Education

Should The Government Pay For Veterans To Attend Code Schools? (backchannel.com) 168

mirandakatz writes: David Molina was finishing up his 12-year time in the army when he started teaching himself to code, and started to think that he might like to pursue it professionally once his service was done. But with a wife and family, he couldn't dedicate the four years he'd need to get an undergraduate degree in computer science -- and the GI Bill, he learned, won't cover accelerated programs like code schools. So he started an organization dedicated to changing that. Operation Code is lobbying politicians to allow vets to attend code schools through the GI Bill and prepare themselves for the sorts of stable, middle-class jobs that have come to be called "blue-collar coding." Molina sees it as a serious failing that the GI Bill will cover myriad vocational programs, but not those that can prepare veterans for one of the fastest-growing industries in existence.
The issue seems to be quality. The group estimates there are already nine code schools in the U.S. which do accept GI Bill benefits -- but only "longer-standing ones that have made it through State Approving Agencies." Meanwhile, Course Report calculates 18,000 people finished coding bootcamps last year -- and that two thirds of them found a job within three months.

But I just liked how Molina described his introduction into the world of programmers. While stationed at Dover Air Force Base, he attended Baltimore's long-standing Meetup for Ruby on Rails, where "People taught me about open source. There was pizza, there was beer. They made me feel like I was at home."
Java

Red Hat And IBM Will Vote Against Java's Next Release (infoworld.com) 57

An anonymous reader quotes InfoWorld: The next edition of standard Java had been proceeding toward its planned July 27 release after earlier bumps in the road over modularity. But now Red Hat and IBM have opposed the module plan. "JDK 9 might be held up by this," Oracle's Georges Saab, vice president of development for the Java platform, said late Wednesday afternoon. "As is the case for all major Java SE releases, feedback from the Java Community Process may affect the timeline..."

Red Hat's Scott Stark, vice president of architecture for the company's JBoss group, expressed a number of concerns about how applications would work with the module system and its potential impact on the planned Java Enterprise Edition 9. Stark also said the module system, which is featured in Java Specification Request 376 and Project Jigsaw, could result in two worlds of Java: one for Jigsaw and one for everything else, including Java SE classloaders and OSGI. Stark's analysis received input from others in the Java community, including Sonatype.

"The result will be a weakened Java ecosystem at a time when rapid change is occurring in the server space with increasing use of languages like Go," Stark wrote, also predicting major challenges for applications dealing with services and reflection. His critique adds that "In some cases the implementation...contradicts years of modular application deployment best practices that are already commonly employed by the ecosystem as a whole." And he ultimately concludes that this effort to modularize Java has limitations which "almost certainly prevent the possibility of Java EE 9 from being based on Jigsaw, as to do so would require existing Java EE vendors to completely throw out compatibility, interoperability, and feature parity with past versions of the Java EE specification."
Oracle

In Oracle's Cloud Pitch To Enterprises, an Echo of a Bygone Tech Era (siliconangle.com) 55

An anonymous reader writes: Oracle sought to position itself once again this week as the best place for everything companies need to move to cloud computing. On Thursday, executives at the database and business software giant distanced Oracle from public cloud leaders such as Amazon Web Services, Google Cloud Platform and Microsoft Azure that provide computing, storage and other services to corporations looking to reduce or eliminate their data centers. "Our cloud is more comprehensive than any other cloud in the market today, a full end-to-end cloud," said David Donatelli, Oracle's executive vice president of converged infrastructure. "We design from the chip all the way up to the application, fully vertically integrated." What's interesting about that messaging, which Oracle has been refining since at least its OpenWorld conference last September, is not simply the competitive positioning. Oracle is essentially saying that the nature of cloud computing suggests customers need to move away from the notion that has dominated information technology since personal computers and PC-based servers began to displace mainframes and minicomputers: cherry-picking the best applications and hardware and cobbling together their own IT setups. In short, Oracle contends, it's time for another broad swing back to the integrated, uber-suppliers of a bygone era of technology. Of course, the new tech titans such as Google, Facebook and Amazon arguably wield as much power in their particular domains of advertising and e-commerce as the Big Blue of old. But it has been a long time since a soup-to-nuts approach has worked for enterprise tech companies, and for those few still attempting it, such as Dell and Oracle, it's far from obvious it will work. The cloud, Oracle contends, may well change that.
Education

How Scratch Is Feeding Hacker Values into Young Minds (backchannel.com) 48

Reader mirandakatz writes: It's the 10th anniversary of Scratch, the kids programming language that's become a popular tool for training the next generation of minds in computer science. But as Steven Levy writes at Backchannel, Scratch's real value is how it imparts lessons in sharing, logic, and hackerism: 'A product of the MIT Media Lab, Scratch is steeped in a complicated set of traditions -- everything from educational philosophy to open source activism and the pursuit of artificial life. The underpinnings of this tool subtly, and sometimes not so subtly, convey a set of values through its use... These values include reverence of logic, an unshakeable belief in the power of collaboration, and a celebration of the psychic and tangible rewards of being a maker.'
NASA

NASA Runs Competition To Help Make Old Fortran Code Faster (bbc.com) 205

NASA is seeking help from coders to speed up the software it uses to design experimental aircraft. From a report on BBC: It is running a competition that will share $55,000 between the top two people who can make its FUN3D software run up to 10,000 times faster. The FUN3D code is used to model how air flows around simulated aircraft in a supercomputer. The software was developed in the 1980s and is written in an older computer programming language called Fortran. "This is the ultimate 'geek' dream assignment," said Doug Rohn, head of NASA's transformative aeronautics concepts program that makes heavy use of the FUN3D code. In a statement, Mr Rohn said the software is used on the agency's Pleiades supercomputer to test early designs of futuristic aircraft. The software suite tests them using computational fluid dynamics, which make heavy use of complicated mathematical formulae and data structures to see how well the designs work.
China

China Makes Quantum Leap In Developing Quantum Computer (scmp.com) 70

hackingbear writes: Researchers at the University of Science and Technology of China created a quantum device, called a boson sampling machine, that can now carry out calculations for five photons, but at a speed 24,000 times faster than previous experiments. Pan Jianwei, the lead scientist on the project, said that though their device was already (only) 10 to 11 times faster at carrying out the calculations than the first electronic digital computer, ENIAC, and the first transistor computer, TRADIC, in running the classical algorithm, their machine would eclipse all of the world's supercomputers in a few years. "Our architecture is feasible to be scaled up to a larger number of photons and with a higher rate to race against increasingly advanced classical computers," they said in the research paper published in Nature Photonics. This device is said to be the first quantum computer beating a real electronic classical computer in practice. Scientists estimate that the current faster supercomputers would struggle to estimate the behavior of 20 photons.
Facebook

Facebook Rejects Female Engineers' Code More Often Than Male Counterparts, Analysis Finds (theverge.com) 450

According to The Wall Street Journal, female engineers who work at Facebook may face gender bias that prevents their code from being accepted at the same rate as male counterparts. "For Facebook, these revelations call into question the company's ongoing diversity efforts and its goal to build overarching online systems for people around the globe," reports The Verge. "The company's workforce is just 33 percent female, with women holding just 17 percent of technical roles and 27 percent of leadership positions." From the report: The findings come in two parts. An initial study by a former employee found that code written by female engineers was less likely to make it through Facebook's internal peer review system. This seemed to suggest that a female engineer's work was more heavily scrutinized. Facebook, alarmed by this data, commissioned a second study by Jay Parikh, its head of infrastructure, to investigate any potential issues. Parikh's findings suggested that the code rejections were due to engineering rank, not gender. However, Facebook employees now speculate that Parikh's findings mean female engineers might not be rising in the ranks as fast as male counterparts who joined the company at the same time, or perhaps that female engineers are leaving the company more often before being promoted. Either possibility could result in the 35 percent higher code rejection rate for female engineers. When contacted by The Wall Street Journal, Facebook called the initial study "incomplete and inaccurate" and based on "incomplete data," but did not shy away from confirming Parikh's separate findings.
Programming

Power of Modern Programming Languages is That They Are Expressive, Readable, Concise, Precise, and Executable (scientificamerican.com) 268

An anonymous reader shares a Scientific American article: Programming has changed. In first generation languages like FORTRAN and C, the burden was on programmers to translate high-level concepts into code. With modern programming languages -- I'll use Python as an example -- we use functions, objects, modules, and libraries to extend the language, and that doesn't just make programs better, it changes what programming is. Programming used to be about translation: expressing ideas in natural language, working with them in math notation, then writing flowcharts and pseudocode, and finally writing a program. Translation was necessary because each language offers different capabilities. Natural language is expressive and readable, pseudocode is more precise, math notation is concise, and code is executable. But the price of translation is that we are limited to the subset of ideas we can express effectively in each language. Some ideas that are easy to express computationally are awkward to write in math notation, and the symbolic manipulations we do in math are impossible in most programming languages. The power of modern programming languages is that they are expressive, readable, concise, precise, and executable. That means we can eliminate middleman languages and use one language to explore, learn, teach, and think.
Programming

Developer Hacks Together Object-Oriented HTML (github.com) 184

An anonymous reader writes: Ever since I started coding, I have always loved object-oriented design patterns. I built an HTML preprocessor that adds inheritance, polymorphism, and public methods to this venerable language. It offers more freedom than a templating engine and has a wider variety of use cases. Pull requests appreciated!
Data Storage

Developer Shares A Recoverable Container Format That's File System Agnostic (github.com) 133

Long-time Slashdot reader MarcoPon writes: I created a thing: SeqBox. It's an archive/container format (and corresponding suite of tools) with some interesting and unique features. Basically an SBX file is composed of a series of sector-sized blocks with a small header with a recognizable signature, integrity check, info about the file they belong to, and a sequence number. The results of this encoding is the ability to recover an SBX container even if the file system is corrupted, completely lost or just unknown, no matter how much the file is fragmented.
Businesses

Should Banks Let Ancient Programming Language COBOL Die? (thenextweb.com) 383

COBOL is a programming language invented by Hopper from 1959 to 1961, and while it is several decades old, it's still largely used by the financial sector, major corporations and part of the federal government. Mar Masson Maack from The Next Web interviews Daniel Doderlein, CEO of Auka, who explains why banks don't have to actively kill COBOL and how they can modernize and "minimize the new platforms' connections to the old systems so that COBOL can be switched out in a safe and cheap manner." From the report: According to [Doderlein], COBOL-based systems still function properly but they're faced with a more human problem: "This extremely critical part of the economic infrastructure of the planet is run on a very old piece of technology -- which in itself is fine -- if it weren't for the fact that the people servicing that technology are a dying race." And Doderlein literally means dying. Despite the fact that three trillion dollars run through COBOL systems every single day they are mostly maintained by retired programming veterans. There are almost no new COBOL programmers available so as retirees start passing away, then so does the maintenance for software written in the ancient programming language. Doderlein says that banks have three options when it comes to deciding how to deal with this emerging crisis. First off, they can simply ignore the problem and hope for the best. Software written in COBOL is still good for some functions, but ignoring the problem won't fix how impractical it is for making new consumer-centric products. Option number two is replacing everything, creating completely new core banking platforms written in more recent programming languages. The downside is that it can cost hundreds of millions and it's highly risky changing the entire system all at once. The third option, however, is the cheapest and probably easiest. Instead of trying to completely revamp the entire system, Doderlein suggests that banks take a closer look at the current consumer problems. Basically, Doderlein suggests making light-weight add-ons in more current programming languages that only rely on COBOL for the core feature of the old systems.
Nintendo

Early Nintendo Programmer Worked Without a Keyboard (arstechnica.com) 111

Much like IT guys, every programmer has a horror story about the extreme work environments that forced them to hack together things. But as ArsTechnica points out, not many of them can beat the keyboard-free coding environment that Masahiro Sakurai apparently used to create the first Kirby's Dream Land. From the story: The tidbit comes from a talk Sakurai gave ahead of a Japanese orchestral performance celebrating the 25th anniversary of the original Game Boy release of Kirby's Dream Land in 1992. Sakurai recalled how HAL Laboratory was using a Twin Famicom as a development kit at the time. Trying to program on the hardware, which combined a cartridge-based Famicom and the disk-based Famicom Disk System, was "like using a lunchbox to make lunch," Sakurai said. As if the limited power wasn't bad enough, Sakurai revealed that the Twin Famicom testbed they were using "didn't even have keyboard support, meaning values had to be input using a trackball and an on-screen keyboard."
Databases

Five Years Later, Legal Megaupload Data Is Still Trapped On Dead Servers (arstechnica.com) 82

An anonymous reader quotes a report from Ars Technica: It's been more than five years since the government accused Megaupload and its founder Kim Dotcom of criminal copyright infringement. While Dotcom himself was arrested in New Zealand, U.S. government agents executed search warrants and grabbed a group of more than 1,000 servers owned by Carpathia Hosting. That meant that a lot of users with gigabytes of perfectly legal content lost access to it. Two months after the Dotcom raid and arrest, the Electronic Frontier Foundation filed a motion in court asking to get back data belonging to one of those users, Kyle Goodwin, whom the EFF took on as a client. Years have passed. The U.S. criminal prosecution of Dotcom and other Megaupload executives is on hold while New Zealand continues with years of extradition hearings. Meanwhile, Carpathia's servers were powered down and are kept in storage by QTS Realty Trust, which acquired Carpathia in 2015. Now the EFF has taken the extraordinary step of asking an appeals court to step in and effectively force the hand of the district court judge. Yesterday, Goodwin's lawyers filed a petition for a writ of mandamus (PDF) with the U.S. Court of Appeals for the 4th Circuit, which oversees Virginia federal courts. "We've been asking the court for help since 2012," said EFF attorney Mitch Stolz in a statement about the petition. "It's deeply unfair for him to still be in limbo after all this time."
Java

Ask Slashdot: Do You Like Functional Programming? (slashdot.org) 418

An anonymous reader writes: Functional programming seems to be all the rage these days. Efforts are being made to highlight its use in Java, JavaScript, C# and elsewhere. Lots of claims are being made about it's virtues that seem relatively easy to prove or disprove such as "Its use will reduce your debugging time." Or "It will clarify your code." My co-workers are resorting to arm-wrestling matches over this style choice. Half of my co-workers have drunk the Kool-Aid and are evangelizing its benefits. The other half are unconvinced of its virtues over Object Oriented Design patterns, etc.

What is your take on functional programming and related technologies (i.e. lambdas and streams)? Is it our salvation? Is it merely another useful design pattern? Or is it a technological dead-end?

Python creator Guido van Rossum has said most programmers aren't used to functional languages, and when he answered Slashdot reader questions in 2013 said the only functional language he knew much about was Haskell, and "any language less popular than Haskell surely has very little practical value." He even added "I also don't think that the current crop of functional languages is ready for mainstream."

Leave your own opinions in the comments. Do you like functional programming?
Programming

Flawed Online Tutorials Led To Vulnerabilities In Software (helpnetsecurity.com) 96

An anonymous reader quotes Help Net Security: Researchers from several German universities have checked the PHP codebases of over 64,000 projects on GitHub, and found 117 vulnerabilities that they believe have been introduced through the use of code from popular but insufficiently reviewed tutorials. The researchers identified popular tutorials by inputting search terms such as "mysql tutorial", "php search form", "javascript echo user input", etc. into Google Search. The first five results for each query were then manually reviewed and evaluated for SQLi and XSS vulnerabilities by following the Open Web Application Security Project's Guidelines. This resulted in the discovery of 9 tutorials containing vulnerable code (6 with SQLi, 3 with XSS).
The researchers then checked for the code in GitHub repositories, and concluded that "there is a substantial, if not causal, link between insecure tutorials and web application vulnerabilities." Their paper is titled "Leveraging Flawed Tutorials for Seeding Large-Scale Web Vulnerability Discovery."
Education

Slashdot Asks: What Was Your First Programming Language? (stanforddaily.com) 633

This question was inspired by news that Stanford's computer science professor Eric Roberts will try JavaScript instead of Java in a new version of the college's introductory computer programming course. The Stanford Daily reports: When Roberts came to Stanford in 1990, CS106A was still taught in Pascal, a programming language he described as not "clean." The department adopted the C language in 1992. When Java came out in 1995, the computer science faculty was excited to transition to the new language. Roberts wrote the textbooks, worked with other faculty members to restructure the course and assignments and introduced Java at Stanford in 2002... "Java had stabilized," Roberts said. "It was clear that many universities were going in that direction. It's 2017 now, and Java is showing its age." According to Roberts, Java was intended early on as "the language of the Internet". But now, more than a decade after the transition to Java, Javascript has taken its place as a web language.
In 2014 Python and Java were the two most commonly-taught languages at America's top universities, according to an analysis published by the Communications of the ACM. And Java still remains the most-commonly taught language in a university setting, according to a poll by the Special Interest Group on Computer Science Education. In a spreadsheet compiling the results, "Python appears 60 times, C++ 54 times, Java 84 times, and JavaScript 28 times," writes a computing professor at the Georgia Institute of Technology, adding "if Java is dying (or "showing its age"...) it's going out as the reigning champ."

I'm guessing Slashdot's readers have their own opinions about this, so share your educational experiences in the comments. What was your first programming language?
Programming

Stack Overflow Reveals Which Programming Languages Are Most Used At Night (stackoverflow.blog) 99

Stack Overflow data scientist David Robinson recently calculated when people visit the popular programming question-and-answer site, but then also calculated whether those results differed by programming language. Quoting his results:
  • "C# programmers start and stop their day earlier, and tend to use the language less in the evenings. This might be because C# is often used at finance and enterprise software companies, which often start earlier and have rigid schedules."
  • "C programmers start the day a bit later, keep using the language in the evening, and stay up the longest. This suggests C may be particularly popular among hobbyist programmers who code during their free time (or perhaps among summer school students doing homework)."
  • "Python and Javascript are somewhere in between: Python and Javascript developers start and end the day a little later than C# users, and are a little less likely than C programmers to work in the evening."

The site also released an interactive app which lets users see how the results for other languages compared to C#, JavaScript, Python, and C, though of those four, "C# would count as the 'most nine-to-five,' and C as the least."

And they've also calculated the technologies used most between 9 to 5 (which "include many Microsoft technologies, such as SQL Server, Excel, VBA, and Internet Explorer, as well as technologies like SVN and Oracle that are frequently used at enterprise software companies.") Meanwhile, the technologies most often used outside the 9-5 workday "include web frameworks like Firebase, Meteor, and Express, as well as graphics libraries like OpenGL and Unity. The functional language Haskell is the tag most visited outside of the workday; only half of its visits happen between 9 and 5."


Databases

Microsoft Will Support Python In SQL Server 2017 (infoworld.com) 98

There was a surprise in the latest Community Technology Preview release of SQL Server 2017. An anonymous reader quotes InfoWorld: Python can now be used within SQL Server to perform analytics, run machine learning models, or handle most any kind of data-powered work. This integration isn't limited to enterprise editions of SQL Server 2017, either -- it'll also be available in the free-to-use Express edition... Microsoft has also made it possible to embed Python code directly in SQL Server databases by including the code as a T-SQL stored procedure. This allows Python code to be deployed in production along with the data it'll be processing. These behaviors, and the RevoScalePy package, are essentially Python versions of features Microsoft built for SQL Server back when it integrated the R language into the database...

An existing Python installation isn't required. During the setup process, SQL Server 2017 can pull down and install its own edition of CPython 3.5, the stock Python interpreter available from the Python.org website. Users can install their own Python packages as well or use Cython to generate C code from Python modules for additional speed.

Except it's not yet available for Linux users, according to the article. "Microsoft has previously announced SQL Server would be available for Linux, but right now, only the Windows version of SQL Server 2017 supports Python."
Botnet

Developer of BrickerBot Malware Claims He Destroyed Over Two Million Devices (bleepingcomputer.com) 88

An anonymous reader writes: In an interview today, the author of BrickerBot, a malware that bricks IoT and networking devices, claimed he destroyed over 2 million devices, but he never intended to do so in the first place. His intentions were to fight the rising number of IoT botnets that were used to launch DDoS attacks last year, such as Gafgyt and Mirai. He says he created BrickerBot with 84 routines that try to secure devices so they can't be taken over by Mirai and other malware. Nevertheless, he realized that some devices are so badly designed that he could never protect them. He says that for these, he created a "Plan B," which meant deleting the device's storage, effectively bricking the device. His identity was revealed after a reporter received an anonymous tip about a HackForum users claiming he was destroying IoT devices since last November, just after BrickerBot appeared. When contacted, BrickerBot's author revealed that the malware is a personal project which he calls "Internet Chemotherapy" and he's "the doctor" who will kill all the cancerous unsecured IoT devices.
Cloud

Amazon Cloud Chief Jabs Oracle: 'Customers Are Sick of It' (cnbc.com) 81

It's no secret that Amazon and Oracle don't see eye to eye. But things are far from improving, it appears. From a report: On Wednesday, two months after Oracle co-CEO Mark Hurd called Amazon's cloud infrastructure "old" and claimed his company was gaining share, Amazon Web Services chief Andy Jassy slammed Oracle for locking customers into painfully long and expensive contracts. "People are very sensitive about being locked in given the experience they've had the last 10 to 15 years," Jassy said on Wednesday on stage at Amazon's AWS Summit in San Francisco. "When you look at cloud, it's nothing like being locked into Oracle." Jassy was addressing a cultural shift in the way technology is bought and sold. No longer does the process involve the purchase of heavy proprietary software with multi-year contracts that include annual maintenance fees. Now, Jassy says, it's about choice and ease of use, including letting clients turn things off if they're not working.

Slashdot Top Deals