Python

Did Programming Language Flaws Create Insecure Apps? (bleepingcomputer.com) 92

Several popular interpreted programming languages are affected by severe vulnerabilities that expose apps built on these languages to attacks, according to research presented at the Black Hat Europe 2017 security conference. An anonymous reader writes: The author of this research is IOActive Senior Security Consultant Fernando Arnaboldi, who says he used an automated software testing technique named fuzzing to identify vulnerabilities in the interpreters of five of today's most popular programming languages: JavaScript, Perl, PHP, Python, and Ruby.

Fuzzing involves providing invalid, unexpected, or random data as input to a software application. The researcher created his own fuzzing framework named XDiFF that broke down programming languages per each of its core functions and fuzzed each one for abnormalities. His work exposed severe flaws in all five languages, such as a hidden flaw in PHP constant names that can be abused to perform remote code execution, and undocumented Python methods that can be used for OS code execution. Arnaboldi argues that attackers can exploit these flaws even in the most secure applications built on top of these programming languages.

Programming

What Mistakes Can Stall An IT Career? (cio.com) 199

Quoting snydeq: "In the fast-paced world of technology, complacency can be a career killer," Paul Heltzel writes in an article on 20 ways to kill your IT career without knowing it. "So too can any number of hidden hazards that quietly put your career on shaky ground -- from not knowing your true worth to thinking you've finally made it. Learning new tech skills and networking are obvious ways to solidify your career. But what about accidental ways that could put your career in a slide? Hidden hazards -- silent career killers? Some tech pitfalls may not be obvious."
CIO's reporter "talked to a number of IT pros, recruiters, and developers about how to build a bulletproof career and avoid lesser-known pitfalls," citing hazards like burning bridges and skipping social events. But it also warns of the dangers of staying in your comfort zone too long instead of asking for "stretch" assignments and accepting training opporunities.

The original submission puts the same question to Slashdot readers. "What silent career killers have you witnessed (or fallen prey to) in your years in IT?"
Technology

Sexual Harassment In Tech Is As Old As the Computer Age (ieee.org) 367

Tekla Perry writes: Historian Marie Hicks, speaking at the Computer History Museum talks about how women computer operators and programmers were driven out of the industry, gives examples of sexual harassment dating back to the days of the Colossus era, and previews her next research. "It's all a matter of power, Hicks pointed out -- and women have never had their share of it," reports IEEE Spectrum. "Women dominated computer programming in its early days because the field wasn't seen as a career, just a something someone could do without a lot of training and would do for only a short period of time. Computer jobs had no room for advancement, so having women 'retire' in their 20s was not seen as a bad thing. And since women, of course, could never supervise men, Hicks said, women who were good at computing ended up training the men who ended up as their managers. But when it became clear that computers -- and computer work -- were important, women were suddenly pushed out of the field."

Hicks has also started looking at the bias baked into algorithms, specifically at when it first crossed from human to computer. The first example she turned up had "something to do with transgender people and the government's main pension computer." She says that when humans were in the loop, petitions to change gender on national insurance cards generally went through, but when the computer came in, the system was "specifically designed to no longer accommodate them, instead, to literally cause an error code to kick out of the processing chain any account of a 'known transsexual.'"

Google

Google Is Pulling YouTube Off the Fire TV and Echo Show as Feud With Amazon Grows (theverge.com) 238

An anonymous reader shares a report: Three months ago, YouTube pulled its programming from Amazon's Echo Show device -- the first skirmish in what is apparently an ongoing war. Shortly after, Amazon stopped selling the Nest E Thermostat, Nest's Camera IQ, and the Nest Secure alarm system. Two weeks ago, Amazon got YouTube back on the Echo Show by simply directing users to the web version, a workaround that left a lot to be desired. But even that version won't be available after today. In a statement, Google said it has been trying to reach an agreement with Amazon to provide customers with access to each other's products and services. But, Google said, Amazon doesn't carry Google products like Chromecast and Google Home, doesn't make Prime Video available for Google Cast users, and last month stopped selling some of Nest's latest products. "Given this lack of reciprocity, we are no longer supporting YouTube on Echo Show and FireTV. We hope we can reach an agreement to resolve these issues soon."
Piracy

Not Even Free TV Can Get People To Stop Pirating Movies and TV Shows (qz.com) 221

An anonymous reader quotes a report from Quartz: Since the internet made it easier to illegally download and stream movies and TV shows, Hollywood struggled with people pirating its works online. About $5.5 billion in revenue was lost to piracy globally last year, Digital TV Research found (pdf), and it's expected to approach $10 billion by 2022. Streaming-video services like Netflix and Hulu have made it more affordable to access a wide-range of titles from different TV networks and movie studios. But the availability of cheap content online has done little to curb piracy, according to research published in Management Science (paywall) last month. Customers who were offered free subscriptions to a video-on-demand package (SVOD) were just as likely to turn to piracy to find programming as those without the offering, researchers at Catolica Lisbon School of Business & Economics and Carnegie Mellon University found.

The researchers partnered with an unnamed internet-service provider -- in a region they chose not to disclose -- to offer customers who were already prone to piracy an on-demand package for free for 45 days. About 10,000 households participated in the study, and about half were given the free service. The on-demand service was packaged like Netflix or Hulu in layout, appearance, and scope of programming, but was delivered through a TV set-top box. It had a personalized recommendation engine that surfaced popular programming based on what those customers were already watching illegally through BitTorrent logs, which were obtained from a third-party firm. The study found that while the participants watched 4.6% more TV overall when they had the free on-demand service, they did not stop using BitTorrent to pirate movies and TV shows that were not included in the offering.

Windows

Lead Developer of Popular Windows Application Classic Shell Is Quitting 97

WheezyJoe writes: Classic Shell is a free Windows application that for years has replaced Microsoft's Start Screen or Start Menu with a highly configurable, more familiar non-tile Start menu. Yesterday, the lead developer released what he said would be the last version of Classic Shell. Citing other interests and the frequency at which Microsoft releases updates to Windows 10, as well as lagging support for the Win32 programming model, the developer says that he won't work on the program anymore. The application's source code is available on SourceForge, so there is a chance others may come and fork the code to continue development. There are several alternatives available, some pay and some free (like Start10 and Start Is Back++), but Classic Shell has an exceptionally broad range of tweaks and customizability.
Education

To Solve the Diversity Drought in Software Engineering, Look to Community Colleges (vice.com) 331

An anonymous reader shares a report: Community college is not flashy and does not make promises about your future employability. You will also likely not learn current way-cool web development technologies like React and GraphQL. In terms of projects, you're more likely to build software for organizing a professor's DVD or textbook collection than you are responsive web apps. I would tell you that all of this is OK because in community college computer science classes you're learning fundamentals, broad concepts like data structures, algorithmic complexity, and object-oriented programming. You won't learn any of those things as deeply as you would in a full-on university computer science program, but you'll get pretty far. And community college is cheap, though that varies depending on where you are. Here in Portland, OR, the local community college network charges $104 per credit. Which means it's possible to get a solid few semesters of computer science coursework down for a couple of grand. Which is actually amazing. In a new piece published in the Communications of the ACM, Silicon Valley researchers Louise Ann Lyon and Jill Denner make the argument that community colleges have the potential to play a key role in increasing equity and inclusion in computer science education. If you haven't heard, software engineering has a diversity problem. Access to education is a huge contributor to that, and Denner and Lyon see community college as something of a solution in plain sight.
Programming

'24 Pull Requests' Suggests Contributing Code For Christmas (24pullrequests.com) 30

An anonymous reader writes: "On December 1st, 24 Pull Requests will be opening its virtual doors once again, asking you to give the gift of a pull request to an open source project in need," writes UK-based software developer Andrew Nesbitt -- noting that last year the site registered more than 16,000 pull requests. "And they're not all by programmers. Often the contribution with the most impact might be an improvement to technical documentation, some tests, or even better -- guidance for other contributors."

This year they're even touting "24 Pull Requests hack events," happening around the world from Lexington, Kentucky to Torino, Italy. (Last year 80 people showed up for an event in London.) "You don't have to hack alone this Christmas!" suggests the site, also inviting local communities and geek meetups (as well as open source-loving companies) to host their own events.

Contributing to open source projects can also beef up your CV (for when you're applying for your next job), the site points out, and "Even small contributions can be really valuable to a project."

"You've been benefiting from the use of open source projects all year. Now is the time to say thanks to the maintainers of those projects, and a little birdy tells me that they love receiving pull requests!"
Encryption

PHP Now Supports Argon2 Next-Generation Password Hashing Algorithm (bleepingcomputer.com) 94

An anonymous reader quotes Bleeping Computer: PHP got a whole lot more secure this week with the release of the 7.2 branch, a version that improves and modernizes the language's support for cryptography and password hashing algorithms.

Of all changes, the most significant is, by far, the support for Argon2, a password hashing algorithm developed in the early 2010s. Back in 2015, Argon2 beat 23 other algorithms to win the Password Hashing Competition, and is now in the midst of becoming a universally recognized Internet standard at the Internet Engineering Task Force (IETF), the reward for winning the contest. The algorithm is currently considered to be superior to Bcrypt, today's most widely used password hashing function, in terms of both security and cost-effectiveness, and is also slated to become a favorite among cryptocurrencies, as it can also handle proof-of-work operations.

The other major change in PHP 7.2 was the removal of the old Mcrypt cryptographic library from the PHP core and the addition of Libsodium, a more modern alternative.

Space

A Programing Error Blasted 19 Russian Satellites Back Towards Earth (arstechnica.com) 90

An anonymous reader quotes Ars Technica's report on Russia's failed attempt to launch 19 satellites into orbit on Tuesday: Instead of boosting its payload, the Soyuz 2.1b rocket's Fregat upper stage fired in the wrong direction, sending the satellites on a suborbital trajectory instead, burning them up in Earth's atmosphere... According to normally reliable Russian Space Web, a programming error caused the Fregat upper stage, which is the spacecraft on top of the rocket that deploys satellites, to be unable to orient itself. Specifically, the site reports, the Fregat's flight control system did not have the correct settings for a mission launching from the country's new Vostochny cosmodrome. It evidently was still programmed for Baikonur, or one of Russia's other spaceports capable of launching the workhorse Soyuz vehicle. Essentially, then, after the Fregat vehicle separated from the Soyuz rocket, it was unable to find its correct orientation. Therefore, when the Fregat first fired its engines to boost the satellites into orbit, it was still trying to correct this orientation -- and was in fact aimed downward toward Earth. Though the Fregat space tug has been in operation since the 1990s, this is its fourth failure -- all of which have happened within the last 8 years.

"In each of the cases, the satellite did not reach its desired orbit," reports Ars Technica, adding "As the country's heritage rockets and upper stages continue to age, the concern is that the failure rate will increase."
Education

Computer Science GCSE in Disarray After Tasks Leaked Online (bbc.com) 53

An anonymous reader shares a report: The new computer science GCSE has been thrown into disarray after programming tasks worth a fifth of the total marks were leaked repeatedly online. Exams regulator Ofqual plans to pull this chunk of the qualification from the overall marks as it has been seen by thousands of people. Ofqual said the non-exam assessment may have been leaked by teachers as well as students who had completed the task. The breach affects two year groups. The first will sit the exam in summer 2018. Last year 70,000 students were entered for computer science GCSE. A quick internet search reveals numerous posts about the the non-exam assessment, with questions and potential answers.
Programming

Why ESR Hates C++, Respects Java, and Thinks Go (But Not Rust) Will Replace C (ibiblio.org) 608

Open source guru Eric S. Raymond followed up his post on alternatives to C by explaining why he won't touch C++ any more, calling the story "a launch point for a disquisition on the economics of computer-language design, why some truly unfortunate choices got made and baked into our infrastructure, and how we're probably going to fix them." My problem with [C++] is that it piles complexity on complexity upon chrome upon gingerbread in an attempt to address problems that cannot actually be solved because the foundational abstractions are leaky. It's all very well to say "well, don't do that" about things like bare pointers, and for small-scale single-developer projects (like my eqn upgrade) it is realistic to expect the discipline can be enforced. Not so on projects with larger scale or multiple devs at varying skill levels (the case I normally deal with)... C is flawed, but it does have one immensely valuable property that C++ didn't keep -- if you can mentally model the hardware it's running on, you can easily see all the way down. If C++ had actually eliminated C's flaws (that is, been type-safe and memory-safe) giving away that transparency might be a trade worth making. As it is, nope.
He calls Java a better attempt at fixing C's leaky abstractions, but believes it "left a huge hole in the options for systems programming that wouldn't be properly addressed for another 15 years, until Rust and Go." He delves into a history of programming languages, touching on Lisp, Python, and programmer-centric languages (versus machine-centric languages), identifying one of the biggest differentiators as "the presence or absence of automatic memory management." Falling machine-resource costs led to the rise of scripting languages and Node.js, but Raymond still sees Rust and Go as a response to the increasing scale of projects.
Eventually we will have garbage collection techniques with low enough latency overhead to be usable in kernels and low-level firmware, and those will ship in language implementations. Those are the languages that will truly end C's long reign. There are broad hints in the working papers from the Go development group that they're headed in this direction... Sorry, Rustaceans -- you've got a plausible future in kernels and deep firmware, but too many strikes against you to beat Go over most of C's range. No garbage collection, plus Rust is a harder transition from C because of the borrow checker, plus the standardized part of the API is still seriously incomplete (where's my select(2), again?).

The only consolation you get, if it is one, is that the C++ fans are screwed worse than you are. At least Rust has a real prospect of dramatically lowering downstream defect rates relative to C anywhere it's not crowded out by Go; C++ doesn't have that.

Education

Why Do Employers Require College Degrees That Aren't Necessary? (thestreet.com) 358

Slashdot reader pefisher writes: A lot of us on Slashdot have noticed that potential employers advertise for things they don't need. To the point that sometimes they even ask for things that don't exist. Like asking for ten years of experience in a technology that has only just been introduced. It's frustrating because it makes you wonder "what's this employers real game?"

Do they just want to say they advertised for the position, or are they really so immensely stupid, so disconnected from their own needs, that they think they are actually asking for something they can have...? Here is a Harvard Study that addresses one particular angle of this. It doesn't answer any questions, but it does prove that you aren't crazy. And it quantifies the craziness.

The study's author calls it "degree inflation," and after studying 26 million job postings concluded that employers are now less willing to actually train new people on the job, possibly to save money. "Many companies have fallen into a lazy way of thinking about this," the study's author tells The Street, saying companies are "[looking for] somebody who is just job-ready to just show up." The irony is that college graduates will ultimately be paid a higher salary -- even though for many jobs, the study found that a college degree yields zero improvement in actual performance.

The Street reports that "In a market where companies increasingly rely on computerized systems to cull out early-round applicants, that has led firms to often consider a bachelor's degree indicative of someone who can socialize, run a meeting and generally work well with others." One company tells them that "we removed the requirement to have a computer science degree, and we removed the requirement to have experience in development computer programming. And when we removed those things we found that the pool of potential really good team members drastically expanded."
Programming

More Than Half of GitHub Is Duplicate Code, Researchers Find (theregister.co.uk) 115

Richard Chirgwin, writing for The Register: Given that code sharing is a big part of the GitHub mission, it should come at no surprise that the platform stores a lot of duplicated code: 70 per cent, a study has found. An international team of eight researchers didn't set out to measure GitHub duplication. Their original aim was to try and define the "granularity" of copying -- that is, how much files changed between different clones -- but along the way, they turned up a "staggering rate of file-level duplication" that made them change direction. Presented at this year's OOPSLA (part of the late-October Association of Computing Machinery) SPLASH conference in Vancouver, the University of California at Irvine-led research found that out of 428 million files on GitHub, only 85 million are unique. Before readers say "so what?", the reason for this study was to improve other researchers' work. Anybody studying software using GitHub probably seeks random samples, and the authors of this study argued duplication needs to be taken into account.
Software

Google Is Working On Fuchsia OS Support For Apple's Swift Programming Language (androidpolice.com) 54

An anonymous reader shares a report from Android Police: Google's in-development operating system, named "Fuchsia," first appeared over a year ago. It's quite different from Android and Chrome OS, as it runs on top of the real-time "Magenta" kernel instead of Linux. According to recent code commits, Google is working on Fuchsia OS support for the Swift programming language. If you're not familiar with it, Swift is a programming language developed by Apple, which can be used to create iOS/macOS/tvOS/watchOS applications (it can also compile to Linux). Apple calls it "Objective-C without the C," and on the company's own platforms, it can be mixed with existing C/Objective-C/C++ code (similar to how apps on Android can use both Kotlin and Java in the same codebase). We already know that Fuchsia will support apps written in Dart, a C-like language developed by Google, but it looks like Swift could also be supported. On Swift's GitHub repository, a pull request was created by a Google employee that adds Fuchsia OS support to the compiler. At the time of writing, there are discussions about splitting it into several smaller pull requests to make reviewing the code changes easier.
AI

Deep Learning Is Eating Software (petewarden.com) 147

Pete Warden, engineer and CTO of Jetpac, shares his view on how deep learning is already starting to change some of the programming is done. From a blog post, shared by a reader last week: The pattern is that there's an existing software project doing data processing using explicit programming logic, and the team charged with maintaining it find they can replace it with a deep-learning-based solution. I can only point to examples within Alphabet that we've made public, like upgrading search ranking, data center energy usage, language translation, and solving Go, but these aren't rare exceptions internally. What I see is that almost any data processing system with non-trivial logic can be improved significantly by applying modern machine learning. This might sound less than dramatic when put in those terms, but it's a radical change in how we build software. Instead of writing and maintaining intricate, layered tangles of logic, the developer has to become a teacher, a curator of training data and an analyst of results. This is very, very different than the programming I was taught in school, but what gets me most excited is that it should be far more accessible than traditional coding, once the tooling catches up. The essence of the process is providing a lot of examples of inputs, and what you expect for the outputs. This doesn't require the same technical skills as traditional programming, but it does need a deep knowledge of the problem domain. That means motivated users of the software will be able to play much more of a direct role in building it than has ever been possible. In essence, the users are writing their own user stories and feeding them into the machinery to build what they want.
Chrome

Is Firefox 57 Faster Than Chrome? (mashable.com) 234

An anonymous reader quotes TechNewsWorld: Firefox is not only fast on startup -- it remains zippy even when taxed by multitudes of tabs. "We have a better balance of memory to performance than all the other browsers," said Firefox Vice President for Product Nick Nguyen. "We use 30 percent less memory, and the reason for that is we can allocate the number of processes Firefox uses on your computer based on the hardware that you have," he told TechNewsWorld. The performance improvements in Quantum could be a drink from the fountain of youth for many Firefox users' systems. "A significant number of our users are on machines that are two cores or less, and less than 4 gigabytes of RAM," Nguyen explained.
Mashable ran JetStream 1.1 tests on the ability to run advanced web applications, and concluded that "Firefox comes out on top, but not by much. This means it's, according to JetStream, slightly better suited for 'advanced workloads and programming techniques.'" Firefox also performed better on "real-world speed tests" on Amazon.com and the New York Times' site, while Chrome performed better on National Geographic, CNN, and Mashable. Unfortunately for Mozilla, Chrome looks like it's keeping the top spot, at least for now. The only test that favors Quantum is JetStream, and that's by a hair. And in Ares-6 [which measures how quickly a browser can run new Javascript functions, including mathematical functions], Quantum gets eviscerated... Speedometer simulates user actions on web applications (specifically, adding items to a to-do list) and measures the time they take... When it comes to user interactions in web applications, Chrome takes the day...

In reality, however, Quantum is no slug. It's a capable, fast, and gorgeous browser with innovative bookmark functionality and a library full of creative add-ons. As Mozilla's developers fine-tune Quantum in the coming months, it's possible it could catch up to Chrome. In the meantime, the differences in page-load time are slight at best; you probably won't notice the difference.

Open Source

Proprietary Software is the Driver of Unprecedented Surveillance: Richard Stallman (factor-tech.com) 197

From a wide-ranging interview of Richard Stallman, president of the Free Software Foundation, programming legend and recipient of at least 15 honorary doctorates and professorships: "The reason that we are subject now to more surveillance than there was in the Soviet Union is that digital technology made it possible," he says. "And the first disaster of digital technology was proprietary software that people would install and run on their own computers, and they wouldn't know what it was doing. They can't tell what it's doing. And that is the first injustice that I began fighting in 1983: proprietary software, software that is not free, that the users don't control." Here, Stallman is keen to stress, he doesn't mean free in the sense of not costing money -- plenty of free software is paid for -- but free in the sense of freedom to control. Software, after all, instructs your computer to perform actions, and when another company has written and locked down that software, you can't know exactly what it is doing. "You might think your computer is obeying you, when really its obeying the real master first, and it only obeys you when the real master says it's ok. With every program there are two possibilities: either the user controls the program or the program controls the users," he says. "It's free software if users control it. And that's why it respects their freedom. Otherwise it's a non-free, proprietary, user subjugating program."
Businesses

Amazon Developing a Free, Ad-Supported Version of Prime Video: Report (adage.com) 74

Amazon is developing a free, ad-supported complement to its Prime streaming video service, AdAge reported on Monday, citing people familiar with Amazon's plans. From the report: The company is talking with TV networks, movie studios and other media companies about providing programming to the service, they say. Amazon Prime subscribers pay $99 per year for free shipping but also access to a mix of ad-free TV shows, movies and original series such as "Transparent" and "The Man in the High Castle." It has dabbled in commercials on Prime to a very limited degree, putting ads inside National Football League games this season and offering smaller opportunities for brand integrations. A version paid for by advertisers instead of subscribers could provide a new foothold in streaming video for marketers, whose opportunities to run commercials are eroding as audiences drift away from traditional TV and toward ad-free services like Netflix and Prime.
Programming

ESR Sees Three Viable Alternatives To C (ibiblio.org) 595

An anonymous reader writes: After 35 years of programming in C, Eric S. Raymond believes that we're finally seeing viable alternatives to the language. "We went thirty years -- most of my time in the field -- without any plausible C successor, nor any real vision of what a post-C technology platform for systems programming might look like. Now we have two such visions...and there is another."

"I have a friend working on a language he calls 'Cx' which is C with minimal changes for type safety; the goal of his project is explicitly to produce a code lifter that, with minimal human assistance, can pull up legacy C codebases. I won't name him so he doesn't get stuck in a situation where he might be overpromising, but the approach looks sound to me and I'm trying to get him more funding. So, now I can see three plausible paths out of C. Two years ago I couldn't see any. I repeat: this is huge... Go, or Rust, or Cx -- any way you slice it, C's hold is slipping."

Raymond's essay also includes a fascinating look back at the history of programming languages after 1982, when the major complied languages (FORTRAN, Pascal, and COBOL) "were either confined to legacy code, retreated to single-platform fortresses, or simply ran on inertia under increasing pressure from C around the edges of their domains.

"Then it stayed that way for nearly thirty years."

Slashdot Top Deals