Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×
AI

AI Software Juggles Probabilities To Learn From Less Data (technologyreview.com) 49

moon_unit2 quotes a report from MIT Technology Review: You can, for instance, train a deep-learning algorithm to recognize a cat with a cat-fancier's level of expertise, but you'll need to feed it tens or even hundreds of thousands of images of felines, capturing a huge amount of variation in size, shape, texture, lighting, and orientation. It would be lot more efficient if, a bit like a person, an algorithm could develop an idea about what makes a cat a cat from fewer examples. A Boston-based startup called Gamalon has developed technology that lets computers do this in some situations, and it is releasing two products Tuesday based on the approach. Gamalon uses a technique that it calls Bayesian program synthesis to build algorithms capable of learning from fewer examples. Bayesian probability, named after the 18th century mathematician Thomas Bayes, provides a mathematical framework for refining predictions about the world based on experience. Gamalon's system uses probabilistic programming -- or code that deals in probabilities rather than specific variables -- to build a predictive model that explains a particular data set. From just a few examples, a probabilistic program can determine, for instance, that it's highly probable that cats have ears, whiskers, and tails. As further examples are provided, the code behind the model is rewritten, and the probabilities tweaked. This provides an efficient way to learn the salient knowledge from the data.
Databases

Story Of a Country Which Has Built a Centralized Biometrics Database Of 1.1B People But Appears To Be Mishandling It Now (mashable.com) 60

In a bid to get more Indians to have a birth certificate or any sort of ID card, India announced Aadhaar project in 2009. At the time, there were more Indians without these ID cards than those with. As a result of this, much of the government funding for the citizens were disappearing before they could see them. But according to several security experts, lawyers, politicians and journalists, the government is using poor security practices, and this is exposing the biometrics data -- photo, name, address, fingerprint, iris info -- of people at risk. More than 1.1 billion people -- and 99 percent of all adults -- in India have enrolled themselves to the system. From a report: "There are two fundamental flaws in Aadhaar: it is poorly designed, and it is being poorly verified," Member of Parliament and privacy advocate, Rajeev Chandrasekhar told Mashable India. Another issue with Aadhaar is, Chandrasekhar explains, there is no firm legislation to safeguard the privacy and rights of the billion people who have enrolled into the system. There's little a person whose Aadhaar data has been compromised could do. [...] "Aadhaar is remote, covert, and non-consensual," he told Mashable India, adding the existence of a central database of any kind, but especially in the context of the Aadhaar, and at the scale it is working is appalling. Abraham said fingerprint and iris data of a person can be stolen with little effort -- a "gummy bear" which sells for a few cents, can store one's fingerprint, while a high-resolution camera can capture one's iris data. The report goes on to say that the Indian government is also not telling how the data is being shared with private companies. Experts cited in the story have expressed concerns that those companies (some of which are run by people who were previously members of the team which designed the framework of Aadhaar) can store and create a parallel database of their own. On top of that, the government is making Aadhaar mandatory for availing several things including registration for nation-wide examinations, but in the beginning it promised Aadhaar will be used only to help poor get grocery at subsidized prices.
Programming

H-1Bs Reduced Computer Programmer Employment By Up To 11%, Study Finds (marketwatch.com) 271

An anonymous reader quotes a report from MarketWatch: There would have been up to 11% more computer science jobs at wages up to 5% higher were it not for the immigration program that brings in foreign high-skilled employees, a new study finds. The paper -- by John Bound and Nicolas Morales of the University of Michigan and Gaurav Khanna of the University of California, San Diego -- was conducted by studying the economy between 1994 and 2001, during the internet boom. It was also a period where the recruitment of so-called H-1B labor was at or close to the cap and largely before the onset of the vibrant IT sector in India. In 2001, the number of U.S. computer scientists was between 6.1%-10.8% lower and wages were between 2.6% and 5.1% lower. Of course, there also were beneficiaries -- namely consumers and employers. Immigration lowered prices by between 1.9% and 2.4%, and profits increased as did the total number of IT firms.
Microsoft

Microsoft's Open-Source Graph Engine Takes On Neo4j (infoworld.com) 17

An anonymous reader quotes a report from InfoWorld: Sometimes the relationships between the data you've gathered are more important than the data itself. That's when a graph processing system comes in handy. It's an important but often poorly understood method for exploring how items in a data set are interrelated. Microsoft's been exploring this area since at least 2013, when it published a paper describing the Trinity project, a cloud-based, in-memory graph engine. The fruits of the effort, known as the Microsoft Graph Engine, are now available as an MIT-licensed open source project as an alternative to the likes of Neo4j or the Linux Foundation's recently announced JanusGraph. Microsoft calls Graph Engine (GE) as "both a RAM store and a computation engine." Data can be inserted into GE and retrieved at high speed since it's kept in-memory and only written back to disk as needed. It can work as a simple key-value store like Memcached, but Redis may be the better comparison, since GE stores data in strongly typed schemas (string, integer, and so on). How does all this shape up against the leading open source graph database, Neo4j? For one, Neo4j has been in the market longer and has an existing user base. It's also available in both an open source community edition and a commercial product, whereas GE is only an open source project right now.
Chrome

Chrome's Sandbox Feature Infringes On Three Patents So Google Must Now Pay $20 Million (bleepingcomputer.com) 104

An anonymous reader writes: After five years of litigation at various levels of the U.S. legal system, today, following the conclusion of a jury trial, Google was ordered to pay $20 million to two developers after a jury ruled that Google had infringed on three patents when it designed Chrome's sandboxing feature. Litigation had been going on since 2012, with Google winning the original verdict, but then losing the appeal. After the Supreme Court refused to listen to Google's petition, they sent the case back for a retrial in the U.S. District Court in Eastern Texas, the home of all patent trolls. As expected, Google lost the case and must now pay $20 million in damages, in the form of rolling royalties, which means the company stands to pay more money as Chrome becomes more popular in the future.
Businesses

Story of Two Developers Who Are Reporting Growth in Revenue After Leaving Apple's App Store (techcrunch.com) 65

John Biggs, writing for TechCrunch: In what amounts to one of the purest and most interesting experiments in assessing the value of Mac OS's App Store, the founder of Rogue Amoeba posted a description of what happened when he pulled his app Piezo. The result? More revenue as a whole without much damage to sales. The impetus for the move came after Apple pulled the Dash app off of the App Store. In the 100-day period since the move, Dash maintained and even increased revenue and found that its users didn't care which platform they were using -- 84% of the customers simply moved over to the independent app license from the App Store license. The bottom line? "It feels great to have full control over my business and to avoid App Store installation/updating/purchasing issues," wrote Dash creator Bogdan Popescu. When Paul Kafasis tried to move away from the App Store he was worried he'd lose half of his sales. After all, many months saw about 50% of sales coming from the App Store directly. When he pulled the app a year ago, however, all of those App Store sales turned into direct sales through his website, a fact that surprised and amused Kafasis.
Programming

GitHub Commits Reveal The Top 'Weekend Programming' Languages (medium.com) 149

An anonymous reader writes: Google "developer advocate" Felipe Hoffa has determined the top "weekend programming languages," those which see the biggest spike in commit activity on the weekends. "Clearly 2016 was a year dedicated to play with functional languages, up and coming paradigms, and scripting 3d worlds," he writes, revealing that the top weekend programming languages are:

Rust, Glsl, D, Haskell, Common Lisp, Kicad, Emacs Lisp, Lua, Scheme, Julia, Elm, Eagle, Racket, Dart, Nsis, Clojure, Kotlin, Elixir, F#, Ocaml

Earlier this week another data scientist calculated ended up with an entirely different list by counting the frequency of each language's tag in StackOverflow questions. But Hoffa's analysis was performed using Google's BigQuery web service, and he's also compiled a list of 2016's least popular weekend languages -- the ones people seem to prefer using at the office rather than in their own free time.

Nginx, Matlab, Processing, Vue, Fortran, Visual Basic, Objective-C++, Plsql, Plpgsql, Web Ontology Language, Smarty, Groovy, Batchfile, Objective-C, Powershell, Xslt, Cucumber, Hcl, Puppet, Gcc Machine Description

What's most interesting is the changes over time. In the last year Perl has become more popular than Java, PHP, and ASP as a weekend programming language. And Rust "used to be a weekday language," Hoffa writes, but it soon also grew more popular for Saturdays and Sunday. Meanwhile, "The more popular Go grows, the more it settles as a weekday language," while Puppet "is the champion of weekday coders." Ruby on the other hand, is "slowly leaving the week and embracing the weekend."

Hoffa is also a long-time Slashdot reader who analyzed one billion files on GitHub last summer to determine whether they'd been indented with spaces or tabs. But does this new list resonate with anybody? What languages are you using for your weekend coding projects?
Android

Oracle Refuses To Accept Android's 'Fair Use' Verdict, Files Appeal (wsj.com) 155

An anonymous reader quotes the Wall Street Journal: The seven-year legal battle between tech giants Google and Oracle just got new life. Oracle on Friday filed an appeal with the U.S. Court of Appeals for the Federal Circuit that seeks to overturn a federal jury's decision last year... The case has now gone through two federal trials and bounced around at appeals courts, including a brief stop at the U.S. Supreme Court. Oracle has sought as much as $9 billion in the case.

In the trial last year in San Francisco, the jury ruled Google's use of 11,000 lines of Java code was allowed under "fair use" provisions in federal copyright law. In Oracle's 155-page appeal on Friday, it called Google's "copying...classic unfair use" and said "Google reaped billions of dollars while leaving Oracle's Java business in tatters."

Oracle's brief also argues that "When a plagiarist takes the most recognizable portions of a novel and adapts them into a film, the plagiarist commits the 'classic' unfair use."
Software

Valve Is Shutting Down Steam's Greenlight Community Voting System (theverge.com) 99

Valve's crowdsourced Greenlight submission program, which let the gaming community select which games get chosen for distribution via Steam, is shutting down after nearly five years. It will be replaced with a new system called Steam Direct that will charge developers a fee for each title they plan to distribute. The Verge reports: Steam Greenlight was launched in 2012 as a way for indie developers to get their games on Steam, even if they weren't working with a big publisher that had a relationship with Valve. Steam users would vote on Greenlight games, and Valve would accept titles with enough support to suggest that they'd sell well. Kroll says that "over 100" Greenlight titles have made $1 million or more. But Greenlight has also had significant problems. Developers could game the system by offering rewards for votes, and worthy projects could get lost amidst a slew of bad proposals. Since Valve ultimately made the call on including games, the process could also seem arbitrary and opaque. The big question is whether what's replacing it is better. To get a game on Steam Direct, developers will need to "complete a set of digital paperwork, personal or company verification, and tax documents similar to the process of applying for a bank account." Then, they'll pay an application fee for each game, "which is intended to decrease the noise in the submission pipeline" -- a polite way of saying that it will make people think twice before spending money submitting a low-quality game. Steam Direct is supposed to launch in spring of 2017, but the application fee hasn't been decided yet. Developer feedback has apparently suggested anything from $100 -- the current Greenlight submission fee -- and $5,000.
Programming

Slashdot Asks: How Do You Know a Developer is Doing a Good Job? 229

An anonymous reader writes: One of the easiest ways to evaluate a developer is keeping a tab on the amount of value they provide to a business. But the problem with this approach is that the nature of software development does not make it easy to measure the value a single developer brings. Some managers are aware of this, and they look at the number of lines of code a developer has written. The fewer, the better, many believe. I recently came across this in a blog post, "If you paid your developers per line of code, you would reward the inefficient developers. An analogy to this is writing essays, novels, blog posts, etc. Would you judge a writer solely on the number of words written? Probably not. There are a minimum number of words needed to get a complex point across, but those points get lost when a writer clutters their work with useless sentences. So the lines of code metric doesn't work. The notion of a quantifiable metric for evaluating developers is still attractive though. Some may argue that creating many code branches is the mark of a great developer. Yet I once worked with a developer who would create code branches to hide the fact that he wasn't very productive." Good point. But then, what other options do we have?
Books

The Most Mentioned Books On StackOverflow (dev-books.com) 92

An anonymous reader writes: People over at DevBooks have analyzed more than four million questions and answers on StackOverflow to list the top of the most mentioned books. You can check out the list for yourself here, but here are the top 10 books: Working Effectively with Legacy Code by Michael C Feathers; Design Patterns by Ralph Johnson, Erich Gamma, John Vlissides, and Richard Helm; Clean Code by Robert C. Martin; Java concurrency in practice by Brian Goetz, and Tim Peierls; Domain-driven Design by Eric Evans; JavaScript by Douglas Crockford; Patterns of Enterprise Application Architecture by Martin Fowler; Code Complete by Steve McConnell; Refactoring by Martin Fowler, and Kent Beck; Head First Design Patterns by Eric Freeman, Elisabeth Freeman, Kathy Sierra, and Bert Bates.
Programming

Goldman Sachs Automated Trading Replaces 600 Traders With 200 Engineers (technologyreview.com) 185

Goldman Sach's New York headquarters has replaced 600 of its traders with 200 computer engineers over the last two decades or so, thanks to automated trading programs. (Though, the effort to do so has accelerated over the past five years.) "Marty Chavez, the company's deputy chief financial officer and former chief information officer, explained all this to attendees at a symposium on computer's impact on economic activity held by Harvard's Institute for Applied Computational Science last month," reports MIT Technology Review. From their report: The experience of its New York traders is just one early example of a transformation of Goldman Sachs, and increasingly other Wall Street firms, that began with the rise in computerized trading, but has accelerated over the past five years, moving into more fields of finance that humans once dominated. Chavez, who will become chief financial officer in April, says areas of trading like currencies and even parts of business lines like investment banking are moving in the same automated direction that equities have already traveled. Today, nearly 45 percent of trading is done electronically, according to Coalition, a U.K. firm that tracks the industry. In addition to back-office clerical workers, on Wall Street machines are replacing a lot of highly paid people, too. Complex trading algorithms, some with machine-learning capabilities, first replaced trades where the price of what's being sold was easy to determine on the market, including the stocks traded by Goldman's old 600. Now areas of trading like currencies and futures, which are not traded on a stock exchange like the New York Stock Exchange but rather have prices that fluctuate, are coming in for more automation as well. To execute these trades, algorithms are being designed to emulate as closely as possible what a human trader would do, explains Coalition's Shahani. Goldman Sachs has already begun to automate currency trading, and has found consistently that four traders can be replaced by one computer engineer, Chavez said at the Harvard conference. Some 9,000 people, about one-third of Goldman's staff, are computer engineers.
Microsoft

Microsoft Debuts Customizable Speech-To-Text Tech, Releases Some Cognitive Services Tools To Developers (geekwire.com) 23

Microsoft is readying three of its 25 Cognitive Services tools for wider release to developers. From a report on GeekWire: Microsoft's AI and Research Group, a major new engineering and research division formed last year inside the Redmond company, is debuting a new technology that lets developers customize Microsoft's speech-to-text engine for use in their own apps and online services. The new Custom Speech Service is set for release today as a public preview. Microsoft says it lets developers upload a unique vocabulary -- such as alien names in Human Interact's VR game Starship Commander -- to produce a sophisticated language model for recognizing voice commands and other speech from users. It's the latest in a series of "cognitive services" from Microsoft's Artificial Intelligence and Research Group, a 5,000-person division led by Microsoft Research chief Harry Shum. The company says it has expanded from four to 25 cognitive services in the last two years, including 19 in preview and six that are generally available.
Java

Ask Slashdot: How To Get Started With Programming? [2017 Edition] 312

Reader joshtops writes: I know this is a question that must have been asked -- and answered -- on Slashdot several times, but I am hoping to listen from the community again (fresh perspective, if you will). I'm in my 20s, and have a day job that doesn't require any programming skills. But I want to learn it nonetheless. I have done some research but people have varied opinions. Essentially my question is: What is perhaps the best way to learn programming for my use case? I am looking for best possible resources -- perhaps tutorials on the internet, the right books and the order in which I should read/watch them. Some people have advised me to start with C language, but I was wondering if I could kickstart things with other languages such as perhaps Apple's Swift as well?
Open Source

How Open Sourcing Made Apache Kafka A Dominant Streaming Platform (techrepublic.com) 48

Open sourced in 2010, the Apache Kafka distributed streaming platform is now used at more than a third of Fortune 500 companies (as well as seven of the world's top 10 banks). An anonymous reader writes: Co-creator Neha Narkhede says "We saw the need for a distributed architecture with microservices that we could scale quickly and robustly. The legacy systems couldn't help us anymore." In a new interview with TechRepublic, Narkhede explains that while working at LinkedIn, "We had the vision of building the entire company's business logic as stream processors that express transformations on streams of data... [T]hough Kafka started off as a very scalable messaging system, it grew to complete our vision of being a distributed streaming platform."

Narkhede became the CTO and co-founder of Confluent, which supports enterprise installations of Kafka, and now says that being open source "helps you build a pipeline for your product and reduce the cost of sales... [T]he developer is the new decision maker. If the product experience is tailored to ensure that the developers are successful and the technology plays a critical role in your business, you have the foundational pieces of building a growing and profitable business around an open-source technology... Kafka is used as the source-of-truth pipeline carrying critical data that businesses rely on for real-time decision-making."

Slashdot Top Deals