×
Programming

GitHub Commits Reveal The Top 'Weekend Programming' Languages (medium.com) 149

An anonymous reader writes: Google "developer advocate" Felipe Hoffa has determined the top "weekend programming languages," those which see the biggest spike in commit activity on the weekends. "Clearly 2016 was a year dedicated to play with functional languages, up and coming paradigms, and scripting 3d worlds," he writes, revealing that the top weekend programming languages are:

Rust, Glsl, D, Haskell, Common Lisp, Kicad, Emacs Lisp, Lua, Scheme, Julia, Elm, Eagle, Racket, Dart, Nsis, Clojure, Kotlin, Elixir, F#, Ocaml

Earlier this week another data scientist calculated ended up with an entirely different list by counting the frequency of each language's tag in StackOverflow questions. But Hoffa's analysis was performed using Google's BigQuery web service, and he's also compiled a list of 2016's least popular weekend languages -- the ones people seem to prefer using at the office rather than in their own free time.

Nginx, Matlab, Processing, Vue, Fortran, Visual Basic, Objective-C++, Plsql, Plpgsql, Web Ontology Language, Smarty, Groovy, Batchfile, Objective-C, Powershell, Xslt, Cucumber, Hcl, Puppet, Gcc Machine Description

What's most interesting is the changes over time. In the last year Perl has become more popular than Java, PHP, and ASP as a weekend programming language. And Rust "used to be a weekday language," Hoffa writes, but it soon also grew more popular for Saturdays and Sunday. Meanwhile, "The more popular Go grows, the more it settles as a weekday language," while Puppet "is the champion of weekday coders." Ruby on the other hand, is "slowly leaving the week and embracing the weekend."

Hoffa is also a long-time Slashdot reader who analyzed one billion files on GitHub last summer to determine whether they'd been indented with spaces or tabs. But does this new list resonate with anybody? What languages are you using for your weekend coding projects?
Android

Oracle Refuses To Accept Android's 'Fair Use' Verdict, Files Appeal (wsj.com) 155

An anonymous reader quotes the Wall Street Journal: The seven-year legal battle between tech giants Google and Oracle just got new life. Oracle on Friday filed an appeal with the U.S. Court of Appeals for the Federal Circuit that seeks to overturn a federal jury's decision last year... The case has now gone through two federal trials and bounced around at appeals courts, including a brief stop at the U.S. Supreme Court. Oracle has sought as much as $9 billion in the case.

In the trial last year in San Francisco, the jury ruled Google's use of 11,000 lines of Java code was allowed under "fair use" provisions in federal copyright law. In Oracle's 155-page appeal on Friday, it called Google's "copying...classic unfair use" and said "Google reaped billions of dollars while leaving Oracle's Java business in tatters."

Oracle's brief also argues that "When a plagiarist takes the most recognizable portions of a novel and adapts them into a film, the plagiarist commits the 'classic' unfair use."
Software

Valve Is Shutting Down Steam's Greenlight Community Voting System (theverge.com) 99

Valve's crowdsourced Greenlight submission program, which let the gaming community select which games get chosen for distribution via Steam, is shutting down after nearly five years. It will be replaced with a new system called Steam Direct that will charge developers a fee for each title they plan to distribute. The Verge reports: Steam Greenlight was launched in 2012 as a way for indie developers to get their games on Steam, even if they weren't working with a big publisher that had a relationship with Valve. Steam users would vote on Greenlight games, and Valve would accept titles with enough support to suggest that they'd sell well. Kroll says that "over 100" Greenlight titles have made $1 million or more. But Greenlight has also had significant problems. Developers could game the system by offering rewards for votes, and worthy projects could get lost amidst a slew of bad proposals. Since Valve ultimately made the call on including games, the process could also seem arbitrary and opaque. The big question is whether what's replacing it is better. To get a game on Steam Direct, developers will need to "complete a set of digital paperwork, personal or company verification, and tax documents similar to the process of applying for a bank account." Then, they'll pay an application fee for each game, "which is intended to decrease the noise in the submission pipeline" -- a polite way of saying that it will make people think twice before spending money submitting a low-quality game. Steam Direct is supposed to launch in spring of 2017, but the application fee hasn't been decided yet. Developer feedback has apparently suggested anything from $100 -- the current Greenlight submission fee -- and $5,000.
Programming

Slashdot Asks: How Do You Know a Developer is Doing a Good Job? 229

An anonymous reader writes: One of the easiest ways to evaluate a developer is keeping a tab on the amount of value they provide to a business. But the problem with this approach is that the nature of software development does not make it easy to measure the value a single developer brings. Some managers are aware of this, and they look at the number of lines of code a developer has written. The fewer, the better, many believe. I recently came across this in a blog post, "If you paid your developers per line of code, you would reward the inefficient developers. An analogy to this is writing essays, novels, blog posts, etc. Would you judge a writer solely on the number of words written? Probably not. There are a minimum number of words needed to get a complex point across, but those points get lost when a writer clutters their work with useless sentences. So the lines of code metric doesn't work. The notion of a quantifiable metric for evaluating developers is still attractive though. Some may argue that creating many code branches is the mark of a great developer. Yet I once worked with a developer who would create code branches to hide the fact that he wasn't very productive." Good point. But then, what other options do we have?
Books

The Most Mentioned Books On StackOverflow (dev-books.com) 92

An anonymous reader writes: People over at DevBooks have analyzed more than four million questions and answers on StackOverflow to list the top of the most mentioned books. You can check out the list for yourself here, but here are the top 10 books: Working Effectively with Legacy Code by Michael C Feathers; Design Patterns by Ralph Johnson, Erich Gamma, John Vlissides, and Richard Helm; Clean Code by Robert C. Martin; Java concurrency in practice by Brian Goetz, and Tim Peierls; Domain-driven Design by Eric Evans; JavaScript by Douglas Crockford; Patterns of Enterprise Application Architecture by Martin Fowler; Code Complete by Steve McConnell; Refactoring by Martin Fowler, and Kent Beck; Head First Design Patterns by Eric Freeman, Elisabeth Freeman, Kathy Sierra, and Bert Bates.
Programming

Goldman Sachs Automated Trading Replaces 600 Traders With 200 Engineers (technologyreview.com) 185

Goldman Sach's New York headquarters has replaced 600 of its traders with 200 computer engineers over the last two decades or so, thanks to automated trading programs. (Though, the effort to do so has accelerated over the past five years.) "Marty Chavez, the company's deputy chief financial officer and former chief information officer, explained all this to attendees at a symposium on computer's impact on economic activity held by Harvard's Institute for Applied Computational Science last month," reports MIT Technology Review. From their report: The experience of its New York traders is just one early example of a transformation of Goldman Sachs, and increasingly other Wall Street firms, that began with the rise in computerized trading, but has accelerated over the past five years, moving into more fields of finance that humans once dominated. Chavez, who will become chief financial officer in April, says areas of trading like currencies and even parts of business lines like investment banking are moving in the same automated direction that equities have already traveled. Today, nearly 45 percent of trading is done electronically, according to Coalition, a U.K. firm that tracks the industry. In addition to back-office clerical workers, on Wall Street machines are replacing a lot of highly paid people, too. Complex trading algorithms, some with machine-learning capabilities, first replaced trades where the price of what's being sold was easy to determine on the market, including the stocks traded by Goldman's old 600. Now areas of trading like currencies and futures, which are not traded on a stock exchange like the New York Stock Exchange but rather have prices that fluctuate, are coming in for more automation as well. To execute these trades, algorithms are being designed to emulate as closely as possible what a human trader would do, explains Coalition's Shahani. Goldman Sachs has already begun to automate currency trading, and has found consistently that four traders can be replaced by one computer engineer, Chavez said at the Harvard conference. Some 9,000 people, about one-third of Goldman's staff, are computer engineers.
Microsoft

Microsoft Debuts Customizable Speech-To-Text Tech, Releases Some Cognitive Services Tools To Developers (geekwire.com) 23

Microsoft is readying three of its 25 Cognitive Services tools for wider release to developers. From a report on GeekWire: Microsoft's AI and Research Group, a major new engineering and research division formed last year inside the Redmond company, is debuting a new technology that lets developers customize Microsoft's speech-to-text engine for use in their own apps and online services. The new Custom Speech Service is set for release today as a public preview. Microsoft says it lets developers upload a unique vocabulary -- such as alien names in Human Interact's VR game Starship Commander -- to produce a sophisticated language model for recognizing voice commands and other speech from users. It's the latest in a series of "cognitive services" from Microsoft's Artificial Intelligence and Research Group, a 5,000-person division led by Microsoft Research chief Harry Shum. The company says it has expanded from four to 25 cognitive services in the last two years, including 19 in preview and six that are generally available.
Java

Ask Slashdot: How To Get Started With Programming? [2017 Edition] 312

Reader joshtops writes: I know this is a question that must have been asked -- and answered -- on Slashdot several times, but I am hoping to listen from the community again (fresh perspective, if you will). I'm in my 20s, and have a day job that doesn't require any programming skills. But I want to learn it nonetheless. I have done some research but people have varied opinions. Essentially my question is: What is perhaps the best way to learn programming for my use case? I am looking for best possible resources -- perhaps tutorials on the internet, the right books and the order in which I should read/watch them. Some people have advised me to start with C language, but I was wondering if I could kickstart things with other languages such as perhaps Apple's Swift as well?
Open Source

How Open Sourcing Made Apache Kafka A Dominant Streaming Platform (techrepublic.com) 48

Open sourced in 2010, the Apache Kafka distributed streaming platform is now used at more than a third of Fortune 500 companies (as well as seven of the world's top 10 banks). An anonymous reader writes: Co-creator Neha Narkhede says "We saw the need for a distributed architecture with microservices that we could scale quickly and robustly. The legacy systems couldn't help us anymore." In a new interview with TechRepublic, Narkhede explains that while working at LinkedIn, "We had the vision of building the entire company's business logic as stream processors that express transformations on streams of data... [T]hough Kafka started off as a very scalable messaging system, it grew to complete our vision of being a distributed streaming platform."

Narkhede became the CTO and co-founder of Confluent, which supports enterprise installations of Kafka, and now says that being open source "helps you build a pipeline for your product and reduce the cost of sales... [T]he developer is the new decision maker. If the product experience is tailored to ensure that the developers are successful and the technology plays a critical role in your business, you have the foundational pieces of building a growing and profitable business around an open-source technology... Kafka is used as the source-of-truth pipeline carrying critical data that businesses rely on for real-time decision-making."

Education

Disney Thinks High Schools Should Let Kids Take Coding In Place of Foreign Languages 328

theodp writes: Florida lawmakers are again proposing a contentious plan that would put coding and foreign language on equal footing in a public high school student's education. Under a proposed bill students who take two credits of computer coding and earn a related industry certification could then count that coursework toward two foreign language credits.

"I sort of comically applaud that some would want to categorize coding as a foreign language," said Miami-Dade Schools Superintendent Alberto Carvalho. "Coding cannot be seen as an equivalent substitute." Disclosure records show that Walt Disney Parks and Resorts has three lobbyists registered to fight in support of the bill. Disney did not return an email seeking comment, but State Senator Jeff Brandes said the company's interest is in a future workforce... Disney has provided signature tutorials for the nation's Hour of Code over the past three years, including Disney's Frozen princess-themed tutorial.
Security

14,000 Domains Dropped Dyn's DNS Service After Mirai Attack (securityledger.com) 27

chicksdaddy New data suggests that some 14,500 web domains stopped using Dyn's Managed DNS service in the immediate aftermath of an October DDoS attack by the Mirai botnet. That's around 8% of the web domains using Dyn Managed DNS... "The data show that Dyn lost a pretty big chunk of their customer base because they were affected by (Mirai)," said Dan Dahlberg, a research scientist at BitSight Technologies in Cambridge, Massachusetts... BitSight, which provides security rating services for companies, analyzed a set of 178,000 domains that were hosted on Dyn's managed DNS infrastructure before and immediately after the October 21st attacks.
It's possible some of those domains later returned to Dyn -- and the number of actual customers may be smaller than the number of hosted domains. But in the end it may not have mattered much, since Dyn was acquired by Oracle the next month, and TechCrunch speculates that the deal had already been set in motion before the attack.

They also add that "Oracle, of course, is no stranger to breaches itself: in August it was found that hundreds of its own computer systems were breached."
Programming

Developer Argues For 'Forgotten Code Constructs' Like GOTO and Eval (techbeacon.com) 600

mikeatTB quotes TechBeacon: Some things in the programming world are so easy to misuse that most people prefer to never use them at all. These are the programming equivalent of a flamethrower... [But] creative use of features such as goto, multiple inheritance, eval, and recursion may be just the right solution for experienced developers when used in the right situation. Is it time to resurrect these four forgotten code constructs?
The article notes that the Linux kernel uses goto statements, and links to Linus Torvalds' defense of them. ("Any if-statement is a goto. As are all structured loops...") And it points out that eval statements are supported by JavaScript, Python, PHP, and Ruby. But when the article describes recursion as "more forgotten than forbidden," it begs the inevitable question. Are you using these "forgotten code constructs" -- and should you be?
Social Networks

Ask Slashdot: How Do You Deal With Aggressive Forum Users? 477

Slashdot reader dryriver writes: I've noticed a disturbing trend while trying to resolve a rather tricky tech issue by asking questions on a number of internet forums. The number of people who don't help at all with problems but rather butt into threads with unhelpful comments like "Why would you want to do that in the first place?" or "why don't you look at X poorly written documentation page " was staggering. One forum user with 1,500+ posts even posted "you are such a n00b if you can't figure this out" in my question thread, even though my tech question wasn't one that is obvious or easy to resolve...

I seem to remember a time when people helped each other far more readily on the internet. Now there seems to be a new breed of forum user who a) hangs out at a forum socially all day b) does not bother to help at all and c) gets a kick out of telling you things like "what a stupid question" or "nobody will help you with that here" or similar... Where have the good old days gone when people much more readily gave other people step-by-step tips, tricks, instructions and advice?

The original submission claims the ratio of unhelpful comments to helpful ones was 5 to 1. Has anyone else experienced this? And if so, what's the best response? Leave your best answers in the comments. How do you deal with aggressive forum users?
Communications

IMDb Is Shutting Down Its Long-Running, Popular Message Boards After 16 Years (polygon.com) 168

An anonymous reader quotes a report from Polygon: After 16 years, IMDb's message boards and the ability to privately message other users is shutting down, with many members of the community openly mourning the loss of the section. IMDb, which stands from the Internet Movie Database, is one of the world's biggest databases for film and television. According to the company, there is information on more than 4.1 million titles and 7.7 million personalities available on the site as of January 2017. The message board, which was introduced in 2001, reportedly remains one of the most used services on the website, but despite that, the company is getting ready to shut it down, citing a desire to foster a positive environment and serve its audience the best way it can. "After in-depth discussion and examination, we have concluded that IMDb's message boards are no longer providing a positive, useful experience for the vast majority of our more than 250 million monthly users worldwide," a statement on the site reads. "The decision to retire a long-standing feature was made only after careful consideration and was based on data and traffic. Because IMDb's message boards continue to be utilized by a small but passionate community of IMDb users, we announced our decision to disable our message boards on February 3, 2017 but will leave them open for two additional weeks so that users will have ample time to archive any message board content they'd like to keep for personal use. During this two-week transition period, which concludes on February 19, 2017, IMDb message board users can exchange contact information with any other board users they would like to remain in communication with (since once we shut down the IMDb message boards, users will no longer be able to send personal messages to one another)."
Microsoft

Microsoft Introduces GVFS (Git Virtual File System) (microsoft.com) 213

Saeed Noursalehi, principal program manager at Microsoft, writes on a blog post: We've been working hard on a solution that allows the Git client to scale to repos of any size. Today, we're introducing GVFS (Git Virtual File System), which virtualizes the file system beneath your repo and makes it appear as though all the files in your repo are present, but in reality only downloads a file the first time it is opened. GVFS also actively manages how much of the repo Git has to consider in operations like checkout and status, since any file that has not been hydrated can be safely ignored. And because we do this all at the file system level, your IDEs and build tools don't need to change at all! In a repo that is this large, no developer builds the entire source tree. Instead, they typically download the build outputs from the most recent official build, and only build a small portion of the sources related to the area they are modifying. Therefore, even though there are over 3 million files in the repo, a typical developer will only need to download and use about 50-100K of those files. With GVFS, this means that they now have a Git experience that is much more manageable: clone now takes a few minutes instead of 12+ hours, checkout takes 30 seconds instead of 2-3 hours, and status takes 4-5 seconds instead of 10 minutes. And we're working on making those numbers even better.
Programming

GitLab Says It Found Lost Data On a Staging Server (theregister.co.uk) 101

GitLab.com, the wannabe GitHub alternative that went down hard earlier this week and reported data loss, has said that some data is gone but that its services are now operational again. From a report The Register: The incident did not result in Git repos disappearing. Which may be why the company's PR reps characterised the lost data as "peripheral metadata that was written during a 6-hour window". But in a prose account of the incident, GitLab says "issues, merge requests, users, comments, snippets, etc" were lost. The Register imagines many developers may not be entirely happy with those data types being considered peripheral to their efforts. GitLab's PR flaks added that the incident impacted "less than 1% of our user base." But the firm's incident log says 707 users have lost data. The startup, which has raised over $25 million, added that it lost six hours of data and asserted that the lost doesn't include users' code.
Data Storage

GitLab.com Melts Down After Wrong Directory Deleted, Backups Fail (theregister.co.uk) 356

An anonymous reader quotes a report from The Register: Source-code hub Gitlab.com is in meltdown after experiencing data loss as a result of what it has suddenly discovered are ineffectual backups. On Tuesday evening, Pacific Time, the startup issued the sobering series of tweets, starting with "We are performing emergency database maintenance, GitLab.com will be taken offline" and ending with "We accidentally deleted production data and might have to restore from backup. Google Doc with live notes [link]." Behind the scenes, a tired sysadmin, working late at night in the Netherlands, had accidentally deleted a directory on the wrong server during a frustrating database replication process: he wiped a folder containing 300GB of live production data that was due to be replicated. Just 4.5GB remained by the time he canceled the rm -rf command. The last potentially viable backup was taken six hours beforehand. That Google Doc mentioned in the last tweet notes: "This incident affected the database (including issues and merge requests) but not the git repos (repositories and wikis)." So some solace there for users because not all is lost. But the document concludes with the following: "So in other words, out of 5 backup/replication techniques deployed none are working reliably or set up in the first place." At the time of writing, GitLab says it has no estimated restore time but is working to restore from a staging server that may be "without webhooks" but is "the only available snapshot." That source is six hours old, so there will be some data loss.
Facebook

Facebook's Parse Is Shutting Down Today (parse.com) 14

Facebook acquired Parse, a toolkit and support system for mobile developers, in 2013. At the time, the social network's ambitions were high: Parse would be Facebook's way into one day harnessing developers to become a true cloud business, competing alongside the likes of Amazon, Google and Microsoft. Three years later, Facebook announced it would be shutting down Parse. Today is that day. From Parse's status page: As we previously shared, the Parse service is shutting down today. Throughout the day we will be disabling the Parse API on an app-by-app basis. When your app is disabled, you will not be able to access the data browser or export any data, and your applications will no longer be able to access the Parse API.
Oracle

Oracle Effectively Doubles Licence Fees To Run Its Stuff in AWS (theregister.co.uk) 198

Oracle has changed the way it charges users to run its software in Amazon Web Services, effectively doubling the cost along the way. From a report: Big Red's previous licensing regime recognised that AWS's virtual CPUs were a single thread of a core that runs two threads. Each virtual CPU therefore counted as half a core. That's changed: Oracle's new cloud licensing policy says an AWS vCPU is now treated as a full core if hyperthreading is not enabled. A user hiring two AWS vCPUS therefore needs to pay full freight for both, effectively doubling the number of Oracle licences required to run Big Red inside AWS. And therefore doubling the cost as well. The new policy also says: "When counting Oracle Processor license requirements in Authorized Cloud Environments, the Oracle Processor Core Factor Table is not applicable." That table says Xeons cores count as half a licence. Making the Table inapplicable to the cloud again doubles the licence count required.
Communications

US Intelligence Seeks a Universal Translator For Text Search In Any Language (arstechnica.com) 47

An anonymous reader quotes a report from Ars Technica: The Intelligence Advanced Research Projects Agency (IARPA), the U.S. Intelligence Community's own science and technology research arm, has announced it is seeking contenders for a program to develop what amounts to the ultimate Google Translator. IARPA's Machine Translation for English Retrieval of Information in Any Language (MATERIAL) program intends to provide researchers and analysts with a tool to search for documents in their field of concern in any of the more than 7,000 languages spoken worldwide. The specific goal, according to IARPA's announcement, is an "'English-in, English-out' information retrieval system that, given a domain-sensitive English query, will retrieve relevant data from a large multilingual repository and display the retrieved information in English as query-biased summaries." Users would be able to search vast numbers of documents with a two-part query: the first giving the "domain" of the search in terms of what sort of information they are seeking (for example, "Government," "Science," or "Health") and the second an English word or phrase describing the information sought (the examples given in the announcement were "zika virus" and "Asperger's syndrome"). The system would be used in situations like natural disasters or military interventions in remote locations where the military has little or no local language expertise. Those taking on the MATERIAL program will be given access to a limited set of machine translation and automatic speech recognition training data from multiple languages "to enable performers to learn how to quickly adapt their methods to a wide variety of materials in various genres and domains," the announcement explained. "As the program progresses, performers will apply and adapt these methods in increasingly shortened time frames to new languages... Since language-independent approaches with quick ramp up time are sought, foreign language expertise in the languages of the program is not expected." The good news for the broader linguistics and technology world is that IARPA expects the teams competing on MATERIAL to publicly publish their research. If successful, this moonshot for translation could radically change how accessible materials in many languages are to the rest of the world.

Slashdot Top Deals