Profile of William H. Alsup, a Judge Who Codes and Decides Tech's Biggest Cases ( 37

Sarah Jeong at The Verge has an interesting profile of William H. Alsup, the judge in Oracle v. Google case, who to many's surprise was able to comment on the technical issues that Oracle and Google were fighting about. Alsup admits that he learned the Java programming language only so that he could better understand the substance of the case. Here's an excerpt from the interview: On May 18th, 2012, attorneys for Oracle and Google were battling over nine lines of code in a hearing before Judge William H. Alsup of the northern district of California. The first jury trial in Oracle v. Google, the fight over whether Google had hijacked code from Oracle for its Android system, was wrapping up. The argument centered on a function called rangeCheck. Of all the lines of code that Oracle had tested -- 15 million in total -- these were the only ones that were "literally" copied. Every keystroke, a perfect duplicate. It was in Oracle's interest to play up the significance of rangeCheck as much as possible, and David Boies, Oracle's lawyer, began to argue that Google had copied rangeCheck so that it could take Android to market more quickly. Judge Alsup was not buying it. "I couldn't have told you the first thing about Java before this trial," said the judge. "But, I have done and still do a lot of programming myself in other languages. I have written blocks of code like rangeCheck a hundred times or more. I could do it. You could do it. It is so simple." It was an offhand comment that would snowball out of control, much to Alsup's chagrin. It was first repeated among lawyers and legal wonks, then by tech publications. With every repetition, Alsup's skill grew, until eventually he became "the judge who learned Java" -- Alsup the programmer, the black-robed nerd hero, the 10x judge, the "master of the court and of Java."

Ask Slashdot: How Can You Apply For A Job When Your Code Samples Suck? 408

An anonymous Slashdot reader ran into a problem when looking for a new employer: Most ask for links to "recent work" but the reason I'm leaving my current job is because this company doesn't produce good code. After years of trying to force them to change, they have refused to change any of their poor practices, because the CTO is a narcissist and doesn't recognize that so much is wrong. I have written good code for this company. The problem is it is mostly back-end code where I was afforded some freedom, but the front-end is still a complete mess that doesn't reflect any coherent coding practice whatsoever...

I am giving up on fixing this company but finding it hard to exemplify my work when it is hidden behind some of the worst front-end code I have ever seen. Most job applications ask for links to live code, not for code samples (which I would more easily be able to supply). Some of the websites look okay on the surface, but are one right click -> inspect element away from giving away the mess; most of the projects require a username and password to login as well but account registration is not open. So how do I reference my recent work when all of my recent work is embarrassing on the front-end?

The original submission's title asked what to use for work samples "when the CTO has butchered all my work." Any suggestions? Leave your best thoughts in the comments. How can you apply for a job when your code samples suck?

Does the Rise of AI Precede the End of Code? ( 202

An anonymous reader shares an article: It's difficult to know what's in store for the future of AI but let's tackle the most looming question first: are engineering jobs threatened? As anticlimactic as it may be, the answer is entirely dependent on what timeframe you are talking about. In the next decade? No, entirely unlikely. Eventually? Most definitely. The kicker is that engineers never truly know how the computer is able to accomplish these tasks. In many ways, the neural operations of the AI system are a black box. Programmers, therefore, become the AI coaches. They coach cars to self-drive, coach computers to recognise faces in photos, coach your smartphone to detect handwriting on a check in order to deposit electronically, and so on. In fact, the possibilities of AI and machine learning are limitless. The capabilities of AI through machine learning are wondrous, magnificent... and not going away. Attempts to apply artificial intelligence to programming tasks have resulted in further developments in knowledge and automated reasoning. Therefore, programmers must redefine their roles. Essentially, software development jobs will not become obsolete anytime soon but instead require more collaboration between humans and computers. For one, there will be an increased need for engineers to create, test and research AI systems. AI and machine learning will not be advanced enough to automate and dominate everything for a long time, so engineers will remain the technological handmaidens.

Learn To Code, It's More Important Than English as a Second Language, Says Apple CEO ( 296

Apple CEO Tim Cook says it is more important to learn how to code than it is to learn English as a second language. From a report: The tech executive made the remarks to French outlet Konbini while in the country for a meeting with French President Emmanuel Macron, who has called for tech companies to pay higher taxes in Europe. "If I were a French student and I were 10 years old, I think it would be more important for me to learn coding than English. I'm not telling people not to learn English in some form -- but I think you understand what I am saying is that this is a language that you can [use to] express yourself to 7 billion people in the world," Cook tells Konbini. "I think that coding should be required in every public school in the world. [...] It's the language that everyone needs, and not just for the computer scientists. It's for all of us."

The Case Against Biometric IDs ( 146

"The White House and Equifax Agree: Social Security Numbers Should Go," reads a headline at Bloomberg. Securities lawyer Jerri-Lynn Scofield tears down one proposed alternative: a universal biometric identity system (possibly using fingerprints and an iris scan) with further numeric verification. Presto Vivace shared the article: Using a biometric system when the basic problem of securing and safeguarding data have yet to be solved will only worsen, not address, the hacking problem. What we're being asked to do is to turn over our biometric information, and then trust those to whom we do so to safeguard that data. Given the current status of database security, corporate and governmental accountability, etc.: How do you think that is going to play out...?

[M]aybe we should rethink the whole impulse to centralize such data collection, for starters. And, after such a thought experiment, then further focus on obvious measures to safeguard such information -- such as installing regular software patches that could have prevented the Equifax hack -- should be the priority. And, how about bringing back a concept in rather short supply in C-suites -- that of accountability? Perhaps measures to increase that might be a better idea than gee whiz misdirected techno-wizardry... The Equifax hack has revealed the sad and sorry state of cybersecurity. But inviting the biometric ID fairy to drop by and replace the existing Social Security number is not the solution.

The article calls biometric identification systems "another source of data to be mined by corporations, and surveilled by those who want to do so. And it would ultimately not foil identity theft." It suggests currently biometric ids are a distraction from the push to change the credit bureau business model -- for example, requiring consumers to opt-in to the collection of their personal data.

New Video Peeks 'Inside the Head' of Perl Creator Larry Wall ( 106

"I was trained more as a linguist than a computer scientist," says Perl creator Larry Wall, "and some people would say it shows." An anonymous reader describes Wall's new video interview up on InfoQ: "With a natural language, you learn it as you go," Wall says. "You're not expected to know the whole language at once. It's okay to have dialects... Natural languages evolve over time, and they don't have arbitrary limits. They naturally cover multiple paradigms. There are external influences on style... It has fractal dimensionality to it. Easy things should be easy, hard things should be possible. And, you know, if you get really good at it, you can even speak CompSci."

Wall also touched on the long delay for the release of Perl 6. "In the year 2000, we said 'Maybe it's time to break backward compatibility, just once. Maybe we can afford to do that, get off the worse-is-worse cycle, crank the thing once for a worse-is-better cycle." The development team received a whopping 361 suggestions -- and was also influenced by Paul Graham's essay on the 100-year language. "We put a lot of these ideas together and thought really hard, and came up with a whole bunch of principles in the last 15 years." Among the pithy principles: "Give the user enough rope to shoot themselves in the foot, but hide the rope in the corner," and "Encapsulate cleverness, then reuse the heck out of it.."

But Wall emphasized the flexibility and multi-paradigm nature that they finally implemented in Perl 6. "The thing we really came up with was... There really is no one true language. Not even Perl 6, because Perl 6 itself is a braid of sublanguages -- slangs for short -- and they interact with each other, and you can modify each part of the braid..."

Wall even demoed a sigil-less style, and argued that Perl 6 was everything from "expressive" and "optimizable" to "gradually-typed" and "concurrency aware," while supporting multiple virtual machines. He also notes that Perl 6 borrows powerful features from other languages, including Haskell (lazy evaluation) Smalltalk (traits), Go (promises and channels), and C# (functional reactive programming).

And towards the end of the interview Wall remembers how the original release of Perl was considered by some as a violation of the Unix philosophy of doing one thing and doing it well. "I was already on my rebellious slide into changing the world at that point."

Java Coders Are Getting Bad Security Advice From Stack Overflow ( 236

Slashdot reader Orome1 quotes Help Net Security: A group of Virginia Tech researchers has analyzed hundreds of posts on Stack Overflow, a popular developer forum/Q&A site, and found that many of the developers who offer answers do not appear to understand the security implications of coding options, showing a lack of cybersecurity training. Another thing they discovered is that, sometimes, the most upvoted posts/answers contain insecure suggestions that introduce security vulnerabilities in software, while correct fixes are less popular and visible simply because they have been offered by users with a lower reputation score...

The researchers concentrated on posts relevant to Java security, from both software engineering and security perspectives, and on posts addressing questions tied to Spring Security, a third-party Java framework that provides authentication, authorization and other security features for enterprise applications... Developers are frustrated when they have to spend too much time figuring out the correct usage of APIs, and often end up choosing completely insecure-but-easy fixes such as using obsolete cryptographic hash functions, disabling cross-site request forgery protection, trusting all certificates in HTTPS verification, or using obsolete communication protocols. "These poor coding practices, if used in production code, will seriously compromise the security of software products," the researchers pointed out.

The researchers blame "the rapidly increasing need for enterprise security applications, the lack of security training in the software development workforce, and poorly designed security libraries." Among their suggested solutions: new developer tools which can recognize security errors and suggest patches.

According To Star Trek: Discovery, Starfleet Still Runs Microsoft Windows ( 237

AmiMoJo shares a report from The Verge: The third episode of Star Trek: Discovery aired this week, and at one point in the episode, Sonequa Martin-Green's Michael Burnham is tasked with reconciling two suites of code. In the show, Burnham claims the code is confusing because it deals with quantum astrophysics, biochemistry, and gene expression. And while the episode later reveals that it's related to the USS Discovery's experimental new mycelial network transportation system, Twitter user Rob Graham noted the code itself is a little more pedestrian in nature. More specifically, it seems to be decompiled code for the infamous Stuxnet virus, developed by the United States to attack Iranian computers running Windows.

Slack Locks Down Oracle Partnership Targeting Enterprises ( 43

From a report: Slack Technologies has secured a partnership with Oracle to integrate the tech giant's enterprise software products into the popular workplace messaging app, the two companies told Reuters. The partnership is a victory for Slack as the young startup ramps up its efforts to win the business of large enterprises in an increasingly competitive marketplace that has seen the entry of Microsoft, Facebook and countless startups. "As you see all these large enterprise software companies looking at messaging as a major platform, they're looking to partner with us first and foremost," said Brad Armstrong, Slack's head of global business and corporate development. The partnership will allow workers to use Slack as the interface for Oracle's sales, human resources and business software.

Code is Too Hard To Think About ( 397

From a longform piece on The Atlantic: What made programming so difficult was that it required you to think like a computer. The strangeness of it was in some sense more vivid in the early days of computing, when code took the form of literal ones and zeros. Anyone looking over a programmer's shoulder as they pored over line after line like "100001010011" and "000010011110" would have seen just how alienated the programmer was from the actual problems they were trying to solve; it would have been impossible to tell whether they were trying to calculate artillery trajectories or simulate a game of tic-tac-toe. The introduction of programming languages like Fortran and C, which resemble English, and tools, known as "integrated development environments," or IDEs, that help correct simple mistakes (like Microsoft Word's grammar checker but for code), obscured, though did little to actually change, this basic alienation -- the fact that the programmer didn't work on a problem directly, but rather spent their days writing out instructions for a machine. "The problem is that software engineers don't understand the problem they're trying to solve, and don't care to," says Leveson, the MIT software-safety expert. The reason is that they're too wrapped up in getting their code to work. "Software engineers like to provide all kinds of tools and stuff for coding errors," she says, referring to IDEs. "The serious problems that have happened with software have to do with requirements, not coding errors." When you're writing code that controls a car's throttle, for instance, what's important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. "There's 100 million lines of code in cars now," Leveson says. "You just cannot anticipate all these things."

Chip Reprograms Cells To Regenerate Damaged Tissue ( 16

An anonymous reader quotes a report from Scientific American about a device that delivers infusions of DNA and other molecules to restore injured limbs in mice, and maybe someday, humans: Cells are typically reprogrammed using mixtures of DNA, RNA and proteins. The most popular method uses viruses as a delivery vehicle -- although they can infect unintended cells, provoke immune responses and even turn cells cancerous. One alternative, called bulk electroporation, exposes cells to an electric field that pokes holes in their membranes to let in genetic material and proteins. Yet this method can stress or kill them. Tissue nanotransfection, described in a study published in August in Nature Nanotechnology, involves a chip containing an array of tiny channels that apply electric fields to individual cells. "You affect only a small area of the cell surface, compared with the conventional method, which upsets the entire cell," says study co-author James Lee, a chemical and biomolecular engineer at The Ohio State University. "Essentially we create a tiny hole and inject DNA right into the cell, so we can control the dosage."

Chandan Sen, a physiologist at Ohio State, and his colleagues developed a genetic cocktail that rapidly converts skin cells into endothelial cells -- the main component of blood vessels. They then used their technique on mice whose legs had been damaged by a severed artery that cut off blood supply. New blood vessels formed, blood flow increased, and after three weeks the legs had completely healed.


Former Female Oracle Employees Sue Company For Alleged Pay Discrimination ( 121

Three female, former Oracle employees are suing Oracle for allegedly paying women less than men in similar jobs. Rong Jewett, Sophy Wang and Xian Murray filed a lawsuit August 28, seeking a class-action status to represent all other women who have worked at the company. TechCrunch reports: The lawsuit, first reported by The Information, alleges that Oracle discriminated against women by "systematically paying them lower wage rates than Oracle pays to male employees performing substantially equal or similar work under similar working conditions," the filing states. The time period the lawsuit references is four years prior to the filing and through the date of the trial in California. Referencing how the U.S. Department of Labor sued Oracle in January based on its compliance review that found "systemic discrimination against women" and "gross disparities in pay," the lawsuit states Oracle had known or should have known about the pay disparity between its male and female employees. The plaintiffs are seeking wages due, interest and liquidated damages plus interest. They also want Oracle to guarantee they won't pay women less than men for similar work in the future.

'Tetris' Recreated In Conway's 'Game of Life' ( 87

In 1970 mathematician John Conway created rules for the "Game of Life," a now famous "zero-player game" where a grid of cells evolves (following Conway's rules) from an initial state proposed by the player. In 2013 someone challenged readers of StackExchange's "Programming Puzzles & Code Golf" section to devise an initial state "that will allow for the playing of a game of Tetris."

An anonymous Slashdot reader reports that "This challenge sat around, gathering upvotes but no answer, for four years. Then, it was answered." Citing the work of seven contributors, a massive six-part response says their solution took one and a half years to create, and "began as a quest but ended as an odyssey." The team created their own assembly language, known as QFTASM (Quest for Tetris Assembly) for use within Conway's mathematical universe, and then also designed their own processor architecture, and eventually even a higher-level language that they named COGOL. Their StackExchange response includes a link to all of their code on GitHub, as well as to a page where you can run the code online.

One StackExchange reader hailed the achievement as "the single greatest thing I've ever scrolled through while understanding very little."

Do Strongly Typed Languages Reduce Bugs? ( 456

"Static vs dynamic typing is always one of those topics that attracts passionately held positions," writes the Morning Paper -- reporting on an "encouraging" study that attempted to empirically evaluate the efficacy of statically-typed systems on mature, real-world code bases. The study was conducted by Christian Bird at Microsoft's "Research in Software Engineering" group with two researchers from University College London. Long-time Slashdot reader phantomfive writes: This study looked at bugs found in open source Javascript code. Looking through the commit history, they enumerated the bugs that would have been caught if a more strongly typed language (like Typescript) had been used. They found that a strongly typed language would have reduced bugs by 15%.

Does this make you want to avoid Python?


Apple's Swift 4.0 Includes A Compatibility Mode For 'The Majority' Of Swift 3.x Code ( 122

An anonymous reader quotes InfoWorld: Swift 4.0 is now available. It's a major upgrade to Apple's Swift, the three-year old successor to the Objective-C language used for MacOS and iOS application development. The Swift 4 upgrade enhances the Swift Package Manager and provides new compatibility modes for developers. Apple said Swift 4 also makes Swift more stable and improves its standard library. Swift 4 is largely source-compatible with Swift 3 and ships as part of Apple's Xcode 9 IDE...

Swift 4's new compatibility modes could save you from having to modify code to be able to use the new version of the compiler. Two modes are supported, including the Swift 3.2 mode, which accepts most source files built with Swift 3.x compilers, and the Swift 4.0 mode, which includes Swift 4 and API changes. Apple said that some source migration will be needed for many projects, but the number of source changes are "quite modest" compared to many previous major changes between Swift releases.

Apple calls Swift 4.0 "a major language release" that also includes new language changes and updates that came through the Swift Evolution process.

Slashdot Top Deals