Businesses

Ask Slashdot: How Can You Apply For A Job When Your Code Samples Suck? 403

An anonymous Slashdot reader ran into a problem when looking for a new employer: Most ask for links to "recent work" but the reason I'm leaving my current job is because this company doesn't produce good code. After years of trying to force them to change, they have refused to change any of their poor practices, because the CTO is a narcissist and doesn't recognize that so much is wrong. I have written good code for this company. The problem is it is mostly back-end code where I was afforded some freedom, but the front-end is still a complete mess that doesn't reflect any coherent coding practice whatsoever...

I am giving up on fixing this company but finding it hard to exemplify my work when it is hidden behind some of the worst front-end code I have ever seen. Most job applications ask for links to live code, not for code samples (which I would more easily be able to supply). Some of the websites look okay on the surface, but are one right click -> inspect element away from giving away the mess; most of the projects require a username and password to login as well but account registration is not open. So how do I reference my recent work when all of my recent work is embarrassing on the front-end?

The original submission's title asked what to use for work samples "when the CTO has butchered all my work." Any suggestions? Leave your best thoughts in the comments. How can you apply for a job when your code samples suck?
AI

Does the Rise of AI Precede the End of Code? (itproportal.com) 199

An anonymous reader shares an article: It's difficult to know what's in store for the future of AI but let's tackle the most looming question first: are engineering jobs threatened? As anticlimactic as it may be, the answer is entirely dependent on what timeframe you are talking about. In the next decade? No, entirely unlikely. Eventually? Most definitely. The kicker is that engineers never truly know how the computer is able to accomplish these tasks. In many ways, the neural operations of the AI system are a black box. Programmers, therefore, become the AI coaches. They coach cars to self-drive, coach computers to recognise faces in photos, coach your smartphone to detect handwriting on a check in order to deposit electronically, and so on. In fact, the possibilities of AI and machine learning are limitless. The capabilities of AI through machine learning are wondrous, magnificent... and not going away. Attempts to apply artificial intelligence to programming tasks have resulted in further developments in knowledge and automated reasoning. Therefore, programmers must redefine their roles. Essentially, software development jobs will not become obsolete anytime soon but instead require more collaboration between humans and computers. For one, there will be an increased need for engineers to create, test and research AI systems. AI and machine learning will not be advanced enough to automate and dominate everything for a long time, so engineers will remain the technological handmaidens.
Education

Learn To Code, It's More Important Than English as a Second Language, Says Apple CEO (cnbc.com) 295

Apple CEO Tim Cook says it is more important to learn how to code than it is to learn English as a second language. From a report: The tech executive made the remarks to French outlet Konbini while in the country for a meeting with French President Emmanuel Macron, who has called for tech companies to pay higher taxes in Europe. "If I were a French student and I were 10 years old, I think it would be more important for me to learn coding than English. I'm not telling people not to learn English in some form -- but I think you understand what I am saying is that this is a language that you can [use to] express yourself to 7 billion people in the world," Cook tells Konbini. "I think that coding should be required in every public school in the world. [...] It's the language that everyone needs, and not just for the computer scientists. It's for all of us."
Businesses

The Case Against Biometric IDs (nakedcapitalism.com) 146

"The White House and Equifax Agree: Social Security Numbers Should Go," reads a headline at Bloomberg. Securities lawyer Jerri-Lynn Scofield tears down one proposed alternative: a universal biometric identity system (possibly using fingerprints and an iris scan) with further numeric verification. Presto Vivace shared the article: Using a biometric system when the basic problem of securing and safeguarding data have yet to be solved will only worsen, not address, the hacking problem. What we're being asked to do is to turn over our biometric information, and then trust those to whom we do so to safeguard that data. Given the current status of database security, corporate and governmental accountability, etc.: How do you think that is going to play out...?

[M]aybe we should rethink the whole impulse to centralize such data collection, for starters. And, after such a thought experiment, then further focus on obvious measures to safeguard such information -- such as installing regular software patches that could have prevented the Equifax hack -- should be the priority. And, how about bringing back a concept in rather short supply in C-suites -- that of accountability? Perhaps measures to increase that might be a better idea than gee whiz misdirected techno-wizardry... The Equifax hack has revealed the sad and sorry state of cybersecurity. But inviting the biometric ID fairy to drop by and replace the existing Social Security number is not the solution.

The article calls biometric identification systems "another source of data to be mined by corporations, and surveilled by those who want to do so. And it would ultimately not foil identity theft." It suggests currently biometric ids are a distraction from the push to change the credit bureau business model -- for example, requiring consumers to opt-in to the collection of their personal data.
Perl

New Video Peeks 'Inside the Head' of Perl Creator Larry Wall (infoq.com) 106

"I was trained more as a linguist than a computer scientist," says Perl creator Larry Wall, "and some people would say it shows." An anonymous reader describes Wall's new video interview up on InfoQ: "With a natural language, you learn it as you go," Wall says. "You're not expected to know the whole language at once. It's okay to have dialects... Natural languages evolve over time, and they don't have arbitrary limits. They naturally cover multiple paradigms. There are external influences on style... It has fractal dimensionality to it. Easy things should be easy, hard things should be possible. And, you know, if you get really good at it, you can even speak CompSci."

Wall also touched on the long delay for the release of Perl 6. "In the year 2000, we said 'Maybe it's time to break backward compatibility, just once. Maybe we can afford to do that, get off the worse-is-worse cycle, crank the thing once for a worse-is-better cycle." The development team received a whopping 361 suggestions -- and was also influenced by Paul Graham's essay on the 100-year language. "We put a lot of these ideas together and thought really hard, and came up with a whole bunch of principles in the last 15 years." Among the pithy principles: "Give the user enough rope to shoot themselves in the foot, but hide the rope in the corner," and "Encapsulate cleverness, then reuse the heck out of it.."

But Wall emphasized the flexibility and multi-paradigm nature that they finally implemented in Perl 6. "The thing we really came up with was... There really is no one true language. Not even Perl 6, because Perl 6 itself is a braid of sublanguages -- slangs for short -- and they interact with each other, and you can modify each part of the braid..."

Wall even demoed a sigil-less style, and argued that Perl 6 was everything from "expressive" and "optimizable" to "gradually-typed" and "concurrency aware," while supporting multiple virtual machines. He also notes that Perl 6 borrows powerful features from other languages, including Haskell (lazy evaluation) Smalltalk (traits), Go (promises and channels), and C# (functional reactive programming).

And towards the end of the interview Wall remembers how the original release of Perl was considered by some as a violation of the Unix philosophy of doing one thing and doing it well. "I was already on my rebellious slide into changing the world at that point."
Java

Java Coders Are Getting Bad Security Advice From Stack Overflow (helpnetsecurity.com) 236

Slashdot reader Orome1 quotes Help Net Security: A group of Virginia Tech researchers has analyzed hundreds of posts on Stack Overflow, a popular developer forum/Q&A site, and found that many of the developers who offer answers do not appear to understand the security implications of coding options, showing a lack of cybersecurity training. Another thing they discovered is that, sometimes, the most upvoted posts/answers contain insecure suggestions that introduce security vulnerabilities in software, while correct fixes are less popular and visible simply because they have been offered by users with a lower reputation score...

The researchers concentrated on posts relevant to Java security, from both software engineering and security perspectives, and on posts addressing questions tied to Spring Security, a third-party Java framework that provides authentication, authorization and other security features for enterprise applications... Developers are frustrated when they have to spend too much time figuring out the correct usage of APIs, and often end up choosing completely insecure-but-easy fixes such as using obsolete cryptographic hash functions, disabling cross-site request forgery protection, trusting all certificates in HTTPS verification, or using obsolete communication protocols. "These poor coding practices, if used in production code, will seriously compromise the security of software products," the researchers pointed out.

The researchers blame "the rapidly increasing need for enterprise security applications, the lack of security training in the software development workforce, and poorly designed security libraries." Among their suggested solutions: new developer tools which can recognize security errors and suggest patches.
Sci-Fi

According To Star Trek: Discovery, Starfleet Still Runs Microsoft Windows (theverge.com) 237

AmiMoJo shares a report from The Verge: The third episode of Star Trek: Discovery aired this week, and at one point in the episode, Sonequa Martin-Green's Michael Burnham is tasked with reconciling two suites of code. In the show, Burnham claims the code is confusing because it deals with quantum astrophysics, biochemistry, and gene expression. And while the episode later reveals that it's related to the USS Discovery's experimental new mycelial network transportation system, Twitter user Rob Graham noted the code itself is a little more pedestrian in nature. More specifically, it seems to be decompiled code for the infamous Stuxnet virus, developed by the United States to attack Iranian computers running Windows.
Communications

Slack Locks Down Oracle Partnership Targeting Enterprises (reuters.com) 43

From a report: Slack Technologies has secured a partnership with Oracle to integrate the tech giant's enterprise software products into the popular workplace messaging app, the two companies told Reuters. The partnership is a victory for Slack as the young startup ramps up its efforts to win the business of large enterprises in an increasingly competitive marketplace that has seen the entry of Microsoft, Facebook and countless startups. "As you see all these large enterprise software companies looking at messaging as a major platform, they're looking to partner with us first and foremost," said Brad Armstrong, Slack's head of global business and corporate development. The partnership will allow workers to use Slack as the interface for Oracle's sales, human resources and business software.
Software

Code is Too Hard To Think About (theatlantic.com) 397

From a longform piece on The Atlantic: What made programming so difficult was that it required you to think like a computer. The strangeness of it was in some sense more vivid in the early days of computing, when code took the form of literal ones and zeros. Anyone looking over a programmer's shoulder as they pored over line after line like "100001010011" and "000010011110" would have seen just how alienated the programmer was from the actual problems they were trying to solve; it would have been impossible to tell whether they were trying to calculate artillery trajectories or simulate a game of tic-tac-toe. The introduction of programming languages like Fortran and C, which resemble English, and tools, known as "integrated development environments," or IDEs, that help correct simple mistakes (like Microsoft Word's grammar checker but for code), obscured, though did little to actually change, this basic alienation -- the fact that the programmer didn't work on a problem directly, but rather spent their days writing out instructions for a machine. "The problem is that software engineers don't understand the problem they're trying to solve, and don't care to," says Leveson, the MIT software-safety expert. The reason is that they're too wrapped up in getting their code to work. "Software engineers like to provide all kinds of tools and stuff for coding errors," she says, referring to IDEs. "The serious problems that have happened with software have to do with requirements, not coding errors." When you're writing code that controls a car's throttle, for instance, what's important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. "There's 100 million lines of code in cars now," Leveson says. "You just cannot anticipate all these things."
Biotech

Chip Reprograms Cells To Regenerate Damaged Tissue (scientificamerican.com) 16

An anonymous reader quotes a report from Scientific American about a device that delivers infusions of DNA and other molecules to restore injured limbs in mice, and maybe someday, humans: Cells are typically reprogrammed using mixtures of DNA, RNA and proteins. The most popular method uses viruses as a delivery vehicle -- although they can infect unintended cells, provoke immune responses and even turn cells cancerous. One alternative, called bulk electroporation, exposes cells to an electric field that pokes holes in their membranes to let in genetic material and proteins. Yet this method can stress or kill them. Tissue nanotransfection, described in a study published in August in Nature Nanotechnology, involves a chip containing an array of tiny channels that apply electric fields to individual cells. "You affect only a small area of the cell surface, compared with the conventional method, which upsets the entire cell," says study co-author James Lee, a chemical and biomolecular engineer at The Ohio State University. "Essentially we create a tiny hole and inject DNA right into the cell, so we can control the dosage."

Chandan Sen, a physiologist at Ohio State, and his colleagues developed a genetic cocktail that rapidly converts skin cells into endothelial cells -- the main component of blood vessels. They then used their technique on mice whose legs had been damaged by a severed artery that cut off blood supply. New blood vessels formed, blood flow increased, and after three weeks the legs had completely healed.

Businesses

Former Female Oracle Employees Sue Company For Alleged Pay Discrimination (techcrunch.com) 121

Three female, former Oracle employees are suing Oracle for allegedly paying women less than men in similar jobs. Rong Jewett, Sophy Wang and Xian Murray filed a lawsuit August 28, seeking a class-action status to represent all other women who have worked at the company. TechCrunch reports: The lawsuit, first reported by The Information, alleges that Oracle discriminated against women by "systematically paying them lower wage rates than Oracle pays to male employees performing substantially equal or similar work under similar working conditions," the filing states. The time period the lawsuit references is four years prior to the filing and through the date of the trial in California. Referencing how the U.S. Department of Labor sued Oracle in January based on its compliance review that found "systemic discrimination against women" and "gross disparities in pay," the lawsuit states Oracle had known or should have known about the pay disparity between its male and female employees. The plaintiffs are seeking wages due, interest and liquidated damages plus interest. They also want Oracle to guarantee they won't pay women less than men for similar work in the future.
Programming

'Tetris' Recreated In Conway's 'Game of Life' (stackexchange.com) 87

In 1970 mathematician John Conway created rules for the "Game of Life," a now famous "zero-player game" where a grid of cells evolves (following Conway's rules) from an initial state proposed by the player. In 2013 someone challenged readers of StackExchange's "Programming Puzzles & Code Golf" section to devise an initial state "that will allow for the playing of a game of Tetris."

An anonymous Slashdot reader reports that "This challenge sat around, gathering upvotes but no answer, for four years. Then, it was answered." Citing the work of seven contributors, a massive six-part response says their solution took one and a half years to create, and "began as a quest but ended as an odyssey." The team created their own assembly language, known as QFTASM (Quest for Tetris Assembly) for use within Conway's mathematical universe, and then also designed their own processor architecture, and eventually even a higher-level language that they named COGOL. Their StackExchange response includes a link to all of their code on GitHub, as well as to a page where you can run the code online.

One StackExchange reader hailed the achievement as "the single greatest thing I've ever scrolled through while understanding very little."
Programming

Do Strongly Typed Languages Reduce Bugs? (acolyer.org) 456

"Static vs dynamic typing is always one of those topics that attracts passionately held positions," writes the Morning Paper -- reporting on an "encouraging" study that attempted to empirically evaluate the efficacy of statically-typed systems on mature, real-world code bases. The study was conducted by Christian Bird at Microsoft's "Research in Software Engineering" group with two researchers from University College London. Long-time Slashdot reader phantomfive writes: This study looked at bugs found in open source Javascript code. Looking through the commit history, they enumerated the bugs that would have been caught if a more strongly typed language (like Typescript) had been used. They found that a strongly typed language would have reduced bugs by 15%.

Does this make you want to avoid Python?

Cellphones

Apple's Swift 4.0 Includes A Compatibility Mode For 'The Majority' Of Swift 3.x Code (infoworld.com) 122

An anonymous reader quotes InfoWorld: Swift 4.0 is now available. It's a major upgrade to Apple's Swift, the three-year old successor to the Objective-C language used for MacOS and iOS application development. The Swift 4 upgrade enhances the Swift Package Manager and provides new compatibility modes for developers. Apple said Swift 4 also makes Swift more stable and improves its standard library. Swift 4 is largely source-compatible with Swift 3 and ships as part of Apple's Xcode 9 IDE...

Swift 4's new compatibility modes could save you from having to modify code to be able to use the new version of the compiler. Two modes are supported, including the Swift 3.2 mode, which accepts most source files built with Swift 3.x compilers, and the Swift 4.0 mode, which includes Swift 4 and API changes. Apple said that some source migration will be needed for many projects, but the number of source changes are "quite modest" compared to many previous major changes between Swift releases.

Apple calls Swift 4.0 "a major language release" that also includes new language changes and updates that came through the Swift Evolution process.
IBM

IBM Open Sources 'WebSphere Liberty' For Java Microservices and Cloud-Native Apps (techrepublic.com) 17

An anonymous reader quotes TechRepublic: On Wednesday, IBM revealed the Open Liberty project, open sourcing its WebSphere Liberty code on GitHub to support Java microservices and cloud-native apps. The company created Liberty five years ago to help developers more quickly and easily create applications using agile and DevOps principles, according to an IBM developerWorks blog post from Ian Robinson, WebSphere Foundation chief architect at IBM... Developers can also choose to move to the commercial versions of WebSphere Liberty at any time, he noted, which include technical support and more specialized features... "We hope Open Liberty will help more developers turn their ideas into full-fledged, enterprise ready apps," Robinson wrote. "We also hope it will broaden the WebSphere family to include more ideas and innovations to benefit the broader Java community of developers at organizations big and small."
IBM argues that Open Liberty, along with the OpenJ9 VM they open sourced last week, "provides the full Java stack from IBM with a fully open licensing model."

Interestingly, Slashdot ran a story asking "IBM WebSphere SE To Be Opened?" -- back in 2000.

Slashdot Top Deals