AI

At Amazon, Some Coders Say Their Jobs Have Begun To Resemble Warehouse Work (nytimes.com) 206

Amazon software engineers are reporting that AI tools are transforming their jobs into something resembling the company's warehouse work, with managers pushing faster output and tighter deadlines while teams shrink in size, according to the New York Times.

Three Amazon engineers told the New York Times that the company has raised productivity goals over the past year and expects developers to use AI assistants that suggest code snippets or generate entire program sections. One engineer said his team was cut roughly in half but still expected to produce the same amount of code by relying on AI tools.

The shift mirrors historical workplace changes during industrialization, the Times argues, where technology didn't eliminate jobs but made them more routine and fast-paced. Engineers describe feeling like "bystanders in their own jobs" as they spend more time reviewing AI-generated code rather than writing it themselves. Tasks that once took weeks now must be completed in days, with less time for meetings and collaborative problem-solving, according to the engineers.
Programming

Is AI Turning Coders Into Bystanders in Their Own Jobs? (msn.com) 101

AI's downside for software engineers for now seems to be a change in the quality of their work," reports the New York Times. "Some say it is becoming more routine, less thoughtful and, crucially, much faster paced... The new approach to coding at many companies has, in effect, eliminated much of the time the developer spends reflecting on his or her work."

And Amazon CEO Andy Jassy even recently told shareholders Amazon would "change the norms" for programming by how they used AI. Those changing norms have not always been eagerly embraced. Three Amazon engineers said managers had increasingly pushed them to use AI in their work over the past year. The engineers said the company had raised output goals [which affect performance reviews] and had become less forgiving about deadlines. It has even encouraged coders to gin up new AI productivity tools at an upcoming hackathon, an internal coding competition. One Amazon engineer said his team was roughly half the size it was last year, but it was expected to produce roughly the same amount of code by using AI.

Other tech companies are moving in the same direction. In a memo to employees in April, the CEO of Shopify, a company that helps entrepreneurs build and manage e-commerce websites, announced that "AI usage is now a baseline expectation" and that the company would "add AI usage questions" to performance reviews. Google recently told employees that it would soon hold a companywide hackathon in which one category would be creating AI tools that could "enhance their overall daily productivity," according to an internal announcement. Winning teams will receive $10,000.

The shift has not been all negative for workers. At Amazon and other companies, managers argue that AI can relieve employees of tedious tasks and enable them to perform more interesting work. Jassy wrote last year that the company had saved "the equivalent of 4,500 developer-years" by using AI to do the thankless work of upgrading old software... As at Microsoft, many Amazon engineers use an AI assistant that suggests lines of code. But the company has more recently rolled out AI tools that can generate large portions of a program on its own. One engineer called the tools "scarily good." The engineers said that many colleagues have been reluctant to use these new tools because they require a lot of double-checking and because the engineers want more control.

"It's more fun to write code than to read code," said Simon Willison, an AI fan who is a longtime programmer and blogger, channelling the objections of other programmers. "If you're told you have to do a code review, it's never a fun part of the job. When you're working with these tools, it's most of the job."

"This shift from writing to reading code can make engineers feel like bystanders in their own jobs," the article points out (adding "The automation of coding has special resonance for Amazon engineers, who have watched their blue-collar counterparts undergo a similar transition..."

"While there is no rush to form a union for coders at Amazon, such a move would not be unheard of. When General Motors workers went on strike in 1936 to demand recognition of their union, the United Auto Workers, it was the dreaded speedup that spurred them on."
Programming

Python Can Now Call Code Written in Chris Lattner's Mojo (modular.com) 26

Mojo (the programming language) reached a milestone today.

The story so far... Chris Lattner created the Swift programming language (and answered questions from Slashdot readers in 2017 on his way to new jobs at Tesla, Google, and SiFive). But in 2023, he'd created a new programming language called Mojo — a superset of Python with added functionality for high performance code that takes advantage of modern accelerators — as part of his work at AI infrastructure company Modular.AI.

And today Modular's product manager Brad Larson announced Python users can now call Mojo code from Python. (Watch for it in Mojo's latest nightly builds...) The Python interoperability section of the Mojo manual has been expanded and now includes a dedicated document on calling Mojo from Python. We've also added a couple of new examples to the modular GitHub repository: a "hello world" that shows how to round-trip from Python to Mojo and back, and one that shows how even Mojo code that uses the GPU can be called from Python. This is usable through any of the ways of installing MAX [their Modular Accelerated Xecution platform, an integrated suite of AI compute tools] and the Mojo compiler: via pip install modular / pip install max, or with Conda via Magic / Pixi.

One of our goals has been the progressive introduction of MAX and Mojo into the massive Python codebases out in the world today. We feel that enabling selective migration of performance bottlenecks in Python code to fast Mojo (especially Mojo running on accelerators) will unlock entirely new applications. I'm really excited for how this will expand the reach of the Mojo code many of you have been writing...

It has taken months of deep technical work to get to this point, and this is just the first step in the roll-out of this new language feature. I strongly recommend reading the list of current known limitations to understand what may not work just yet, both to avoid potential frustration and to prevent the filing of duplicate issues for known areas that we're working on.

"We are really interested in what you'll build with this new functionality, as well as hearing your feedback about how this could be made even better," the post concludes.

Mojo's licensing makes it free on any device, for any research, hobby or learning project, as well as on x86 or ARM CPUs or NVIDIA GPU.
Java

Java Turns 30 (theregister.com) 97

Richard Speed writes via The Register: It was 30 years ago when the first public release of the Java programming language introduced the world to Write Once, Run Anywhere -- and showed devs something cuddlier than C and C++. Originally called "Oak," Java was designed in the early 1990s by James Gosling at Sun Microsystems. Initially aimed at digital devices, its focus soon shifted to another platform that was pretty new at the time -- the World Wide Web.

The language, which has some similarities to C and C++, usually compiles to a bytecode that can, in theory, run on any Java Virtual Machine (JVM). The intention was to allow programmers to Write Once Run Anywhere (WORA) although subtle differences in JVM implementations meant that dream didn't always play out in reality. This reporter once worked with a witty colleague who described the system as Write Once Test Everywhere, as yet another unexpected wrinkle in a JVM caused their application to behave unpredictably. However, the language soon became wildly popular, rapidly becoming the backbone of many enterprises. [...]

However, the platform's ubiquity has meant that alternatives exist to Oracle Java, and the language's popularity is undiminished by so-called "predatory licensing tactics." Over 30 years, Java has moved from an upstart new language to something enterprises have come to depend on. Yes, it may not have the shiny baubles demanded by the AI applications of today, but it continues to be the foundation for much of today's modern software development. A thriving ecosystem and a vast community of enthusiasts mean that Java remains more than relevant as it heads into its fourth decade.

IT

Glitch is Basically Shutting Down (theverge.com) 7

Glitch, the coding platform where developers can share and remix projects, will soon no longer offer its core feature: hosting apps on the web. From a report: In an update on Thursday, Glitch CEO Anil Dash said it will stop hosting projects and close user profiles on July 8th, 2025 -- but stopped short of saying that it's shutting down completely.

Users will be able to access their dashboard and download code for their projects through the end of 2025, and Glitch is working on a new feature that allows users to redirect their project subdomains. The platform has also stopped taking new Pro subscriptions, but it will continue to honor existing subscriptions until July 8th.

Microsoft

Microsoft's Edit on Windows is a New Command-Line Text Editor (theverge.com) 105

Microsoft unveiled "Edit on Windows," a new command-line text editor, at its Build conference today. The open-source tool allows developers to edit files directly in the command line without switching to another app, similar to vim but designed to be more user-friendly.

Accessible by typing "edit" in a command prompt, the lightweight editor (less than 250KB) includes features like multiple file support via ctrl + P shortcuts, find and replace functionality, and regular expression support. "What motivated us to build Edit was the need for a default CLI text editor in 64-bit versions of Windows," said Christopher Nguyen, product manager of Windows Terminal, noting that 32-bit Windows versions already ship with MS-DOS Edit.

Microsoft also wanted to avoid the notorious "how do I exit vim?" problem by creating a modeless editor, The Verge writes. The tool will be available to Windows Insiders in the coming months.
Programming

'Rust is So Good You Can Get Paid $20K to Make It as Fast as C' (itsfoss.com) 180

The Prossimo project (funded by the nonprofit Internet Security Research Group) seeks to "move the Internet's security-sensitive software infrastructure to memory safe code." Two years ago the Prossimo project made an announcement: they'd begun work on rav1d, a safer high performance AV1 decoder written in Rust, according to a new update: We partnered with Immunant to do the engineering work. By September of 2024 rav1d was basically complete and we learned a lot during the process. Today rav1d works well — it passes all the same tests as the dav1d decoder it is based on, which is written in C. It's possible to build and run Chromium with it.

There's just one problem — it's not quite as fast as the C version...

Our Rust-based rav1d decoder is currently about 5% slower than the C-based dav1d decoder (the exact amount differs a bit depending on the benchmark, input, and platform). This is enough of a difference to be a problem for potential adopters, and, frankly, it just bothers us. The development team worked hard to get it to performance parity. We brought in a couple of other contractors who have experience with optimizing things like this. We wrote about the optimization work we did. However, we were still unable to get to performance parity and, to be frank again, we aren't really sure what to do next.

After racking our brains for options, we decided to offer a bounty pool of $20,000 for getting rav1d to performance parity with dav1d. Hopefully folks out there can help get rav1d performance advanced to where it needs to be, and ideally we and the Rust community will also learn something about how Rust performance stacks up against C.

This drew a snarky response from FFmpeg, the framework that powers audio and video processing for everyone from VLC to Twitch. "Rust is so good you can get paid $20k to make it as fast as C," they posted to their 68,300 followers on X.com.

Thanks to the It's FOSS blog for spotting the announcement.
Programming

Ask Slashdot: Would You Consider a Low-Latency JavaScript Runtime For Your Workflow? (github.com) 185

Amazon's AWS Labs has created LLRT an experimental, lightweight JavaScript runtime designed to address the growing demand for fast and efficient serverless applications.

Slashdot reader BitterEpic wants to know what you think of it: Traditional JavaScript runtimes like Node.js rely on garbage collection, which can introduce unpredictable pauses and slow down performance, especially during cold starts in serverless environments like AWS Lambda. LLRT's manual memory management, courtesy of Rust, eliminates this issue, leading to smoother, more predictable performance. LLRT also has a runtime under 2MB, a huge reduction compared to the 100MB+ typically required by Node.js. This lightweight design means lower memory usage, better scalability, and reduced operational costs. Without the overhead of garbage collection, LLRT has faster cold start times and can initialize in milliseconds—perfect for latency-sensitive applications where every millisecond counts.

For JavaScript developers, LLRT offers the best of both worlds: rapid development with JavaScript's flexibility, combined with Rust's performance. This means faster, more scalable applications without the usual memory bloat and cold start issues. Still in beta, LLRT promises to be a major step forward for serverless JavaScript applications. By combining Rust's performance with JavaScript's flexibility, it opens new possibilities for building high-performance, low-latency applications. If it continues to evolve, LLRT could become a core offering in AWS Lambda, potentially changing how we approach serverless JavaScript development.

Would you consider Javascript as the core of your future workflow? Or maybe you would prefer to go lower level with quckjs?

Programming

Stack Overflow Seeks Realignment 'To Support the Builders of the Future in an AI World' (devclass.com) 58

"The world has changed," writes Stack Overflow's blog. "Fast. Artificial intelligence is reshaping how we build, learn, and solve problems. Software development looks dramatically different than it did even a few years ago — and the pace of change is only accelerating."

And they believe their brand "at times" lost "fidelity and clarity. It's very much been always added to and not been thought of holistically. So, it's time for our brand to evolve too," they write, hoping to articulate a perspective "forged in the fires of community, powered by collaboration, shaped by AI, and driven by people."

The developer news site DevClass notes the change happens "as the number of posts to its site continues a dramatic decline thanks to AI-driven alternatives." According to a quick query on the official data explorer, the sum of questions and answers posted in April 2025 was down by over 64 percent from the same month in 2024, and plunged more than 90 percent from April 2020, when traffic was near its peak...

Although declining traffic is a sign of Stack Overflow's reduced significance in the developer community, the company's business is not equally affected so far. Stack Exchange is a business owned by investment company Prosus, and the Stack Exchange products include private versions of its site (Stack Overflow for Teams) as well as advertising and recruitment. According to the Prosus financial results, in the six months ended September 2024, Stack Overflow increased its revenue and reduced its losses. The company's search for a new direction though confirms that the fast-disappearing developer engagement with Stack Overflow poses an existential challenge to the organization.

DevClass says Stack Overflow's parent company "is casting about for new ways to provide value (and drive business) in this context..." The company has already experimented with various new services, via its Labs research department, including an AI Answer Assistant and Question Assistant, as well as a revamped jobs site in association with recruitment site Indeed, Discussions for technical debate, and extensions for GitHub Copilot, Slack, and Visual Studio Code.
From the official announcement on Stack Overflow's blog: This rebrand isn't just a fresh coat of paint. It's a realignment with our purpose: to support the builders of the future in an AI world — with clarity, speed, and humanity. It's about showing up in a way that reflects who we are today, and where we're headed tomorrow.
"We have appointed an internal steering group and we have engaged with an external expert partner in this area to help bring about the required change," notes a post in Stack Exchange's "meta" area. This isn't just about a visual update or marketing exercise — it's going to bring about a shift in how we present ourselves to the world which you will feel everywhere from the design to the copywriting, so that we can better achieve our goals and shared mission. As the emergence of AI has called into question the role of Stack Overflow and the Stack Exchange Network, one of the desired outputs of the rebrand process is to clarify our place in the world.

We've done work toward this already — our recent community AMA is an example of this — but we want to ensure that this comes across in our brand and identity as well. We want the community to be involved and have a strong voice in the process of renewing and refreshing our brand. Remember, Stack Overflow started with a public discussion about what to name it!

And another another post two months ago Stack Exchange is exploring early ideas for expanding beyond the "single lane" Q&A highway. Our goal right now is to better understand the problems, opportunities, and needs before deciding on any specific changes...

The vision is to potentially enable:

- A slower lane, with high-quality durable knowledge that takes time to create and curate, like questions and answers.

- A medium lane, for more flexible engagement, with features like Discussions or more flexible Stack Exchanges, where users can explore ideas or share opinions.

- A fast lane for quick, real-time interaction, with features like Chat that can bring the community together to discuss topics instantly.

With this in mind, we're seeking your feedback on the current state of Chat, what's most important to you, and how you see Chat fitting into the future.

In a post in Stack Exchange's "meta" area, brand design director David Longworth says the "tension mentioned between Stack Overflow and Stack Exchange" is probably the most relevant to the rebranding.

But he posted later that "There's a lot of people behind the scenes on this who care deeply about getting this right! Thank you on behalf of myself and the team."
Programming

Curl Warns GitHub About 'Malicious Unicode' Security Issue (daniel.haxx.se) 69

A Curl contributor replaced an ASCII letter with a Unicode alternative in a pull request, writes Curl lead developer/founder Daniel Stenberg. And not a single human reviewer on the team (or any of their CI jobs) noticed.

The change "looked identical to the ASCII version, so it was not possible to visually spot this..." The impact of changing one or more letters in a URL can of course be devastating depending on conditions... [W]e have implemented checks to help us poor humans spot things like this. To detect malicious Unicode. We have added a CI job that scans all files and validates every UTF-8 sequence in the git repository.

In the curl git repository most files and most content are plain old ASCII so we can "easily" whitelist a small set of UTF-8 sequences and some specific files, the rest of the files are simply not allowed to use UTF-8 at all as they will then fail the CI job and turn up red. In order to drive this change home, we went through all the test files in the curl repository and made sure that all the UTF-8 occurrences were instead replaced by other kind of escape sequences and similar. Some of them were also used more or less by mistake and could easily be replaced by their ASCII counterparts.

The next time someone tries this stunt on us it could be someone with less good intentions, but now ideally our CI will tell us... We want and strive to be proactive and tighten everything before malicious people exploit some weakness somewhere but security remains this never-ending race where we can only do the best we can and while the other side is working in silence and might at some future point attack us in new creative ways we had not anticipated. That future unknown attack is a tricky thing.

In the original blog post Stenberg complained he got "barely no responses" from GitHub (joking "perhaps they are all just too busy implementing the next AI feature we don't want.") But hours later he posted an update.

"GitHub has told me they have raised this as a security issue internally and they are working on a fix."
Programming

Rust Creator Graydon Hoare Thanks Its Many Stakeholders - and Mozilla - on Rust's 10th Anniversary (rustfoundation.org) 35

Thursday was Rust's 10-year anniversary for its first stable release. "To say I'm surprised by its trajectory would be a vast understatement," writes Rust's original creator Graydon Hoare. "I can only thank, congratulate, and celebrate everyone involved... In my view, Rust is a story about a large community of stakeholders coming together to design, build, maintain, and expand shared technical infrastructure." It's a story with many actors:

- The population of developers the language serves who express their needs and constraints through discussion, debate, testing, and bug reports arising from their experience writing libraries and applications.

- The language designers and implementers who work to satisfy those needs and constraints while wrestling with the unexpected consequences of each decision.

- The authors, educators, speakers, translators, illustrators, and others who work to expand the set of people able to use the infrastructure and work on the infrastructure.

- The institutions investing in the project who provide the long-term funding and support necessary to sustain all this work over decades.

All these actors have a common interest in infrastructure.

Rather than just "systems programming", Hoare sees Rust as a tool for building infrastructure itself, "the robust and reliable necessities that enable us to get our work done" — a wide range that includes everything from embedded and IoT systems to multi-core systems. So the story of "Rust's initial implementation, its sustained investment, and its remarkable resonance and uptake all happened because the world needs robust and reliable infrastructure, and the infrastructure we had was not up to the task." Put simply: it failed too often, in spectacular and expensive ways. Crashes and downtime in the best cases, and security vulnerabilities in the worst. Efficient "infrastructure-building" languages existed but they were very hard to use, and nearly impossible to use safely, especially when writing concurrent code. This produced an infrastructure deficit many people felt, if not everyone could name, and it was growing worse by the year as we placed ever-greater demands on computers to work in ever more challenging environments...

We were stuck with the tools we had because building better tools like Rust was going to require an extraordinary investment of time, effort, and money. The bootstrap Rust compiler I initially wrote was just a few tens of thousands of lines of code; that was nearing the limits of what an unfunded solo hobby project can typically accomplish. Mozilla's decision to invest in Rust in 2009 immediately quadrupled the size of the team — it created a team in the first place — and then doubled it again, and again in subsequent years. Mozilla sustained this very unusual, very improbable investment in Rust from 2009-2020, as well as funding an entire browser engine written in Rust — Servo — from 2012 onwards, which served as a crucial testbed for Rust language features.

Rust and Servo had multiple contributors at Samsung, Hoare acknowledges, and Amazon, Facebook, Google, Microsoft, Huawei, and others "hired key developers and contributed hardware and management resources to its ongoing development." Rust itself "sits atop LLVM" (developed by researchers at UIUC and later funded by Apple, Qualcomm, Google, ARM, Huawei, and many other organizations), while Rust's safe memory model "derives directly from decades of research in academia, as well as academic-industrial projects like Cyclone, built by AT&T Bell Labs and Cornell."

And there were contributions from "interns, researchers, and professors at top academic research programming-language departments, including CMU, NEU, IU, MPI-SWS, and many others." JetBrains and the Rust-Analyzer OpenCollective essentially paid for two additional interactive-incremental reimplementations of the Rust frontend to provide language services to IDEs — critical tools for productive, day-to-day programming. Hundreds of companies and other institutions contributed time and money to evaluate Rust for production, write Rust programs, test them, file bugs related to them, and pay their staff to fix or improve any shortcomings they found. Last but very much not least: Rust has had thousands and thousands of volunteers donating years of their labor to the project. While it might seem tempting to think this is all "free", it's being paid for! Just less visibly than if it were part of a corporate budget.

All this investment, despite the long time horizon, paid off. We're all better for it.

He looks ahead with hope for a future with new contributors, "steady and diversified streams of support," and continued reliability and compatability (including "investment in ever-greater reliability technology, including the many emerging formal methods projects built on Rust.")

And he closes by saying Rust's "sustained, controlled, and frankly astonishing throughput of work" has "set a new standard for what good tools, good processes, and reliable infrastructure software should be like.

"Everyone involved should be proud of what they've built."
AI

OpenAI Launches Codex, an AI Coding Agent, In ChatGPT 12

OpenAI has launched Codex, a powerful AI coding agent in ChatGPT that autonomously handles tasks like writing features, fixing bugs, and testing code in a cloud-based environment. TechCrunch reports: Codex is powered by codex-1, a version of the company's o3 AI reasoning model optimized for software engineering tasks. OpenAI says codex-1 produces "cleaner" code than o3, adheres more precisely to instructions, and will iteratively run tests on its code until passing results are achieved.

The Codex agent runs in a sandboxed, virtual computer in the cloud. By connecting with GitHub, Codex's environment can come preloaded with your code repositories. OpenAI says the AI coding agent will take anywhere from one to 30 minutes to write simple features, fix bugs, answer questions about your codebase, and run tests, among other tasks. Codex can handle multiple software engineering tasks simultaneously, says OpenAI, and it doesn't limit users from accessing their computer and browser while it's running.

Codex is rolling out starting today to subscribers to ChatGPT Pro, Enterprise, and Team. OpenAI says users will have "generous access" to Codex to start, but in the coming weeks, the company will implement rate limits for the tool. Users will then have the option to purchase additional credits to use Codex, an OpenAI spokesperson tells TechCrunch. OpenAI plans to expand Codex access to ChatGPT Plus and Edu users soon.
Software

Carmack: World Could Run on Older Hardware if Software Optimization Was Priority 174

Gaming pioneer John Carmack believes we're not nearly as dependent on cutting-edge silicon as most assume -- we just lack the economic incentive to prove it. Responding to a "CPU apocalypse" thought experiment on X, the id Software founder and former Oculus CTO suggested that software inefficiency, not hardware limitations, is our greatest vulnerability. "More of the world than many might imagine could run on outdated hardware if software optimization was truly a priority," Carmack wrote, arguing that market pressures would drive dramatic efficiency improvements if new chips stopped arriving.

His solution? "Rebuild all the interpreted microservice based products into monolithic native codebases!" -- essentially abandoning modern development patterns for the more efficient approaches of earlier computing eras. The veteran programmer noted that such changes would come with significant tradeoffs: "Innovative new products would get much rarer without super cheap and scalable compute."
Google

Google Developing Software AI Agent 9

An anonymous reader shares a report: After weeks of news about Google's antitrust travails, the tech giant will try to reset the narrative next week by highlighting advances it is making in artificial intelligence, cloud and Android technology at its annual I/O developer conference.

Ahead of I/O, Google has been demonstrating to employees and outside developers an array of different products, including an AI agent for software development. Known internally as a "software development lifecycle agent," it is intended to help software engineers navigate every stage of the software process, from responding to tasks to documenting code, according to three people who have seen demonstrations of the product or been told about it by Google employees. Google employees have described it as an always-on coworker that can help identify bugs to fix or flag security vulnerabilities, one of the people said, although it's not clear how close it is to being released.
Programming

Over 3,200 Cursor Users Infected by Malicious Credential-Stealing npm Packages (thehackernews.com) 30

Cybersecurity researchers have flagged three malicious npm packages that target the macOS version of AI-powered code-editing tool Cursor, reports The Hacker News: "Disguised as developer tools offering 'the cheapest Cursor API,' these packages steal user credentials, fetch an encrypted payload from threat actor-controlled infrastructure, overwrite Cursor's main.js file, and disable auto-updates to maintain persistence," Socket researcher Kirill Boychenko said. All three packages continue to be available for download from the npm registry. "Aiide-cur" was first published on February 14, 2025...

In total, the three packages have been downloaded over 3,200 times to date.... The findings point to an emerging trend where threat actors are using rogue npm packages as a way to introduce malicious modifications to other legitimate libraries or software already installed on developer systems... "By operating inside a legitimate parent process — an IDE or shared library — the malicious logic inherits the application's trust, maintains persistence even after the offending package is removed, and automatically gains whatever privileges that software holds, from API tokens and signing keys to outbound network access," Socket told The Hacker News.

"This campaign highlights a growing supply chain threat, with threat actors increasingly using malicious patches to compromise trusted local software," Boychenko said.

The npm packages "restart the application so that the patched code takes effect," letting the threat actor "execute arbitrary code within the context of the platform."

Slashdot Top Deals