Cloud

Google, Oracle Cloud Servers Wilt in UK Heatwave, Take Down Websites (theregister.com) 61

Cloud services and servers hosted by Google and Oracle in the UK have dropped offline due to cooling issues as the nation experiences a record-breaking heatwave. From a report: When the mercury hit 40.3C (104.5F) in eastern England, the highest ever registered by a country not used to these conditions, datacenters couldn't take the heat. Selected machines were powered off to avoid long-term damage, causing some resources, services, and virtual machines to became unavailable, taking down unlucky websites and the like.

Multiple Oracle Cloud Infrastructure resources are offline, including networking, storage, and compute provided by its servers in the south of UK. Cooling systems were blamed, and techies switched off equipment in a bid to prevent hardware burning out, according to a status update from Team Oracle. "As a result of unseasonal temperatures in the region, a subset of cooling infrastructure within the UK South (London) Data Centre has experienced an issue," Oracle said on Tuesday at 1638 UTC. "As a result some customers may be unable to access or use Oracle Cloud Infrastructure resources hosted in the region.

Google

Google Will Let European Developers Use Their Own Billing Systems (theverge.com) 19

Google will start allowing the developers of non-gaming apps in the European Economic Area (EEA) to offer alternate payment systems. In a blog post, Google outlines its plans to comply with the Digital Markets Act (or DMA), a piece of legislation aimed at regulating big tech. From a report: The DMA passed through the European Parliament earlier this month, but it isn't expected to go into force until spring 2023. But Google is rolling out the changes ahead of time to make sure that its plans "serve the needs" of users.

The legislation requires "gatekeepers," or companies with a market capitalization of $75.8 billion or over, to follow a set of rules meant to promote competition among digital platforms. Failing to comply could lead to fines of up to 10 percent of a firm's global revenue or 20 percent in case of repeat offenses. Android developers who choose to use an alternate payment processor will still have to pay Google a service fee for each transaction on the first $1 million they make within one year. However, Google says it will reduce this fee by 3 percent, meaning the company will take a 12 percent or lower cut from every transaction. If developers make more than $1 million in one year, Google will charge developers a 27 percent fee on transactions (3 percent less than the standard 30 percent).

GNU is Not Unix

GCC Rust Approved by Steering Committee, Beta Likely Next April (phoronix.com) 51

Phoronix reports: The GCC Steering Committee has approved of the GCC Rust front-end providing Rust programming language support by the GNU Compiler Collection. This Rust front-end will likely be merged ahead of the GCC 13 release next year.

The GCC Steering Committee this morning has announced that the Rust front-end "GCC Rust" is appropriate for inclusion into the GCC mainline code-base. This is the effort that has been in the works for a while as an alternative to Rust's official LLVM-based compiler. GCC Rust is still under active development but is getting into shape for mainlining.

The hope is to have at least "beta" level support for the Rust programming language in GCC 13, which will be released as stable around April of next year.

Programming

Ask Slashdot: Does WebAssembly Increase Your Web Browser's Attack Surface? (github.com) 104

Steve Springett is a conscientious senior security architect. And in 2018, he published an essay on GitHub arguing that from a security engineer's perspective, WebAssembly "increases the attack surface of any browser that supports it."

Springett wrote that WebAssembly modules are sent in (unsigned) binary format — without a transport-layer security mechanism — and rely on browser sandboxing for safety. But the binary format makes it harder to analyze the code, while sandboxing "is prone to breakouts and effectiveness varies largely by implementation. Adobe Flash is an example of a technology that was sandboxed after a series of exploits, yet exploits and breakouts still occurred." Springett even went so far as to offer the commands for switching off WebAssembly in your browser.

Now Tablizer (Slashdot reader #95,088) wants to know what other Slashdot readers think of Spingett's security concrens around WebAssembly.

And also offers this suggestion to browser makers: Browsers should have a way to easily disable WebAssembly — including whitelisting. For example, if you need it for specific gaming site, you can whitelist just that site and not have WASM exposed for other sites.
Programming

Top Languages for WebAssembly Development: Rust, C++, Blazor, Go - and JavaScript? (visualstudiomagazine.com) 49

This year's "State of WebAssembly" report has been published by Colin Eberhardt (CTO at the U.K.-based software consultancy Scott Logic). Hundreds of people were surveyed for the report, notes this article by Visual Studio Magazine.

Published by B2B media company 1105 Media, the magazine notes that Eberhardt's survey included some good news for Rust — and for Microsoft's free open source framework Blazor (for building web apps using C# and HTML): This year, like last year, Rust was found to be the most frequently used and most desired programming language for WebAssembly development.... "Rust once again comes out on top, with 45 percent saying they use it frequently or sometimes," Eberhardt said. "WebAssembly and Rust do have quite a close relationship, most WebAssembly runtimes are written in Rust, as are the various platforms based on wasm. It also enjoys some of the best tooling, so this result doesn't come as a big surprise."

While Rust usage and desirability has continued to climb, the Blazor web-dev framework is coming on strong in the report, which treats Blazor as a programming language, though it's not. On that desirability scale, Blazor climbed from sixth spot in 2021 to fourth this year among seven "programming languages" [based on] percentage of respondents who use a given language 'frequently,' or 'sometimes' [for WebAssembly development] compared to last year. Eberhardt said, "Rust has had a modest rise in desirability, but the biggest climber is Blazor, with Go following just behind."

Commenting on another graphic that shows which language people most want to use for WebAssembly development, Eberhardt said, "This shows that Rust usage has climbed steadily, but the biggest climbers are Blazor and Python.

While you can now compile WebAssembly from a variety of languages (including C, #C, and C++), the report also found that JavaScript has somehow become a viable WebAssembly language — sort of, and even though JavaScript itself can't be compiled to WebAssembly... There's a cunning workaround for this challenge; rather than compiling JS to Wasm, you can instead compile a JavaScript engine to WebAssembly then use that to execute your code.

This is actually much more practical than you might think.

Android

Google Play Hides App Permissions In Favor of Developer-Written Descriptions (arstechnica.com) 33

An anonymous reader quotes a report from Ars Technica: Google's developer deadline for the Play Store's new "Data Safety" section is next week (July 20), and we're starting to see what the future of Google Play privacy will look like. The actual Data Safety section started rolling out in April, but now that the developer deadline is approaching... Google is turning off the separate "app permissions" section? That doesn't sound like a great move for privacy at all.

The Play Store's new Data Safety section is Google's answer to a similar feature in iOS 14, which displays a list of developer-provided privacy considerations, like what data an app collects, how that data is stored, and who the data is shared with. At first blush, the Data Safety entries might seem pretty similar to the old list of app permissions. You get items like "location," and in some ways, it's better than a plain list of permissions since developers can explain how and why each bit of data is collected.

The difference is in how that data ends up in Google's system. The old list of app permissions was guaranteed to be factual because it was built by Google, automatically, by scanning the app. The Data Safety system, meanwhile, runs on the honor system. Here's Google's explanation to developers of how the new section works: "You alone are responsible for making complete and accurate declarations in your app's store listing on Google Play. Google Play reviews apps across all policy requirements; however, we cannot make determinations on behalf of the developers of how they handle user data. Only you possess all the information required to complete the Data safety form. When Google becomes aware of a discrepancy between your app behavior and your declaration, we may take appropriate action, including enforcement action."

The Military

DARPA Is Worried About How Well Open-Source Code Can Be Trusted (technologyreview.com) 85

An anonymous reader quotes a report from MIT Technology Review: "People are realizing now: wait a minute, literally everything we do is underpinned by Linux," says Dave Aitel, a cybersecurity researcher and former NSA computer security scientist. "This is a core technology to our society. Not understanding kernel security means we can't secure critical infrastructure." Now DARPA, the US military's research arm, wants to understand the collision of code and community that makes these open-source projects work, in order to better understand the risks they face. The goal is to be able to effectively recognize malicious actors and prevent them from disrupting or corrupting crucially important open-source code before it's too late. DARPA's "SocialCyber" program is an 18-month-long, multimillion-dollar project that will combine sociology with recent technological advances in artificial intelligence to map, understand, and protect these massive open-source communities and the code they create. It's different from most previous research because it combines automated analysis of both the code and the social dimensions of open-source software.

Here's how the SocialCyber program works. DARPA has contracted with multiple teams of what it calls "performers," including small, boutique cybersecurity research shops with deep technical chops. One such performer is New York -- based Margin Research, which has put together a team of well-respected researchers for the task. Margin Research is focused on the Linux kernel in part because it's so big and critical that succeeding here, at this scale, means you can make it anywhere else. The plan is to analyze both the code and the community in order to visualize and finally understand the whole ecosystem.

Margin's work maps out who is working on what specific parts of open-source projects. For example, Huawei is currently the biggest contributor to the Linux kernel. Another contributor works for Positive Technologies, a Russian cybersecurity firm that -- like Huawei -- has been sanctioned by the US government, says Aitel. Margin has also mapped code written by NSA employees, many of whom participate in different open-source projects. "This subject kills me," says d'Antoine of the quest to better understand the open-source movement, "because, honestly, even the most simple things seem so novel to so many important people. The government is only just realizing that our critical infrastructure is running code that could be literally being written by sanctioned entities. Right now." This kind of research also aims to find underinvestment -- that is critical software run entirely by one or two volunteers. It's more common than you might think -- so common that one common way software projects currently measure risk is the "bus factor": Does this whole project fall apart if just one person gets hit by a bus?
SocialCyber will also tackle other open-source projects too, such as Python which is "used in a huge number of artificial-intelligence and machine-learning projects," notes the report. "The hope is that greater understanding will make it easier to prevent a future disaster, whether it's caused by malicious activity or not."
Programming

Hundreds of Tech, Business and Nonprofit Leaders Urge States To Boost CS Education 49

theodp writes: In partnership with tech-bankrolled nonprofit Code.org, over 500 of the nation's business, education and nonprofit leaders issued a letter calling for state governments and education leaders to bring more Computer Science to K-12 students across the U.S. The signatories include a who's who of tech leaders, including Bill Gates, Jeff Bezos, Satya Nadella, Steve Ballmer, Tim Cook, Sundar Pichai, and Mark Zuckerberg. A new website -- CEOs for CS -- was launched in conjunction with the campaign. "The United States leads the world in technology, yet only 5% of our high school students study computer science. How is this acceptable?" the CEOs demand to know in their letter addressed "To the Governors and Education Leaders of the United States of America." They add, "Nearly two-thirds of high-skilled immigration is for computer scientists, and every state is an importer of this strategic talent. The USA has over 700,000 open computing jobs but only 80,000 computer science graduates a year. We must educate American students as a matter of national competitiveness."

A press release explains that the announcement "coincides with the culmination of the National Governors Association Chairman's Initiative for K-12 computer science, led by Arkansas Gov. Asa Hutchinson." Hutchinson is a founding Governor of the Code.org-led advocacy group Govs for CS, which launched in anticipation of President Obama's tech-supported but never materialized $4 billion CS for All initiative. Hutchinson was a signatory of an earlier 2016 Code.org organized letter from Governors, business, education, and nonprofit leaders that implored Congress to make CS education for K-12 students a priority.
Bitcoin

Game Developer On 'Why NFTs Are a Nightmare' (pcgamer.com) 89

Game developer Mark Venturelli received a spirited ovation at Brazil's International Games Festival on Friday after he surprised the audience for his "Future of Game Design" talk with a new title: "Why NFTs are a nightmare." PC Gamer reports: Venturelli, who is best known for the game Chroma Squad, didn't just push back against those talks by calling NFTs a nightmare: He argued in detail that they're bad for gaming and run directly counter to his vision for the future of game design. In a follow-up interview with PC Gamer, Venturelli said the event's blockchain sponsors needed "to buy their relevance, because they're not relevant." [...] NFT projects in particular quickly became savvy enough to use phrases like "environment-friendly technology" in their press releases, but none of them grapple with the deeper criticisms of their ideas. That's what Venturelli zeroed in on in his talk and in our follow-up interview. There's the uncanny resemblance between these profit-driven grifts and pyramid schemes, but there's also the philosophical concern that things like cryptocurrency represent a libertarian ideal founded in paranoia about institutions, and about other human beings. That, Venturelli says, is in part why they're so inefficient in the first place.

"Computationally, like in real life, if you don't trust the people that you're working with, you have to spend a lot more energy to achieve the same things," he says. "If I'm living with you in the same house and we don't trust each other, I have to, every time before I leave my house, hide my valuables. I have to make inventory of the things that I own, and maybe put cameras or locks inside of things. When I come back home I need to check everything and see if you messed with any of my stuff, and make sure that you don't get into my room when I'm sleeping and all that shit. It's so much energy that I have to use just to exist in a room with you, because I don't trust you. That, I feel, is a very good metaphor about how computationally blockchain works, and what is the underlying philosophical idea behind it, which is, 'We want a world without any sort of centralized authority because we cannot trust any of them ever.' And that is the opposite of what we want as a society, in my opinion." [...]

Investors see potential value in South America right now due to exploitable political and economic instabilities, which for Venturelli means that presenting his counterargument is more important than ever. "If we don't take up some spaces, and we let these kinds of people take these spaces, suddenly they're dictating what's the future, suddenly they're taking the investments so that they are building our next big projects," he said. "That's when it starts to get really dangerous, because it can jeopardize our future as an industry, in my opinion. Because I don't feel like these things have long legs. I feel like they might be successful in the short term, but they are going to fall on the long term for sure." [He went on to say:] "Right now we are living in a crisis of trust in Western society -- trust in each other, in institutions, and even in our future together is in decline," Venturelli says. "We should be building systems that help connect people and build trust, build sustainable solutions, and build infinitely scalable human solutions. We should not be shifting away from culture, entertainment, and storytelling towards economic activity. We should not just be eliminating the final hiding places that we have to run away from the oppression of capitalist society."
You can watch Venturelli's The Future of Game Design talk on YouTube. An English version of the slides accompanying it is available here.
Security

PyPI Is Rolling Out 2FA For Critical Projects, Giving Away 4,000 Security Keys (zdnet.com) 19

PyPI or the Python Package Index is giving away 4,000 Google Titan security keys as part of its move to mandatory two-factor authentication (2FA) for critical projects built in the Python programming language. ZDNet reports: PyPI, which is managed by the Python Software Foundation, is the main repository where Python developers can get third-party developed open-source packages for their projects. [...] One way developers can protect themselves from stolen credentials is by using two-factor authentication and the PSF is now making it mandatory for developers behind "critical projects" to use 2FA in coming months. PyPI hasn't declared a specific date for the requirement. "We've begun rolling out a 2FA requirement: soon, maintainers of critical projects must have 2FA enabled to publish, update, or modify them," the PSF said on its PyPI Twitter account.

As part of the security drive, it is giving away 4,000 Google Titan hardware security keys to project maintainers gifted by Google's open source security team. "In order to improve the general security of the Python ecosystem, PyPI has begun implementing a two-factor authentication (2FA) requirement for critical projects. This requirement will go into effect in the coming months," PSF said in a statement. "To ensure that maintainers of critical projects have the ability to implement strong 2FA with security keys, the Google Open Source Security Team, a sponsor of the Python Software Foundation, has provided a limited number of security keys to distribute to critical project maintainers.

PSF says it deems any project in the top 1% of downloads over the prior six months as critical. Presently, there are more than 350,000 projects on PyPI, meaning that more than 3,500 projects are rated as critical. PyPI calculates this on a daily basis so the Titan giveaway should go a long way to cover a chunk of key maintainers but not all of them. In the name of transparency, PyPI is also publishing 2FA account metrics here. There are currently 28,336 users with 2FA enabled, with nearly 27,000 of them using a 2FA app like Microsoft Authenticator. There are over 3,800 projects rated as "critical" and 8,241 PyPI users in this group. The critical group is also likely to grow since projects that have been designated as critical remain so indefinitely while new projects are added to mandatory 2FA over time. The 2FA rule applies to both project maintainers and owners.

Programming

Meet Bun, a Speedy New JavaScript Runtime (bun.sh) 121

Bun is "a modern JavaScript runtime like Node or Deno," according to its newly-launched web site, "built from scratch to focus on three main things."

- Start fast (it has the edge in mind).
- New levels of performance (extending JavaScriptCore, the engine).
- Being a great and complete tool (bundler, transpiler, package manager).

Bun is designed as a drop-in replacement for your current JavaScript & TypeScript apps or scripts — on your local computer, server or on the edge. Bun natively implements hundreds of Node.js and Web APIs, including ~90% of Node-API functions (native modules), fs, path, Buffer and more. [And Bun also implements Node.js' module resolution algorithm, so you can use npm packages in bun.js]

The goal of Bun is to run most of the world's JavaScript outside of browsers, bringing performance and complexity enhancements to your future infrastructure, as well as developer productivity through better, simpler tooling.... Why is Bun fast? An enormous amount of time spent profiling, benchmarking and optimizing things. The answer is different for every part of Bun, but one general theme: [it's written in Zig.] Zig's low-level control over memory and lack of hidden control flow makes it much simpler to write fast software.

An infographic on the site claims its server-side rendering of React is more than three times faster than Node or Deno. And Bun.js can even automatically load environment variables from .env files, according to the site. No more require("dotenv").load()
Hackaday describes it as "a performant all-in-one approach," including "bundling, transpiling, module resolution, and a fantastic foreign-function interface." Many Javascript projects have a bundling and transpiling step that takes the source and packages it together in a more standard format. Typescript needs to be packaged into javascript, and modules need to be resolved. Bun bakes all this in. Typescript and JSX "just work." This dramatically simplifies many projects as much of the build infrastructure is part of Bun itself, lowering cognitive load when trying to understand a project... Some web-specific APIs, such as fetch and Websockets, are also built-in.
"What's even wilder is that Bun is written by one person, Jared Sumner," the article points out — adding that the all the code is available on GitHub under the MIT License ("excluding dependencies which have various licenses.")
Databases

Baserow Challenges Airtable With an Open Source No-Code Database Platform (techcrunch.com) 19

An anonymous reader quotes a report from TechCrunch: The burgeoning low-code and no-code movement is showing little sign of waning, with numerous startups continuing to raise sizable sums to help the less-technical workforce develop and deploy software with ease. Arguably one of the most notable examples of this trend is Airtable, a 10-year-old business that recently attained a whopping $11 billion valuation for a no-code platform used by firms such as Netflix and Shopify to create relational databases. In tandem, we're also seeing a rise in "open source alternatives" to some of the big-name technology incumbents, from Google's backend-as-a-service platform Firebase to open source scheduling infrastructure that seeks to supplant the mighty Calendly. A young Dutch company called Baserow sits at the intersection of both these trends, pitching itself as an open source Airbase alternative that helps people build databases with minimal technical prowess. Today, Baserow announced that it has raised $5.2 million in seed funding to launch a suite of new premium and enterprise products in the coming months, transforming the platform from its current database-focused foundation into a "complete, open source no-code toolchain," co-founder and CEO Bram Wiepjes told TechCrunch.

So what, exactly, does Baserow do in its current guise? Well, anyone with even the most rudimentary spreadsheet skills can use Baserow for use-cases spanning content marketing, such as managing brand assets collaboratively across teams; managing and organizing events; helping HR teams or startups manage and track applicants for a new role; and countless more, which Baserow provides pre-built templates for. [...] Baserow's open source credentials are arguably its core selling point, with the promise of greater extensibility and customizations (users can create their own plug-ins to enhance its functionality, similar to how WordPress works) -- this is a particularly alluring proposition for businesses with very specific or niche use cases that aren't well supported from an off-the-shelf SaaS solution. On top of that, some sectors require full control of their data and technology stack for security or compliance purposes. This is where open source really comes into its own, given that businesses can host the product themselves and circumvent vendor lock-in.

With a fresh 5 million euros in the bank, Baserow is planning to double down on its commercial efforts, starting with a premium incarnation that's officially launching out of an early access program later this month. This offering will be available as a SaaS and self-hosted product and will include various features such as the ability to export in different formats; user management tools for admin; Kanban view; and more. An additional "advanced" product will also be made available purely for SaaS customers and will include a higher data storage limit and service level agreements (SLAs). Although Baserow has operated under the radar somewhat since its official foundation in Amsterdam last year, it claims to have 10,000 active users, 100 sponsors who donate to the project via GitHub and 800 users already on the waiting list for its premium version. Later this year, Baserow plans to introduce a paid enterprise version for self-hosting customers, with support for specific requirements such as audit logs, single sign-on (SSO), role-based access control and more.

Programming

Vim 9.0 Released (vim.org) 81

After many years of gradual improvement Vim now takes a big step with a major release. Besides many small additions the spotlight is on a new incarnation of the Vim script language: Vim9 script. Why Vim9 script: A new script language, what is that needed for? Vim script has been growing over time, while preserving backwards compatibility. That means bad choices from the past often can't be changed and compatibility with Vi restricts possible solutions. Execution is quite slow, each line is parsed every time it is executed.

The main goal of Vim9 script is to drastically improve performance. This is accomplished by compiling commands into instructions that can be efficiently executed. An increase in execution speed of 10 to 100 times can be expected. A secondary goal is to avoid Vim-specific constructs and get closer to commonly used programming languages, such as JavaScript, TypeScript and Java.

The performance improvements can only be achieved by not being 100% backwards compatible. For example, making function arguments available by creating an "a:" dictionary involves quite a lot of overhead. In a Vim9 function this dictionary is not available. Other differences are more subtle, such as how errors are handled. For those with a large collection of legacy scripts: Not to worry! They will keep working as before. There are no plans to drop support for legacy script. No drama like with the deprecation of Python 2.

Databases

SQLite or PostgreSQL? It's Complicated! (twilio.com) 101

Miguel Grinberg, a Principal Software Engineer for Technical Content at Twilio, writes in a blog post: We take blogging very seriously at Twilio. To help us understand what content works well and what doesn't on our blog, we have a dashboard that combines the metadata that we maintain for each article such as author, team, product, publication date, etc., with traffic information from Google Analytics. Users can interactively request charts and tables while filtering and grouping the data in many different ways. I chose SQLite for the database that supports this dashboard, which in early 2021 when I built this system, seemed like a perfect choice for what I thought would be a small, niche application that my teammates and I can use to improve our blogging. But almost a year and a half later, this application tracks daily traffic for close to 8000 articles across the Twilio and SendGrid blogs, with about 6.5 million individual daily traffic records, and with a user base that grew to over 200 employees.

At some point I realized that some queries were taking a few seconds to produce results, so I started to wonder if a more robust database such as PostgreSQL would provide better performance. Having publicly professed my dislike of performance benchmarks, I resisted the urge to look up any comparisons online, and instead embarked on a series of experiments to accurately measure the performance of these two databases for the specific use cases of this application. What follows is a detailed account of my effort, the results of my testing (including a surprising twist!), and my analysis and final decision, which ended up being more involved than I expected. [...] If you are going to take one thing away from this article, I hope it is that the only benchmarks that are valuable are those that run on your own platform, with your own stack, with your own data, and with your own software. And even then, you may need to add custom optimizations to get the best performance.

Programming

The Really Important Job Interview Questions Engineers Should Ask (But Don't) (posthog.com) 185

James Hawkins: Since we started PostHog, our team has interviewed 725 people. What's one thing I've taken from this? It's normal for candidates not to ask harder questions about our company, so they usually miss out on a chance to (i) de-risk our company's performance and (ii) to increase the chances they'll like working here.

Does the company have product-market fit? This is the single most important thing a company can do to survive and grow.
"Do you ever question if you have product-market fit?"
"When did you reach product-market fit? How did you know?"
"What do you need to do to get to product-market fit?"
"What's your revenue? What was it a year ago?"
"How many daily active users do you have?"

It's ok if these answers show you the founder doesn't have product market fit. In this case, figure out if they will get to a yes. Unless you want to join a sinking ship, of course! Early stage founders are (or should be) super-mega-extra-desperately keen to have product-market fit -- it's all that really matters. The ones that will succeed are those that are honest about this (or those that have it already) and are prioritizing it. Many will think or say (intentionally or through self-delusion) that they have it when they don't. Low user or revenue numbers and vague answers to the example questions above are a sign that it isn't there. Product-market fit is very obvious.

Google

Google To Pay $90 Million To Settle Legal Fight With App Developers (reuters.com) 12

Google has agreed to pay $90 million to settle a legal fight with app developers over the money they earned creating apps for Android smartphones and for enticing users to make in-app purchases. Reuters reports: The app developers, in a lawsuit filed in federal court in San Francisco, had accused Google of using agreements with smartphone makers, technical barriers and revenue sharing agreements to effectively close the app ecosystem and shunt most payments through its Google Play billing system with a default service fee of 30%.

As part of the proposed settlement, Google said in a blog post it would put $90 million in a fund to support app developers who made $2 million or less in annual revenue from 2016-2021. "A vast majority of U.S. developers who earned revenue through Google Play will be eligible to receive money from this fund, if they choose," Google said in the blog post. Google said it would also charge developers a 15% commission on their first million in revenue from the Google Play Store each year. It started doing this in 2021.
"There were likely 48,000 app developers eligible to apply for the $90 million fund, and the minimum payout is $250," notes Reuters.
Facebook

Meta Sparks Anger By Charging For VR Apps (arstechnica.com) 32

An anonymous reader quotes a report from the Financial Times: Meta is facing a growing backlash for the charges imposed on apps created for its virtual reality headsets, as developers complain about the commercial terms set around futuristic devices that the company hopes will help create a multibillion-dollar consumer market. [...] But several developers told the Financial Times of their frustration that Meta, which is seen as having an early lead in a nascent market, has insisted on a charging model for its VR app store similar to what exists today on smartphones. This is despite Meta chief Mark Zuckerberg being strongly critical in the past of charging policies on existing mobile app stores.

"Don't confuse marketing with reality -- it's good marketing to pick on Apple. But it doesn't mean Meta won't do the exact same thing," said Seth Siegel, global head of AI and cyber security at Infosys Consulting. "There is no impetus for them to be better." The "Quest Store" for Meta's Quest 2, by far the most popular VR headset on the market, takes a 30 percent cut from digital purchases and charges 15-30 percent on subscriptions, similar to the fees charged by Apple and Android. "Undoubtedly there are services provided -- they build amazing hardware and provide store services," said Daniel Sproll, chief executive of Realities.io, an immersive realities start-up behind the VR game Puzzling Places. "But the problem is that it feels like everybody agreed on this 30 percent and that's what we're stuck with. It doesn't feel like there's any competition. The Chinese companies coming out with headsets are the same. Why would they change it?"

Meta defended its policies, pointing out that unlike iPhone owners, Quest users can install apps outside its official store through SideQuest, a third-party app store, or make use of App Lab, its less restricted, more experimental app store. "We want to foster choice and competition in the VR ecosystem," Meta said. "And it's working -- our efforts have produced a material financial return for developers: as we announced earlier this year, over $1 billion has been spent on games and apps in the Meta Quest Store." Developers welcome these alternatives but say their impact is limited. SideQuest has been downloaded just 396,000 times, versus 19 million for the Oculus app, according to Sensor Tower. App Lab, meanwhile, still takes a 30 percent cut of purchases.
Developers are also frustrated with Meta's shift to a more restrictive approach to allowing apps on its VR app store.

Chris Pruett, Meta's content ecosystem director, said Meta found that lax standards resulted in too many users being frustrated by low-quality content, so the company has opted to play more of a gatekeeper role. But developers said the resulting barriers could lack transparency.

"Getting something on the Quest store is painful," said Lyron Bentovim, chief executive of the Glimpse Group, an immersive experiences group. "It's significantly worse than getting on Apple or Android stores."
Programming

Svelte Origins: a JavaScript Documentary (youtube.com) 48

Svelte Origins: The Documentary tells the story of how Svelte came to be, what makes Svelte different, and how it changes the game as a JavaScript framework. From the description of the documentary, which was recommended by several Slashdot readers: Filmed in locations throughout Europe and the US, it features Svelte's creator Rich Harris and members from the core community who contributed to making Svelte what it is today. Svelte Origins was filmed in late 2021, produced by OfferZen and directed by Dewald Brand, with shoots in the USA, the UK, Ireland, Sweden and Germany.
Programming

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296

Long-time Slashdot reader Artem S. Tashkinov shares a blog post from indie game programmer who complains "The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process." Oh and BTW it gives error messages and right now, it doesn't work. sigh.

I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....

Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....

All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.

This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....

There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.

Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.

But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
  • Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
  • A lot of things like security were not taken in consideration
  • It's a different computing environment.... multi tasking, internet, GPUs
  • In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
  • A lot of code is due to to software interacting one with another, compatibility with standards
  • Shiny technology like microservices allow scaling, heterogenous integration

So who's right and who's wrong? Leave your own best answers in the comments.

And are today's programmers leaving too much code bloat?


Programming

Stack Overflow Survey Finds Developers Like Rust, Python, JavaScript and Remote Work (infoworld.com) 97

For Stack Overflow's annual survey, "Over 73,000 developers from 180 countries each spent roughly 15 minutes answering our questions," a blog post announces: The top five languages for professional developers haven't changed: JavaScript is still the most used, and Rust is the most loved for a seventh year. The big surprise came in the most loved web framework category. Showing how fast web technologies change, newcomer Phoenix took the most loved spot from Svelte, itself a new entry last year.... Check out the full results from this year's Developer Survey here.
In fact, 87% of Rust developers said that they want to continue using Rust, notes SD Times' summary of the results: Rust also tied with Python as the most wanted technology in this year's report, with TypeScript and Go following closely behind. The distinction between most loved and most wanted is that most wanted includes only developers who are not currently developing with the language, but have an interest in developing with it.
Slashdot reader logankilpatrick writes, "It should come as no surprise to those following the growth and expansion of the Julia Programming Language ecosystem that in this year's Stack Overflow developer survey, Julia ranked in the top 5 for the most loved languages (above Python — 6th, MatLab — Last, and R — 33rd)."

And the Register shares more highlights: Also notable in the 71,547 responses regarding programming languages was a switch again between Python and SQL. In 2021, Python pushed out SQL to be the third most commonly used language. This year SQL regained third place, just behind second placed HTML /CSS.

And the most hated...

Unsurprisingly, developers still dread that tap on the shoulder from the finance department for a tweak to that bit of code upon which the entire company depends. Visual Basic for Applications and COBOL still lurk within the top three most dreaded technologies.

The operating system rankings were little changed: Windows won out for personal and professional use, although for professional use Linux passed macOS to take second place with 40 percent of responses compared to Apple's 33 percent. Most notable was the growth of Windows Subsystem for Linux, which now accounts for 14 percent of personal use compared with a barely registering 3 percent in 2021.

But SD Times noted what may be the most interesting statistic: Only 15% of developers work on-site full time. Forty-three percent are fully remote and 42% are hybrid. Smaller organizations with 2-19 employees are more likely to be in-person, while large organizations with over 10k employees are more likely to be hybrid, according to the survey.
InfoWorld delves into what this means: "The world has made the decision to go hybrid and remote, I have a lot of confidence given the data I have seen that that is a one-way train that has left the station," Prashanth Chandrasekar, CEO of Stack Overflow told InfoWorld.

Chandrasekar says that flexibility and the tech stack developers get to work with are the most important contributors to overall happiness at work. "Many developers drop out of the hiring process because of the tech stack they will be working with," he said... Organizational culture is also shifting, and cloud-native techniques have taken hold among Stack Overflow survey respondents. Most professional developers (70%) now use some form of CI/CD and 60% have a dedicated devops function....

Lastly, Web3 still has software developers torn, with 32% of respondents favorable, 31% unfavorable, and 26% indifferent. Web3 refers to the emerging idea of a decentralized web where data and content are registered on blockchains, tokenized, or managed and accessed on peer-to-peer distributed networks.

Slashdot Top Deals