Oracle

USPTO Petitioned To Cancel Oracle's JavaScript Trademark (infoworld.com) 26

Software company Deno Land has filed a petition with the U.S. Patent and Trademark Office to cancel Oracle's JavaScript trademark, citing trademark abandonment and fraud. The November 22 filing claims Oracle has not sold JavaScript products or services since acquiring the trademark through its 2009 Sun Microsystems purchase. The petition alleges Oracle committed fraud during its 2019 trademark renewal by submitting Node.js website screenshots without authorization.

The legal action follows a September open letter from JavaScript creator Brendan Eich, Node.js and Deno creator Ryan Dahl, and other prominent JavaScript developers urging Oracle to relinquish the trademark. The letter has garnered over 14,000 signatures.
Software

Europe's Largest Local Authority Slammed For 'Poorest' ERP Rollout Ever (theregister.com) 71

UK government-appointed commissioners have labeled Birmingham City Council's Oracle Fusion rollout as "the poorest ERP deployment" they have seen. From a report: A report published by the UK council's Corporate Finance Overview and Scrutiny Committee found that 18 months after Fusion went live, the largest public authority in Europe "had not tactically stabilized the system or formulated clear plans to resolve the system issues and recover the operation."

The city council's cloud-based Oracle tech replaced the SAP system that it began using in 1999, but the disastrous project encountered a string of landmark failures. The council has failed to produce auditable accounts since Oracle was implemented in 2022, costs have ballooned from around 19 million pound to a projected estimate of 131 million pound and, because the council chose not to use system audit features, it cannot tell if fraud has taken place on its multibillion-pound spending budget for an 18-month period. In September last year, the council became effectively bankrupt due to outstanding equal pay claims and the Oracle implementation.

The report from "best value commissioners" appointed by central government to investigate struggling councils said that following the Oracle implementation, "a serious lack of trust had developed between members and officers driven by the failed implementation and subsequent lack of progress to resolve the situation."

Networking

DTrace for Linux Comes to Gentoo (gentoo.org) 14

It was originally created back in 2005 by Sun Microsystems for its proprietary Solaris Unix systems, "for troubleshooting kernel and application problems on production systems in real time," explains Wikipedia. "DTrace can be used to get a global overview of a running system, such as the amount of memory, CPU time, filesystem and network resources used by the active processes," explains its Wikipedia entry.

But this week, Gentoo announced: The real, mythical DTrace comes to Gentoo! Need to dynamically trace your kernel or userspace programs, with rainbows, ponies, and unicorns — and all entirely safely and in production?! Gentoo is now ready for that!

Just emerge dev-debug/dtrace and you're all set. All required kernel options are already enabled in the newest stable Gentoo distribution kernel...

Documentation? Sure, there's lots of it. You can start with our DTrace wiki page, the DTrace for Linux page on GitHub, or the original documentation for Illumos. Enjoy!

Thanks to Heraklit (Slashdot reader #29,346) for sharing the news.
Education

Code.org Taps No-Code Tableau To Make the Case For K-12 Programming Courses 62

theodp writes: "Computer science education is a necessity for all students," argues tech-backed nonprofit Code.org in its newly-published 2024 State of Computer Science Education (Understanding Our National Imperative) report. "Students of all identities and chosen career paths need quality computer science education to become informed citizens and confident creators of content and digital tools."

In the 200-page report, Code.org pays special attention to participation in "foundational computer science courses" in high school. "Across the country, 60% of public high schools offer at least one foundational computer science course," laments Code.org (curiously promoting a metric that ignores school size which nonetheless was embraced by Education Week and others).

"A course that teaches foundational computer science includes a minimum amount of time applying learned concepts through programming (at least 20 hours of programming/coding for grades 9-12 high schools)," Code.org explains in a separate 13-page Defining Foundational Computer Science document. Interestingly, Code.org argues that Data and Informatics courses -- in which "students may use Oracle WebDB, SQL, PL/SQL, SPSS, and SAS" to learn "the K-12 CS Framework concepts about data and analytics" -- do not count, because "the course content focuses on querying using a scripting language rather than creating programs [the IEEE's Top Programming Languages 2024 begs to differ]." Code.org similarly dissed the use of the Wolfram Language for broad educational use back in 2016.

With its insistence on the importance of kids taking Code.org-defined 'programming' courses in K-12 to promote computational thinking, it's probably no surprise to see that the data behind the 2024 State of Computer Science Education report was prepared using Python (the IEEE's top programming language) and presented to the public in a Jupyter notebook. Just kidding. Ironically, the data behind the 2024 State of Computer Science Education analysis is prepared and presented by Code.org in a no-code Tableau workbook.
AI

Is the Microsoft-OpenAI 'Bromance' Beginning to Fray? (seattletimes.com) 30

Though Sam Altman once called OpenAI's partnership with Microsoft "the best bromance in tech," now "ties between the companies have started to fray" reports the New York Times — citing interviews with 19 people "familiar with the relationship". [Alternate URL here.]

Among other things, Satya Nadella "has said privately that Altman's firing in November shocked and concerned him, according to five people with knowledge of his comments. Since then, Microsoft has started to hedge its bet on OpenAI," and reconsidered new investments beyond its initial $13 billion — even as OpenAI expects to lose $5 billion this year That tension demonstrates a key challenge for AI startups: They are dependent on the world's tech giants for money and computing power because those big companies control the massive cloud computing systems the small outfits need to develop AI... Over the past year, OpenAI has been trying to renegotiate the deal to help it secure more computing power and reduce crushing expenses while Microsoft executives have grown concerned that their AI work is too dependent on OpenAI... [I]n March, Microsoft paid at least $650 million to hire most of the staff from Inflection, an OpenAI competitor...

In June, Microsoft agreed to an exception in [OpenAI's] contract, six people with knowledge of the change said. That allowed OpenAI to sign a roughly $10 billion computing deal with Oracle for additional computing resources, according to two people familiar with the deal. Oracle is providing computers packed with chips suited to building AI, while Microsoft provides the software that drives the hardware... While it was looking for computer power alternatives, OpenAI also raced to broaden its investors, according to two people familiar with the company's plan. Part of the plan was to secure strategic investments from organizations that could bolster OpenAI's prospects in ways beyond throwing around money. Those organizations included Apple, chipmaker Nvidia, and MGX, a tech investment firm controlled by the United Arab Emirates... Earlier this month, OpenAI closed a $6.6 billion funding round led by Thrive Capital, with additional participation from Nvidia, MGX and others. Apple did not invest, but Microsoft also participated in the funding round.

OpenAI expected to spend at least $5.4 billion in computing costs through the end of 2024, according to documents reviewed by The New York Times. That amount was expected to skyrocket over the next five years as OpenAI expanded, soaring to an estimated $37.5 billion in annual computing costs by 2029, the documents showed... Still, OpenAI employees complain that Microsoft is not providing enough computing power, according to three people familiar with the relationship. And some have complained that if another company beat it to the creation of AI that matches the human brain, Microsoft will be to blame because it hasn't given OpenAI the computing power it needs, according to two people familiar with the complaints.

Oddly, that could be the key to getting out from under its contract with Microsoft. The contract contains a clause that says that if OpenAI builds artificial general intelligence, or AGI — roughly speaking, a machine that matches the power of the human brain — Microsoft loses access to OpenAI's technologies.

Intel

Intel and AMD Form an x86 Ecosystem Advisory Group (phoronix.com) 55

Phoronix's Michael Larabel reports: Intel and AMD have jointly announced the creation of an x86 ecosystem advisory group to bring together the two companies as well as other industry leaders -- both companies and individuals such as Linux creator Linus Torvalds. Intel and AMD are forming this x86 ecosystem advisory group to help foster collaboration and innovations around the x86 (x86_64) ISA. [...] Besides Intel amd AMD, other founding members include Broadcom, Dell, Google, HPE, HP Inc, Lenovo, Microsoft, Oracle, and Red Hat. Here are the "intended outcomes" for the group, as stated in the press release: The intended outcomes include:
- Enhancing customer choice and compatibility across hardware and software, while accelerating their ability to benefit from new, cutting-edge features.
- Simplifying architectural guidelines to enhance software consistency and standardize interfaces across x86 product offerings from Intel and AMD.
- Enabling greater and more efficient integration of new capabilities into operating systems, frameworks and applications.

Power

The Hot New Trend in Commercial Real Estate? Renting to Data Centers (yahoo.com) 49

U.S. real estate developers "are having a hard time keeping up with demand," reports the Los Angeles Times, "as businesses in search of secure spots for their servers rent nearly every square foot that becomes available..." Construction of new data centers is at "extraordinary levels" driven by "insatiable demand," a recent report on the industry by real estate brokerage JLL found. "Never in my career of 25 years in real estate have I seen demand like this on a global scale," said JLL real estate broker Darren Eades, who specializes in data centers...

The biggest drivers are AI and cloud service providers that include some of the biggest names in tech, such as Amazon, Microsoft, Google and Oracle. With occupancy in conventional office buildings still down sharply following the impact of the COVID-19 pandemic and property values falling, data centers represent a rare ripe opportunity for real estate developers, who are pursuing opportunities in major markets like Los Angeles and less urban locales that are served by plentiful and preferably cheap power needed to run data centers. "If you can find a cluster of power to build a site, they'll come," Eades said of developers. Construction is taking place at an "extraordinary" pace nationwide and still not keeping up, the JLL data center report said. [Data center] "Vacancy declined to a record low of 3% at midyear due to insatiable demand and despite rampant construction."

Development increased more than sevenfold in two years, with the pipeline of new projects leveling off in the first half of 2024, a potential signal that the U.S. power grid cannot support development at a faster pace. But when projects currently under construction or planned are complete, the U.S. colocation market, in which businesses rent space in a data center owned by another company for their servers and other computing hardware, will triple in size from current levels... Real estate investors and landlords are being drawn into the market because demand from tenants is high and they are likely to renew their leases after shouldering the costs of setting up data centers. "They invest in their space and in your space and they tend to stick around longer," said Mark Messana, president of Downtown Properties, which owns offices in Los Angeles and San Francisco. "As we all know, the office market is struggling a little bit, so it's nice to be able to have some data customers in the mix..."

Power demand for computing is growing so intense that it threatens to strain the nation's electrical grid, sending users to remote locations where power is plentiful and preferably cheap. Data center developers are working in Alabama, the Dakotas and Indiana, "traditionally states that wouldn't have data centers," Eades said.

The article includes "the mother of all data centers" in the western U.S. — a 30-story building where "thousands of miles of undersea fiber-optic cables disappear into an ordinary-looking office tower." Once a prestigious location for businesses, "The recent departure of a law firm that had been in the building more than 50 years cleared out five floors that will quickly be re-leased to data tenants, said Eades, who represents the landlord..."

To retrofit the building for data centers, "two elevators were removed so the empty shafts could hold water pipes used to help keep the temperature cool enough for the heat-producing servers" — and developers are happy rents "can be double what they are at newer downtown office high-rises, according to real estate data provider CoStar...

"By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs."
Businesses

Oracle Owns Nearly a Third of Arm Chip House Ampere, Could Take Control In 2027 (theregister.com) 6

The Register's Tobias Mann reports: Oracle could choose to take control of Ampere Computing, the Arm processor designer it has backed and uses in its cloud. A proxy statement [PDF] filed on Wednesday reveals that Oracle held 29 percent stake in Ampere as of May 31, 2024, and has the option to gain majority control over the chip house in 2027. "The total carrying value of our investments in Ampere, after accounting for losses under the equity method of accounting, was $1.5 billion as of May 31, 2024," the filing reads. Oracle also revealed it extended $600 million in loans in the form of convertible debt to Ampere during its 2024 fiscal year, on top of $400 million in debt given during the prior fiscal year. Ampere's debts are set to mature beginning June 2026, when Oracle will have the option of converting those investments into additional equity in the chip startup. "If either of such options is exercised by us or our co-investors, we would obtain control of Ampere and consolidate its results with our results of operations," the filing explains.

According to the document, Oracle spent roughly $48 million on Ampere processors during its 2023 fiscal year -- some of it direct with Ampere and some through a third party. By comparison, Big Red spent just $3 million on Ampere's chips and had $101.1 million worth of products available under a pre-payment order by the end of fiscal year 2024. This is despite the fact that Oracle is aggressively expanding its datacenter footprint to address growing demand for AI infrastructure. These efforts have included the deployment of massive clusters of GPUs from Nvidia and AMD with the largest campus developments nearing a gigawatt in scale. The filing also revealed that Ampere founder and CEO Renee James will not seek re-election to Oracle's board of directors.

IT

Desktop Hypervisors Are Like Buses: None for Ages, Then Four at Once (theregister.com) 34

An anonymous reader shares a report: September has been a big month for desktop hypervisors, with the field's big players all delivering significant updates. Oracle delivered VirtualBox version 7.1, billed as a major upgrade thanks to its implementation of a UI with a "modernized look and feel, offering a selection between Basic and Experienced user level with reduced or full UI functionality."

[...] Parallels also released a desktop hypervisor update last week. Version 20 of the eponymous tool now offers a VM that's packed with tools developers may find handy as they work on generative AI applications. Among those tools are the Docker community edition, lmutils, the OpenCV computer vision library, and the Ollama chatbot interface for AI models. [...] The other big player in desktop hypervisors is VMware, with its Fusion and Workstation products for macOS and Windows respectively. Both were recently updated.

AI

Ellison Declares Oracle 'All In' On AI Mass Surveillance 114

Oracle cofounder Larry Ellison envisions AI as the backbone of a new era of mass surveillance, positioning Oracle as a key player in AI infrastructure through its unique networking architecture and partnerships with AWS and Microsoft. The Register reports: Ellison made the comments near the end of an hour-long chat at the Oracle financial analyst meeting last week during a question and answer session in which he painted Oracle as the AI infrastructure player to beat in light of its recent deals with AWS and Microsoft. Many companies, Ellison touted, build AI models at Oracle because of its "unique networking architecture," which dates back to the database era.

"AI is hot, and databases are not," he said, making Oracle's part of the puzzle less sexy, but no less important, at least according to the man himself - AI systems have to have well-organized data, or else they won't be that valuable. The fact that some of the biggest names in cloud computing (and Elon Musk's Grok) have turned to Oracle to run their AI infrastructure means it's clear that Oracle is doing something right, claimed now-CTO Ellison. "If Elon and Satya [Nadella] want to pick us, that's a good sign - we have tech that's valuable and differentiated," Ellison said, adding: One of the ideal uses of that differentiated offering? Maximizing AI's pubic security capabilities.

"The police will be on their best behavior because we're constantly watching and recording everything that's going on," Ellison told analysts. He described police body cameras that were constantly on, with no ability for officers to disable the feed to Oracle. Even requesting privacy for a bathroom break or a meal only meant sections of recording would require a subpoena to view - not that the video feed was ever stopped. AI would be trained to monitor officer feeds for anything untoward, which Ellison said could prevent abuse of police power and save lives. [...] "Citizens will be on their best behavior because we're constantly recording and reporting," Ellison added, though it's not clear what he sees as the source of those recordings - police body cams or publicly placed security cameras. "There are so many opportunities to exploit AI," he said.
Oracle

Oracle Is Designing a Data Center That Would Be Powered By Three Small Nuclear Reactors 96

With electricity demand from AI becoming so "crazy," Oracle's Larry Ellison announced the company is designing a data center that will be powered by three small nuclear reactors capable of providing more than a gigawatt of electricity. "The location and the power place we've located, they've already got building permits for three nuclear reactors," Ellison said. "These are the small modular nuclear reactors to power the data center. This is how crazy it's getting. This is what's going on." CNBC reports: Small modular nuclear reactors are new designs that promise to speed the deployment of reliable, carbon-free energy as power demand rises from data centers, manufacturing and the broader electrification of the economy. Generally, these reactors are 300 megawatts or less, about a third the size of the typical reactor in the current U.S. fleet. They would be prefabricated in several pieces and then assembled on the site, reducing the capital costs that stymie larger plants.

Right now, small modular reactors are a technology of the future, with executives in the nuclear industry generally agreeing that they won't be commercialized in the U.S. until the 2030s. There are currently three operational small modular reactors in the world, according to the Nuclear Energy Agency. Two are in China and Russia, the central geopolitical adversaries of the U.S. A test reactor is also operational in Japan.
Oracle

'Oracle's Missteps in Cloud Computing Are Paying Dividends in AI' (msn.com) 26

Oracle missed the tech industry's move to cloud computing last decade and ended up an also-ran. Now the AI boom has given it another shot. WSJ: The 47-year-old company that made its name on relational database software has emerged as an attractive cloud-computing provider for AI developers such as OpenAI, sending its long-stagnant stock to new heights. Oracle shares are up 34% since January, well outpacing the Nasdaq's 14% rise and those of bigger competitors Microsoft, Amazon.com and Google.

It is a surprising revitalization for a company many in the tech industry had dismissed as a dinosaur of a bygone, precloud era. Oracle appears to be successfully making a case to investors that it has become a strong fourth-place player in a cloud market surging thanks to AI. Its lateness to the game may have played to its advantage, as a number of its 162 data centers were built in recent years and are designed for the development of AI models, known as training.

In addition, Oracle isn't developing its own large AI models that compete with potential clients. The company is considered such a neutral and unthreatening player that it now has partnerships with Microsoft, Google and Amazon, all of which let Oracle's databases run in their clouds. Microsoft is also running its Bing AI chatbot on Oracle's servers.

Open Source

Open Source Redis Fork 'Valkey' Has Momentum, Improvements, and Speed, Says Dirk Hohndel (thenewstack.io) 16

"Dirk Hohndel, a Linux kernel developer and long-time open source leader, wanted his audience at KubeCon + CloudNativeCon + Open Source Summit China 2024 Summit China to know he's not a Valkey developer," writes Steven J. Vaughan-Nichols. "He's a Valkey user and fan." [Hohndel] opened his speech by recalling how the open source, high-performance key/value datastore Valkey had been forked from Redis... Hohndel emphasized that "forks are good. Forks are one of the key things that open source licenses are for. So, if the maintainer starts doing things you don't like, you can fork the code under the same license and do better..." In this case, though, Redis had done a "bait-and-switch" with the Redis code, Hohndale argued. This was because they had made an all-too-common business failure: They hadn't realized that "open source is not a business model...."

While the licensing change is what prompted the fork, Hohndel sees leadership and technical reasons why the Valkey fork is likely to succeed. First, two-thirds of the formerly top Redis maintainers and developers have switched to Valkey. In addition, AWS, Google Cloud, and Oracle, under the Linux Foundation's auspices, all support Valkey. When both the technical and money people agree, good things can happen.

The other reason is that Valkey already looks like it will be the better technical choice. That's because the recently announced Valkey 8.0, which builds upon the last open source version of Redis, 7.2.4, introduces serious speed improvements and new features that Redis users have wanted for some time. As [AWS principal engineer Madelyn] Olson said at Open Source Summit North America earlier this year, "Redis really didn't want to break anything." Valkey wants to move a bit faster. How much faster? A lot. Valkey 8.0 overhauls Redis's single-threaded event loop threading model with a more sophisticated multithreaded approach to I/O operations. Hohndel reported that on his small Valkey-powered aircraft tracking system, "I see roughly a threefold improvement in performance, and I stream a lot of data, 60 million data points a day."

The article notes that Valkey is already being supported by major Linux distros including AlmaLinux, Fedora, and Alpine.
AI

NIST Releases an Open-Source Platform for AI Safety Testing (scmagazine.com) 4

America's National Institute of Standards and Technology (NIST) has released a new open-source software tool called Dioptra for testing the resilience of machine learning models to various types of attacks.

"Key features that are new from the alpha release include a new web-based front end, user authentication, and provenance tracking of all the elements of an experiment, which enables reproducibility and verification of results," a NIST spokesperson told SC Media: Previous NIST research identified three main categories of attacks against machine learning algorithms: evasion, poisoning and oracle. Evasion attacks aim to trigger an inaccurate model response by manipulating the data input (for example, by adding noise), poisoning attacks aim to impede the model's accuracy by altering its training data, leading to incorrect associations, and oracle attacks aim to "reverse engineer" the model to gain information about its training dataset or parameters, according to NIST.

The free platform enables users to determine to what degree attacks in the three categories mentioned will affect model performance and can also be used to gauge the use of various defenses such as data sanitization or more robust training methods.

The open-source testbed has a modular design to support experimentation with different combinations of factors such as different models, training datasets, attack tactics and defenses. The newly released 1.0.0 version of Dioptra comes with a number of features to maximize its accessibility to first-party model developers, second-party model users or purchasers, third-party model testers or auditors, and researchers in the ML field alike. Along with its modular architecture design and user-friendly web interface, Dioptra 1.0.0 is also extensible and interoperable with Python plugins that add functionality... Dioptra tracks experiment histories, including inputs and resource snapshots that support traceable and reproducible testing, which can unveil insights that lead to more effective model development and defenses.

NIST also published final versions of three "guidance" documents, according to the article. "The first tackles 12 unique risks of generative AI along with more than 200 recommended actions to help manage these risks. The second outlines Secure Software Development Practices for Generative AI and Dual-Use Foundation Models, and the third provides a plan for global cooperation in the development of AI standards."

Thanks to Slashdot reader spatwei for sharing the news.
Government

Why DARPA is Funding an AI-Powered Bug-Spotting Challenge (msn.com) 43

Somewhere in America's Defense Department, the DARPA R&D agency is running a two-year contest to write an AI-powered program "that can scan millions of lines of open-source code, identify security flaws and fix them, all without human intervention," reports the Washington Post. [Alternate URL here.]

But as they see it, "The contest is one of the clearest signs to date that the government sees flaws in open-source software as one of the country's biggest security risks, and considers artificial intelligence vital to addressing it." Free open-source programs, such as the Linux operating system, help run everything from websites to power stations. The code isn't inherently worse than what's in proprietary programs from companies like Microsoft and Oracle, but there aren't enough skilled engineers tasked with testing it. As a result, poorly maintained free code has been at the root of some of the most expensive cybersecurity breaches of all time, including the 2017 Equifax disaster that exposed the personal information of half of all Americans. The incident, which led to the largest-ever data breach settlement, cost the company more than $1 billion in improvements and penalties.

If people can't keep up with all the code being woven into every industrial sector, DARPA hopes machines can. "The goal is having an end-to-end 'cyber reasoning system' that leverages large language models to find vulnerabilities, prove that they are vulnerabilities, and patch them," explained one of the advising professors, Arizona State's Yan Shoshitaishvili.... Some large open-source projects are run by near-Wikipedia-size armies of volunteers and are generally in good shape. Some have maintainers who are given grants by big corporate users that turn it into a job. And then there is everything else, including programs written as homework assignments by authors who barely remember them.

"Open source has always been 'Use at your own risk,'" said Brian Behlendorf, who started the Open Source Security Foundation after decades of maintaining a pioneering free server software, Apache, and other projects at the Apache Software Foundation. "It's not free as in speech, or even free as in beer," he said. "It's free as in puppy, and it needs care and feeding."

40 teams entered the contest, according to the article — and seven received $1 million in funding to continue on to the next round, with the finalists to be announced at this year's Def Con, according to the article.

"Under the terms of the DARPA contest, all finalists must release their programs as open source," the article points out, "so that software vendors and consumers will be able to run them."
Privacy

Bumble and Hinge Allowed Stalkers To Pinpoint Users' Locations Down To 2 Meters, Researchers Say (techcrunch.com) 23

An anonymous reader quotes a report from TechCrunch: A group of researchers said they found that vulnerabilities in the design of some dating apps, including the popular Bumble and Hinge, allowed malicious users or stalkers to pinpoint the location of their victims down to two meters. In a new academic paper, researchers from the Belgian university KU Leuven detailed their findings (PDF) when they analyzed 15 popular dating apps. Of those, Badoo, Bumble, Grindr, happn, Hinge and Hily all had the same vulnerability that could have helped a malicious user to identify the near-exact location of another user, according to the researchers. While neither of those apps share exact locations when displaying the distance between users on their profiles, they did use exact locations for the "filters" feature of the apps. Generally speaking, by using filters, users can tailor their search for a partner based on criteria like age, height, what type of relationship they are looking for and, crucially, distance.

To pinpoint the exact location of a target user, the researchers used a novel technique they call "oracle trilateration." In general, trilateration, which for example is used in GPS, works by using three points and measuring their distance relative to the target. This creates three circles, which intersect at the point where the target is located. Oracle trilateration works slightly differently. The researchers wrote in their paper that the first step for the person who wants to identify their target's location "roughly estimates the victim's location," for example, based on the location displayed in the target's profile. Then, the attacker moves in increments "until the oracle indicates that the victim is no longer within proximity, and this for three different directions. The attacker now has three positions with a known exact distance, i.e., the preselected proximity distance, and can trilaterate the victim," the researchers wrote.

"It was somewhat surprising that known issues were still present in these popular apps," Karel Dhondt, one of the researchers, told TechCrunch. While this technique doesn't reveal the exact GPS coordinates of the victim, "I'd say 2 meters is close enough to pinpoint the user," Dhondt said. The good news is that all the apps that had these issues, and that the researchers reached out to, have now changed how distance filters work and are not vulnerable to the oracle trilateration technique. The fix, according to the researchers, was to round up the exact coordinates by three decimals, making them less precise and accurate.

AI

What Is the Future of Open Source AI? (fb.com) 22

Tuesday Meta released Llama 3.1, its largest open-source AI model to date. But just one day Mistral released Large 2, notes this report from TechCrunch, "which it claims to be on par with the latest cutting-edge models from OpenAI and Meta in terms of code generation, mathematics, and reasoning...

"Though Mistral is one of the newer entrants in the artificial intelligence space, it's quickly shipping AI models on or near the cutting edge." In a press release, Mistral says one of its key focus areas during training was to minimize the model's hallucination issues. The company says Large 2 was trained to be more discerning in its responses, acknowledging when it does not know something instead of making something up that seems plausible. The Paris-based AI startup recently raised $640 million in a Series B funding round, led by General Catalyst, at a $6 billion valuation...

However, it's important to note that Mistral's models are, like most others, not open source in the traditional sense — any commercial application of the model needs a paid license. And while it's more open than, say, GPT-4o, few in the world have the expertise and infrastructure to implement such a large model. (That goes double for Llama's 405 billion parameters, of course.)

Mistral only has 123 billion parameters, according to the article. But whichever system prevails, "Open Source AI Is the Path Forward," Mark Zuckerberg wrote this week, predicting that open-source AI will soar to the same popularity as Linux: This year, Llama 3 is competitive with the most advanced models and leading in some areas. Starting next year, we expect future Llama models to become the most advanced in the industry. But even before that, Llama is already leading on openness, modifiability, and cost efficiency... Beyond releasing these models, we're working with a range of companies to grow the broader ecosystem. Amazon, Databricks, and NVIDIA are launching full suites of services to support developers fine-tuning and distilling their own models. Innovators like Groq have built low-latency, low-cost inference serving for all the new models. The models will be available on all major clouds including AWS, Azure, Google, Oracle, and more. Companies like Scale.AI, Dell, Deloitte, and others are ready to help enterprises adopt Llama and train custom models with their own data.
"As the community grows and more companies develop new services, we can collectively make Llama the industry standard and bring the benefits of AI to everyone," Zuckerberg writes. He says that he's heard from developers, CEOs, and government officials that they want to "train, fine-tune, and distill" their own models, protecting their data with a cheap and efficient model — and without being locked into a closed vendor. But they also tell him that want to invest in an ecosystem "that's going to be the standard for the long term." Lots of people see that open source is advancing at a faster rate than closed models, and they want to build their systems on the architecture that will give them the greatest advantage long term...

One of my formative experiences has been building our services constrained by what Apple will let us build on their platforms. Between the way they tax developers, the arbitrary rules they apply, and all the product innovations they block from shipping, it's clear that Meta and many other companies would be freed up to build much better services for people if we could build the best versions of our products and competitors were not able to constrain what we could build. On a philosophical level, this is a major reason why I believe so strongly in building open ecosystems in AI and AR/VR for the next generation of computing...

I believe that open source is necessary for a positive AI future. AI has more potential than any other modern technology to increase human productivity, creativity, and quality of life — and to accelerate economic growth while unlocking progress in medical and scientific research. Open source will ensure that more people around the world have access to the benefits and opportunities of AI, that power isn't concentrated in the hands of a small number of companies, and that the technology can be deployed more evenly and safely across society. There is an ongoing debate about the safety of open source AI models, and my view is that open source AI will be safer than the alternatives. I think governments will conclude it's in their interest to support open source because it will make the world more prosperous and safer... [O]pen source should be significantly safer since the systems are more transparent and can be widely scrutinized...

The bottom line is that open source AI represents the world's best shot at harnessing this technology to create the greatest economic opportunity and security for everyone... I believe the Llama 3.1 release will be an inflection point in the industry where most developers begin to primarily use open source, and I expect that approach to only grow from here. I hope you'll join us on this journey to bring the benefits of AI to everyone in the world.

Java

Oracle's Java Pricing Brews Bitter Taste, Subscribers Spill Over To OpenJDK (theregister.com) 49

Lindsay Clark reports via The Register: Only 14 percent of Oracle Java subscribers plan to stay on Big Red's runtime environment, according to a study following the introduction of an employee-based subscription model. At the same time, 36 percent of the 663 Java users questioned said they had already moved to the employee-based pricing model introduced in January 2023. Shortly after the new model was implemented, experts warned that it would create a significant price hike for users adopting it. By July, global tech research company Gartner was forecasting that those on the new subscription package would face between two and five times the costs compared with the previous usage-based model.

As such, among the 86 percent of respondents using Oracle Java SE who are currently moving or plan to move all or some of their Java applications off Oracle environments, 53 percent said the Oracle environment was too expensive, according to the study carried out by independent market research firm Dimensional Research. Forty-seven percent said the reason for moving was a preference for open source, and 38 percent said it was because of uncertainty created by ongoing changes in pricing, licensing, and support. [...]

To support OpenJDK applications in production, 46 percent chose a paid-for platform such as Belsoft Liberica, IBM Semeru, or Azul Platform Core; 45 percent chose a free supported platform such as Amazon Corretto or Microsoft Build of OpenJDK; and 37 percent chose a free, unsupported platform. Of the users who have already moved to OpenJDK, 25 percent said Oracle had been significantly more expensive, while 41 percent said Big Red's licensing had made it somewhat more expensive than the alternative. The survey found three-quarters of Java migrations were completed within a year, 23 percent within three months.

Microsoft

Microsoft: Our Licensing Terms Do Not Meaningfully Raise Cloud Rivals' Costs 21

In a response to the UK's Competition and Markets Authority's investigation into cloud services and licensing, Microsoft has defended its practices, asserting that its terms "do not meaningfully raise cloud rivals' costs." The Windows-maker emphasized Amazon's continued dominance in the UK hyperscale market and noted Google's quarter-on-quarter growth, while also highlighting the declining share of Windows Server relative to Linux in cloud operating systems and SQL Server's second-place position behind Oracle.

[...] The CMA's inquiry primarily focuses on the pricing disparity between using Microsoft products on Azure versus rival cloud platforms, with most surveyed customers perceiving Azure as the more cost-effective option for Microsoft software deployment. The Register adds: Microsoft's bullish take on this is that AWS and Google should be grateful that they even get to run its software. In its response, the company said: "This dispute on pricing terms only arises because Microsoft grants all rivals IP licenses in the first place to its software that is of most popularity for use in the cloud. It does this not because there is any legal obligation to share IP with closest rivals in cloud, but for commercial reasons."
Cloud

Microsoft: Linux Is the Top Operating System on Azure Today (thenewstack.io) 69

Azure used to be a cloud platform dedicated to Windows. Now, it's the most widely used operating system on Microsoft Azure. The New Stack's Joab Jackson writes: These days, Microsoft expends considerable effort that Linux runs as smoothly as possible on Azure, according to a talk given earlier this year at the Linux Foundation Open Source Summit given by two Microsoft Azure Linux Platforms Group program managers, Jack Aboutboul, and Krum Kashan. "Linux is the #1 operating system in Azure today," Aboutoul said. And all must be supported in a way that Microsoft users have come to expects. Hence, the need for the Microsoft's Linux Platforms Group, which provides support Linux to both the internal customers and to Azure customers. These days, the duo of engineers explained, Microsoft knows about as much as anyone about how to operate Linux at hyperscale. [...]

As of today, there are hundreds of Azure and Azure-based services running on Linux, including the Azure Kubernetes Service (AKS), OpenAI, HDInsight, and many of the other database services. "A lot of the infrastructure powering everything else is running on Linux," Aboutoul said. "They're different flavors of Linux running all over the place," Aboutoul said. To run these services, Microsoft maintains its own kernel, Azure Linux, and in 2023 the company released its own version of Linux, Azure Linux. But Azure Linux is just a small portion of all the other flavors of Linux running on Azure, all of which Microsoft must work with to support.

Overall, there are about 20,000 third-party Software as a Service (SaaS) packages in the Azure marketplace that rely on some Linux distribution. And when things go wrong, it is the Azure service engineers who get the help tickets. The company keeps a set of endorsed Linux distributions, which include Red Hat Enterprise Linux, Debian, Flatcar, Suse, Canonical, and Oracle Linux and CentOS (as managed by OpenLogic, not Red Hat). [...] Overall, the company gets about 1,000 images a month from these endorsed partners alone. Many of the distributions have multiple images (Suse has a regular one, and another one for high-performance computing, for instance).

Slashdot Top Deals