AI

DeepMind Cracks 'Knot' Conjecture That Bedeviled Mathematicians For Decades (livescience.com) 21

The artificial intelligence (AI) program DeepMind has gotten closer to proving a math conjecture that's bedeviled mathematicians for decades and revealed another new conjecture that may unravel how mathematicians understand knots. Live Science reports: The two pure math conjectures are the first-ever important advances in pure mathematics (or math not directly linked to any non-math application) generated by artificial intelligence, the researchers reported Dec. 1 in the journal Nature. [...] The first challenge was setting DeepMind onto a useful path. [...] They focused on two fields: knot theory, which is the mathematical study of knots; and representation theory, which is a field that focuses on abstract algebraic structures, such as rings and lattices, and relates those abstract structures to linear algebraic equations, or the familiar equations with Xs, Ys, pluses and minuses that might be found in a high-school math class.

In understanding knots, mathematicians rely on something called invariants, which are algebraic, geometric or numerical quantities that are the same. In this case, they looked at invariants that were the same in equivalent knots; equivalence can be defined in several ways, but knots can be considered equivalent if you can distort one into another without breaking the knot. Geometric invariants are essentially measurements of a knot's overall shape, whereas algebraic invariants describe how the knots twist in and around each other. "Up until now, there was no proven connection between those two things," [said Alex Davies, a machine-learning specialist at DeepMind and one of the authors of the new paper], referring to geometric and algebraic invariants. But mathematicians thought there might be some kind of relationship between the two, so the researchers decided to use DeepMind to find it. With the help of the AI program, they were able to identify a new geometric measurement, which they dubbed the "natural slope" of a knot. This measurement was mathematically related to a known algebraic invariant called the signature, which describes certain surfaces on knots.

In the second case, DeepMind took a conjecture generated by mathematicians in the late 1970s and helped reveal why that conjecture works. For 40 years, mathematicians have conjectured that it's possible to look at a specific kind of very complex, multidimensional graph and figure out a particular kind of equation to represent it. But they haven't quite worked out how to do it. Now, DeepMind has come closer by linking specific features of the graphs to predictions about these equations, which are called Kazhdan-Lusztig (KL) polynomials, named after the mathematicians who first proposed them. "What we were able to do is train some machine-learning models that were able to predict what the polynomial was, very accurately, from the graph," Davies said. The team also analyzed what features of the graph DeepMind was using to make those predictions, which got them closer to a general rule about how the two map to each other. This means DeepMind has made significant progress on solving this conjecture, known as the combinatorial invariance conjecture.

Education

California Moves To Recommend Delaying Algebra To 9th Grade Statewide (sfstandard.com) 639

California is in the process of approving new guidelines for math education in public schools that "pushes Algebra 1 back to 9th grade, de-emphasizes calculus, and applies social justice principles to math lessons," writes Joe Hong via the San Francisco Standard. The new approach would have been approved earlier this month but has been delayed due to the attention and controversy it has received. Here's an excerpt from the report: When Rebecca Pariso agreed to join a team of educators tasked in late 2019 with California's new mathematics framework, she said she expected some controversy. But she didn't expect her work would be in the national spotlight. [...] Every eight years (PDF), a group of educators comes together to update the state's math curriculum framework. This particular update has attracted extra attention, and controversy, because of perceived changes it makes to how "gifted" students progress -- and because it pushes Algebra 1 back to 9th grade, de-emphasizes calculus, and applies social justice principles to math lessons. San Francisco pioneered key aspects of the new approach, opting in 2014 to delay algebra instruction until 9th grade and to push advanced mathematics courses until at least after 10th grade as a means of promoting equity.

San Francisco Unified School District touted the effort as a success, asserting that algebra failure rates fell and the number of students taking advanced math rose as a result of the change. The California Department of Education cited those results in drafting the statewide framework. But critics have accused the district of using cherry-picked and misleading assertions to bolster the case for the changes. The intent of the state mathematics framework, its designers say, is to maintain rigor while also helping remedy California's achievement gaps for Black, Latino and low-income students, which remain some of the largest in the nation. At the heart of the wrangling lies a broad agreement about at least one thing: The way California public schools teach math isn't working. On national standardized tests, California ranks in the bottom quartile among all states and U.S. territories for 8th grade math scores.

Yet for all the sound and fury, the proposed framework, about 800-pages long, is little more than a set of suggestions. Its designers are revising it now and will subject it to 60 more days of public review. Once it's approved in July, districts may adopt as much or as little of the framework as they choose -- and can disregard it completely without any penalty. "It's not mandated that you use the framework," said framework team member Dianne Wilson, a program specialist at Elk Grove Unified. "There's a concern that it will be implemented unequally."

Education

Why Colleges are Giving Up on Remote Education (salon.com) 111

The president emeritus of the Great Lakes College Association writes that "nearly all colleges have re-adopted in-person education this fall, in spite of delta variant risks...

"As it turns out, student enthusiasm for remote learning is mixed at best, and in some cases students have sued their colleges for refunds. But it is not simply student opinion that has driven this reversion to face-to-face education." Indeed, students are far better off with in-person learning than with online approaches. Recent research indicates that the effects of remote learning have been negative. As the Brookings Institution Stephanie Riegg reports, "bachelor's degree students in online programs perform worse on nearly all test score measures — including math, reading, writing, and English — relative to their counterparts in similar on-campus programs...."

[R]esearch on human learning consistently finds that the social context of learning is critical, and the emotions involved in effective human relations play an essential role in learning. Think of a teacher who had a great impact on you — the one who made you excited, interested, intrigued, and motivated to learn. Was this teacher a calm and cool transmitter of facts, or a person who was passionate about the subject and excited to talk about it...? Research tells us the most effective teachers — those who are most successful in having their students learn — are those who establish an emotional relationship with their students in an environment of care and trust. As former teacher and now neuroscientist Mary Helen Immordino-Yang tells us, emotion is necessary for learning to occur: "Emotion is where learning begins, or, as is often the case, where it ends. Put simply, it is literally neurobiologically impossible to think deeply about things that you don't care about.... Even in academic subjects that are traditionally considered unemotional, such as physics, engineering or math, deep understanding depends on making emotional connections between concepts...."

Today we have the benefit of extensive research documenting the short-term and long-term importance of these social-educational practices. Research based on the widely used National Survey of Student Engagement (NSSE) consistently finds that having meaningful outside-of-class relationships with faculty and advisors increases not only learning but graduation from college and employment after graduation. It is also worth noting that Gallup-Purdue University public opinion research affirms the idea that people believe these personal relationships in college matter. A study of 30,000 graduates reports that they believe "what students are doing in college and how they are experiencing it... has a profound relationship to life and career." Specifically, "if graduates had a professor who cared about them as a person, made them excited about learning, and encouraged them to pursue their dreams, their odds of being engaged at work more than doubled, as did their odds of thriving in their well-being."

Since empirical research documents the powerful impact of meaningful human relationships on learning while in college as well as on graduate's adult lives, and people believe it matters, do we dare replace it with technology?

Math

The 50-year-old Problem That Eludes Theoretical Computer Science (technologyreview.com) 113

A solution to P vs NP could unlock countless computational problems -- or keep them forever out of reach. MIT Technology Review: On Monday, July 19, 2021, in the middle of another strange pandemic summer, a leading computer scientist in the field of complexity theory tweeted out a public service message about an administrative snafu at a journal. He signed off with a very loaded, "Happy Monday." In a parallel universe, it might have been a very happy Monday indeed. A proof had appeared online at the esteemed journal ACM Transactions on Computational Theory, which trades in "outstanding original research exploring the limits of feasible computation." The result purported to solve the problem of all problems -- the Holy Grail of theoretical computer science, worth a $1 million prize and fame rivaling Aristotle's forevermore.

This treasured problem -- known as "P versus NP" -- is considered at once the most important in theoretical computer science and mathematics and completely out of reach. It addresses questions central to the promise, limits, and ambitions of computation, asking:

Why are some problems harder than others?
Which problems can computers realistically solve?
How much time will it take?

And it's a quest with big philosophical and practical payoffs. "Look, this P versus NP question, what can I say?" Scott Aaronson, a computer scientist at the University of Texas at Austin, wrote in his memoir of ideas, Quantum Computing Since Democritus. "People like to describe it as 'probably the central unsolved problem of theoretical computer science.' That's a comical understatement. P vs NP is one of the deepest questions that human beings have ever asked." One way to think of this story's protagonists is as follows: "P" represents problems that a computer can handily solve. "NP" represents problems that, once solved, are easy to check -- like jigsaw puzzles, or Sudoku. Many NP problems correspond to some of the most stubborn and urgent problems society faces. The million-dollar question posed by P vs. NP is this: Are these two classes of problems one and the same? Which is to say, could the problems that seem so difficult in fact be solved with an algorithm in a reasonable amount of time, if only the right, devilishly fast algorithm could be found? If so, many hard problems are suddenly solvable. And their algorithmic solutions could bring about societal changes of utopian proportions -- in medicine and engineering and economics, biology and ecology, neuroscience and social science, industry, the arts, even politics and beyond.

AI

$28B Startup Says Companies Were Refusing Their Free Open-Source Code as 'Not Enterprise-Ready' (forbesindia.com) 49

"Ali Ghodsi was happily researching AI at Berkeley when he helped invent a revolutionary bit of code — and he wanted to give it away for free," remembers Forbes India. "But few would take it unless he charged for it.

"Now his startup is worth $28 billion, and the career academic is a billionaire with a reputation as one of the best CEOs in the Valley." (Literally. VC Ben Horowitz of Andreessen Horowitz calls him the best CEO in Andreessen Horowitz's portfolio of hundreds of companies.) Inside a 13th-floor boardroom in downtown San Francisco, the atmosphere was tense. It was November 2015, and Databricks, a two-year-old software company started by a group of seven Berkeley researchers, was long on buzz but short on revenue. The directors awkwardly broached subjects that had been rehashed time and again. The startup had been trying to raise funds for five months, but venture capitalists were keeping it at arm's length, wary of its paltry sales. Seeing no other option, NEA partner Pete Sonsini, an existing investor, raised his hand to save the company with an emergency $30 million injection...

Many of the original founders, Ghodsi in particular, were so engrossed with their academic work that they were reluctant to start a company — or charge for their technology, a best-of-breed piece of future-predicting code called Spark, at all. But when the researchers offered it to companies as an open-source tool, they were told it wasn't "enterprise ready". In other words, Databricks needed to commercialise. "We were a bunch of Berkeley hippies, and we just wanted to change the world," Ghodsi says. "We would tell them, 'Just take the software for free', and they would say 'No, we have to give you $1 million'."

Databricks' cutting-edge software uses artificial intelligence to fuse costly data warehouses (structured data used for analytics) with data lakes (cheap, raw data repositories) to create what it has coined data "lakehouses" (no space between the words, in the finest geekspeak tradition). Users feed in their data and the AI makes predictions about the future. John Deere, for example, installs sensors in its farm equipment to measure things like engine temperature and hours of use. Databricks uses this raw data to predict when a tractor is likely to break down. Ecommerce companies use the software to suggest changes to their websites that boost sales. It's used to detect malicious actors — both on stock exchanges and on social networks.

Ghodsi says Databricks is ready to go public soon. It's on track to near $1 billion in revenue next year, Sonsini notes. Down the line, $100 billion is not out of the question, Ghodsi says — and even that could be a conservative figure. It's simple math: Enterprise AI is already a trillion-dollar market, and it's certain to grow much larger. If the category leader grabs just 10 percent of the market, Ghodsi says, that's revenues of "many, many hundred billions."

Later in the article Ghodsi offers this succinct summary of the market they entered.

"It turns out that if you dust off the neural network algorithms from the '70s, but you use way more data than ever before and modern hardware, the results start becoming superhuman."
Math

A Math Teacher is Putting Calculus Lessons on Pornhub (boingboing.net) 57

An anonymous reader shares a report: It's safe to assume that few Pornhub visitors are looking for hour-long calculus videos (by a fully-clothed instructor), but Taiwanese math teacher Changhsu puts them there anyway. His channel is filled with over 200 decidedly unsexy chalkboard lessons about topics like differential equations. The 34-year-old math tutor found the YouTube market for math explainers to be saturated, so he decided to expand his reach into Pornhub. He told Mel Magazine that he wants to reach a new market of mathematics learners.
OS X

Steve Jobs Tried To Convince Dell To License Mac Software (cnet.com) 42

It's been 10 years since the death of Steve Jobs. Michael Dell talks about his memories of the tech icon, including when Jobs tried to convince Dell to license Mac software to run on Intel-based PCs. CNET reports: Fast forward to 1993. Jobs, ousted from Apple after a fallout with the company's board in 1985, had started a new company, called Next, and created a beautiful (but expensive) workstation, with its own operating system, as well as software called WebObjects for building web-based applications. Dell says Jobs came to his house in Texas several times that year, trying to convince him to use the Next operating system on Dell PCs, by arguing that it was better than Microsoft's Windows software and could undermine the Unix workstation market being touted by Sun Microsystems. The problem, Dell says he told Jobs, was that there were no applications for it and zero customer interest. Still, Dell's company worked a little bit with Next and used WebObjects to build its first online store in the mid-'90s.

In 1997, Jobs rejoined a struggling Apple after it acquired Next for $429 million, and he pitched Dell on another business proposal (as Jobs was evaluating Apple's Mac clone licensing project, which he ultimately shut down). Jobs and his team had ported the Mac software, based on Next's Mach operating system, and had it running on the Intel x86 chips that powered Dell PCs. Jobs offered to license the Mac OS to Dell, telling him he could give PC buyers a choice of Apple's software or Microsoft's Windows OS installed on their machine. "He said, look at this -- we've got this Dell desktop and it's running Mac OS," Dell tells me. "Why don't you license the Mac OS?" Dell thought it was a great idea and told Jobs he'd pay a licensing fee for every PC sold with the Mac OS. But Jobs had a counteroffer: He was worried that licensing scheme might undermine Apple's own Mac computer sales because Dell computers were less costly. Instead, Dell says, Jobs suggested he just load the Mac OS alongside Windows on every Dell PC and let customers decide which software to use -- and then pay Apple for every Dell PC sold.

Dell smiles when he tells the story. "The royalty he was talking about would amount to hundreds of millions of dollars, and the math just didn't work, because most of our customers, especially larger business customers, didn't really want the Mac operating system," he writes. "Steve's proposal would have been interesting if it was just us saying, "OK, we'll pay you every time we use the Mac OS" -- but to pay him for every time we didn't use it ... well, nice try, Steve!" Another problem: Jobs wouldn't guarantee access to the Mac OS three, four or five years later "even on the same bad terms." That could leave customers who were using Mac OS out of luck as the software evolved, leaving Dell Inc. no way to ensure it could support those users. Still, Dell acknowledges the deal was a what-could-have-been moment in history. [...] That different direction led to Jobs continuing to evolve the Next-inspired Mac OS and retooling the Mac product line, including adding the candy-colored iMac in mid-1998.

Education

School Reopenings Stymie Teens' Reseller Gigs (pcmag.com) 147

It turns out school reopenings are disrupting the cash flow of industrious teenagers who spent the pandemic scooping up in-demand products via bots and reselling them for a hefty profit. From a report: "Yes, I am back in school. Yea, it's very annoying," said one US high school student named Dillon, who regularly buys video game consoles and graphics cards with automated bots. "I am sitting in math class and drawing class with my computer open, and I get told to shut it down during a [product] drop sometimes," he told PCMag in an interview. Dillon may be young, but he's among the legion of online scalpers who spent the pandemic at home buying and reselling the tech world's most-wanted products. "I would say around $10,000 to $12,500 average a month," he told PCMag. "Some months it would be exponentially higher, some would be lower."

Using automated bots he purchased and installed on his computer, and intel from other online resellers, Dillon scooped up products like the PlayStation 5 ahead of other consumers and sold them off at inflated pricing. But lately, Dillon's reselling hit a snag. After months away from high school because of the pandemic, he's now back in the classroom, where computer use can be strictly controlled. "When everything closed [during the pandemic], I could do whatever I wanted because I was doing my school from home," he said. But with the return of in-classroom teaching, Dillon says his profits have now fallen by about 25%.

Bitcoin

Old Coal Plant Is Now Mining Bitcoin For a Utility Company (arstechnica.com) 59

An anonymous reader quotes a report from Ars Technica: Bitcoin's massive power consumption is the cryptocurrency's dirty secret. To mine bitcoin, computers across the globe chew through enough electricity to power a medium size country, somewhere on the order of the Netherlands or Poland depending on the estimate. In fact, electricity has become such a significant factor that one private equity firm owns a power plant to mine bitcoin. The company, Greenidge Generation, said at one point that they could mine one bitcoin for less than $3,000. Even today -- at $40,000 per bitcoin, some 30 percent off its peak -- the potential for profit is real. Which is why an investor-owned utility has dropped a containerized data center outside a coal-fired power plant 10 miles north of St. Louis. Ameren, the utility, was struggling to keep the 1,099 MW power plant running profitably when wholesale electricity prices dropped. But it wasn't well suited to running only when demand was high, so-called peaker duty. Instead, they're experimenting with running it full-time and using the excess electricity to mine bitcoin.

Ameren executives reportedly blame wind and solar power for the load variability that taxes the 55-year-old power plant. The utility claims that mining bitcoin could reduce its carbon footprint by allowing it to run its plants more consistently rather than ramping them up and down, which they say can increase emissions. "We have pretty dramatic changes in load minute by minute, second by second at times," Warren Wood, the utility's vice president of regulatory and legislative affairs, told E&E News. But when it's running full-time, they only have to take power away from the mining operations. Wood said it takes about 20 seconds to divert power back to the grid.

Ameren attempted to get rate payers to foot a portion of the bill for its experiment, but Missouri's consumer advocate pushed back. "If Ameren Missouri wants to enter into speculative commodities, like virtual currencies, then it should do so as a non-regulated service where ratepayers are unexposed to the economics of them," Geoff Marke, chief economist for the Missouri Office of the Public Counsel, wrote in a filing. "This endeavor is beyond the scope of intended electric utility regulation, and, if allowed, creates a slippery slope where ratepayers could be asked to put up capital for virtually anything." The utility says that if its bitcoin experiment pans out, it could attach similar containerized data centers to wind and solar farms to soak up excess electricity profitably in times of high supply or low demand. The coal-fired power plant that's being used in the experiment is scheduled to be shut down in 2028. Ameren says that so far it's pleased with the project, which has mined 20 coins and mints a new one at a rate of one every 15 days or so. Whether the math continues to work depends largely on the cost of running the plant and the price of bitcoin, which is highly volatile. Based on today's prices, the company has made about $800,000 since it switched on the miners in April.

Businesses

Why Deliveries Are So Slow (theatlantic.com) 84

Americans are habitually unattuned to the massive and profoundly human apparatus that brings us basically everything in our lives. Much of the country's pandemic response has treated us as somehow separate from the rest of the world and the challenges it endures, but unpredictably empty shelves, rising prices, and long waits are just more proof of how foolish that belief has always been. The Atlantic: When I called up Dan Hearsch, a managing director at the consulting firm AlixPartners who specializes in supply-chain management, I described the current state of the industry to him as a little wonky. He laughed. "'A little wonky' is one way to say it," he said. "'Everything's broken' is another way." Hearsch told me about a friend whose company imports consumer goods -- stuff that's normally available in abundance at any Walmart or Target -- from China. Before the pandemic, according to the friend, shipping a container of that merchandise to the U.S. would have cost the company $2,000 to $5,000. Recently, though, the number is more like $30,000, at least for anything shipped on a predictable timeline. You can get it down to $20,000 if you're willing to deal with the possibility of your stuff arriving in a few months, or whenever space on a ship eventually opens up that's not already accounted for by companies willing to pay more.

Such severe price hikes aren't supposed to happen. Wealthy Western countries offloaded much of their manufacturing to Asia and Latin America precisely because container shipping has made moving goods between hemispheres so inexpensive. When that math tips into unprofitability, either companies stop shipping goods and wait for better rates, or they start charging you a lot more for the things they ship. Both options constrain supply further and raise prices on what's available. "You look at the price of cars, you look at the price of food -- the price of practically anything is up significantly from one year ago, from two years ago," Hearsch told me. "The differences are really, really quite shocking." The Bureau of Labor Statistics estimates that as of July, consumer prices had grown almost 5 percent since before the pandemic, with some types of goods showing much larger increases.

Overseas shipping is currently slow and expensive for lots of very complicated reasons and one big, important, relatively uncomplicated one: The countries trying to meet the huge demands of wealthy markets such as the United States are also trying to prevent mass-casualty events. Infection-prevention measures have recently closed high-volume shipping ports in China, the country that supplies the largest share of goods imported to the United States. In Vietnam and Malaysia, where workers churn out products as varied as a third of all shoes imported to the U.S. and chip components that are crucial to auto manufacturing, controlling the far more transmissible [...] Domestically, things aren't a whole lot better. Offshoring has systematically decimated America's capacity to manufacture most things at home, and even products that are made in the United States likely use at least some raw materials or components that need to be imported or are in short supply for other reasons.

Math

Mathematical Model Predicts Best Way To Build Muscle (phys.org) 67

An anonymous reader quotes a report from Phys.Org: Researchers have developed a mathematical model that can predict the optimum exercise regime for building muscle. The researchers, from the University of Cambridge, used methods of theoretical biophysics to construct the model, which can tell how much a specific amount of exertion will cause a muscle to grow and how long it will take. The model could form the basis of a software product, where users could optimize their exercise regimes by entering a few details of their individual physiology. The results, reported in the Biophysical Journal, suggest that there is an optimal weight at which to do resistance training for each person and each muscle growth target. Muscles can only be near their maximal load for a very short time, and it is the load integrated over time which activates the cell signaling pathway that leads to synthesis of new muscle proteins. But below a certain value, the load is insufficient to cause much signaling, and exercise time would have to increase exponentially to compensate. The value of this critical load is likely to depend on the particular physiology of the individual.

In 2018, the Cambridge researchers started a project on how the proteins in muscle filaments change under force. They found that main muscle constituents, actin and myosin, lack binding sites for signaling molecules, so it had to be the third-most abundant muscle component -- titin -- that was responsible for signaling the changes in applied force. Whenever part of a molecule is under tension for a sufficiently long time, it toggles into a different state, exposing a previously hidden region. If this region can then bind to a small molecule involved in cell signaling, it activates that molecule, generating a chemical signal chain. Titin is a giant protein, a large part of which is extended when a muscle is stretched, but a small part of the molecule is also under tension during muscle contraction. This part of titin contains the so-called titin kinase domain, which is the one that generates the chemical signal that affects muscle growth. The molecule will be more likely to open if it is under more force, or when kept under the same force for longer. Both conditions will increase the number of activated signaling molecules. These molecules then induce the synthesis of more messenger RNA, leading to production of new muscle proteins, and the cross-section of the muscle cell increases.

This realization led to the current work. [The researchers] set out to constrict a mathematical model that could give quantitative predictions on muscle growth. They started with a simple model that kept track of titin molecules opening under force and starting the signaling cascade. They used microscopy data to determine the force-dependent probability that a titin kinase unit would open or close under force and activate a signaling molecule. They then made the model more complex by including additional information, such as metabolic energy exchange, as well as repetition length and recovery. The model was validated using past long-term studies on muscle hypertrophy. "Our model offers a physiological basis for the idea that muscle growth mainly occurs at 70% of the maximum load, which is the idea behind resistance training," said [one of the paper's authors]. "Below that, the opening rate of titin kinase drops precipitously and precludes mechanosensitive signaling from taking place. Above that, rapid exhaustion prevents a good outcome, which our model has quantitatively predicted." [...] The model also addresses the problem of muscle atrophy, which occurs during long periods of bed rest or for astronauts in microgravity, showing both how long can a muscle afford to remain inactive before starting to deteriorate, and what the optimal recovery regime could be.

Math

Ask Slashdot: Is There a 'Standard' Way of Formatting Numbers? 84

Long-time Slashdot reader Pieroxy is working on a new open source project, a web-based version of the system-monitoring software Conky.

The ultimate goal is send the data to an HTML interface "to find some use for the old iPads/tablets/laptops we all have lying around. You can put them next to your screen and have your metrics displayed there...!"

There's just one problem: "I had to come up with a way for users to format a number." I needed a small string the user could write to describe exactly what they want to do with their number. Some examples can be: write it as a 3-digit number suffixed by SI prefixes when the numbers are too big or too small, display a timestamp as HH:MM string, or just the day of week, eventually cut to the first three characters, do the same with a timestamp in milliseconds, or nanoseconds, display a nice string out of a number of seconds to express a duration ("3h 12mn 17s"), pad the number with spaces so that all numbers are aligned (left or right), force a fixed number of digits after the decimal point, etc.

In other words, I was looking for a "universal" way of formatting numbers and failed to find any kind of standard online.

Do Slashdot readers know of such a thing or should I create my own?
Education

Oregon Law Allows Students To Graduate Without Proving They Can Write Or Do Math (oregonlive.com) 337

An anonymous reader quotes a report from Oregon Live: For the next five years, an Oregon high school diploma will be no guarantee that the student who earned it can read, write or do math at a high school level. Gov. Kate Brown had demurred earlier this summer regarding whether she supported the plan passed by the Legislature to drop the requirement that students demonstrate they have achieved those essential skills. But on July 14, the governor signed Senate Bill 744 into law. Through a spokesperson, the governor declined again Friday to comment on the law and why she supported suspending the proficiency requirements. Charles Boyle, the governor's deputy communications director, said the governor's staff notified legislative staff the same day the governor signed the bill.

Boyle said in an emailed statement that suspending the reading, writing and math proficiency requirements while the state develops new graduation standards will benefit "Oregon's Black, Latino, Latina, Latinx, Indigenous, Asian, Pacific Islander, Tribal, and students of color." "Leaders from those communities have advocated time and again for equitable graduation standards, along with expanded learning opportunities and supports," Boyle wrote. The requirement that students demonstrate freshman- to sophomore-level skills in reading, writing and, particularly, math led many high schools to create workshop-style courses to help students strengthen their skills and create evidence of mastery. Most of those courses have been discontinued since the skills requirement was paused during the pandemic before lawmakers killed it entirely.
The state's four-year graduation rate is 82.6%, up more than 10 points from six years ago. However, it still lags behind the national graduation rate averages, which is 85 percent.

Oregon's graduation rates currently rank nearly last in the country. But it's complicated because states use different methodologies to calculate their graduation rates, making some states appear better than others.
Education

Texas Instruments' New Calculator Will Run Programs Written in Python (dallasnews.com) 126

"Dallas-based Texas Instruments' latest generation of calculators is getting a modern-day update with the addition of programming language Python," reports the Dallas Morning News: The goal is to expand students' ability to explore science, technology, engineering and math through the device that's all-but-required in the nation's high schools and colleges...

Though most of the company's $14 billion in annual revenue comes from semiconductors, its graphing calculator remains its most recognized consumer product. This latest TI-84 model, priced between $120 to $160 depending on the retailer, was made to accommodate the increasing importance of programming in the modern world.

Judging by photos in their press release, an "alpha" key maps the calculator's keys to the letters of the alphabet (indicated with yellow letters above each key). One page on its web site also mentions "Menu selections" that "help students with discovery and syntax." (And the site confirms the calculator will "display expressions, symbols and fractions just as you write them.")

There's even a file manager that "gives quick access to Python programs you have saved on your calculator. From here, you can create, edit, run and manage your files." And one page also mentions something called TI Connect CE software application, which "connects your computer and graphing calculator so they can talk to each other. Use it to transfer data, update your operating system, download calculator software applications or take screenshots of your graphing calculator."

I'm sure Slashdot's readers have some fond memories of their first calculator. But these new models have a full-color screen and a rechargeable battery that can last up to a month on a single charge. And Texas Instruments seems to think they could even replace computers in the classroom. "By adding Python to the calculators many students are already familiar with and use in class, we are making programming more accessible and approachable for all students," their press release argues, "eliminating the need for teachers to reserve separate computer labs to teach these important skills.
Education

SANS Institute Founder Hopes to Find New Cybersecurity Talent With a Game (esecurityplanet.com) 15

storagedude writes: Alan Paller, founder of the cybersecurity training SANS Technology Institute, has launched an initiative aimed at finding and developing cybersecurity talent at the community college and high school level — through a game developed by their CTO James Lyne. A similar game was already the basis of a UK government program that has reached 250,000 students, and Paller hopes the U.S. will adopt a similar model to help ease the chronic shortage of cybersecurity talent. And Paller's own Cyber Talent Institute (or CTI) has already reached 29,000 students, largely through state-level partnerships.

But playing the game isn't the same as becoming a career-ready cybersecurity pro. By tapping high schools and community colleges, the group hopes to "discover and train a diverse new generation of 25,000 cyber stars by the year 2025," Paller told eSecurity Planet. "SANS is an organization that finds people who are already in the field and makes them better. What CTI is doing is going down a step in the pipeline, to the students, to find the talent earlier, so that we don't lose them. Because the way the education system works, only a few people seem to go into cybersecurity. We wanted to change that.

"You did an article earlier this month about looking in different places for talent, looking for people who are already working. That's the purpose of CTI. To reach out to students. It's to go beyond the pipeline that we automatically come into cybersecurity through math, computer science, and networking and open the funnel much wider. Find people who have not already found technology, but who have three characteristics that seem to make superstars — tenacity, curiosity, and love of learning new things. They don't mind being faced with new problems. They like them. And what the game does is find those people. So CTI is just moving to earlier in the pipeline."

Space

How Many Atoms Are In the Observable Universe? (livescience.com) 77

Long-time Slashdot reader fahrbot-bot quotes LiveScience's exploration of the math: To start out 'small,' there are around 7 octillion, or 7x10^27 (7 followed by 27 zeros), atoms in an average human body, according to The Guardian. Given this vast sum of atoms in one person alone, you might think it would be impossible to determine how many atoms are in the entire universe. And you'd be right: Because we have no idea how large the entire universe really is, we can't find out how many atoms are within it.

However, it is possible to work out roughly how many atoms are in the observable universe — the part of the universe that we can see and study — using some cosmological assumptions and a bit of math.

[...]

Doing the math

To work out the number of atoms in the observable universe, we need to know its mass, which means we have to find out how many stars there are. There are around 10^11 to 10^12 galaxies in the observable universe, and each galaxy contains between 10^11 and 10^12 stars, according to the European Space Agency. This gives us somewhere between 10^22 and 10^24 stars. For the purposes of this calculation, we can say that there are 10^23 stars in the observable universe. Of course, this is just a best guess; galaxies can range in size and number of stars, but because we can't count them individually, this will have to do for now.

On average, a star weighs around 2.2x10^32 pounds (10^32 kilograms), according to Science ABC, which means that the mass of the universe is around 2.2x10^55 pounds (10^55 kilograms). Now that we know the mass, or amount of matter, we need to see how many atoms fit into it. On average, each gram of matter has around 10^24 protons, according to Fermilab, a national laboratory for particle physics in Illinois. That means it is the same as the number of hydrogen atoms, because each hydrogen atom has only one proton (hence why we made the earlier assumption about hydrogen atoms).

This gives us 10^82 atoms in the observable universe. To put that into context, that is 100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 atoms.

This number is only a rough guess, based on a number of approximations and assumptions. But given our current understanding of the observable universe, it is unlikely to be too far off the mark.

Crime

French Engineer Claims He's Solved the Zodiac Killer's Final Code (msn.com) 57

The New York Times tells the story of Fayçal Ziraoui, a 38-year-old French-Moroccan business consultant who "caused an online uproar" after saying he'd cracked the last two unsolved ciphers of the four attributed to the Zodiac killer in California "and identified him, potentially ending a 50-year-old quest." Maybe because he said he cracked them in just two weeks. Many Zodiac enthusiasts consider the remaining ciphers — Z32 and Z13 — unsolvable because they are too short to determine the encryption key. An untold number of solutions could work, they say, rendering verification nearly impossible.

But Mr. Ziraoui said he had a sudden thought. The code-crackers who had solved the [earlier] 340-character cipher in December had been able to do so by identifying the encryption key, which they had put into the public domain when announcing their breakthrough. What if the killer used that same encryption key for the two remaining ciphers? So he said he applied it to the 32-character cipher, which the killer had included in a letter as the key to the location of a bomb set to go off at a school in the fall of 1970. (It never did, even though police failed to crack the code.) That produced a sequence of random letters from the alphabet. Mr. Ziraoui said he then worked through a half-dozen steps including letter-to-number substitutions, identifying coordinates in numbers and using a code-breaking program he created to crunch jumbles of letters into coherent words...

After two weeks of intense code-cracking, he deciphered the sentence, "LABOR DAY FIND 45.069 NORT 58.719 WEST." The message referred to coordinates based on the earth's magnetic field, not the more familiar geographic coordinates. The sequence zeroed in on a location near a school in South Lake Tahoe, a city in California referred to in another postcard believed to have been sent by the Zodiac killer in 1971.

An excited Mr. Ziraoui said he immediately turned to Z13, which supposedly revealed the killer's name, using the same encryption key and various cipher-cracking techniques. [The mostly un-coded letter includes a sentence which says "My name is _____," followed by a 13-character cipher.] After about an hour, Mr. Ziraoui said he came up with "KAYR," which he realized resembled the last name of Lawrence Kaye, a salesman and career criminal living in South Lake Tahoe who had been a suspect in the case. Mr. Kaye, who also used the pseudonym Kane, died in 2010.

The typo was similar to ones found in previous ciphers, he noticed, likely errors made by the killer when encoding the message. The result that was so close to Mr. Kaye's name and the South Lake Tahoe location were too much to be a coincidence, he thought. Mr. Kaye had been the subject of a report by Harvey Hines, a now-deceased police detective, who was convinced he was the Zodiac killer but was unable to convince his superiors. Around 2 a.m. on Jan. 3, an exhausted but elated Mr. Ziraoui posted a message entitled "Z13 — My Name is KAYE" on a 50,000-member Reddit forum dedicated to the Zodiac Killer.

The message was deleted within 30 minutes.

"Sorry, I've removed this one as part of a sort of general policy against Z13 solution posts," the forum's moderator wrote, arguing that the cipher was too short to be solvable.

Math

Mathematicians Welcome Computer-Assisted Proof in 'Grand Unification' Theory (nature.com) 36

Proof-assistant software handles an abstract concept at the cutting edge of research, revealing a bigger role for software in mathematics. From a report: Mathematicians have long used computers to do numerical calculations or manipulate complex formulas. In some cases, they have proved major results by making computers do massive amounts of repetitive work -- the most famous being a proof in the 1970s that any map can be coloured with just four different colours, and without filling any two adjacent countries with the same colour. But systems known as proof assistants go deeper. The user enters statements into the system to teach it the definition of a mathematical concept -- an object -- based on simpler objects that the machine already knows about.

A statement can also just refer to known objects, and the proof assistant will answer whether the fact is 'obviously' true or false based on its current knowledge. If the answer is not obvious, the user has to enter more details. Proof assistants thus force the user to lay out the logic of their arguments in a rigorous way, and they fill in simpler steps that human mathematicians had consciously or unconsciously skipped. Once researchers have done the hard work of translating a set of mathematical concepts into a proof assistant, the program generates a library of computer code that can be built on by other researchers and used to define higher-level mathematical objects. In this way, proof assistants can help to verify mathematical proofs that would otherwise be time-consuming and difficult, perhaps even practically impossible, for a human to check. Proof assistants have long had their fans, but this is the first time that they had a major role at the cutting edge of a field, says Kevin Buzzard, a mathematician at Imperial College London who was part of a collaboration that checked Scholze and Clausen's result. "The big remaining question was: can they handle complex mathematics?" says Buzzard. "We showed that they can."

Math

When Graphs Are a Matter of Life and Death (newyorker.com) 122

Pie charts and scatter plots seem like ordinary tools, but they revolutionized the way we solve problems. From a report: John Carter has only an hour to decide. The most important auto race of the season is looming; it will be broadcast live on national television and could bring major prize money. If his team wins, it will get a sponsorship deal and a chance to start making some real profits for a change. There's just one problem. In seven of the past twenty-four races, the engine in the Carter Racing car has blown out. An engine failure live on TV will jeopardize sponsorships -- and the driver's life. But withdrawing has consequences, too. The wasted entry fee means finishing the season in debt, and the team won't be happy about the missed opportunity for glory. As Burns's First Law of Racing says, "Nobody ever won a race sitting in the pits."

One of the engine mechanics has a hunch about what's causing the blowouts. He thinks that the engine's head gasket might be breaking in cooler weather. To help Carter decide what to do, a graph is devised that shows the conditions during each of the blowouts: the outdoor temperature at the time of the race plotted against the number of breaks in the head gasket. The dots are scattered into a sort of crooked smile across a range of temperatures from about fifty-five degrees to seventy-five degrees. The upcoming race is forecast to be especially cold, just forty degrees, well below anything the cars have experienced before. So: race or withdraw?

This case study, based on real data, and devised by a pair of clever business professors, has been shown to students around the world for more than three decades. Most groups presented with the Carter Racing story look at the scattered dots on the graph and decide that the relationship between temperature and engine failure is inconclusive. Almost everyone chooses to race. Almost no one looks at that chart and asks to see the seventeen missing data points -- the data from those races which did not end in engine failure.

Slashdot Top Deals