Education

Two-Thirds of American Employees Regret Their College Degrees (cbsnews.com) 209

An anonymous reader quotes a report from CBS News: A college education is still considered a pathway to higher lifetime earnings and gainful employment for Americans. Nevertheless, two-thirds of employees report having regrets when it comes to their advanced degrees, according to a PayScale survey of 248,000 respondents this past spring that was released Tuesday. Student loan debt, which has ballooned to nearly $1.6 trillion nationwide in 2019, was the No. 1 regret among workers with college degrees. About 27% of survey respondents listed student loans as their top misgiving, PayScale said. College debt was followed by chosen area of study (12%) as a top regret for employees, though this varied greatly by major. Other regrets include poor networking, school choice, too many degrees, time spent completing education and academic underachievement. "Those with science, technology, engineering and math majors, who are typically more likely to enjoy higher salaries, reported more satisfaction with their degrees," the report adds. "About 42% of engineering grads and 35% of computer science grads said they had no regrets."

Those with the most regrets include humanities majors, who are least likely to earn higher pay post-graduation. "About 75% of humanities majors said they regretted their college education," report says. "About 73% of graduates who studied social sciences, physical and life sciences, and art also said the same." Somewhere in the middle were 66% of business graduates, 67% of health sciences graduates and 68% of math graduates who said they regretted their education.
Programming

Remembering The ENIAC Programmers (freedom-to-tinker.com) 85

On Princeton's "Freedom to Tinker" site, the founder of the ENIAC Programmers Project summarizes 20 years of its research, remembering the "incredible acts of computing innovation during and just after WWII" that "established the foundation of modern computing and programming."

Commissioned in 1942, and launched in 1946, the ENIAC computer, with its 18,000 vacuum tubes, was the world's very first modern computer (all-electronic, programmable, and general-purpose). "Key technologists of the time, of course, told the Army that the ENIAC would never work."

Slashdot reader AmiMoJo quotes Cory Doctorow: The ENIAC programmers had to invent programming as we know it, working without programming codes (these were invented a few years later for UNIVAC by Betty Holberton): they "broke down the differential calculus ballistics trajectory program" into small steps the computer could handle, then literally wired together the program by affixing cables and flicking the machine's 3,000 switches in the correct sequences. To capture it all, they created meticulous flowcharts that described the program's workings.
From the site: Gunners needed to know what angle to shoot their artillery to hit a target 8 to 10 miles away.... The Army's Ballistics Research Labs (BRL) located women math graduates from schools nearby [who] worked day and night, six days a week, calculating thousands of ballistics trajectories which were compiled into artillery firing tables and sent to soldiers in the battlefields. It was a tremendous effort. Second, the Army and BRL agreed to commission a highly-experimental machine... [Six] women studied ENIAC's wiring and logical diagrams and taught themselves how to program it...

After the war, the Army asked all six ENIAC Programmers to continue their work -- no solider returning home from the battlefield could program ENIAC... Others made other pivotal contributions: Jean Bartik led the team that converted ENIAC to one of the world's first stored program computer and her best friend Betty Holberton joined Eckert Mauchly Computer Corporation and wrote critical new programming tools for UNIVAC I, the first commercial computer, including the C-10 instruction code (predecessor to programming languages).
You can still find its original operating manual online. ("Do not open d-c fuse cabinet with the d-c power turned on. This not only exposes a person to voltage differences of around 1500 volts but the person may be burned by flying pieces of molten fuse wire in case a fuse should blow.")

It performed calculations that helped design the world's first hydrogen bomb.
Education

Should Schools Teach Computer Science Instead of Physics? (floridaphoenix.com) 316

Long-time Slashdot reader theodp writes: "Other than trying to keep my kids from falling down the stairs in the Governor's mansion I don't know how much I deal with physics daily," quipped Florida governor Ron DeSantis as he explained his support for a bill pushed by Microsoft and Code.org lobbyists that will allow computer science credit to be substituted for traditional science classes to meet high school graduation requirements. "You cannot live in our modern society without dealing with technology or computers in your daily life."

From the Governor's press release: "Expanding access to computer science learning is critically important for the future of Florida's students," said Sheela VanHoose of Code.org. "This historic investment by the Governor and the Florida Legislature represents the nation's largest one-time investment in computer science teachers by a state."

"Providing the tools that students need to learn computer science is crucial to filling the jobs of tomorrow," said Fred Humphries, Corporate Vice President of U.S. Government Affairs at Microsoft. "We applaud Governor DeSantis for approving crucial funding to help train more computer science teachers as part of a broader commitment to prepare students for the thousands of computing and data science jobs in Florida. Microsoft looks forward to continuing to work with Governor DeSantis to ensure that all students are ready for the career opportunities created by our digital economy."

Math

A 53-Year-Old Network Coloring Conjecture Is Disproved (quantamagazine.org) 49

In just three pages, a Russian mathematician has presented a better way to color certain types of networks than many experts thought possible. From a report: A paper posted online last month has disproved a 53-year-old conjecture about the best way to assign colors to the nodes of a network. The paper shows, in a mere three pages, that there are better ways to color certain networks than many mathematicians had supposed possible. Network coloring problems, which were inspired by the question of how to color maps so that adjoining countries are different colors, have been a focus of study among mathematicians for nearly 200 years. The goal is to figure out how to color the nodes of some network (or graph, as mathematicians call them) so that no two connected nodes share the same color. Depending on the context, such a coloring can provide an effective way to seat guests at a wedding, schedule factory tasks for different time slots, or even solve a sudoku puzzle.

Graph coloring problems tend to be simple to state, but they are often enormously hard to solve. Even the question that launched the field -- Do four colors suffice to color any map? -- took more than a century to answer (the answer is yes, in case you were wondering). The problem tackled in the new paper seemed, until now, to be no exception to this rule. Unsolved for more than 50 years, it concerns tensor products -- graphs made by combining two different graphs (call them G and H) in a specific way. The tensor product of G and H is a new, larger graph in which each node represents a pair of nodes from the original graphs -- one from G and one from H -- and two nodes in the tensor product are connected if both their corresponding nodes in G and their corresponding nodes in H are connected.

Microsoft

New Hampshire Unveils a Historical Highway Marker For The BASIC Programming Language (concordmonitor.com) 68

"It took 10 months to get it done, but the Granite State is now officially a Geeky State," writes Concord Monitor science reporter David Brooks.

"The latest New Hampshire Historical Highway Marker, celebrating the creation of the BASIC computer language at Dartmouth in 1964, has officially been installed. Everybody who has ever typed a GOTO command can feel proud..." Last August, I wrote in this column that the 255 official historical markers placed alongside state roads told us enough about covered bridges and birthplaces of famous people but not enough about geekiness. Since anybody can submit a suggestion for a new sign, I thought I'd give it a shot.

The creation of BASIC, the first programing language designed to let newbies dip their intellectual toes into the cutting-edge world of software, seemed the obvious candidate. Beginner's All-purpose Symbolic Instruction Code has probably has done more to introduce more people to computer programming than anything ever created. That includes me: The only functioning programs I've ever created were in vanilla BASIC, and I still recall the great satisfaction of typing 100 END...

But BASIC wasn't just a toy for classrooms. It proved robust enough to survive for decades, helping launch Microsoft along the way, and there are descendants still in use today. In short, it's way more important than any covered bridge.

The campaign for the marker was supported by Thomas Kurtz, the retired Dartmouth math professor who'd created BASIC along with the late John Kemeny. "Our original idea was to mention both BASIC and the Dartmouth Time-Sharing System, an early system by which far-flung computers could share resources. They were created hand-in-hand as part of Kemeny's idea of putting computing in the hands of the unwashed masses.

"However, the N.H. Division of Historical Resources, which has decades of experience creating these markers, said it would be too hard to cram both concepts into the limited verbiage of a sign."

The highway marker calls BASIC "the first user-friendly computer programming languages... BASIC made computer programming accessible to college students and, with the later popularity of personal computers, to users everywhere. It became the standard way that people all over the world learned to program computers, and variants of BASIC are still in use today."

In the original submission, an anonymous Slashdot reader notes that last month, Manchester New Hampshire also unveiled a statue of Ralph Baer, whose team built the first home video game sold as Magnavox Odyssey, sitting on a park bench. "The Granite State isn't shy about its geek side."
Advertising

'Apple Wants To Kill the Ad Industry. It's Forcing Developers To Help.' (char.gd) 221

"As a consumer, the idea of Apple sign-in is genuinely an exciting one..." writes developer/tech journalist Owen Williams at Char.gd.

"As a person in digital marketing, as well as a coder and startup founder, the feature terrifies me... I don't have a choice. Apple plans to force developers using third-party signin features to add its signin along any competing ones, rather than allowing them to make the choice. Essentially, Apple will force its success..." [B]y selling the tool as a privacy-focused feature, the company is building a new identity system that it owns entirely. Because it is a powerful privacy feature, it makes it hard to debate this move in any constructive way -- personally, I think we need more tools like this, just not from the very platforms further entrenching their own kingdoms... All of the largest tech companies have switched gears to this model, including Google, and now sell a narrative that nobody can be trusted with your data -- but it's fine to give it all to them, instead. There's bitter irony in Apple denouncing other companies' collection of data with a sign-in service, then launching its own, asking that you give that data to them, instead. I definitely trust Apple to act with my interests at heart today, but what about tomorrow, when the bottom falls out of iPhone sales, and the math changes?

I'm not arguing that any of these advertising practices are right or wrong, but rather that such a hamfisted approach isn't all that it seems. The ad industry gets a bad rap -- and does need to improve -- but allowing a company that has a vested interest in crippling it to dictate the rules by forcing developers to implement their technology is wrong...

This feature, and the way it's being forced on developers, is a fantastic example of why companies like Apple and Google should be broken up: it's clearly using the App Store, and its reach, to force the industry's hand in its favor -- rather than compete on merit.

Education

Why Are Some Wealthy Kids Getting Extra Time To Finish Their SAT Tests? (cbsnews.com) 210

Students from wealthy high schools are more than twice as likely to qualify for extra time to finish their SAT or ACT college entrance tests than students from poor schools -- and in some cases, they're getting 50% more time.

An anonymous reader quotes CBS News: About 4.2 percent of students at wealthy high schools qualified for a 504 designation, a plan that enables the students to qualify for accommodations such as extra test-taking time, according to an analysis of federal data for 9,000 by public schools by The Wall Street Journal. By comparison, only 1.6 percent of students in poor high schools qualified for the same designation.... These plans, named after a federal statute prohibiting discrimination against students with disabilities, can cover a wide range of issues, ranging from anxiety to deafness and other impairments. But critics of 504 plans say some families may be abusing the system in order to secure much-needed extra time for their children on the high-stakes exams...

About one-sixth of ACT test-takers don't complete the exam within its normal time limit, the Journal noted. And a redesign of the SAT in 2014 signaled how many students struggle with finishing on time, as fewer than half of students completed the math section in a prototype of the new test. Naturally, gaining an extra 50 percent of the allotted time can alleviate some of the stress of time management. And the SATs and ACTs don't alert colleges about whether a student received extra time to complete the tests, eliminating a disincentive for students to request the accommodation.

It's apparently been going on for years, according to CBS. In 2000 a California state report found that students getting extra time for their tests "were predominately white, wealthy, and from private schools."

And now in Boston's "well-heeled" Newton suburb, about one-third of students qualified for extra time.
Government

EPA Plans To Get Thousands of Pollution Deaths Off the Books by Changing Its Math (nytimes.com) 308

The Environmental Protection Agency plans to change the way it calculates the health risks of air pollution, a shift that would make it easier to roll back a key climate change rule because it would result in far fewer predicted deaths from pollution, New York Times reported this week, citing five people with knowledge of the agency's plans. From the report: The E.P.A. had originally forecast that eliminating the Obama-era rule, the Clean Power Plan, and replacing it with a new measure would have resulted in an additional 1,400 premature deaths per year. The new analytical model would significantly reduce that number and would most likely be used by the Trump administration to defend further rollbacks of air pollution rules if it is formally adopted. The proposed shift is the latest example of the Trump administration downgrading the estimates of environmental harm from pollution in regulations. In this case, the proposed methodology would assume there is little or no health benefit to making the air any cleaner than what the law requires. Many experts said that approach was not scientifically sound and that, in the real world, there are no safe levels of the fine particulate pollution associated with the burning of fossil fuels.
Math

Measurements Confirm Universe Is Expanding Faster Than Expected (sciencedaily.com) 186

Slashdot reader The Snazster shares a report from ScienceDaily, reporting on materials provided by Johns Hopkins University: New measurements from NASA's Hubble Space Telescope confirm that the Universe is expanding about 9% faster than expected based on its trajectory seen shortly after the big bang, astronomers say. The new measurements, published April 25 in the Astrophysical Journal Letters, reduce the chances that the disparity is an accident from 1 in 3,000 to only 1 in 100,000 and suggest that new physics may be needed to better understand the cosmos.

In this study, [Adam Riess, Bloomberg Distinguished Professor of Physics and Astronomy at The Johns Hopkins University, Nobel Laureate and the project's leader] and his SH0ES (Supernovae, H0, for the Equation of State) Team analyzed light from 70 stars in our neighboring galaxy, the Large Magellanic Cloud, with a new method that allowed for capturing quick images of these stars. The stars, called Cepheid variables, brighten and dim at predictable rates that are used to measure nearby intergalactic distances. The usual method for measuring the stars is incredibly time-consuming; the Hubble can only observe one star for every 90-minute orbit around Earth. Using their new method called DASH (Drift And Shift), the researchers using Hubble as a "point-and-shoot" camera to look at groups of Cepheids, thereby allowing the team to observe a dozen Cepheids in the same amount of time it would normally take to observe just one. [...] As the team's measurements have become more precise, their calculation of the Hubble constant has remained at odds with the expected value derived from observations of the early universe's expansion by the European Space Agency's Planck satellite based on conditions Planck observed 380,000 years after the Big Bang.
"This is not just two experiments disagreeing," Riess explained. "We are measuring something fundamentally different. One is a measurement of how fast the universe is expanding today, as we see it. The other is a prediction based on the physics of the early universe and on measurements of how fast it ought to be expanding. If these values don't agree, there becomes a very strong likelihood that we're missing something in the cosmological model that connects the two eras."
Education

LeBron James' STEM-Based School Is Showing Promise (goodnewsnetwork.org) 102

Last year, NBA superstar LeBron James opened an experimental school that focuses on teaching a STEM curriculum to students who have a higher probability of failing academically or dropping out of school. The New York Times is now reporting that "the inaugural classes of third and fourth graders at [the I PROMISE School] posted extraordinary results in their first set of district assessments. Ninety percent met or exceeded individual growth goals in reading and math (Warning: source may be paywalled; alternative source), outpacing their peers across the district." From the report: The students' scores reflect their performance on the Measures of Academic Progress assessment, a nationally recognized test administered by NWEA, an evaluation association. In reading, where both classes had scored in the lowest, or first, percentile, third graders moved to the ninth percentile, and fourth graders to the 16th. In math, third graders jumped from the lowest percentile to the 18th, while fourth graders moved from the second percentile to the 30th.

The 90 percent of I Promise students who met their goals exceeded the 70 percent of students districtwide, and scored in the 99th growth percentile of the evaluation association's school norms, which the district said showed that students' test scores increased at a higher rate than 99 out of 100 schools nationally. The students have a long way to go to even join the middle of the pack. And time will tell whether the gains are sustainable and how they stack up against rigorous state standardized tests at the end of the year. To some extent, the excitement surrounding the students' progress illustrates a somber reality in urban education, where big hopes hinge on small victories.

Space

'BlackHoles@Home' Will Use Your PC For DIY Gravitational Wave Analysis (phys.org) 50

West Virginia University assistant professor Zachariah Etienne is launching "a global volunteer computing effort" analyzing gravitational waves from colliding black holes, reports Phys.org: "As our gravitational wave detectors become more sensitive, we're going to need to greatly expand our efforts to understand all of the information encoded in gravitational waves from colliding binary black holes," Etienne said. "We are turning to the general public to help with these efforts, which involve generating unprecedented numbers of self-consistent simulations of these extremely energetic collisions. This will truly be an inclusive effort, and we especially hope to inspire the next generation of scientists in this growing field of gravitational wave astrophysics."

His team -- and the scientific community in general -- needs computing capacity to run the simulations required to cover all possibilities related to the properties and other information contained in gravitational waves. "Each desktop computer will be able to perform a single simulation of colliding black holes," said Etienne. By seeking public involvement through use of vast numbers of personal desktop computers, Etienne and others hope to dramatically increase the throughput of the theoretical gravitational wave predictions needed to extract information from observations of the collisions.

Etienne and his team are building a website with downloadable software based on the same Berkeley Open Infrastructure for Network Computing, or BOINC, system used for the SETI@Home project and other scientific applications. The free middleware system is designed to help harness the processing power of thousands of personal computers across the globe. The West Virginia team has named their project BlackHoles@Home and expects to have it up and running later this year.

They have already established a website where the public can begin learning more about the effort.

Social Networks

'Hyperscans' Show How Brains Sync As People Interact (scientificamerican.com) 38

"A growing cadre of neuroscientists is using sophisticated technology -- and some very complicated math -- to capture what happens in one brain, two brains, or even 12 or 15 at a time when their owners are engaged in eye contact, storytelling, joint attention focused on a topic or object, or any other activity that requires social give and take," reports Scientific American. "Although the field of interactive social neuroscience is in its infancy, the hope remains that identifying the neural underpinnings of real social exchange will change our basic understanding of communication and ultimately improve education or inform treatment of the many psychiatric disorders that involve social impairments." Here's an excerpt from the report: [T]he first study to successfully monitor two brains at the same time took place nearly 20 years ago. Physicist Read Montague, now at Virginia Tech, and his colleagues put two people in separate functional magnetic resonance imaging (fMRI) machines and observed their brain activity as they engaged in a simple competitive game in which one player (the sender) transmitted a signal about whether he or she had just seen the color red or green and the other player (the receiver) had to decide if the sender was telling the truth or lying. Correct guesses resulted in rewards. Montague called the technique hyperscanning, and his work proved it was possible to observe two brains at once.

Initially, Montague's lead was followed mostly by other neuroeconomists rather than social neuroscientists. But the term hyperscanning is now applied to any brain imaging research that involves more than one person. Today the techniques that fit the bill include electroencephalography (EEG), magnetoencephalography and functional near-infrared spectroscopy. Use of these varied techniques, many of them quite new, has broadened the range of possible experiments and made hyperscanning less cumbersome and, as a consequence, much more popular.
The report also mentions a study from earlier this year that "used hyperscanning to show that eye contact prepares the social brain to empathize by activating the same areas of each person's brain simultaneously: the cerebellum, which helps predict the sensory consequences of actions, and the limbic mirror system, a set of brain areas that become active both when we move any part of the body (including the eyes) and when we observe someone else's movements."
Math

Old-School Slashdotter Discovers and Solves Longstanding Flaw In Basic Calculus (mindmatters.ai) 222

Longtime Slashdot reader johnnyb (Jonathan Bartlett) shares the findings of a new study he, along with co-author Asatur Zh. Khurshudyan, published this week in the journal DCDIS-A: Recently a longstanding flaw in elementary calculus was found and corrected. The "second derivative" has a notation that has confused many students. It turns out that part of the confusion is because the notation is wrong. Note -- I am the subject of the article. Mind Matters provides the technical details: "[T]he second derivative of y with respect to x has traditionally had the notation 'd2 y/dx 2.' While this notation is expressed as a fraction, the problem is that it doesn't actually work as a fraction. The problem is well-known but it has been generally assumed that there is no way to express the second derivative in fraction form. It has been thought that differentials (the fundamental 'dy' and 'dx' that calculus works with) were not actual values and therefore they aren't actually in ratio with each other. Because of these underlying assumptions, the fact that you could not treat the second derivative as a fraction was not thought to be an anomaly. However, it turns out that, with minor modifications to the notation, the terms of the second derivative (and higher derivatives) can indeed be manipulated as an algebraic fraction. The revised notation for the second derivative is '(d 2 y/dx 2) - (dy/dx)(d 2 x/dx 2).'"

The report adds that while mathematicians haven't been getting wrong answers, "correcting the notation enables mathematicians to work with fewer special-case formulas and also to develop a more intuitive understanding of the nature of differentials."
AMD

Could AMD's Upcoming EPYC 'Rome' Server Processors Feature Up To 162 PCIe Lanes? (tomshardware.com) 107

jwhyche (Slashdot reader #6,192) tipped us off to some interesting speculation about AMD's upcoming Zen 2-based EPYC Rome server processors. "The new Epyc processor would be Gen 4 PCIe where Intel is still using Gen 3. Gen 4 PCIe features twice the bandwidth of the older Gen 3 specification."

And now Tom's Hardware reports: While AMD has said that a single EPYC Rome processor could deliver up to 128 PCIe lanes, the company hasn't stated how many lanes two processors could deliver in a dual-socket server. According to ServeTheHome.com, there's a distinct possibility EPYC could feature up to 162 PCIe 4.0 lanes in a dual-socket configuration, which is 82 more lanes than Intel's dual-socket Cascade Lake Xeon servers. That even beats Intel's latest 56-core 112-thread Platinum 9200-series processors, which expose 80 PCIe lanes per dual-socket server.

Patrick Kennedy at ServeTheHome, a publication focused on high-performance computing, and RetiredEngineer on Twitter have both concluded that two Rome CPUs could support 160 PCIe 4.0 lanes. Kennedy even expects there will be an additional PCIe lane per CPU (meaning 129 in a single socket), bringing the total number of lanes in a dual-socket server up to 162, but with the caveat that this additional lane per socket could only be used for the baseboard management controller (or BMC), a vital component of server motherboards... If @RetiredEngineer and ServeTheHome did their math correctly, then Intel has even more serious competition than AMD has let on.

Music

The Swedish DJ Who Invented Industrially-Manufactured Pop Music (bbc.com) 110

"BBC Culture reports on DJ Denniz Pop (born Dagge Volle), who couldn't sing, play an instrument, or write a song but could mathematically craft a song from stitching together electronically programmed sounds and beats," writes Slashdot reader dryriver. "Pop was the musical brains behind acts ranging from the Backstreet Boys, *NSYNC, Ace Of Base to Britney Spears, and trained Max Martin who wrote 22 Billbooard #1 hits for the likes of Taylor Swift, The Weeknd, Katy Perry, P!nk, Justin Timberlake, Ariana Grande and Maroon 5 using a technique called 'Melodic Math.'" From the report: In a basement in Stockholm's suburbs, Pop brought together an elite team of eight songwriters and producers for a new venture -- Cheiron Studios -- in 1992. Over the next eight years they would go on to sell hundreds of millions of records through the likes of Ace of Base, 5ive, Robyn, Boyzone, Backstreet Boys, Westlife, *NSYNC and Britney Spears. The secret of their songwriting success was to marry the melody to the beat, not work against it, and to have a big chorus. The team at Cheiron followed Pop's example, experimenting in clubs across the capital with up to a hundred different versions of each new track -- meticulously documenting the combinations of beats and melodies that made the club crowds go wild. Through these experiments, an entirely new genre of music blossomed, one that seemed tailor-made for the age of manufactured boybands and girl groups. Having grown up in socialist Sweden, Pop's approach to writing music was almost utilitarian. Like so many Swedish success stories -- IKEA, H&M, Volvo and Spotify -- the Cheiron team wanted their product to appeal to the maximum amount of people, which in a country with a population of only nine million meant focusing outside the nation's borders. Pop designed his music to reflect the lives of the people who bought more music than anyone else -- American teenagers -- at least as far as he understood them from his basement in faraway Stockholm.
Social Networks

Linus Torvalds on Social Media: 'It's a Disease. It Seems To Encourage Bad Behavior.' (linuxjournal.com) 305

From a wide-ranging interview of Linus Torvalds with Linux Journal on the magazine's 25th anniversary: Linux Journal: If you had to fix one thing about the networked world, what would it be?
Linus: Nothing technical. But, I absolutely detest modern "social media" -- Twitter, Facebook, Instagram. It's a disease. It seems to encourage bad behavior. I think part of it is something that email shares too, and that I've said before: "On the internet, nobody can hear you being subtle". When you're not talking to somebody face to face, and you miss all the normal social cues, it's easy to miss humor and sarcasm, but it's also very easy to overlook the reaction of the recipient, so you get things like flame wars, etc., that might not happen as easily with face-to-face interaction. But email still works. You still have to put in the effort to write it, and there's generally some actual content (technical or otherwise). The whole "liking" and "sharing" model is just garbage. There is no effort and no quality control. In fact, it's all geared to the reverse of quality control, with lowest common denominator targets, and click-bait, and things designed to generate an emotional response, often one of moral outrage.

Add in anonymity, and it's just disgusting. When you don't even put your real name on your garbage (or the garbage you share or like), it really doesn't help. I'm actually one of those people who thinks that anonymity is overrated. Some people confuse privacy and anonymity and think they go hand in hand, and that protecting privacy means that you need to protect anonymity. I think that's wrong. Anonymity is important if you're a whistle-blower, but if you cannot prove your identity, your crazy rant on some social-media platform shouldn't be visible, and you shouldn't be able to share it or like it.

Linux Journal: Is there any advice you'd like to give to young programmers/computer science students?
Linus: I'm actually the worst person to ask. I knew I was interested in math and computers since an early age, and I was largely self-taught until university. And everything I did was fairly self-driven. So I don't understand the problems people face when they say "what should I do?" It's not where I came from at all.

Math

Windows 10 Calculator Will Soon Be Able To Graph Math Equations (zdnet.com) 130

Earlier this month, Microsoft made the source code for its Windows calculator available on GitHub. This has spurred developers to add new features to the app, like a new graphing mode that will make its way to the official Windows Calculator app. The "Graphing Mode" is one of 30+ suggestions that open-source contributors have proposed so far. The ZDNet reports: As its name implies, Graphing Mode will allow users to create graphs based on mathematical equations, in a similar way to Matlab's (way more advanced) Plotting Mode. The feature was proposed by Microsoft engineer Dave Grochocki, also a member of the Windows Calculator team. In a GitHub issue Grochocki submitted to support his proposal, he argued that a graphing mode would help students learn algebra easier.

"High school algebra is the gateway to mathematics and all other disciplines of STEM," Grochocki said. "However, algebra is the single most failed course in high school, as well as the most failed course in community college." By adding a Graphing Mode to Windows Calculator, an app included with all Windows 10 versions, the Microsoft engineer hopes to provide students and teachers with a free tool to help schools across the world.
"Physical graphing calculators can be expensive, software solutions require licenses and configuration by school IT departments, and online solutions are not always an option," he added. "Graphing capabilities in their daily tools are essential for students who are beginning to explore linear algebra as early as 8th grade. [...] At present, Windows Calculator does not currently have the needed functionality to meet the demands of students."

There's no timeline for when the new graphing mode will arrive, but it should arrive soon.
Programming

Coders' Primal Urge To Kill Inefficiency -- Everywhere (wired.com) 181

For software engineers, lack of friction is an aesthetic joy, an emotional high, the ideal existential state. It's what drives them, and what shapes our world. An excerpt from an upcoming book on coding, via Wired: The thrust of Silicon Valley is always to take human activity and shift it into metabolic overdrive. And maybe you've wondered, why the heck is that? Why do techies insist that things should be sped up, torqued, optimized? There's one obvious reason, of course: They do it because of the dictates of the market. Capitalism handsomely rewards anyone who can improve a process and squeeze some margin out. But with software, there's something else going on too. For coders, efficiency is more than just a tool for business. It's an existential state, an emotional driver.

Coders might have different backgrounds and political opinions, but nearly every one I've ever met found deep, almost soulful pleasure in taking something inefficient -- even just a little bit slow -- and tightening it up a notch. Removing the friction from a system is an aesthetic joy; coders' eyes blaze when they talk about making something run faster or how they eliminated some bothersome human effort from a process. This passion for efficiency isn't unique to software developers. Engineers and inventors have long been motivated by it. During the early years of industrialization, engineers elevated the automation of everyday tasks to a moral good. The engineer was humanity's "redeemer from despairing drudgery and burdensome labor," as Charles Hermany, an engineer himself, wrote in 1904.

[...] Many of today's programmers have their efficiency "aha" moment in their teenage years, when they discover that life is full of blindingly dull repetitive tasks and that computers are really good at doing them. (Math homework, with its dull litany of exercises, was one thing that inspired a number of coders I've talked to.) Larry Wall, who created the Perl programming language, and several coauthors wrote that one of the key virtues of a programmer is "laziness" -- of the variety where your unwillingness to perform rote actions inspires you to do the work to automate them.

Math

Is Statistical Significance Significant? (npr.org) 184

More than 850 scientists and statisticians told the authors of a Nature commentary that they are endorsing an idea to ban "statistical significance." Critics say that declaring a result to be statistically significant or not essentially forces complicated questions to be answered as true or false. "The world is much more uncertain than that," says Nicoole Lazar, a professor of statistics at the University of Georgia. An entire issue of the journal The American Statistician is devoted to this question, with 43 articles and a 17,500-word editorial that Lazar co-authored.

"In the early 20th century, the father of statistics, R.A. Fisher, developed a test of significance," reports NPR. "It involves a variable called the p-value, that he intended to be a guide for judging results. Over the years, scientists have warped that idea beyond all recognition, creating an arbitrary threshold for the p-value, typically 0.05, and they use that to declare whether a scientific result is significant or not. Slashdot reader apoc.famine writes: In a nutshell, what the statisticians are recommending is that we embrace uncertainty, quantify it, and discuss it, rather than set arbitrary measures for when studies are worth publishing. This way research which appears interesting but which doesn't hit that magical p == 0.05 can be published and discussed, and scientists won't feel pressured to p-hack.
Transportation

Toyota Is Losing the Electric Car Race, So It Pretends Hybrids Are Better 434

Ben Jervey from DeSmogBlog writes about how Toyota is "using questionable logic" to claim hybrid vehicles are superior than electric vehicles, when in reality it's only saying that because it decided years ago to invest in gasoline-electric hybrids and fuel cells in the long term instead of battery production. This decision is now coming back to haunt them. From the report: There are at least 12 car companies currently selling an all-electric vehicle in the United States, and Toyota isn't one of them. Despite admitting recently that the Tesla Model 3 alone is responsible for half of Toyota's customer defections in North America -- as Prius drivers transition to all-electric -- the company has been an outspoken laggard in the race to electrification. Now, the company is using questionable logic to attempt to justify its inaction on electrification, claiming that its limited battery capacity better serves the planet by producing gasoline-electric hybrids. For years, Toyota leadership has shunned investment in all-electric cars, laying out a more conservative strategy to "electrify" its fleet -- essentially doubling down on hybrids and plug-in hybrids -- as a bridge to a future generation of hydrogen fuel cell vehicles. As Tesla, Nissan, and GM have led the technological shift to fully battery electric vehicles, Toyota has publicly bashed the prospects of all-electric fleets. (See, for instance, the swipe the company took at plug-in vehicles in this recent Toyota Corolla Hybrid commercial.)

Last week, at the Geneva Auto Show, a Toyota executive provided a curious explanation for the company's refusal to launch a single battery electric vehicle. As Car and Driver reported, Toyota claims that it is limited by battery production capacity and that "Toyota is able to produce enough batteries for 28,000 electric vehicles each year -- or for 1.5 million hybrid cars." In other words, because Toyota has neglected to invest in battery production, it can only produce enough batteries for a trivial number of all-electric vehicles. Due to this self-inflicted capacity shortage, the company is forced to choose between manufacturing 1.5 million hybrids or 28,000 electric cars. Using what Car and Driver called "fuzzy math," the company tried to justify the strategy to forgo electric vehicles (EVs) on environmental grounds. As Toyota explained it, "selling 1.5 million hybrid cars reduces carbon emissions by a third more than selling 28,000 EVs."
As for the "fuzzy math," Toyota's calculation "seems to assume that for every hybrid sold, a fully gasoline-powered car would be taken off the road," writes Jervey. "In reality, many Toyota hybrid buyers are replacing a Toyota hybrid. And, based on Toyota's own revelation that they are losing Prius drivers to Tesla, it stands to reason that many Toyota hybrid drivers would jump at the opportunity to transition to an all-electric Toyota."

Slashdot Top Deals