×
Math

Interviews: Ask Mathematician Neil Sloane a Question 189

Considered by many to be one of the most influential mathematicians alive today, Neil Sloane has made major contributions to the fields of sphere packing, combinatorics, and error-correcting codes. He is probably best known for being the creator and curator of the On-Line Encyclopedia of Integer Sequences (OEIS), known simply as “Sloane” by its many users. The repository is over 50 years old and contains over 260,000 sequences.

Neil recently turned 76 but his passion for mathematics remains as strong as ever. Talking about a recent project, he writes: “Back in September I was looking at an old sequence in the OEIS. The sequence starts 1, 12, 123, 1234, 12345, ..., 123456789, 12345678910, 1234567891011, ... The n-th term: just write all the decimal numbers from 1 to n in a row and think of this as a big number. The entry for the sequence had a comment that it is expected that there are infinitely many terms which are primes, but that no prime was known, even though Dana Jaconsen had checked the first 64,000 terms. So I asked various friends and correspondents about this, and people extended the search somewhat. In fact Ernst Mayer has set up a cloud-source project to look for primes in the sequence, and the sequence has now been checked to nearly n = 270,000 without finding a prime. But I am hopeful that a prime will appear before we get to n = 10^6. When a prime is found, as it surely will be, it probably won't be the largest prime known, but it will be close to the record (which is held by the latest Mersenne prime). We may make it into the top ten. It will certainly be the largest known prime which is easy to write down! (Explicitly, I mean. You may know that 2^32582657-1 is prime, but you won't be able to write down the decimal expansion without using a computer).”

Neil has agreed to take some time away from his favorite sequences and answer any questions you may have. As usual, ask as many as you'd like, but please, one question per post.
Math

Breakthrough Algorithm Reported For Graph Isomorphsim (scottaaronson.com) 86

JoshuaZ writes: A major open problem in graph theory is how efficiently one can tell, given two graphs, whether or not they are isomorphic — that is, whether they're the same graph with just the labels changed. This problem is famous, along with factoring integers, as a problem that is potentially in between P and NP in difficulty. Now, Laszlo Babai has reported that he has a quasipolynomial time algorithm which he sketched out at a set of talks at the University of Chicago. Scott Aaronson was one of the first to break the news, and his latest blog entry and its comments contain further discussion of the result. The new algorithm places the problem of graph isomorphism as, at most, just barely above P. Babai's result depends on the classification of finite simple groups, a deep result in algebra whose proof consists of thousands of pages over hundreds of distinct papers. Unlike the problem of factoring integers, improvements in this algorithm are unlikely to impact cryptography in any direct way, since no cryptographic systems depend on the difficulty of determining when groups are isomorphic.
Math

When Slide Rules Were Like Cellphones (hackaday.com) 220

szczys writes: Slide Rules and Pocket Protectors are the go-to items when making fun of old-time geeks. Forget the pocket protectors. Slide Rules were the first personal computers and a status symbol akin to what cellphones are today. Of course the general public wasn't attached to them, but engineers were. Before electronic calculators came around, everyone who needed to do some serious math owned Slide Rules. Stunningly easy to use and extremely effective, they have tick-marks placed on a logarithmic scale which makes complex multiplication, division, powers, etc. into visual calculations instead of mental ones.
Education

Celebrate the 200th Birthday of George Boole With Logic (i-programmer.info) 63

mikejuk writes: November 2nd 2015 is the bicentenary of George Boole, dubbed the forefather of modern information technology. To mark the event 55,000 school students globally will be learning about Boolean Logic. Free lesson plans, puzzles and worksheets have been made available in English, Irish and Mandarin and schools in over 30 countries have signed up. According to the George Boole 200 website set up by University College Cork (UCC), the Irish university where he was the first Professor of Mathematics in the mid-19th century, Boole is an unsung hero of the digital age who deserves to be recognized as the forefather of the Information Age. An hour-long documentary, The Genius of George Boole, will be released on November 2 and available to view online until November 16. Although Boole did briefly encounter Charles Babbage during his lifetime he wasn't responsible for bringing together binary arithmetic and what we now call Boolean logic. That achievement is down to Claude Shannon who recognised the relevance for engineering of Boole's symbolic logic. As a result of Shannon's work Boole's thinking became the practical foundation of digital circuit design and the theoretical grounding of the the digital age.
Education

University Reprimands Professor For Assigning Cheaper Textbook (slate.com) 363

schwit1 writes: California State University at Fullerton brought a grievance against associate professor Alain Bourget recently. It wasn't for poor results or questionable conduct — it happened because Bourget refused to assign a $180 textbook for his introductory linear algebra and differential equations course, instead using one that cost $75 and supplementing it with free online materials. "Bourget maintains that his choices are just as effective educationally and much less expensive, so he should have the right to use them. But the university says that it makes sense for courses that have multiple sections to all use the same textbooks. Both Bourget and the university say their positions are based on principles of academic freedom."
Education

Despite $30M Tech Push, Half of US States Had Fewer Than 300 AP CS Test Takers 152

theodp writes: As President Obama was 'taught to code' last December, Politico reported that the $30 million tech-financed campaign to promote computer science education was a smash success. And indeed it has been, at least from a PR standpoint. But Code.org and its backers have long spun AP Computer Science test metrics as a true barometer of CS education success, and from that standpoint, things don't look quite so rosy. The College Board raved about "massive gains in AP Computer Science participation (25% growth) AND scores" in a June tweetstorm and at its July conference, where AP CS was declared the '2015 AP Subject of the Year.' But a look at the recently-released detail on 2015 AP CS scores shows wide differences in adoption and success along gender and ethnicity lines (Asian boys and girls, in particular, set themselves apart from other groups with 70%+ pass rates). And, for all the praise the NSF lavished on Code.org for 'its amazing marketing prowess', half of the states still had fewer than 300 AP CS test takers in 2015, and ten states actually saw year-over-year declines in the number of test takers (if my math is correct — scraped data, VBA code here).
Education

Study: Standardized Tests Overwhelming Public Schools (washingtonpost.com) 278

An anonymous reader writes: A new study examined the amount of time U.S. public schools spend on government-mandated standardized tests, and found that the requirements are detrimental to both students and teachers. On average, students will take 112 standardized tests during their K-12 education. From grades 3-11, students spend over 20 hours per year on standardized tests alone. "It portrays a chock-a-block jumble, where tests have been layered upon tests under mandates from Congress, the U.S. Department of Education and state and local governments, many of which the study argues have questionable value to teachers and students. Testing companies that aggressively market new exams also share the blame, the study said."

The U.S. Department of Education has issued an action plan to school districts outlining ways to reduce useless tests and eliminate redundant ones. President Obama even posted a video pledging to reduce the test load of American students. "Standardized testing has caused intense debate on Capitol Hill as lawmakers work to craft a replacement for No Child Left Behind. Testing critics tried unsuccessfully to erase the federal requirement that schools test in math and reading. Civil rights advocates pushed back, arguing that tests are an important safeguard for struggling students because publicly reported test scores illuminate the achievement gap between historically underserved students and their more affluent peers."

Math

New Algorithm Provides Huge Speedups For Optimization Problems (mit.edu) 129

An anonymous reader writes: MIT graduate students have developed a new "cutting-plane" algorithm, a general-purpose algorithm for solving optimization problems. They've also developed a new way to apply their algorithm to specific problems, yielding orders-of-magnitude efficiency gains. Optimization problems look to find the best set of values for a group of disparate parameters. For example, the cost function around designing a new smartphone would reward battery life, speed, and durability while penalizing thickness, cost, and overheating. Finding the optimal arrangement of values is a difficult problem, but the new algorithm shaves a significant amount of operations (PDF) off those calculations. Satoru Iwata, professor of mathematical informatics at the University of Tokyo, said, "This is indeed an astonishing paper. For this problem, the running time bounds derived with the aid of discrete geometry and combinatorial techniques are by far better than what I could imagine."
Government

Getting Over Getting Over Uber: Tim O'Reilly Does the Math 385

Susan Crawford yesterday published at Medium a critique of Uber and similar ride-coordinating services, in the form of a kind of paean to the American taxicab. Though she didn't start out with negative feelings for Uber, Crawford writes, her sentiment has swung away from objections to taxis (such as that they seek unfair protection from competition) to an extravagant defense, though it comes with a long list of "shoulds": "[Cities] should be focusing on making their taxi services better," she writes. "Taxis should be more accessible to everyone. Taxi fares should be low, predictable, and uniform. Taxi geographies should be wide. Taxis should be clean, fuel-efficient, driven by trustworthy, well-trained drivers, and available for frictionless electronic hailing." Even with the flaws that list implies, Crawford's description of how well taxis work now is more positive than I've found to be true: "Their rates are regulated and set; their pricing is transparent and can be double-checked (just look at the meter, which is itself regularly tested); they look like a uniform fleet; they are subject to very strict licensing and safety requirements. With rare exceptions, they don’t employ surge/congestion pricing schemes."

Tim O'Reilly has written a response, calling Crawford's arguments "puzzling and unconvincing." O'Reilly dissects some of the math behind the business of driving others for money, as it applies to both conventional taxi drivers and "gig economy" drivers, as well as some of the qualitative effects of ride-dispatch services; surely some readers will take issue with his figures and examples, but they provide a plausible case for doubting Crawford's rosy picture of taxis and dark view of modern app-dispatched rides. O'Reilly writes: "Regulation is not a good in itself. It is a means of achieving public goods. And so far, it is pretty clear that Uber and Lyft (and in particular, the competition between them) are improving the transportation options in American cities. Regulators should be using the opportunity to revisit the old way of doing things rather than trying to make the new conform to outdated rules that no longer serve their purpose."
Programming

Happy Ada Lovelace Day (findingada.com) 187

Today is Ada Lovelace Day, a time to celebrate the achievements of women in STEM fields. Several publications have put together lists of notable women to commemorate the day, such as tech pioneers, robotics experts, and historical engineers and scientists. Other are taking the opportunity to keep pushing against the elements of tech culture that remain sexist. From the BBC: On Ada Lovelace Day, four female engineers from around the world share their experiences of working in male-dominated professions. When Isis Anchalee's employer OneLogin asked her to take part in its recruitment campaign, she didn't rush to consult the selfie-loving Kardashian sisters for styling tips. "I was wearing very minimal make-up. I didn't brush my hair that day," she said. But the resulting image of Ms Anchalee created a social media storm when it appeared on Bart, the San Francisco metro. Lots of people questioned whether she really was an engineer. "It was not just limited to women — it resonates with every single person who doesn't fit with what the stereotype should look like," she said.

"My parents, my brother, my community, all were against me," said Sovita Dahal of her decision to pursue a career in technology. "I was going against traditional things. In my schooldays I was fascinated by electronic equipment like motors, transformers and LED lights. Later on this enthusiasm became my passion and ultimately my career," she said.

Image

Review: The Martian Screenshot-sm 242

I was both pleased and disappointed, as always, when I heard that a book I enjoyed was being made into a movie. Andy Weir's The Martian was the best new book I'd read in years. It was written for nerds, by a nerd — by somebody with an obvious love for NASA, science, and spaceflight. How could it possibly be condensed into the format of a Hollywood blockbuster? Well, director Ridley Scott and screenwriter Drew Goddard figured out how. The Martian is an excellent film, well worth watching. Read on for my review (very minor spoilers only), and feel free to share your own in the comments.
Toys

Barbie Gets a Brain 235

minstrelmike writes: Mattel is coming out with a Talking Barbie designed by a huge team and pre-scripted with thousands of responses controlled by an AI, with designs to be your best friend. The design team remembers the "Math is hard" debacle of the 1990s and if a girl asks if she's pretty, Barbie will respond, "Yes. And you're smart, too." If she asks if Barbie believes in God, she says a person's beliefs are personal. And suggests talking to grownups about some problems. The linked New York Times' article ("Barbie Wants to Get to Know Your Child") even discusses trying to avoid edited vids on YouTube by scripting out words such as "cockroach."
Math

The Handheld Analog Computer That Made the Atomic Bomb 45

szczys writes: When the physicists and mathematicians of the Manhattan Project began their work they needed to establish which substance was most likely to sustain vigorous fission. This is not trivial math, and the solution of course is to use an advanced computer. If only they had one available. The best computer of the time was a targeting calculation machine that was out of service while being moved from one installation to another. The unlikely fill-in was a simple yet ingenious analog computer called the FERMIAC. When rolled along a piece of paper it calculated neutron collisions with simple markings — doing its small part to forever change the world without a battery, transistor, or tube.
Software

MIT Simplifies Design Process For 3D Printing 45

An anonymous reader writes: New software out of MIT and the Interdisciplinary Center Herzliya in Israel takes CAD files and automatically builds visual models that users can alter with simple, visual sliders. It works by computing myriad design variations before a user asks for them. When the CAD file is loaded, the software runs through a host of size variations on various properties of the object, evaluating whether the changes would work in a 3D printer, and doing the necessary math to plan tool routes. When a user moves one of the sliders, it switches the design along these pre-computer values. "The system automatically weeds out all the parameter values that lead to unprintable or unstable designs, so the sliders are restricted to valid designs. Moving one of the sliders — changing the height of the shoe's heel, say, or the width of the mug's base — sweeps through visual depictions of the associated geometries."

There are two big drawbacks: first, it requires a lot of up-front processing power to compute the variations on an object. Second, resolution for changes is fixed if you want quick results — changing the design for a pair of 3D-printed shoes from size 8 to size 9 might be instantaneous, but asking for a shoe that's a quarter of a millimeter longer than a size 8 would take several minutes to process. But for scrolling through the pre-computed design changes, the software can present "in real time what would take hours to calculate with a CAD program," and without the requisite experience with CAD.
Math

Ada Lovelace and Her Legacy 139

nightcats writes: Nature has an extensive piece on the legacy of the "enchantress of abstraction," the extraordinary Victorian-era computer pioneer Ada Lovelace, daughter of the poet Lord Byron. Her monograph on the Babbage machine was described by Babbage himself as a creation of "that Enchantress who has thrown her magical spell around the most abstract of Sciences and has grasped it with a force that few masculine intellects (in our own country at least) could have exerted over it." Ada's remarkable merging of intellect and intuition — her capacity to analyze and capture the conceptual and functional foundations of the Babbage machine — is summarized with a historical context which reveals the precocious modernity of her scientific mind. "By 1841 Lovelace was developing a concept of 'Poetical Science', in which scientific logic would be driven by imagination, 'the Discovering faculty, pre-eminently. It is that which penetrates into the unseen worlds around us, the worlds of science.' She saw mathematics metaphysically, as 'the language of the unseen relations between things;' but added that to apply it, 'we must be able to fully appreciate, to feel, to seize, the unseen, the unconscious.' She also saw that Babbage's mathematics needed more imaginative presentation."
Programming

You Don't Have To Be Good At Math To Learn To Code 616

HughPickens.com writes: Olga Khazan writes in The Atlantic that learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic. Victoria Fine explains how she taught herself how to code despite hating math. Her secret? Lots and lots of Googling. "Like any good Google query, a successful answer depended on asking the right question. "How do I make a website red" was not nearly as successful a question as "CSS color values HEX red" combined with "CSS background color." I spent a lot of time learning to Google like a pro. I carefully learned the vocabulary of HTML so I knew what I was talking about when I asked the Internet for answers." According to Khazan while it's true that some types of code look a little like equations, you don't really have to solve them, just know where they go and what they do. "In most cases you can see that the hard maths (the physical and geometry) is either done by a computer or has been done by someone else. While the calculations do happen and are essential to the successful running of the program, the programmer does not need to know how they are done." Khazan says that in order to figure out what your program should say, you're going to need some basic logic skills and you'll need to be skilled at copying and pasting things from online repositories and tweaking them slightly. "But humanities majors, fresh off writing reams of term papers, are probably more talented at that than math majors are."
Math

Machine Learning Could Solve Economists' Math Problem 157

An anonymous reader writes: Noah Smith argues that the field of economics frequently uses math in an unhealthy way. He says many economists don't use math as a tool to describe reality, but rather as an abstract foundation for whatever theory they've come up with. A possible solution to this, he says, is machine learning: "In other words, econ is now a rogue branch of applied math. Developed without access to good data, it evolved different scientific values and conventions. But this is changing fast, as information technology and the computer revolution have furnished economists with mountains of data. As a result, empirical analysis is coming to dominate econ. ... [Two economists pushing this change] stated that machine learning techniques emphasized causality less than traditional economic statistical techniques, or what's usually known as econometrics. In other words, machine learning is more about forecasting than about understanding the effects of policy. That would make the techniques less interesting to many economists, who are usually more concerned about giving policy recommendations than in making forecasts."
Math

John Conway: All Play and No Work For a Genius 55

An anonymous reader points out Quanta's spotlight piece on mathematician John Conway, whose best known mathematical contribution is probably his "Game of Life," which has inspired many a screensaver and more than a few computer science careers. From the article: Based at Princeton University, though he found fame at Cambridge (as a student and professor from 1957 to 1987), John Horton Conway, 77, claims never to have worked a day in his life. Instead, he purports to have frittered away reams and reams of time playing. Yet he is Princeton's John von Neumann Professor in Applied and Computational Mathematics (now emeritus). He's a fellow of the Royal Society. And he is roundly praised as a genius. "The word 'genius' gets misused an awful lot," said Persi Diaconis, a mathematician at Stanford University. "John Conway is a genius. And the thing about John is he'll think about anything. He has a real sense of whimsy. You can't put him in a mathematical box."
Open Source

Debian Founder: How I Came To Find Linux 136

An anonymous reader writes: Ian Murdock has pretty solid open source cred: in 1993 he founded Debian, he was the CTO of Progeny and the Linux Foundation, and he helped pave the way for OpenSolaris. He has published a post about how he initially joined the Linux ecosystem. Quoting: "[In 1992], I spent most evenings in the basement of the MATH building basking in the green phosphorescent glow of the Z-29 terminals, exploring every nook and cranny of the UNIX system upstairs. ... I was also accessing UNIX from home via my Intel 80286-based PC and a 2400-baud modem, which saved me the trek across campus to the computer lab on particularly cold days. Being able to get to the Sequent from home was great, but I wanted to replicate the experience of the ENAD building's X terminals, so one day, in January 1993, I set out to find an X server that would run on my PC. As I searched for such a thing on Usenet, I stumbled across something called 'Linux.'" How did you come to find Linux?
Math

How Weather Modeling Gets Better 43

Dr_Ish writes: Bob Henson over at Weather Underground has posted a fascinating discussion of the recent improvements made to the major weather models that are used to forecast hurricanes and the like. The post also included interesting links that explain more about the models. Quoting: "The latest version of the ECMWF model, introduced in May, has significant changes to model physics and the ways in which observations are brought into and used within the model. The overall improvements include better portrayal of clouds and precipitation, including a more accurate depiction of intense rainfall. The main effect of the model upgrade for tropical cyclones is slightly lower central pressure. During the first 3 days of a forecast, the ECMWF has tended to have a slight weak bias on tropical cyclones; the new version is closer to the mark."

Slashdot Top Deals