

'For Algorithms, a Little Memory Outweighs a Lot of Time' (quantamagazine.org) 18
MIT comp-sci professor Ryan Williams suspected that a small amount of memory "would be as helpful as a lot of time in all conceivable computations..." writes Quanta magazine.
"In February, he finally posted his proof online, to widespread acclaim..." Every algorithm takes some time to run, and requires some space to store data while it's running. Until now, the only known algorithms for accomplishing certain tasks required an amount of space roughly proportional to their runtime, and researchers had long assumed there's no way to do better. Williams' proof established a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less space.
What's more, this result — a statement about what you can compute given a certain amount of space — also implies a second result, about what you cannot compute in a certain amount of time. This second result isn't surprising in itself: Researchers expected it to be true, but they had no idea how to prove it. Williams' solution, based on his sweeping first result, feels almost cartoonishly excessive, akin to proving a suspected murderer guilty by establishing an ironclad alibi for everyone else on the planet. It could also offer a new way to attack one of the oldest open problems in computer science.
"It's a pretty stunning result, and a massive advance," said Paul Beame, a computer scientist at the University of Washington.
Thanks to long-time Slashdot reader mspohr for sharing the article.
"In February, he finally posted his proof online, to widespread acclaim..." Every algorithm takes some time to run, and requires some space to store data while it's running. Until now, the only known algorithms for accomplishing certain tasks required an amount of space roughly proportional to their runtime, and researchers had long assumed there's no way to do better. Williams' proof established a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less space.
What's more, this result — a statement about what you can compute given a certain amount of space — also implies a second result, about what you cannot compute in a certain amount of time. This second result isn't surprising in itself: Researchers expected it to be true, but they had no idea how to prove it. Williams' solution, based on his sweeping first result, feels almost cartoonishly excessive, akin to proving a suspected murderer guilty by establishing an ironclad alibi for everyone else on the planet. It could also offer a new way to attack one of the oldest open problems in computer science.
"It's a pretty stunning result, and a massive advance," said Paul Beame, a computer scientist at the University of Washington.
Thanks to long-time Slashdot reader mspohr for sharing the article.
There is no CS in the article (Score:2)
If you studied computer science and didn't forget all of it, the article is a tease.
Algorithms that use relatively little space can solve all problems that require a somewhat larger amount of time. Then, using just a few lines of math, he flipped that around and proved a negative result about the computational power of time: At least a few problems can’t be solved unless you use more time than space
This is gibberish. I'm pretty sure every phrase that says "little space", "more time", etc., should be replaced by the name of a complexity class or a space complexity class.
Though perhaps it's too hard to really explain it in the way that P and NP are usually explained (with a simple example).
Re: There is no CS in the article (Score:2)
Re: There is no CS in the article (Score:2)
I can, can't you?
https://arxiv.org/pdf/2502.177... [arxiv.org]
Re: There is no CS in the article (Score:1)
Re: (Score:2)
If you studied computer science and didn't forget all of it, the article is a tease.
Quanta's focus is to help educate the person who does not have a CS degree. If you took classes with Hartmanis or Hopcroft, you are not the expected reader of this article.
Re: (Score:2)
This article explains the algorithm and is targeted toward the non-CS audience.
It may be difficult for some people to understand but they do boil it down to simple terms.
Those who are interested can read the paper at:
https://arxiv.org/pdf/2502.177... [arxiv.org]
Look here (Score:2)
Is this related to Squeeze Space into Time? (Score:2)
Looks like a theorerical result (Score:2)
No implication for doing actual computations.
I haven't read the paper (Score:2)
Re: I haven't read the paper (Score:2)
That's just kicking the can down the road. Besides, every function needs memory to store its definition, so you've saved nothing.
Re: (Score:2)
I did not read the paper. Only some comments based on the video: https://www.youtube.com/watch?... [youtube.com]
If I understand it correctly; using references to recompute everything in a data dependency manner to save as much space as possible would lead only to space reduction of O(t*log(t)) when compared to space unconstrained algorithm (i.e. an algorithm which uses about as much space as time). This is the old result. They also shift "temporary" variables typically needed during an computation into the computation i
Fallacy (Score:1)
"Proving a suspected murderer guilty by establishing an ironclad alibi for everyone else on the planet"
I'll see your Sherlock Holmes fallacy and raise you a Reverse Burden of Proof fallacy.
The opposite is more interesting (Score:2)
There are countless examples where more memory saves a lot of computing time. For instance precomputing huge tables of complex functions aided by interpolation allows to spare recomputing these frequently used functions.