Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Programming Math

'For Algorithms, a Little Memory Outweighs a Lot of Time' (quantamagazine.org) 37

MIT comp-sci professor Ryan Williams suspected that a small amount of memory "would be as helpful as a lot of time in all conceivable computations..." writes Quanta magazine.

"In February, he finally posted his proof online, to widespread acclaim..." Every algorithm takes some time to run, and requires some space to store data while it's running. Until now, the only known algorithms for accomplishing certain tasks required an amount of space roughly proportional to their runtime, and researchers had long assumed there's no way to do better. Williams' proof established a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less space.

What's more, this result — a statement about what you can compute given a certain amount of space — also implies a second result, about what you cannot compute in a certain amount of time. This second result isn't surprising in itself: Researchers expected it to be true, but they had no idea how to prove it. Williams' solution, based on his sweeping first result, feels almost cartoonishly excessive, akin to proving a suspected murderer guilty by establishing an ironclad alibi for everyone else on the planet. It could also offer a new way to attack one of the oldest open problems in computer science.

"It's a pretty stunning result, and a massive advance," said Paul Beame, a computer scientist at the University of Washington.

Thanks to long-time Slashdot reader mspohr for sharing the article.

'For Algorithms, a Little Memory Outweighs a Lot of Time'

Comments Filter:
  • by piojo ( 995934 ) on Saturday June 07, 2025 @01:10PM (#65434233)

    If you studied computer science and didn't forget all of it, the article is a tease.

    Algorithms that use relatively little space can solve all problems that require a somewhat larger amount of time. Then, using just a few lines of math, he flipped that around and proved a negative result about the computational power of time: At least a few problems can’t be solved unless you use more time than space

    This is gibberish. I'm pretty sure every phrase that says "little space", "more time", etc., should be replaced by the name of a complexity class or a space complexity class.

    Though perhaps it's too hard to really explain it in the way that P and NP are usually explained (with a simple example).

  • The link to the research is in the second link in th OP.
  • by TheMESMERIC ( 766636 ) on Saturday June 07, 2025 @02:38PM (#65434401)
    Sorry I just woke up can't think properly. But just saw this yesterday until the end before bed. https://www.youtube.com/watch?... [youtube.com]
  • No implication for doing actual computations.

    • by Tablizer ( 95088 )

      But it will generate generations of buzzwords for salespeople to dupe clueless managers with. It's a sales efficiency algorithm. It's like greatly improving on a traveling salesperson algorithm, but by using deeper bullshit instead of better routes.

      • by gweihir ( 88907 )

        Indeed. It will convince the same morons that think AI is sentient, we will be going to Mars next year but vaccines do not work. There was no better time in history to learn about the limitations of the average person than now. Lack of educaton or access to information is not a factro anymore. It is all 100% genuine stupid.

        • by Tablizer ( 95088 )

          > It will convince the same morons that think AI is sentient

          Many human morons are not sentient, they just echo their cult leader.

  • But it seems to me you could turn every variable reference into a function to do everything needed to compute that value instead. Inefficient as fuck, but it would remove all space requirements.
    • That's just kicking the can down the road. Besides, every function needs memory to store its definition, so you've saved nothing.

    • by vyvepe ( 809573 )

      I did not read the paper. Only some comments based on the video: https://www.youtube.com/watch?... [youtube.com]

      If I understand it correctly; using references to recompute everything in a data dependency manner to save as much space as possible would lead only to space reduction of O(t*log(t)) when compared to space unconstrained algorithm (i.e. an algorithm which uses about as much space as time). This is the old result. They also shift "temporary" variables typically needed during an computation into the computation i

  • "Proving a suspected murderer guilty by establishing an ironclad alibi for everyone else on the planet"

    I'll see your Sherlock Holmes fallacy and raise you a Reverse Burden of Proof fallacy.

  • There are countless examples where more memory saves a lot of computing time. For instance precomputing huge tables of complex functions aided by interpolation allows to spare recomputing these frequently used functions.

  • Every NP complete problem is solvable in P time using exponential space, shut up

  • Can anyone give an example of an algorithm that can work faster with more space ?
    • by allo ( 1728082 )

      The proof is for multitape Turing machines. You can't ELI5 that stuff.

      • by piojo ( 995934 )

        The proof is for multitape Turing machines. You can't ELI5 that stuff.

        You can, you just need an infinite number of five year-olds.

        • by allo ( 1728082 )

          Good luck finding an infinite number of tape recorders. The five year olds today have an infinite number of spotify subscriptions.

    • There are plenty. Memoization for instance. Try to solve codeforces problems with algos that run within the time limit...
    • Plenty. Unrolling is an example. Using memory maps, LUTs, linked lists, etc.
  • YouTube lecture on this by the discoverer, Ryan Williams [youtube.com]
    Ryan Willaims's paper, "Simulating Time With Square-Root Space [arxiv.org]
    This appears to be based partly but largely on Tree Evaluation is in Space O(log n * log log n)" by James Cook and Ian Mertz (2023 colloqium [weizmann.ac.il], STOC 2024 conference [acm.org]).

    I'm just a programmer who has spent an hour or so looking at this, so please take the rest of this post with a grain of salt.

    I get the impression that Professor Williams's result so far, already a tool for making progress about which computational complexity classes are the same and different, has the limitation of relying on how slow Turing machines are at accessing memory, based on the mention at 18min:50sec into that YouTube Video of how the space savings degrades for a Turing machine with tapes of more than one dimension. If I understand correctly, for such Turing machines, an algorithm with running time bounded above by time T(n) for any input of length n, the space used by this potentially much slower space-saving simulation is bounded by O( ( T(n) + log T(n) ) ** ( 1 - (1/(D+1))) ). I'm using "**" as exponentiation, so the exponent means square root (that is, exponent 0.5) for a one dimensional (linear) tape, two thirds power (exponent 0.66...) for a tape that is a two dimensional surface, 0.75th power for a three dimensional tape, and, so far, no known savings for a tree shaped tape, although I suppose that that three dimensional limit does ultimately apply to real world data storage systems.

"Help Mr. Wizard!" -- Tennessee Tuxedo

Working...