The Secret of the Simplex Algorithm Discovered 65
prostoalex writes "While the Simplex algorithm is considered to be one of the most widely used algorithms in complex networks, the reason for its efficiency has been so far not too clear. Daniel Spielman and Shanghua Teng discovered the secret of why the Simplex algorithm works so well by introducing imprecision into the worst-case scenario analysis. Their article will be published in Journal of ACM, although MIT Technology Review at the aforementioned link quotes Spielman expressing his doubts whether anyone will be able to make it through 80-page document filled with equations and formal explanations of the method."
Oh really ? (Score:2, Interesting)
Pardon my ignorance... (Score:3, Insightful)
The idea of an algorithm that is used on all kinds of major networks, but no one knows why it works sounds rather intriguing, but can anyone offer any background?
Thanks in advance.
Re:Pardon my ignorance... (Score:5, Informative)
I say 'an optimal solution' because there can be more than one.
If viewed in space, the solution set for a simplex problem is a convex, generally closed region of space. In isn't neccessarily closed, but it is necessarily convex if it is closed, and it is never concave.
The simplex algorithm is an algorithm whereby some of the points/vertices of that solution space are visited in a search, until an optimal vertex (i.e. solution) is discovered.
It isn't completely fool-proof, you can get into states where the algorithm bounces between two vertices forever (looping), but for most well-stated linearly constrained problems (in any number of dimensions, really), the simplex is a good way to find an optimal solution.
Sorry, for the rather rambling explanation.
If you're truely interested, do a web search for "simplex method"
Re:Pardon my ignorance... (Score:5, Interesting)
He struggled for a while, and finally turned his results into to the professor. However, the problems had been two famous unsolved problems in the field, and Dantzig had effectively solved them.
While perhaps embellished (and not the only example of students solving 'unsolved' problems), it has become a parable for showing the power of unprejudiced thought. Occasionally it is mis-stated that he invented the simplex method to solve these problems, when in fact, he did that a couple of years afterwards.
Re:Pardon my ignorance... (Score:1)
Re:Pardon my ignorance... (Score:5, Interesting)
Huffman didn't do the research, just spent weeks thinking about it, and eventually, when he had decided to give up and study for the test, the answer came to him and he realized the solution was really quite trivial (the famous "greedy algorithm" for constructing optimal minimum redundancy codes).
On the day of presentation, Huffman strode to the board, described the problem to the class, and totally unaware that it had not previously been solved (despite much effort), wrote the simple algorithm for constructing the codes on the chalkboard.
The professor was, understandably, stunned.
Re:Pardon my ignorance... (Score:4, Informative)
Re:Pardon my ignorance... (Score:5, Informative)
The simplex method solves a class of problems called "Linear Programming" or simply "LP". Many different kinds of network or graph theory problems can be phrased as an LP.
An LP consists of an objective function, and a number of linear contraints. For any given LP, there are 3 possibilites:
1. there does not exist a feasible solution
2. the LP is unbounded
3. there exist an optimal solution.
The goal is to determine an assignment to the variables in the linears system so that the objective value is maximized (or minimized) while satisfying every linear constraint.
There are other algorithms for solving LP, such as the cutting-plane algorithm. But the simplex algorithm exhibits many useful properties over other algorithms. For example, if the solution of a linear system has already been computed, and if the system changes slightly, one can compute a new solution quickly from the old solution, instead of recomputing from scratch. This is obviously quite useful for continuously updating routing tables on a network.
Some examples of LP are:
- single source shortest path (think routing)
- maximum st-flow
- minimum cost flow
Yet another explanation. (Score:2, Informative)
The simplex algorithm is a way of solving Linear Programming problems. Linear Programming problems require you to find an optimal solution for a series of constraints.
An example might be:
You are a baker, and you have 20 pounds of flour and a dozen eggs. You can make either loaves of bread (requiring four pounds of flour each) or c
Re:Yet another explanation. (Score:1)
That makes more sense. (Score:1)
So sad... (Score:5, Insightful)
Re:So sad... (Score:4, Funny)
I do. That's why I posted this comment... to bring the count up to 5.
Re:So sad... (Score:4, Informative)
Re:So sad... (Score:1)
That and the subject matter is so specific and technical that most nerds (much less ordinary people) can't follow it. Really it's good reading for a nerd's nerd. I have to admit that I can't understand it.
I'm not WORTHY!
I'm not WORTHY!
I'm not WORTHY!
Re:So sad... (only to a point) (Score:3, Funny)
Re:So sad... (only to a point) (Score:2)
Re:So sad... (Score:2, Funny)
Linear Programming (Score:2)
Another factor may be that until you get a ways into advanced problems most of the applications linear programming and the simplex method are pretty much cookbook these days - but most courses in linear programming are a semester long - so its hard to tell a cs major that taking a course in LP is more important than taking a course
Re:So sad... (Score:1)
Maybe this says something about the "Me hate Microsoft *grunt*" mentality, but you can't expect 300+ posts on something this complexity and relative obscurity. If you think it's hard to talk to the general public about OSS and computing topics, try explaining
Re:So sad... (Score:1)
This is going to sound awful so I'm just going to say it and not try to finesse it.
You don't need to know anything to bash MS. Partially it's because there's not a lot there, mostly it's because they are in your face every day of every week, and the rest is because everyone has an opinion.
On the other hand, this stuff isn't opinion driven and once the algorithm is embedded in a Cisco router no one needs to know anything about it.
Maybe slashdot needs a section like Fifty years ago today
Simplex - important for many things (Score:5, Interesting)
It's also used by the airlines to figure out how to schedule planes. There are uses in physics, social sciences, etc. I don't know any offhand, but I wouldn't be surprised if at one time it was used to assist in scheduling computing resources. It's also widely used in complex pricing models.
It's also a way to decide, given 10 women and 10 men (or any number), which ones are most likely to get along with each other based on their preferences and characteristics.
Re:Simplex - important for many things (Score:3, Informative)
Re:Simplex - important for many things (Score:1)
I think math is more like law. Rather then trying to learn it all and remember it for 30 years after the fact, one should know the basic concepts, but be able to contact a consultant for the times they actually need something specif
This paper is already availible in preprint? (Score:5, Informative)
"Smoothed Analysis of Algorithms: Why the Simplex Algorithm Usually Takes Polynomial Time" by Daniel A. Spielman and Shang-Hua Teng
http://arxiv.org/abs/cs.DS/0111050 [arxiv.org]
Quote (from the intro):
We propose an analysis that we call smoothed analysis which can help explain the success of many algorithms that both worst-case and average case cannot. In smoothed analysis, we measure the performance of an algorithm under slight random perturbations of arbitrary inputs. In particular, we consider Gaussian perturbations of inputs to algorithms that take real inputs, and we measure the running times of algorithms in terms of their input size and the variance of the Gaussian perturbations.
We show that the simplex method has polynomial smoothed complexity. The simplex method is the classic example of an algorithm that is known to perform well in practice but which takes exponential time in the worst case[KM72, Mur80, GS79, Gol83, AC78, Jer73, AZ99]. In the late 1970's and early 1980's the simplex method was shown to converge in expected polynomial time on various distributions of random inputs by researchers including Borgwardt, Smale, Haimovich, Adler, Karp, Shamir, Megiddo, and Todd[Bor80, Bor77, Sma83, Hai83, AKS87, AM85, Tod86]. However, the last 20 years of research in probability, combinatorics and numerical analysis have taught us that the random instances considered in these analyses may have special properties that one might not find in practice.
Re:This paper is already availible in preprint? (Score:5, Interesting)
Another way of thinking about smoothed complexity is to observe that if an algorithm has low smoothed complexity, then one must be unlucky to choose an input instance on which it performs poorly.
If my first scan of paper yielded anything worth reporting it's the idea that even though a worst case may take exponential time to solve, there is always a neighborhood around the worst case that can be solved in polynomial time and has (almost) the same solution. In other words you can always find a good enough solution quickly.
BTW, those of you who described the idea as largely philosophical were wrong! The paper is mathematically rigorous. It uses the 80 pages to develop the theorems necessary to prove the idea.
Re:This paper is already availible in preprint? (Score:3, Interesting)
Re: Perturbing quicksort inputs (Score:1)
This could work, I believe, but only if Quicksort exhibits the same sort of lets say "poly time affinity" (for lack of a better description) that the Simplex method does. Not all algorithms have this characteristic.
It is worth looking into, though. If this is (statistically) true, one could try to solve in paralell 3 or 4 "pertubed" quicksort
Re: Perturbing quicksort inputs (Score:2)
Re: Perturbing quicksort inputs (Score:1)
In fact to get O(n^2) it doesn't matter which way the data is sorted. The algorithm works by choosing an element as a "pivot", then dividing the remaining data into two sets, the set of data lower than the "pivot", and the set greater than the "pivot" (and then sorting those two sets recursively). The algorithm is efficient when the two sets are of roughly equal size (as will often be the case when the data is random), but O(n^2) when one is significantly larger than the other.
Since most implementations p
Re: Perturbing quicksort inputs (Score:1)
Re: Perturbing quicksort inputs (Score:1)
Re:This paper is already availible in preprint? (Score:2, Informative)
So what's new about the result? (Score:2)
Smale, Blum et al have a really good book called "Complexity and Real Computation" which analyzes the Simplex Method and other algorithms at some length. It tries to build up a theoretical foundation for complexity analysis of numerical algorithms (where the objects being computed are real numbers), just like traditional complexi
Simplex and Operational Research (Score:5, Informative)
Simplex, for those who aren't familiar with it, is a method of solving linear inequalities by representing the inequalities as a set of vectors which describe the outer bounds of the valid problem space. All space within those bounds is a "valid" solution to the inequalities.
Simplex assumes that some of the inequalities are contradictory. ie: that improving one variable will worsen one or more others.
The method works by starting off in some corner, and then progressing round the outer bounds until an optimal solution is achieved.
Operational Research is the science of applying the Simplex method to real-world problems. Early uses of OR (and, indeed, where the name originates) were in World War II, where the problem was to commit the fewest possible resources to achieve the greatest possible result with the fewest possible Allied casualties.
(Too few resources, and the enemy would likely be able to inflict more damage. Too many resources would reduce that damage, but would also reduce the number of operations you could perform.)
Modern uses of OR include production of prefabricated components from some material M, such that you get the most components out, and maximise the amount of M that is usable for some other production work, rather than having it as waste, while (at the same time) keeping additional production and processing costs below the savings made from more efficient use of the material.
In this case, the number of components (N) is one inequality. You need to equal or exceed the ordered number.
M is also an inequality - you want to order strictly less than you were ordering before, using the old process, or you've gained nothing.
M' (the usable remainder) is an inequality, equal or greater than 0 and less than M - W.
W (the waste) is the fourth inequality, which is greater than 0 and less than M - M'.
If the cost per unit M is C, and the amount of M needed before applying the Simplex method is I, then your savings are (I - M) * C.
This gives us the final inequality, where P (the increase in cost, due to increase in complexity) must be strictly less than (I - M) * C.
Without OR, these inequalities are horribly complicated, and "good" solutions are very hard to find. So most companies who aren't familiar with OR just don't bother. Such companies are easy to spot - the only way they can cut costs is to cut workforce.
Those companies with a good OR team can often make significant savings by improving the methods used. Such companies don't downsize when the going gets tough. Often, they'll simply revamp their methods and discover they can get more output for less cost, for the same labor force. These companies do brilliantly during recessions, as they can literally rob competitors of the remaining market, by out-producing and under-cutting anything that companies with poorer designs can do.
You can see from that that VERY few companies use OR in their day-to-day practices. The number of layoffs, blamed on "restructuring" but really the result of restructuring the wrong thing, has been horrendous.
OR isn't the perfect solution to all problems, and is only really designed to solve linear inequalities, but it's the best method out there. And it's about time it was understood.
Re:Simplex and Operational Research (Score:1, Offtopic)
[nec.com]
Pessimal Algorithms and Simplexity analysis
Only tangentially related. but good.
Re:Simplex and Operational Research (Score:1, Informative)
Re:Simplex and Operational Research (Score:1)
If the mathematician screams in the forest of managing suits, will anyone hear?
Re:Well and good, but about my NZ Jetstream questi (Score:1, Offtopic)
Re:Well and good, but about my NZ Jetstream questi (Score:1)
Re:Well and good, but about my NZ Jetstream questi (Score:1)
do you have some inside knowledge?
Simplex (Score:2, Funny)
Re:Simplex (Score:3, Interesting)
Re:Simplex (Score:1)
It appears complex while it is actually quite simple.
Re:Integer Linear Programming is harder than LP (Score:1)
Yes, I know I'm using floating point for ILP, but, then, there is a method to get back to ILP.
Oh, something else has me pu
Re:Integer Linear Programming is harder than LP (Score:1)
It's the same reason that 0-1 Knapsack is NP-complete yet fractional Knapsack is P and very easy.
Melissa ^-^
Re:Integer Linear Programming is harder than LP (Score:1)
not news (Score:1, Offtopic)
simplexly facinating. (Score:1)
Very nice result - may have game applications (Score:5, Informative)
It's been known for a long time that the simplex method is polynominal most of the time, and exponential in the worst case. It's also known that the exponential cases are metastable - a small perturbation in the inputs and they collapse to a polynominial case. So adding a little noise to the algorithm can kick it out of the bad cases.
But that's been an ad-hoc result. There hasn't been theory on how much noise to add, and when, and how. With sound theory underneath, algorithms that use noise injection to get out of the pathological states may become much more reliable.
Polyhedral collision detection, as used in games, works a lot like the simplex algorithm, and there are some pathological cases. The current best solution, "Enhanced GJK", is adequate but still has trouble in some "pathological cases". There are ways out of those difficulties, but they're inelegant. This new work might lead to a cleaner algorithm in that area.
There are other algorithms with similar properties, where the worst case is far slower than the average case. The simplex algorithm is for linear optimization. Many of the same difficulties appear in nonlinear optimization, where everything is more complicated and the methods all have severe restrictions. This may have applications in physics engines for video games, but it's too early to say.
Karmarkar's algorithm (Score:1)
Re:Karmarkar's algorithm (Score:2, Informative)
Popularity? (Score:1, Insightful)