Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Graphics Software IT Technology

Evolution of Mona Lisa Via Genetic Programming 326

mhelander writes "In his weblog Roger Alsing describes how he used genetic programming to arrive at a remarkably good approximation of Mona Lisa using only 50 semi-transparent polygons. His blog entry includes a set of pictures that let you see how 'Poly Lisa' evolved over roughly a million generations. Both beautiful to look at and a striking way to get a feel for the power of evolutionary algorithms."
This discussion has been archived. No new comments can be posted.

Evolution of Mona Lisa Via Genetic Programming

Comments Filter:
  • by Anonymous Coward on Tuesday December 09, 2008 @02:22AM (#26043871)

    One individual trying to improve itself isn't evolution, it's simulated annealing. Just because you call your parameters "DNA" it doesn't turn it into genetic programming.

    Genetic programming requires a population and a crossover operation.

  • by usul294 ( 1163169 ) on Tuesday December 09, 2008 @02:41AM (#26043961)
    It doesn't necessarily require crossover, asexual reproduction that selects based on merit and uses mutation will have an evolving population, though generally not as fast as with crossover.
  • Re:Triangles (Score:5, Informative)

    by prockcore ( 543967 ) on Tuesday December 09, 2008 @02:43AM (#26043977)

    It does.. for every generation it makes 20 mutations.. so you're seeing each of those 20 mutations run. Takes a while just for one generation to complete.

  • by greg_barton ( 5551 ) <greg_barton@yaho ... m minus math_god> on Tuesday December 09, 2008 @02:55AM (#26044031) Homepage Journal

    ...you'll love Picbreeder: picbreeder.org [picbreeder.org]

  • by OrangeTide ( 124937 ) on Tuesday December 09, 2008 @03:19AM (#26044129) Homepage Journal

    I think it's cheating to use convex polygons, why not use complex polygons then you could probably do it with 5 polygons?

  • by khallow ( 566160 ) on Tuesday December 09, 2008 @03:24AM (#26044141)

    One individual trying to improve itself isn't evolution, it's simulated annealing. Just because you call your parameters "DNA" it doesn't turn it into genetic programming.

    But doesn't simulated annealing involve quenching? That is, there's a "temperature" of the system. High temperature means states are more likely to make big random jumps to far from optimal states. Low temperature means the jumps are shorter and more optimal is strongly prefered. Here's what I dimly recall. As the temperature declines, the more fit states become higher prefered. If you cool too rapidly (there is a mathematical view in which that statement makes sense), you risk getting stuck in a suboptimal extreme state. But cooling at a rate of time to the -0.5 power settles to a optimal state as long as the optimal state is isolated, I think. In comparison, the algorithm of the story appears to be constant temperature with respect to time.

    Another aspect of simulated annealing is that it doesn't take the best fit at each generation. By randomly mutating at each step and lowering the temperature slowly as above (and making more optimal states increasingly prefered), the problem naturally settles to an optimal state. In comparison, the algorithm mentioned in the story takes the best fit at each generation.

    It appears to me not to fit very well in the viewpoint of simulated annealing.

  • by haggais ( 624063 ) on Tuesday December 09, 2008 @03:46AM (#26044217)

    Sorry, but this is hill climbing [wikipedia.org], pure and simple. The (very cool) result was achieved by introducing random changes ("mutations", if you like) into a "state" or "temporary solution" (the set of polygons), and keeping the new state only if it increases a target function (the similarity to a target image).

    The name "genetic algorithm" [wikipedia.org] is actually used for a more complex situation, more reminiscent of our own genetics: the algorithm maintains a pool of states or temporary solutions, selects two (or more) of them with probability proportional to their target-function score, and then randomly recombines them, possibly with "mutations", to generate a new state for the pool. A low-scoring state is probably removed, to keep the pool at constant size.

    Quite possibly, a genetic algorithm would do an even better job here, as it could quickly find, for example, two states which each approximates a different half of the image.

  • by lysergic.acid ( 845423 ) on Tuesday December 09, 2008 @03:49AM (#26044229) Homepage

    that depends on what you mean by "population." if by population you simply mean variation, then yes. that's required. but if by population you mean a set of concurrent genetic lineages and breeding individuals then, no, that isn't required. that may be required in biological evolution, but that's an incidental result.

    at its core, all evolution really is is directed randomness. biological evolution requires large populations only because sexual selection necessitates it. but non-biological evolution (like the evolution of ideas, designs, language, etc.) do not entail sexual selection.

    one good example of this, interestingly enough, is the design process used by an aeronautics engineer to design airfoils in a documentary i saw a while back. he basically described how he started with a rough wing shape and measured its flight characteristics (lift, drag, etc.) and then created a handful of random variations each with a slightly different shape/cambers and angle of attack. he'd then test each of the variations and pick the best one to repeat the process with. there's no crossover operation or horizontal gene transfer, but it still demonstrates the basic principles of evolution, such as evolutionary selection and cumulative/incremental changes.

  • Re:Source code (Score:5, Informative)

    by elvstone ( 86513 ) on Tuesday December 09, 2008 @04:29AM (#26044385) Homepage Journal

    He says in the comments [rogeralsing.com] that he's supposed to release the source today.

    The source is apparently C# using .NET 3.5, so might take a bit of work to get running under e.g. Mono, should you be on a non-Windows platform.

  • by Anonymous Coward on Tuesday December 09, 2008 @05:01AM (#26044539)

    Each generation is a population of 20. A given generation is a combination of a weighted breeding of the previous generation based on success plus some random mutation.

    I ran it 3 times for a substantial amount of time. It always started from a population where most or all of the cars failed. It always evolved to one where most(all?) succeeded, in that they ran for the full ~10s without toppling and crashing. It was extremely effective, though it required a bit of patience.

    One singular exceptional population member doesn't ensure that all of the next generation will be exception, we're not talking about cloning. You would expect it to be possible to have an aberration like that, but not necessarily common.

    If I'm interpreting the plot correctly, the x axis is iteration and the y is the success metric. I'm assuming the lower line is the population average while the higher is the best of that generation. You can see both of them grow. On a population where you have one exceptional success, you see the one line spike high above the other.

  • by QuantumG ( 50515 ) * <qg@biodome.org> on Tuesday December 09, 2008 @07:37AM (#26045157) Homepage Journal

    One way to boost complexity is to evolve programs.. or neural networks, if you're that way inclined. One way to speed up the evolutionary process is to use probabilistic modeling to produce offspring.. it's must more efficient than just random mutation. See http://www.opencog.org/wiki/MOSES [opencog.org]. Eventually, though, you will reach limits to blind search. At that point you need to complement it with logic. See http://www.opencog.org/wiki/PLN [opencog.org]. And to focus your search you really need some kind of attention allocation. See http://www.opencog.org/wiki/Attention_allocation [opencog.org]. An integrative approach means you can solve real world applications with modest hardware.

  • Re:Triangles (Score:5, Informative)

    by Half-pint HAL ( 718102 ) on Tuesday December 09, 2008 @09:59AM (#26045847)

    the point of genetic evolution is for there to be progressive enhancements. it's not just randomly throwing the dice over and over again. you have to retain the positive enhancements of past iterations for it to "evolve."

    Not entirely true. Let's get back to basics, and hill-climbing algorithms.

    You have a robot and a hill, and you program the robot to always take the steepest uphill slope possible. That's progressive enhancement -- it's always getting "better" (higher).

    Except for one type of thing: local maxima.

    You see, most hills have more than one summit.

    So if the robot ends up on a lower, secondary summit, it will refuse to go down, as it must get better/higher with each step.

    But logically, to get from a lower peak to a higher one, you have to descent a short distance and then start ascending the true summit.

    Any search strategy has to account for local maxima and other dead ends, and in GA and other evolutionary algorithms, these means introducing the possibility that children are less optimal than their parent iterations.

    HAL.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...