Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Mutating Animations 218

Weird_one writes "Discover magazine's current issue has an intriguing article involving using genetic algorithims to evolve an animation of a walking individual."
This discussion has been archived. No new comments can be posted.

Mutating Animations

Comments Filter:
  • by anttik ( 689060 ) on Saturday July 19, 2003 @03:07AM (#6476944) Journal
    When CPU's will become fast enough to use this level of learning on the fly, it will be great for gamers. Maybe the enemies don't have to learn walking, but learning strategy is a different thing. If they learn strategy by themselves and not by pre-programmed AI, I bet they will be more creative and a tougher opponent.

    But I think I'll still have to wait like 20 years for that.
  • by Great_Jehovah ( 3984 ) on Saturday July 19, 2003 @03:08AM (#6476947)
    Check this out: http://q12.org/phd-movies.html [q12.org]

    Not quite as slick but a lot more amusing.

  • Re:sceptical (Score:5, Interesting)

    by dustman ( 34626 ) <dlearyNO@SPAMttlc.net> on Saturday July 19, 2003 @03:38AM (#6477022)
    How would this be any easier than doing keyframed animation with inverse kinematics?

    I read something about this idea a few years ago. I'm pretty sure it was by the guy that did BMRT... a thesis paper of his or something.

    Basically, the redid the animation of "Luxo" in pixar's animation short. "Luxo" is the bouncing desktop lamp. Making animated characters (even those that aren't human) move "nicely" is quite hard. It takes a lot of work.

    For their project, the specified the constraint: That Luxo must move from point A to point B, and that's all. The only input the model had was "how much force to use on each joint at each particular time". So, they were animating its "muscles" with a genetic algorithm, and also running a physics simulation on the system. (They assigned mass to the individual components, etc).

    It evolved several techniques of locomotion: The "standard" bouncing hop (which the "real" luxo does), dragging itself across the table, somersaulting, etc...

    In short, they came up with good looking animation, without requiring much user input. And in the end, they had a genetic algorithm which could make Luxo walk any distance, without requiring the work of an animator.

    This is important, because although its relatively easy to just loop an animation, it looks rather unnatural.
  • by FleaPlus ( 6935 ) on Saturday July 19, 2003 @03:40AM (#6477028) Journal
    The other day I stumbled upon what could quite possibly be the coolest Java applet ever [biobloc.net]. Once you start the applet, you assemble "bioblocs," which are 3D creatures assembled from connected blocks. Once you've assembled your creature, you can have it use genetic algorithms to try to learn how to most effectively walk, run, jump, and turn around using the blocks you've given it. I assembled a snake-like creature the other day, and was intrigued to see that it evolved a walking movement very similar to that of a sidewinder's.

    In addition to assembling your own creatures, you can also load creatures that others have previously assembled, as well as enter your creatures into contests. A lot of the previously assembled creatures are -very- impressive, with movements quite similar to those evolved in nature.
  • Non-GA Approach (Score:1, Interesting)

    by Narphorium ( 667794 ) on Saturday July 19, 2003 @03:49AM (#6477046)
    Ken Perlin created a similar technology which procedurally animates characters walking using his Perlin Noise functions.
    While this isn't technically a GA approach, it does provide simialr results in real-time.

    Check out his cool applet [nyu.edu].

  • Levels of thinking (Score:3, Interesting)

    by autopr0n ( 534291 ) on Saturday July 19, 2003 @03:55AM (#6477062) Homepage Journal
    You know what I always found interesting. It's like these systems of thinking create newer systems of thinking. Evolution created Neural networks, and eventually created neural networks that can think and create things much faster then evolution (the human mind)

    Then the human mind goes and creates digital computers, which again can do things the Neural network can't (and vise versa).

    Anyway, just thought it was intresting.
  • by autopr0n ( 534291 ) on Saturday July 19, 2003 @04:00AM (#6477073) Homepage Journal
    An NP complete problem is one that takes non-polynomial time to find the optimal answer, but can be verified in polynomial time. A genetic algorithm is similar, you need to have a fast fitness algorithm, or an operator who does the selection for you. If neither of those things are practical, then you probably shouldn't use a GA.

    On the other hand, there are a lot of things that you can use GAs for.
  • by Anonymous Coward on Saturday July 19, 2003 @04:01AM (#6477074)
    "When CPU's will become fast enough to use this level of learning on the fly"

    This does not have to be done real time to work. Most likly you would just need to compile the AI once for an enviroment and just record the results of the last generation to a file to be used later.
  • by krahd ( 106540 ) on Saturday July 19, 2003 @04:04AM (#6477085) Homepage Journal
    When I read the article, it immediatly made me remember the Boids [red3d.com] which are a computer model of coordinated animal motion such as bird flocks and fish schools, first introduced by Craig Reynolds [red3d.com]. Those where an example of emergent behaviour, where a bunch of independent moving things start moving in a coordinated way thanks to some "local guidelines" controlling their behaviour.

    So, if every thing (boid) is telled to steer to avoid crowding local flockmates, steer towards the average heading of local flockmates and steer to move toward the average position of local flockmates, they start to move as a flock (!).

    This approach of obtaining a more developed behaviour in an automatic (more accurately in an "emergent" way), is a lot related to genetic algorithms (GA) evolving an animation model, 'cos is exactly the opposite approach: In the first one the designer must specify the underlying mechanisms that permits the animation, while in the latter the designer must specify the result that is desired.

    What I found more appealing of GA's approach is that the system can outperform the initial specifications, as is noted by a lot of papers [channon.net] in ALife. Some have developed an artificial world (in the paper linked above is named "Geb") where individuals can develope a (eventually) coordinated behaviour to survive (the fitnness function is an implicit
    "survive function").

    What would be cool is to use GA in a pre-determined way to evolve (in a explcit way) the basic behaviour that construct the coordinated mass beahaviours (like boids' flocking).

    --krahd

    mod me up, scottie!

  • by fo0bar ( 261207 ) * on Saturday July 19, 2003 @04:25AM (#6477118)
    Download the E3 demo of Half-Life 2. In it there is a section where people are fighting alongside Gordon. This didn't look too spectacular until the presenter announced that these characters were NOT moving according to a script. The characters were given an objective (help Gordon get to point X), but were not given a path to take or any knowledge about the obstacles in the way. At that point my eyes opened wide, watching these people duck behind debris, covering fellow fighters, shoot-move-shoot-move... the movement and logic that they possessed looked either preprogrammed (which again, they say is not the case), or very human-like.
  • by jefu ( 53450 ) on Saturday July 19, 2003 @06:14AM (#6477282) Homepage Journal
    I managed to finally put my genetic grammar program grammidity [sourceforge.net] up on sourceforge. It should work pretty much as packed up, on linux - though it does require povray for the 3d examples and timidity for the sound example.

    It doesn't evolve moving critters, but one of the included examples evolves 3d plants that compete for sunlight (kind of, sort of).

    Another of the examples included allows the user to evolve midi files - with seriously odd results. Things may get odder though, I'm currently trying to figure out how to decompile midi files into source grammars then do crossover and mutation on them. Imagine Mozart crossed with Metallica.... Beethoven with the Brittany, or... I'd better not say, likely to get attacked.

  • by brianosaurus ( 48471 ) on Saturday July 19, 2003 @06:55AM (#6477371) Homepage
    Nah. Cuz then when your character gets killed, you'll try something different and eventually beat it. But if i can learn what you're doing, it will adapt and respond smarter on your subsequent attempts.

    Years ago I wrote a Tcl-scripted netrek hockey bot. I could reload its script on the fly, so its behavior could change. Early on people learned that it just shot towards the goal everytime. Then all of a sudde, during a game, i loaded in new code that passed to a teammate instead of always shooting on goal. The other team was floored, as the perfectly executed pass led to a goal.

    Sure it hadn't used some advanced AI (rather it was trivial and totally artificial), but it was enough to fool some of the people.

    Preload the AI with enough info to make it interesting when you start the game. But if it can learn enough (not necessarily in real time) to come back and bite you in the ass on level 12, its probably doing pretty well.
  • by Trinition ( 114758 ) on Saturday July 19, 2003 @07:06AM (#6477394) Homepage
    When I read this article, my mind immediately conjured up sodaconstructor [sodaplay.com]. With this applet, you create a bunch of "tendons" and/or "muscles" (line segments) connected together. You then adjust the function that controls the cycle of tension on each segment. The result is that you can make "creatures" that walk.

    The thing that triggered this memory was the talk of "700 independent parameters". I pictured each muscle in this virtual walking body to be much like the line segments in sodaconstructor. The difference is that instead of a human thinking about how to adjust each one, random mutation adjusted them and evolution selected them.
  • It's for beer too! (Score:4, Interesting)

    by Tsu Dho Nimh ( 663417 ) <abacaxiNO@SPAMhotmail.com> on Saturday July 19, 2003 @07:30AM (#6477433)
    And that method designed one seriously cool refrigerator [enginsoft.it].

    The practical side of me says it would never fit into the usual slot they are stuffed into by kitchen designers, but I like it anyway.

  • by SystematicPsycho ( 456042 ) on Saturday July 19, 2003 @08:47AM (#6477548)
    Genetic algorithms work by minimizing the amount of searching needed in a search space by "evolving" candidate solutions and only evolving new ones from the best of the old ones. A candidate solution is evaluated against a fitness (of objective) function that assigns a fitness to a candidate solution.

    Don't get to excited about the walking man - the difficulty and also art in Genetic Algorithms is finding a way to represent a candidate solution. The walking man isn't difficult to represent. To make it simple, pretend that the representation is like, work out a way to order the numbers 1..10 - where 1 == man crawling and 10 == man walking and the rest of the numbers in between are ordered to represent a certain posture. The genetic algorithm then searches through all the combinations and permutations of the numbers 1..10 until they are ordered - btw, instead of drawing numbers let's associate a number with a posture.

    Now with this problem - think of a pool of numbers, each number associated with a posture and the Genetic Algorithm searches through the pool (of 100) until it finds the numbers 1..10 in order all standing next to each other.
    ---
  • by Animats ( 122034 ) on Saturday July 19, 2003 @12:59PM (#6478850) Homepage
    Karl Sims did this about fifteen years ago, although he had to use a Connection Machine back then. It's appeared in a few games, screen savers, and such. The main insight here is that walking is solveable as a hill-climbing problem. Any of the usual hill-climbing algorithms (neural nets, genetic algorithms, simulated annealing, etc.) can do the job.

    Running, though, is harder. Steady-state running is reasonably well understood, but running on rough terrain remains hard. To control running well, you need predictive techniques - look-ahead, two-point boundary value solvers, and model-based control. This is where the insect brain leaves off and the lizard brain starts, and may be the beginnings of low-level AI.

  • by extropy ( 669666 ) on Saturday July 19, 2003 @04:44PM (#6480373)
    I think they were saying is that the characters actions are not directly scripted... i.e. "say this line, enemies come up hill, wait 3 seconds, start shooting at enemies", but instead, the actions are more dynamic and controlled by a.i. What they are NOT claiming (as far as I know) is that the a.i. learns and evolves as do the walking figures in the parent story.

The Macintosh is Xerox technology at its best.

Working...