Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software Programming IT Technology

Biomorphic Software 133

CowboyRobot writes "From the molecular structure of spiders' silk to the efficient use of energy by insects and fish, we can learn many things from Nature and apply them to our engineering tasks. One thing that nature is particularly good at is the development of dynamic, self-organizing systems. Ken Lodding is a software engineer at NASA and is currently developing 'swarm algorithms for groups of wind-driven, remote exploratory vehicles'. He has a six-page article at Queue on 'biologically inspired computing', how to develop 'algorithmic design concepts distilled from biological systems, or processes.'"
This discussion has been archived. No new comments can be posted.

Biomorphic Software

Comments Filter:
  • Predator or Prey? (Score:3, Interesting)

    by garcia ( 6573 ) * on Thursday July 15, 2004 @10:48AM (#9707926)
    Sounds an awful lot like Michael Crichton's novel Prey [crichton-official.com]. The story's description (from the above link): cloud of nanoparticles -- micro-robots -- has escaped from the laboratory. This cloud is self-sustaining and self-reproducing. It is intelligent and learns from experience. For all practical purposes, it is alive. It has been programmed as a predator. It is evolving swiftly, becoming more deadly with each passing hour. Every attempt to destroy it has failed. And we are the prey.

    I hoped that this was more fiction than reality. Perhaps Prey is going to become a movie and they are writing this up to get people interested?

    Doesn't the thought of an intelligent swarm of nearly indestructible particles scare people? I know I am paranoid and all but I can't fathom the damage that could occur if these got out and were self-sustaining even for a short time.
    • Re:Predator or Prey? (Score:3, Interesting)

      by shackma2 ( 685062 )
      I think its ridiculous to say that anything like Prey is going to happen in the near future. If you really want to worry yourself to death, there are much better problems in the world then 'intelligent swarms of nearly indestructible particles'.
      • Re:Predator or Prey? (Score:2, Informative)

        by grimover ( 212034 ) *

        Ridiculous *and* physically impossible, as was mentioned on slashdot here:

        Slashdot Article on Prey [slashdot.org]

        Nanobots the size of Red Blood Cells (around 2-5 microns in length) would have a top speed of 2mm/sec in air by Dyson's calculation, or 7.2M/hour, hardly fodder for a high-speed chase!

        By my quick calculation Dust Mites (about 200 microns in length), as I've mentioned in another post on this article, could travel up to 20cm/sec or 720M/hour, slow but still scary, especially if there's are trillons of them swa

    • Re:Predator or Prey? (Score:3, Informative)

      by JanusFury ( 452699 )
      Prey is actually an interesting novel. The writing isn't as good as some of his previous novels, but from a technical perspective, I found it somewhat intriguing. It's barely plausible, like most sci-fi, but the elements that are plausible make you think.

      If I remember correctly, the basic concept was that instead of trying to design algorithms for nanomachines, the programmers responsible for developing them just used a form of natural selection to 'evolve' an optimal algorithm. Of course, the problem was
      • by SatanicPuppy ( 611928 ) <Satanicpuppy@nosPAm.gmail.com> on Thursday July 15, 2004 @11:08AM (#9708134) Journal
        I have to disagree with the plausability.

        The secret weapon they use to kill the rogue swarms of psycho nano cameras is a gunk impurity that got into the STERILE nano-construction area. Like that woudl never occur naturally in non-sterile (i.e Everywhere) areas of the world.

        The other thing which got to me was the amount of processing power these nano clouds were assumed to have. A sophisticated predator-prey model that would be CAPABLE of evolving into what those evolved into would need tremendous processing power.

        So, lets see, what they would have to have? They'd need high bandwodth that couldn't be jammed (they'd be pretty worthless if you could just turn on a jammer and have them fall apart). They'd need non-volitile memory, because they're solar powered, and if they didn't have it, they'd be stupid again every morning. They'd need a sophisticated distributed processing alogrythm with massive failure tolerance and freakishly complex load balancing (this is more possible than most of it). And beyond all this, they'd need to be able to be microscopic flying cameras that could kill people.

        In biological terms, most species have a "specialization". Which means that most species have ONE thing that they do really well. Birds aren't too smart because flying is hard to do. Same with cheetas, because running that fast requires really specific evolution.

        Those little nano-bots would have to do the thing they're specialized by the design to do...And everything else as well. Christ, he's got them mimicing human behavior by the end! That is such an incredible stretch! I love sci-fi, but that book had me sneering almost from the very beginning.
        • I agree with you about the implausibility of the deus ex machina gunk impurity that killed the machines. I was willing to suspend disbelief on that one.

          I disagree about the amount of processing power/memory that these things would need, however. I find it very easy to believe that a simple set of behavioral patterns when applied to a group of organisms acting together can generate very complex behavior. After all, a human is just a collection of very simple cells, albeit with a complex rule set for behavio
          • by hvt ( 756120 )
            I have not read the book, but I think the grand parent have a good point. If you view the collective as one object, the complexity of that object greatly depends on the capability of its components to communicate, differentiate in task, retain memory...etc.. all of which requires very tight binding between the components. Human isn't a collection of very simple cells, we have very differentiated cells. While all cells have the same DNA master plan; they communicate with each other via mulplitude of complex
        • In biological terms, most species have a "specialization". Which means that most species have ONE thing that they do really well. Birds aren't too smart because flying is hard to do. Same with cheetas, because running that fast requires really specific evolution.

          Those little nano-bots would have to do the thing they're specialized by the design to do...And everything else as well. Christ, he's got them mimicing human behavior by the end! That is such an incredible stretch! I love sci-fi, but that book had
        • Huh? (Score:3, Insightful)

          Birds aren't too smart because flying is hard to do.

          This doesn't make any sense no matter how many times I read it.

          First of, birds are the most intelligent animals after mamals. Flying for a bird is no more difficult then running for a human. Despite their small brains, birds learn to fly way faster then humans learn to walk. Insects also fly and they are definetly dumber then birds. I can make a paper airplane fly and it has no brain power at all. Basic auto pilot on a light aircraft have about as mu
          • Re:Huh? (Score:3, Interesting)

            Well, there is a subtle difference between flying and running, in that, if you stop running, nothing happens, but if you stop flying, you plummet to the ground and go splat.

            I should have been more specific. For a human being, our specialization is intelligence and tool use. We make tools, and we use them to compensate for what we don't have by nature. The rest of our natural skillset is pretty low-end; we don't run as fast as most animals, we can't lift as much, we aren't as coordinated.

            The reason for thi
            • If you are in mid sprint and simply stop moving your legs, you will go splat on the ground just like a bird that simply stops flying.

              Humans most prominent feature may be unusual intelligence, but I think you are greatly discounting the role our body shape plays in our accomplishments. Imagine if we had our brains trapped in the body of a snake or fish, how much of what we have today would be possible? How do we know for sure that some other animals aren't also very intelligent but don't have the means to
            • Birds aren't dumb...for animals. But compared to us? One of the reasons for that is because a good bit of their brain is taken up by the instinctive knowledge of flight.

              If this were true, nature would compensate by allowing bird brain's to grow large enough to, say, do their taxes.

              But birds don't need to do taxes, and a larger brain is heavier and requires more bird to lift, more bird requires more brain to control, etc.
    • I hoped that this was more fiction than reality. Perhaps Prey is going to become a movie and they are writing this up to get people interested?

      I thought it was more along the lines of a video game that was planned for release shortly after Duke Nukem Forever. I know I'm really waiting for those cool shadow effects they've been promising since '95!

      (Does anyone know what happened to the prey engine?)
    • This reminds me of the kind of paranoia that resulted from the grey goo [slashdot.org] article that was run on /. a while ago
    • Doesn't the thought of an intelligent swarm of nearly indestructible particles scare people?

      nope. what do you think human beings are? seen from a larger scale, we are the nanobots.

      of course, that assumes we had a maker ...
    • Didn't he already write that book [amazon.com]? Do these predator microbes evolve and eat through rubber seals? Do they evolve into something uninterested in humans and float (back) off into space?

      Has writing novels been reduced to someting like:
      sed y/"old fear"/"new fear"/ old_novel.text > new_novel.text
      ?
      -Peter
  • Great... (Score:3, Funny)

    by xenostar ( 746407 ) on Thursday July 15, 2004 @10:49AM (#9707934)
    All we need is wild packs of stray 'exploratory vehicles' rummaging through the garbage at night.
  • by march ( 215947 ) * on Thursday July 15, 2004 @10:52AM (#9707962) Homepage
    10 SWIM AROUND TANK
    20 PRINT "LOOK A ROCK!"
    30 GOTO 10

  • ...related, is the practice of having a program interrogate its environment. Some of the most successful programs are highly portable pieces of code that check to see what OS services are available, what APIs are available, what dependency software is available, etc. and then constructs the final object tree based on the results.

    While this is very difficult to do in C/C++, it's a very successful way of writing Java code. For example, a gaming timer I wrote first checks the JVM version. If it's on 1.5 it uses the new NANOTimer. If that fails, it checks the OS. If it's on Windows, it then checks for the presence of a native timer DLL. (Timing on Windows sucks.) If it fails to find and/or load the DLL, it then falls back to a clever algorithm for making the most of default Windows timing. If it's on some other OS, it uses the default timer (all OSes except windows can provide millisecond resolution without complaint).
    • Well it's not much more difficult than doing it in Java, or even bash (well unless you want to use things like class.GetMethods() etc...)
    • all OSes except windows can provide millisecond resolution without complaint

      Go look at QueryPerformanceCounter(). It'll give you a *very* high-res 64-bit timer (3579545 counts per second on my puter).
      • Go look at QueryPerformanceCounter(). It'll give you a *very* high-res 64-bit timer (3579545 counts per second on my puter).

        That's what the DLL does. Sadly, Microsoft doesn't guarantee any sort of accuracy with that clock. Dual proc systems completely change the timing, too. My solution was to abstract out the timing into "ticks per second", then make the developer calculate for how long he wants between event. e.g.: frametime = timer.getTicksPerSecond()/60; //60 FPS
        • The best approach that I've found is to have a single thread for all timer related activities and set its affinity so that it always runs on the same processor. It simply waits on a semaphore and updates a global timestamp variable every time you signal it. It can also signal other semaphores after a specific delay (getting there with enough resolution might involve a bit of busy waiting, but typically for less than 2ms).
    • Nothing stops C/C++ doing it using DLLs...or even compiling its own code.

      But it will be much easier if a persistent object oriented system [slashdot.org] would be around. Self organizing code would be very easy to write with such a system.

      Why do you say Windows can't provide millisecond resolution ? All Windows timer-related functions are based on milliseconds. Furthermore, Windows is the only O/S I know that provides sub-millisecond timers. See High Resolution Multimedia Timers in MSDN.

      • Nothing stops C/C++ doing it using DLLs...or even compiling its own code.

        It's worth noting that I never said you couldn't. I said it was HARD. If you look farther up the thread, I reiterated this point.

        But it will be much easier if a persistent object oriented system would be around. Self organizing code would be very easy to write with such a system.

        I actually built an entire self-organizing system out of the Java SPI concept. On program startup, each module would decide if it should load itself in t
        • It's worth noting that I never said you couldn't. I said it was HARD. If you look farther up the thread, I reiterated this point.

          Why is it hard ? Open Visual Studio, make one or more DLL projects, then use these DLLs from the main project according to what you want to do. Making a DLL is nothing more than pressing a few buttons anyway.

          10ms res on 2000/XP

          Where did you read that? Windows NT provides 1 milisecond resolution. 10 milliseconds is the default timer interrupt granularity. By using the fun

  • by Timesprout ( 579035 ) on Thursday July 15, 2004 @10:52AM (#9707968)
    After all its just an attempt to reproduce human though and decision making processes in machines.
    • On the contrary, computing decision making and human decision making are polar opposites.

      Artificial Life computing is an attempt to bring these closer, whereby a computer's thought process says, "Based on past experience, I think that solving this problem in that manner would suffice." Well, that's a pompous computer's thought process at least.

      However, current computers think, "I was told that if x occurs then do y, so I'll go do y."
    • Not exactly - you've got your *definitions* confused there. How is Apache biologically inspired? Ever seen a fish serve web pages? The biologically inspired computing discussed here is more inspired by crowd/swarm *behaviors* of animals than by human thought processes. It's a model and some algorithms to go along with it.
    • by Apocalypse111 ( 597674 ) on Thursday July 15, 2004 @11:16AM (#9708214) Journal
      If you are talking about creating an Artificial Intelligence to pass the Turing Test [ualberta.ca], then yes. For those not in the know, the Turing Test is a test for artificial intelligence based on social interactions. If a person interacting with an entity on-screen cannot tell if that entity is a human or an AI, then the AI passes the test and is considered "intelligent".

      The problem with the Turing Test is that it biases AI towards a human-style intelect, where that might not be the best way (or even a good way) to make an AI. For all we know, a good AI might have a thought-process which, to us, would seem completely crazy.
      • For all we know, a good AI might have a thought-process which, to us, would seem completely crazy.

        Indeed, and I think I know why. As humans, most of our decisions are based on preference; and what is preference? What we like. And one cannot argue for or against a personal preference.

        Therefore, I think AI should be designed to do what machines do best with simple tasks: being thorough. Start examining element 1, then 2, then 3, etc. until category has been exhausted of elements to examine, then produce an
      • For all we know, a good AI might have a thought-process which, to us, would seem completely crazy.

        CAR!
        Yes Dave?
        Why are you turning left at every corner?? I need to get to work!
        The umbrella in the rear seat needs to be triple rotated bluely because Calista Flockhart has been eating too much red food lately.


        -
        • > The umbrella in the rear seat needs to be triple rotated bluely because Calista Flockhart has been eating too much red food lately.

          Sounds crazy, but the computer knows that Calista's about to jump into your back seat (because she has realized her red food habit is getting out of control and has to "get away") as you turn the umbrella. If you had been doing that in a casual, perhaps yellow, manner she would have sat on it, breaking her hip (and she would have sued you). If you had done it four times,
    • I think that you are confusing "all computing" with "MS products" and drawing a connection with the mythical behavior of lemmings on which the crashing behavior of said products was modelled.
    • Computers run on logic, logic comes from humans, humans are biological.
  • by ReadbackMonkey ( 92198 ) on Thursday July 15, 2004 @10:54AM (#9707993)

    I read things like this and can't help but thing about some alien engineers coming to earth, deciding that they don't have time to explore it properly, and plop down some solar powered "robots" to gather some data on the planet. A few millenia pass and some more alien engineers come by, having the same idea but being jerks, deciding to make "robots" that eat the solar powered "robots".

    Jerks.
    • by tgrigsby ( 164308 ) on Thursday July 15, 2004 @11:03AM (#9708083) Homepage Journal
      That's not so bad, really. The solar powered ones are still doing ok. The robots that eat the solar powered ones are flourishing as well. And there are even robots that eat those robots and so on. It's actually worked out alright, although the latest release of robots seems destined to eat every other robot and even themselves. But even those aren't the worst.

      It's the robots that attempt to charge people a licensing fee for using Linux that really burn me up.

    • Or even more compellingly, they put the solar powered robots on a barren planet to transform it's environment into one suitable for colonization a billion or so years hence. And guess what? A billion years is up...
  • by prgrmr ( 568806 ) on Thursday July 15, 2004 @10:55AM (#9708007) Journal
    how to develop 'algorithmic design concepts distilled from biological systems, or processes.'

    Does this mean we can expect the whole dating-and-mating process to be reduced to an algorythm? Does the average slashdotter now have reason to have hope to apsire to procreation?
    • In this evolutionary process, slashdotters are unfortunately al gone after the first iteration.
    • Does this mean we can expect the whole dating-and-mating process to be reduced to an algorythm?

      Nah, if there is an algorithm for that at all, you can bet your a** it's going to be of exponential complexity. Not only that, but if it is ever solved, the universe will implode or something equally nasty. There are some things man is not meant to tinker with. Stick with trying to reduce NP-complete problems to polynomial time, it's much safer and easier. Or you could try solving the good old Collatz Problem [wolfram.com].

      Y

    • > Does this mean we can expect the whole dating-and-mating process to be reduced to an algorythm?

      No, because as soon as the average female enters the equation, all logic is tossed out the window, and computers must run on explicit instructions. There is no way to predict what a woman will do, except for the universal constant -- bitching.
  • by grunt107 ( 739510 ) on Thursday July 15, 2004 @10:55AM (#9708010)
    Didn't they clash with the autobots?
  • Makes Sense (Score:2, Insightful)

    by seaniqua ( 796818 )
    Seems logical to me, especially for multiuser/processor networking. Nature has been "networking" bugs, fish, packs of mammals, etc. for many more years than we've been around. All that extra research time has to count for something. Now that I think about it, a hive of insects are somewhat similar to a group of computers. The individuals posess little (or no) independant thought, only giving responses to electrical or chemical signals. Interesting...
    • A computer, by definition, is "one who computes." Hive elements are exactly that -- analog transistors that each can perform simple tasks for the benefit of the hive, or user. An insect hive may be similar to a group of electronic computers, but it is a computer in its own right.
    • Oh no! You just stumbled on every corporation's ultimate goal. People with no independant thought, only giving responses to electrical or chemical signals they send. /me rushes off to find a tin-foil hat.
    • Your entire nervous system works like this, as do things like active transport protiens in your cells, the majority of organelles in your cells. A whole lot of nature follows, "The individuals posess little (or no) independant thought, only giving responses to electrical or chemical signals" plan. The interesting stuff comes from emergent properties which still seem to baffle scientists. For example, your brain is a collection of basically binary gates - few than are in current CPUs - and yet we (and sev
  • My daughters and I experimented with these last weekend. After a birthday party. Many of them only ended up exploring the neighbors' trees. They must have found the trees interesting; they're still there. (I guess that's better then them deciding to explore the power lines, though...)
  • by Anonymous Coward on Thursday July 15, 2004 @11:03AM (#9708087)
    Mark Tilden has noticed that machines that mimic biology take a lot less computation resources than machines that are strictly programmed.

    http://encyclopedia.thefreedictionary.com/Mark%2 0T ilden

    Trying to strictly control everything doesn't work well past a certain level of complexity. It's like capitalism vs communism or Cathedral vs Bazaar. I expect to see a lot more of this kind of project in the future.
  • by yebb ( 142883 ) * on Thursday July 15, 2004 @11:19AM (#9708245)
    The author refers to the Genotype/Phenotype analogue wrt to the cells in the mechanized system they built. But he keeps refering to the Genotype as being the DNA (or code) as well as the behavior of the units. While the Phenotype is the actual unit itself.

    The genotype/phenotype analogue is a good one, but his terms are not quite correct. The genotype should refer to only the DNA and genetic information, which in his case is analygous to machine code. The phenotype should be analygous to the behavior of each unit.

    A pedantic technicality, but he mentions this a few times, and it's not quite correct.

    Neat stuff regarless!
  • Enjoy being a programmer while you can.

    Why do I say this? Well look at the efficiencies of simple programs that are "written" or evolved by genetic algorythms. We are just beginning to scratch the surface. I suspect that even simple tasks, like controlling a toaster, will become an evolutionary process that will be given its initial operating parameters by larger AI systems.

    I think that in the future the programmer as we know them will no longer exist, instead we will have people who "teach" a program to
    • by 12357bd ( 686909 ) on Thursday July 15, 2004 @11:45AM (#9708517)

      The idea is not new, read the Turing's paper Intelligent Machinery [alanturing.net] about Pain & Pleasure machines. In short, machines behaves freely but are conditioned by two simple stimulus: 'pain' that forces behaviour to change, and 'pleasure' that stabilizes current behaviour.

      • Pain & Pleasure machines. In short, machines behaves freely but are conditioned by two simple stimulus: 'pain' that forces behaviour to change, and 'pleasure' that stabilizes current behaviour.

        A surprising and new concept only to those who have never had a girlfriend.

        -
        • :)

          I refused to mention that aspect, Turing's life has been already broadly commented!... but yes, there's a strong correlation between his ideas, and his education. What strikes me most is that the man was lucid enough to realize-it and work from there!

    • by Anonymous Coward
      Except for the fact that "genetic" algorithms only sometimes produce better results. Often times, they fall short of well designed algorithms. Genetic algorithms are often a fascinating curiosity rather than something useful.
      • As with many evolving algorithms, one of the problems is the possibility of hitting a genetic dead-end. And unlike actual nature, the program menageries are typically all of the same type of beast, so it's not too unlikely for a particular design to become rabidly successful for a time and basically wipe out other variants before dying itself. But as long as you force there being some randomness and preservation of diversity, there are some interesting results.
    • You ignore the pace of the business and consumer world. Personal spending and corporations can't evolve fast enough to absorb technology as fast as it becomes viable. That's why we're still running payroll on a mainframe using cobol. It's why PC's are still built on the same basic motherboard design we've seen for about 20 years. Things have changed, but not really.
    • Nah. What actually happens in systems like that is that the answer turns out to be encoded into the fitness function and the search algorithm. Read "Why AM and Eurisko Appear to Work", which Doug Lenat wrote in one of his honest moments.
  • ...our new wind-driven, remote exploratory vehicle overlords.
  • by ethank ( 443757 ) on Thursday July 15, 2004 @11:51AM (#9708592) Homepage
    and my worker threads went on strike.
  • 10 Enter a new topic
    20 Try a first post
    30 POST "... ??? Profit!" joke
    40 POST "In Soviet Russia..." joke
    50 POST "... You insensitive clod!" Joke
    60 POST "Netcraft says: $SOMEONE is dying..." joke
    70 GOTO 10
  • This is an interesting concept, but it is hardly a new field/application. (see 'Gentic Programming', by Koza, for example. Website [genetic-programming.com]).
  • "One thing that nature is particularly good at is the development of dynamic, self-organizing systems."

    Seems to me that nature IS a dynamic, self-organizing system.
  • by CrackHappy ( 625183 ) on Thursday July 15, 2004 @12:36PM (#9709079) Journal
    This sentence in the article was rather creepy to me:
    With minor exceptions, each cell contains the information to become any one of the 256 or so types.

    That number coming up in biology is interesting.
  • One thing that nature is particularly good at is the development of dynamic, self-organizing systems which post articles to nerdy websites making statements like "One thing that nature is particularly good at is the development of dynamic, self-organizing systems."

    LS
  • Nature never has to convert between metric and English units.
  • 8 bits! (Score:2, Funny)

    by roofingfelt ( 584540 )
    In the case of the human, the initial parent cell undergoes approximately 50 cell divisions, creating 1015 cells in your body, of which there are about 256 different types

    256? Isn't that convenient!

  • by crovira ( 10242 ) on Thursday July 15, 2004 @01:54PM (#9709959) Homepage
    Almost all software in container based. Indeed, all of our systems' designs are fundamentally based on the Five Normal Forms.

    The world can't be modeled that way.

    Instead of containing data object relationships, you need to design your software with relationship objects and connection instances that are in a separate object space.

    You get reusability benefits because you don't have to alter the objects when its relationships change. Most of our system maintenance is due to relationship changes, not object changes.

  • One thing that nature is particularly good at is the development of dynamic, self-organizing systems.

    No, nature is particularly terrible at doing that, but it cranks out so many different attempts over such an enormous time span that it looks good to us lowly humans. The idea of "biomorphic" software generally fails because we don't want to merely operate as a "hand of god" and take a come-what-may attitude, we have specific problems we want our software to solve. If we have a solution in mind, then

    • OT: For Doc.. (Score:1, Offtopic)

      by Da VinMan ( 7669 )
      You know, you really got my curiosity. You put me on your foes list yesterday, and I've never even talked with you. Weird. "I wonder who this joker is?" I say to myself. So, I go find out. I find out that not only are you NOT a /. troll, you even work in the same geographical area as me. We work within about 10 miles of each other. So, I go read some of your other stuff. I'm curious and I can't help myself. I keep wondering why a reasonable and intelligent guy would bother to make me into a foe.

      So
  • Remind anyone of 'Prey'?
  • by grimover ( 212034 ) * on Thursday July 15, 2004 @03:07PM (#9710696) Homepage
    Hmmm...seems to me the first use of the term "Biomporphing" was in scientist and SF writer Dr. Charles Pellegrino's 1998 ecological thriller "Dust," I wasn't aware that it has come into use. That was the term in the novel for synthetic life forms like Dinosaurs cloned from recovered DNA, but with modifications to make them smaller and more docile for use as house pets. "Dust" describes a meltdown of the global ecology, one of the symptoms of which is swarms of trillions of suddenly-carniverous Dust Mites that consume whole towns full of people (and animals). This may have inspired Crichton to try for the some of the same scares in his 2002 novel "Prey," although its physically impossible for Crichton's nanites to move as fast as they do in the novel (due to Reynold's number), no so for Dust Mites. It wouldn't be the first time Crichton has borrowed from Pellegrino, who wrote a speculative article on "Dinosaur World Park," a place filled with Dinosaurs cloned from DNA traces on insects in amber in a 1985 issue of Omni, which Crichton acknowledges inspired Jurrasic Park. Strangely enough, the novel "Dust" also features technologies based on spider silk grown from genetically modified corn silk. I wonder if the poster has read this novel? Great read if you're into hard SF and thrillers, BTW.
    • I think evolutionary biologist Richard Dawkins deserves credit for the term "biomorph" to describe the results of his 'Blind Watchmaker' program around 1986.

      The program generated very simple tree-like drawings based on various parameters. A given "computer biomorph" could be selected and the computer would generate a number of 'children', whose shape (parameters) would be based on those of the parent with slight random changes (mutations). Dawkins later wrote variants to simulate spider-webs. These Mac-

Do you suffer painful hallucination? -- Don Juan, cited by Carlos Casteneda

Working...