Forgot your password?
typodupeerror
AI Programming

Allen Institute Data Enables Hackathon For the Human Brain 37

Posted by timothy
from the proving-the-old-brains-in-jars-theory dept.
Nerval's Lobster writes "Hackathons are not exactly uncommon things, whether the programmers are assembled to improve a company product or simply to tackle a particular challenge. Few of them, however, offer the chance to hack the human brain. That was the reason behind the Seattle-based Allen Institute for Brain Science's week-long hackathon: give 30 participants from various universities and institutes, along with a smattering of technology companies, the chance to develop data-analysis tools based on the latest version of the Institute's Allen Brain Atlas API, which was released earlier in June. Projects and applications included that crunched a list of genes to discover disease patterns. Another translated genomic data into music—because when it comes to data-crunching and neuroscience, you can't be deadly serious all the time." Be careful what you wish for, though, in applying AI to regular I: New submitter jontyl writes of a project led by Google's Dr Jeff Dean which used a "16,000 processor array to create a brain-style 'neural network' with more than a billion connections." Says the article: "There's a certain grim inevitability to the fact that the YouTube company's creation began watching stills from cat videos."
This discussion has been archived. No new comments can be posted.

Allen Institute Data Enables Hackathon For the Human Brain

Comments Filter:
  • Wrong word (Score:5, Funny)

    by frisket (149522) <[ei.liramlis] [ta] [retep]> on Tuesday June 26, 2012 @09:42AM (#40451741) Homepage
    For a moment I thought that said "Alien Institute..."
    • by Ed_1024 (744566)
      Yeah, me too! The letter pairings li and ll look pretty similar in the font I've got, plus I suppose my brain was hoping for "Alien"...
  • I initially misread the title of this article as "Alien institute data enables hacking the human brain."
  • by A10Mechanic (1056868) on Tuesday June 26, 2012 @09:53AM (#40451861)
    What we wished for - SKYNET. What we actually got - LOLcat
    • by Qzukk (229616)

      Combine them both and you get Ceiling Cat [photobucket.com]. Where is your god now, indeed.

    • by Rei (128717)

      We wished for skynet? Wow, you're really metal. ;)

      What I "wish for" in this field is the ability to A) accurately and completely model the behavior of neurons, not just at a snapshot in time but as they evolve over time, at scales sufficient to model the entire human brain; B) the ability to simultaneously read all of the necessary state data from every neuron in the brain (neurotransmitter levels, connections, etc) to feed into the simulation, and C) the ability to feedback to each neuron in kind data fro

      • Clone is probably easier, technologically, because you don't have to worry so much about speed.

        1. Remove fresh brain. This obviously kills the subject, but that's acceptable to me. Plenty of people would die to live forever.

        2. Slowly lower the brain into liquid nitrogen. You want it stabilised before the neurological damage sets in, and cryonics is the obvious way.

        3.Let your futuretech nano-disassemblers pick it apart cell-by-cell over the course of months, documenting as they go.
  • Not enough time (Score:3, Insightful)

    by Anonymous Coward on Tuesday June 26, 2012 @09:56AM (#40451891)

    One week to analyze petabytes of information is absurdly brief. Just to transfer two petabytes in one week requires a data rate of 3550 Megabytes/sec. A somewhat sane amount of time for this would be to analyze the data for an entire summer. Otherwise, they're only working with a small subset of the data - possibly not enough to be statistically significant. Bringing 30 people to task for this for one week is like trying to make a baby in one month by using 9 women. It could be fun to try, but it just isn't going to happen.

    • by mfh (56)

      One week to analyze petabytes of information is absurdly brief.

      Statistics were not really designed to help liars... but god damn the liars love statistics, don't they?

    • Bringing 30 people to task for this for one week is like trying to make a baby in one month by using 9 women. It could be fun to try, but it just isn't going to happen.

      I don't know, statistically, you just might succeed. There is a non-zero chance of one of the randomly-chosen women being in the eighth month. Randomly choosing nine women instead of just one woman obviously increases your success factor, so the whole thing works.

  • will assume that the most important aspects of existence involve cat pictures, kid pictures, porn, viagra, penis enlargement, breast enlargement, celebrity news and that most frequently mentioned country, Nigeria.

    I for one, welcome our new, well endowed, artificially intelligent, trailer trash overlords

  • No one is asking, but I am answering it nonetheless because it seems quite important to me and used to be the first question Slashdotters would have on such a subject : Are the data public ? The answer is no. They are subjected to a license that is fairly permissive. Here is the core :

    Any of your use of the data and tools (including creation of derivative works of the data and tools) must be for noncommercial use unless otherwise agreed to by the Allen Institute;

    Your use must be in accordance with the Freedom to Innovate section below, which generally prohibits you from obtaining intellectual property rights that would limit the Allen Instituteâ(TM)s freedom to continue innovation; and

    If you use the data and tools provided by the Allen Institute, you must follow the citation policy in these Terms or, if your use is not specifically described in the citation policy, provide proper attribution of the source.

  • by Okian Warrior (537106) on Tuesday June 26, 2012 @11:18AM (#40452733) Homepage Journal

    ...a project led by Google's Dr Jeff Dean which used a "16,000 processor array to create a brain-style 'neural network' with more than a billion connections."

    Hey, Dr. Dean. can you answer a few questions about your project?

    In your project, what is the correct number of hidden layers to use? What algorithm or rule can I use to choose the right number of layers in my project?

    Which connection scheme are you using? In a topological sense, meaning the rules that determine which nodes are connected to other nodes. Is there a way to determine the correct topology using some method?

    There are over 180 different types of artificial neurons. Which ones are you using? What rule indicates that these are the correct ones to use in your application?

    Neural nets in the human brain have more back-propagation circuits than forward. This would appear to be a major feature of the human brain. Does your system have this feature?

    I'm a little confused on the whole AI bit. I've researched all over the literature and net, but still haven't found a constructive definition for intelligence. Help me out here - what definition of "intelligence" are you using, so that you can relate your project to the field of AI?

    Looking forward to hearing from you. Thanks in advance.

    (An AI researcher)

    • ...a project led by Google's Dr Jeff Dean which used a "16,000 processor array to create a brain-style 'neural network' with more than a billion connections."

      Which connection scheme are you using? In a topological sense, meaning the rules that determine which nodes are connected to other nodes. Is there a way to determine the correct topology using some method?

      (not an AI researcher) honest question - what's the point of having multiple connections between processors? If it's a completely connected K-16000 graph then there would only be ~128 million connections. If he's using over a billion that's averaging about 8 connections from every node to every other node. What's the logic behind that with a processor array?

      • Maybe each processor represents more than one neuron? After all, modern L2 cache is getting pretty big these days.

      • by tobiah (308208)

        The number of processors are somewhat unrelated to the number of nodes in the scheme. Neural networks are highly parallel, so the more cores the faster it can run, with little limit to that equation. But you could model the billion-connection network with a single cpu, if you were really really patient.

    • by xtal (49134)

      You don't need to define intelligence. The encoding for it is in your DNA.. or your head. All we need is a way to emulate what is there, apply inputs, and observe how things react. Unfortunately, the last time I investigated, that was the level of understanding we were are with regards to intelligence.

      What is missing is a machine architecture to model existing intelligences on in a generic way..

  • by Anonymous Coward
    I find it interesting that this, an actual technology related article, has roughly 20 comments, and isn't really going anywhere. Meanwhile, the orbitz and mac users article has around 110 comments and growing. Fuck.
  • Another translated genomic data into musicâ"because when it comes to data-crunching and neuroscience, you can't be deadly serious all the time.

    What's non-serious about that? Translating the data into music makes it accessible to the powerful rhythm, tonal sequence, and echo-imaging processing of the auditory system, which might identify interesting and useful features in the data.

  • Lunkwill: Do you...
    Google X: Have an answer for you? Yes. But you're not going to like it.
    Fook: Please tell us. We must know!
    Google X: Okay. The answer to the ultimate question of life, the universe, and everything is...
    [wild cheers from audience, then silence]
    Google X: Pretty Kitty.

When all else fails, read the instructions.

Working...