Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Supercomputing

Can We Build a Human Brain Into a Microchip? 598

destinyland writes "Can we imprint the circuitry of the human brain onto a silicon chip? It requires a computational capacity of 36.8 petaflops — a thousand trillion floating point operations per second — but a team of European scientists has already simulated 200,000 neurons linked up by 50 million synaptic connections. And their brain-chip is scaleable, with plans to create a superchip mimicking 1 billion neurons and 10 trillion synapses. Unfortunately, the human brain has 22 billion neurons and 220 trillion synapses. Just remember Ray Kurzweil's argument: once a machine can achieve a human level of intelligence — it can also exceed it."
This discussion has been archived. No new comments can be posted.

Can We Build a Human Brain Into a Microchip?

Comments Filter:
  • Sure we can... (Score:5, Informative)

    by thisnamestoolong ( 1584383 ) on Thursday August 06, 2009 @01:01PM (#28974725)
    ...but why would we? The brain was assembled by natural selection -- a process that can only improve and work with what it already has, which is hardly ideal. The human brain is certainly amazing, but it is not perfect. There are certainly better, faster, and more efficient ways of designing the superhuman AIs of the future. Looking at the brain will give us a good road map, but is not the end-all be-all.

    I see a strange arrogance and egocentricity in trying to design robots to be exactly like us, why not think outside the box? Why are upright, bipedal robots always portrayed as the ultimate? There are most certainly more efficient and better designs than the one we are saddled with, this is just how we happened to evolve, we are simply the current end of one branch of the evolutionary tree.
  • by characterZer0 ( 138196 ) on Thursday August 06, 2009 @01:28PM (#28975207)

    First off, it's been said that people only use 10% of their actual brain power.

    Lots of stupid things have been said. People generally only use 10%-20% of their brains at any given moment. They use nearly all of it through the course of the day.

  • by Hungus ( 585181 ) on Thursday August 06, 2009 @01:29PM (#28975221) Journal

    Lots of "things" are said and lots of things are wrong. "people only use 10% of their actual brain power" is belongs to both groups.

    Though an alluring idea, the "10 percent myth" is so wrong it is almost laughable, says neurologist Barry Gordon at Johns Hopkins School of Medicine in Baltimore. Although there's no definitive culprit to pin the blame on for starting this legend, the notion has been linked to the American psychologist and author William James, who argued in The Energies of Men that "We are making use of only a small part of our possible mental and physical resources." It's also been associated with to Albert Einstein, who supposedly used it to explain his cosmic towering intellect.

    source: http://www.scientificamerican.com/article.cfm?id=people-only-use-10-percent-of-brain [scientificamerican.com]

  • by hoggoth ( 414195 ) on Thursday August 06, 2009 @01:35PM (#28975323) Journal

    But what if the brain works by exploiting all of the effects of molecules, proteins, ions, electrical charges, even quantum effects at a molecular level? We have seen that evolution is excellent at finding very clever ways of exploiting whatever resources are available. It is possible that the only way to simulate a brain is to simulate every single atom involved within a brain. For obvious reasons a computer made of 'n' atoms cannot simulate a brain made of 'n' atoms as fast as that brain can work.

    I don't know that this is true, but it certainly brings up the possibility that it may be impossible to simulate a brain faster than a brain works, or better than a brain.

    Or on a slightly less pessimistic level, perhaps a "synapse" could be encapsulated in a software object, but the number of variables that make each synapse's position, arrangement, and connections unique are staggering and would require a machine to be thousands of times more powerful than a real brain in order to simulate it. That would move our "singularity" out til we have computers that can process as much as 22,000 billion neurons and 220,000 trillion synapses. I wonder if someone better at math and physics could calculate the bare minimum energy required for the negative-entropy to store 220,000 trillion somewhat complex pieces of information. I recall reading a calculation that the ZFS filesystem has the theoretical (but not practical) limit of enough information that the minimum energy required to actual encode that information would be enough to boil the Earth.

  • by gestalt_n_pepper ( 991155 ) on Thursday August 06, 2009 @01:54PM (#28975681)

    Well, it's not being done that way. The idea behind what the Europeans are doing is to simulate actual neuronal behavior. The results were quite interesting in that it seems to behave much like a real piece of neural tissue (http://www.technologyreview.com/biomedicine/19767/).

  • Educated guess : No (Score:2, Informative)

    by Iffie ( 1410897 ) on Thursday August 06, 2009 @01:58PM (#28975755)
    As a former neuroscientist I would say NO. The brain is not build out of static elements, they themselves show dna/rna expression variation due to their own activities and neuromodulation. A fixed system can doe certain processing, but it has to be reconfigurable and have some software oversight to be as flexible as out brains..
  • Randomness (Score:5, Informative)

    by Burnhard ( 1031106 ) on Thursday August 06, 2009 @02:08PM (#28975893)

    It requires a computational capacity of 36.8 petaflops -- a thousand trillion floating point operations per second

    It requires far more than that. According to some, the microtubules on the cytoskeletons of the cells themselves can be processing units. Raise the bar a few orders of magnitude in that case.

  • by seven of five ( 578993 ) on Thursday August 06, 2009 @02:32PM (#28976219)
    Ray Kurzweil is a brilliant computer scientist and brought us many improvements -- maybe even the invention of -- the electronic musical keyboard.

    err... no. Electronic keyboards go back at least this far...Ondes Martenot [wikipedia.org]
  • Re:From the article (Score:1, Informative)

    by URL Scruggs ( 1230074 ) on Thursday August 06, 2009 @02:57PM (#28976707)

    I agree with this quote. A lot of computer scientists try to build artificial intelligence without really understanding how their own brain works. It is really too bad because they have an unusually observable specimen right in their own head. Genetic learning? Is that how you feel you learn personally?

    There are a few that are doing really interesting research into just that, I recommend this book: Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought [indiana.edu] It goes into quite a lot of detail about trying to simulate analogy making and it is written in quite a human, personal style and studies quite beautifully simple cognitive processes. Its such a shame that people like Kurzweill get all the attention, when there are some really insightful, philosophical AI researchers out there, (such as Douglas Hofstadter).

  • by Anonymous Coward on Thursday August 06, 2009 @03:01PM (#28976789)

    "Irving John Good who, in 1965," ... and ... "charlatan"

    Stanislaw Ulam a Polish mathematician who knew John von Neumann (who died in 1957) told of a conversation he had with John von Neumann, about accelerating change, stating:
    "One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."

    Further to this, the concepts of even the Von Neumann probes are also based on accelerating spread of self-replicating machines, so its very evident John von Neumann was already thinking along these lines decades before most other people.

    Raymond Kurzweil is indeed a brilliant computer scientist, but its very sad John von Neumann's ideas are so often overlooked, as he died decades ago whereas Raymond Kurzweil is still able to keep self promoting himself at every press event, gaining ever more credit for concepts he never originated.

    What makes John von Neumann's achievement even more amazing, (I find almost mind blowing) is that technology was so primitive in his time, yet he showed he had the amazingly insightful capacity to imagine technology far beyond even our level of technology. I still find that jaw dropping. How could he imagine it decades before others, its so awe inspiring. He really does deserve being called a genius. I wish there was far more recognition of his work. We really should be building statues of him to inspire future generations of scientists.

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...