Supercomputer Simulates Human Visual System 244
An anonymous reader writes "What cool things can be done with the 100,000+ cores of the first petaflop supercomputer, the Roadrunner, that were impossible to do before? Because our brain is massively parallel, with a relatively small amount of communication over long distances, and is made of unreliable, imprecise components, it's quite easy to simulate large chunks of it on supercomputers. The Roadrunner has been up only for about a week, and researchers from Los Alamos National Lab are already reporting inaugural simulations of the human visual system, aiming to produce a machine that can see and interpret as well as a human. After examining the results, the researchers 'believe they can study in real time the entire human visual cortex.' How long until we can simulate the entire brain?"
Ghost in the supercomputer (Score:3, Interesting)
And when this simulation claims to be conscious, what do we make of that?
The Last Step For Ubiquitous Robotics? (Score:5, Interesting)
Extrapolating further, a human-quality object recognition system will yield results which we cannot currently imagine (let's avoid some big-brother robot talk for a second, however).
For example; I was looking at some old WWII photographs of troops getting on boat - thousands of faces in these very high-quality photographs. To myself, I thought,'Self. If all historical photographs could be placed in view of a recognition system, perhaps it could be found, interestingly, where certain ancestors of ours did appear.'
Throw in a dash of human-style creativity and reasoning and I'm certain some truly nifty revelations are to be found in our mountains of visual documentation currently lamenting in countless vast archives.
Re:New goal... (Score:5, Interesting)
These are also the little dudes who can strike with the force of a
Go Super Shrimp!
The hardware is apparently there (Score:4, Interesting)
We are ignoring for the moment how the neural network simulators work, how they communicate amongst themselves, how they are partitioned, what sensor inputs they receive, how they are trained (that's a tough one), etc. This will turn out to be extraordinarily difficult unless some very clever people mimic nature in very clever ways.
Well, at least the hardware is there.
Why supercomputers? (Score:3, Interesting)
Should be enough of us out here i would think.
"interpretation" at what level? (Score:4, Interesting)
First I want to say that this whole level of brain modeling is really cool. However, there are, of course, different levels of "interpretation" I don't think that this computer will be able to achieve a human level of interpretation simply by modeling the visual cortex.
Even "interpretation" at the second level above (which it seems the "roadrunner" might be able to model) require a lot more, for humans, than just the visual cortex.
In other words if we were to call into existence a floating occipital lobe connected to a couple of eyes that had never been attached to the rest of a brain we would never be able to achieve recognition/categorization let alone interpretation. If I'm wrong maybe some of you hardcore neuroscience type can help me out?
Re:The hardware is apparently there (Score:2, Interesting)
The 1 ms minimum re-activation interval is interesting, because given enough CPU cores per RAM bank, the speed of the computer may surpass that of the biological brain.
Too Optimistic (Score:3, Interesting)
However, that is based on the previously incorrect assumption that neurons are the only kind of brain matter that is important. Now it is clear that glial cells play an important role in coordinating cognition. There are 10 times as many glial cells as there are neurons. That sets our simulation back a few years.
I think Ray Kurzwiel is way, way, too optimistic regarding the rate of progress.
Don't hold your breath (Score:4, Interesting)
After examining the results, the researchers 'believe they can study in real time the entire human visual cortex.' How long until we can simulate the entire brain?"
There are researches who believe that humans use their whole brain to "see." If that is true, the claims of these researchers are highly premature with respect to vision. Everything from stored patterns to extrapolation is used to determine what we see. Even familiarity is used in perception - that is why there is this urban myth that "foreign" people look the same. If one were to ask those foreigners, they will say all indigenous people are totally different.
Re:The Singluarity is Near (Score:3, Interesting)
More on that: how in the hell are we to keep evolving if not through technology? We wont evolve "naturally", i think thats well established, not anymore. Our social system (for ALL of us) has not erm... evolved to be a good evolutive system that rewards the best.
The only way "up" is through a technologicall singularity. I dont think its inevitable though, i think its necessary, desirable.
How long? (Score:3, Interesting)
Until they can emulate the quantum/holographic methods the brain employs. Keep in mind, there are some worlds-in-worlds within the physical components. Just like how metal siding can form a complete circuit around the house, the nerves of the brain form multiple networks (chemical, electrical, interference patterns, etc)
Re:The Singluarity is Near (Score:3, Interesting)
What I took from the wikipedia article is that these astrocytes are responsible for neurotransmitter release and reuptake, these chemicals, based on my (admittedly limited) understanding are the primary movers and shakers in the brain.
Serotonin, for example is very deeply related to mood, hence why many prescription anti-depressant/anti-anxiety drugs are effective.
If my understanding is correct,(and it may not be)then astrocytes perform much more complicated function than a power cable.
Re:Not Bloody Likely (Score:2, Interesting)
As with regards to your other comments, I am willing to bet that the number of neuroscience publications produced by our team compares favorably to the number of publications of almost any group of a similar size. We know what we are doing. For example, some of us are behind the DOE/DOD project on artificial retina, to be used by blind soldiers coming home from wars. People cannot see with such retinas yet, but they can distinguish light from darkness. So, again, while we are computer geeks, we are also quite respected neuroscientists (read the team roster in the original press release and google).
The key thing, of course, that, in this project, we didn't want to simulate the real physiology (which, I agree with you, we have no hope to do in the foreseeable future). We tried to simulate the functional behavior of the network. The difference is the same as, for example, between simulation locomotion on the levelof contracting muscles and rigid bones vs. simulating gene expression and protein production in every cell in the said muscle.