Future Computers 129
jethro200 writes "Popsci.com has an interesting story on the up-and-coming silicon replacements, ranging from DNA to a little molecule called thiol to using atoms in a quantum state. Obviously, these are a long way from being your next desktop, but an interesting article nonetheless."
In other news... (Score:2, Funny)
In other news, astronauts have discovered a new use for tang.
Re:In other news... (Score:1)
Computing Beyond Silicon (Score:2, Informative)
Re:Computing Beyond Silicon (Score:1)
What I find truly amazing (Score:4, Interesting)
When the claim comes up that someday we will use biological computers, custom grown neurons that will do calculations for us and grow beyond our own puny brains, I can only nod my head in agreement. Our hardware can't be so difficult to figure out, we've got the raw components, we just need to know the schematics.
It's only a matter of time.
Re:What I find truly amazing (Score:3, Informative)
We don't have a 3-5 pound computer sitting in our heads. We have a 3-5 pound brain emulator sitting on our desk.
The point of the computer (originally) was to do complex tasks that took the human brain too much time. It does slave-like replication. It's an emulation of something we can already do, in theory.
Furthermore, the human brain is far from puny. We have 10^15th synapses, which is far more connections than there are genes in our genome, or even stars in the galaxy. 10^15th synapses is an incredibly large number to imagine. A synapse is a neuronal connection. A data transfer point.
I urge the above poster to consider the fact that life and thought have been debated for thousands of years. We *can* be so difficult to figure out.
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:2)
Imagine if Lenin had lived for 1000 years.
I like Sagan's observation that a species that could live for millenia (assuming they're in bodies like ours and not distributed) would be far more cautious because they'd have far more to lose. An acceptable risk for us would be a nightmare to them.
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:3, Funny)
I'd never buy one of those for production, maybe for fun.
Re:What I find truly amazing (Score:1, Funny)
Imagine having to do that each time you wanted to boot your PC.
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:1)
Re:What I find truly amazing (Score:1)
....sure, but try recompiling the kernel!
Re:What I find truly amazing (Score:1, Interesting)
obviously... (Score:2)
I have to reboot every 16 hours or so.
Re:What I find truly amazing (Score:2)
Yeah, and just try computing factorial 200 with your brain, or solving systems of 5000 equations, or indexing the web.
The brain and computer hardware are two different things, each with its own strengths.
Re:What I find truly amazing (Score:1)
And just look at what most people do with theirs. I can see the fine print now: Luser, PHB and Evil modes installed by default.
-- D
Ceci n'est pas une sig.
Go go gadget Eniac!!! (Score:2)
Man, how far have we progressed? How far do we have to go? This is some cool stuff, wonder how long before I'll be able to walk down the street and tap the street sign to get directions, locate restaurants, etc.....hmmm...come to think of it....
I am applying for a patent on this new found technology. I'll call it National Universities Technology Streetside Assistance Center, or NUTSAC for short.
Yes, soon I will take over the world, one stretch of dirty litter-ridden pavement at a time!!!! MUWHAHAHAHAHA!!!!!!
Encryption? (Score:1, Interesting)
I find it interesting that the problems that this new research is solving are changing along with the solutions.
Re:Encryption? (Score:1)
Re:Encryption? (Score:1)
Quantum Computing
A method of computing that encodes data in the supposition of states of quantum bits. When this supposition collapses, the answer falls out (more or less).
This technique can be used to solve the Integer Factorisation Problem. If this occurs, all public-key cryptography is useless.
Quantum Cryptography
A technique of sending single photons oscillating in different directions. This is secure because it is not possible to determine the direction of oscillation of the photon without changing its state (ie. it is imposiible to eavesdrop without being detected).
This is a means of ensuring secure two-party communications.
If Quantum Computing was viable today, it would invalidate all existing public key encryption systems. The good news is that Quantum Encryption would then be available to replace it.
David Jackson
--
Incorrigible punster - Do not incorrige.
Notes on quantum computing... (Score:5, Informative)
The idea is to have a bit that can be a 1 or 0 at the same time. This means that with 50 bits, called qubits, you can represent every number from 1 to 1 trillion, at the same time.
What's really cool, is with this you can use what's called a bogo sort. Imagine a set of cards, that is shuffled. Now to sort them in order, most people would go through 1 by 1 and put some in front and some in back. A bogo sort creates a new universe and then throws the cards into the air. If they land in order, great, else destroy the universe.
All these universes are created at the same time, making it 1 step to sort 52 cards. Like I said, it's interesting.
Re:Notes on quantum computing... (Score:1)
Re:Notes on quantum computing... (Score:2)
Re:Notes on quantum computing... (Score:1)
wouldn't the same argument apply to the recipient?
By trying to read the message would you alter it?
Re:Notes on quantum computing... (Score:1)
if there is only one transmitter, and one receiver, The data only flows one way; when another receiver is added, it changes the message becuase it's there. It's difficult to explain, but you should find some good books listed with the website above to help you out.
Re:Notes on quantum computing... (Score:1)
Re:Notes on quantum computing... (Score:1)
top left, up down , strange and charmed.
this enables you to represent 1 and 0 at the same time (but not 1 0 and -1) or whatever you'd have in a base 3 system. so you can still encrypt.
Re:Notes on quantum computing... (Score:2)
Quantum Computing [orst.edu]
I aggree (Score:1)
Creating universes is all fun, but bolx.
1: Qauntum computer using the ossilation of molicules, each bit adds a resonance frequency to the molicule, so a wiggle on the input can produce lots of wobbles on the output,giving you lots of results at the same time. e.g. you can add/subtract/and or not multiply and devide all at the same time. (not quite a load of universes, more a sea of operations with the results being the waves)
2: Using the spins of things.
This gives you a very efficient base 3 system, but probably doesnt do any magical inter universe stuff. The problem here is that it's more or less unuseable, so you apply the technique accross molicules (see point 1)
Re:Notes on quantum computing... (Score:1)
Re:Notes on quantum computing... (Score:1)
Re:Notes on quantum computing... (Score:1)
The future belongs to various flavors of AI and evolved solutions, with the rare human playing the role of big picture conductor.
--
Quantum computers *require* classical components (Score:1)
What you've said is sort of true, but more importantly, quantum computers must contain classical components to operate. A quantum computer will have silicon like a skeleton supporting its foamy quantum flesh.
Take as example, simple quantum error correction (which is much more important in quantum computers than classical ones) must involve classical routines such as syndrome detection and are critical for the basic storage and movement of quantum information which otherwise decoheres in picoseconds. Basically, you'll have classical error correction subroutines running like arteries around qubits as the flow through the computer. So, you can bet that silicon will run throughout the quantum computer.
Shor's quantum algorithm for factoring integers is based on peiod finding and it has many, many classical parts.
How do you pick the result? (Score:1)
Unfortunately, like everybody else's descriptions I've seen so far, you left out one minor detail: How do you pick the right result? With your card sorting example, I would end up with ca. 8*10^67 results simultanuously. Which is the right one? Do I go through them one by one? Do I build another quantum computer to check them all at once?
Can somebody please explain?
Re:How do you pick the result? (Score:1)
Re:This is bullshit (Score:1)
I only know what I've read, and that is documented; if my sources are wrong, yell at them, not me.
One major benefit of silicon (Score:2, Interesting)
Anything based on a more delicate technology like DNA (which is only stable over a narrow temperature range) would need some kind of homeostatic enclosure. Potential candidates are cheaply available.
That isn't a herd of cows, that's the Pentagon's decryption engine farm. And Bin Laden's successors are really going to have to worry about what the cockroaches might be up to, especially if Dolly the supercomputing sheep is better at processing operational intelligence than the FBI and the CIA - not, on the face of it, difficult.
Output (Score:1)
Future Computer (Score:2)
There have been many mention of "future computers" involving "DNA" or "Molecular Structures" and so on.
Well
Aren't we made up of "molecules" ?
Don't we have "DNA" ?
Methinks the REAL future computer be the DRASTICALLY ALTERED HUMAN BEINGS (if they can still be called "humans") with their molecular structures perfectly alligned to carry out not only bodily functions but also for computational needs, their DNAs become ultra-computational devices
Yep, that may be scifi like, but who knows ?
Instead of the computer being out of body (wearable, or whatever), the future computers will be THE BODY WE HAVE - don't need to wear anything or carry any batteries.
Re:Future Computer (Score:1)
'Scuse me for a minute. I really gotta take a memory leak.
Re:Future Computer (Score:2)
And how about "memory dump" ?
:)
Think about "sanitation engineers" suddenly acquiring "memory dump specialists" ?!
Our bodies DO compute (Score:1)
Every second of every day, our genetic material and their supporting machinery regulate an inimaginable number of complex chemical pathways by carrying out the entire range of sensing, analysis and control. If they didn't, we'd just be a mush of amino acids.
Machinery that regulate chemical processes in our bodies are an inherent part of the processes themselves. In fact, it's productive and enlightening to think of biological systems as computational and chemical processes within them as algorithms. Researchers like Prof. Erik Winfree [caltech.edu] at Caltech are beginning the difficult process of applying this insight into research. [caltech.edu]
Due to the difference between the environment that DNA computers require and the environment supported by the modern infrastructure we have built for computing, the type of DNA computers studied in today's laboratories will never replace the silicon chip. Also, unlike quantum computing, DNA computing does not offer exponential growth in computing power with the number of elements used. However, DNA computing may find a niche in bioinformatics by offering a way to probe, analyze and ultimately control complex biological processes in vitro.
Hence, research into DNA computing may offer us a way to understand, interact with, and ultimately control nature's algorithms in biological systems.
The challenge for computation over the next century is to overcome barriers in the shrinking of circuit size for conventional computers, create practically useful quantum computers, apply conventional and quantum computers along with experimentation to understand the role of computation in complex processes (notably biological systems), and use the understanding gained to create a unified architecture for computation that will allow us to embed synthetic algorithms into every complex dynamic system we design and create and extend our control to the atomic level. When that happens, nanotechnology [foresight.org] will finally fulfill its promise.
Stephen Wolfram [wolframscience.com], Erik Winfree [caltech.edu], Hideo Mabuchi [caltech.edu], Jeff Kimble [caltech.edu], John Preskill [caltech.edu], Bill Goddard, [caltech.edu] Isaac Chuang [stanford.edu] are leaders on the bleeding edge of computation. There are many many others I don't know about.
On that note, I will end my foray into wild speculation.
Not to say that DNA won't have other niches (Score:1)
Not to say that there won't be other niches [slashdot.org]. What I mean is that, there will be huge issues with reliability, durability, and interface if one ever tries to replicate the functionality of silicon chips with DNA computing based on base-pair recombination. It is probably easier to shrink circuit density, power consumption and cost of manufacture to those scales comparable to computing DNA base pair recombination. It's possible that the materials we would use to make those circuits would be chemically similar to DNA, but the way they compute would be closer electrical or magnetic switching than base-pair recombination
"mentat"! (Score:1)
In reality, however, I doubt if we are ever going to overcome the human & legal issues that would certainly arise from creating a race of supercomputing humans just to use them as our calculators. Or something.
Re:Future Computer (Score:1)
Where's the thiol/nanotube based FPGA? (Score:3, Interesting)
On a totally separate note, I thought the DNA experiment about the party guests was a bit suspicious. I've written GRE study guides in the past and so I've spent quite a bit of time analyzing those kinds of analytical questions. From a test writer's perspective, their experiment raises some interesting issues. The GRE frequently uses seven or more entities with special requirements in the analytical section and most of the questions can be solved with a piece of paper and pencil in a few minutes using simple logic. If that wasn't the case, then how would the test writer be sure what the correct answer is if they couldn't verify it?
So, if they've got all these special case situations with perhaps dozens of variables for each party goer then how do they know what the right answer is and that there are not more than one right answer --the bane of test writers. And if they do know how to accurately calculate this data, then is it really as complicated as they make it seem?
Organic, then silicone then saline back to silicon (Score:4, Funny)
I thought saline was the newest thing? But hey, if they can get 'em to run quake, I'm all for it. Oh the wonders of technology! . . . Maybe I missed the point . . .
Re:Organic, then silicone then saline back to sili (Score:1)
I thought saline was the newest thing? But hey, if they can get 'em to run quake, I'm all for it. Oh the wonders of technology!
In this case, anything that gets you closer to the hardware will give a better gaming experience.
Whatever next ... force feedback?
Errata (Score:1, Informative)
Vote (Score:1)
Re:Vote (Score:1)
Website (Score:2)
Quantum Computing [orst.edu]
I know, I posted this in my thread too. Oh well, I have karma to burn. Besides, I feel this is important enough to bring it to the top.
Shift happens (Score:2)
Any other books you'd recommend on the subject?
Um, Did Anyone... (Score:1)
I'd hate to think such innovations are right around the corner, only to find they haven't been approved.
According to Frink (Score:1)
Meh (Score:2)
Who's going to write the software? (Score:3, Insightful)
The most pressing limitation of current computing to my mind is the software we have available. Either it has bugs in it, or it doesn't quite allow us to do what we want, or the user interface is klunky and non-intuitive.
Ideally, we'd like computers to work out what we are really trying to do. There are some tasks that can be described in just a few words of English, yet to write a script that current computers could understand would be a significant undertaking.
I remember being impressed the first time I used MacDraw and found that if I duplicate a shape, drag it to a new position and duplicate it again, the next shape automatically appears in an analogous position. But this is just one tiny little example of a program being a bit intuitive and helpful. There are millions of other things programs could do like this, but so few are actually implemented.
Advances in computer hardware make it more possible to run complex AI algorithms in a short time, but someone has still got to write those algorithms. I think currently there is a bigger gap between the software we want and the software we have than there is between the hardware we want and the hardware we have.
Fact from hype? (Or the New New New Thing?) (Score:2)
The money may come in [redherring.com], but the market has to correct sometime.
I predict a "nanotechnology" version of the web economy bullshit generator [dack.com] in the not-so-near future!
Dot-con business plans were hard enough to understand; I can only imagine how bad these nanotech ones are read...
Classical Balls (Score:1)
Even our most sophisticated machines are based in classical physics as the electrons are simply shunted around a maze of transistors. Technically it is possible to rebuild any current CPU using balls and rails in place of bits and circuits.
Thiol-based molecular transistors questionable (Score:2)
The thiol-based systems are the same ones from Bell Labs that are being questioned [slashdot.org] as based on potentially fraudulent data.
Re:Fraudulent is such an ugly word (Score:1)
In this case, I'm afraid not - but the jury is still out.
from an article in Science:
Pioneering Physics Studies Under Suspicion
Officials at Bell Laboratories, the research arm of Lucent Technologies in Murray Hill, New Jersey, are forming a committee of outside researchers to investigate questions about a recent series of acclaimed scientific studies. Outside researchers presented evidence to Bell Labs management last week of possible manipulation of data involving five separate papers published in Science, Nature, and Applied Physics Letters over 2 years.
The papers describe a series of different device experiments, but physicists are voicing suspicions about the figures, portions of which seem almost identical even though the labels are different. Particularly puzzling is the fact that one pair of graphs show the same pattern of "noise," which should be random.
The groundbreaking papers include Bell Labs physicist Jan Hendrik Schön as lead author and his colleagues at Murray Hill and elsewhere as co-authors. Schön is the only researcher who co-authored all five papers in question. Everyone involved agrees that the questions need further investigation, but many fear that the impact could be devastating for Bell Labs and for solid state physics. Schön told ScienceNOW that he stands behind his data, and he says it's not surprising that experiments with similar devices produce similar-looking data.
Schön, who joined Bell Labs in 1998, has worked most closely with former Bell Labs physicist Bertram Batlogg--now at the Swiss Federal Institute of Technology in Zurich--and Bell Labs chemist Christian Kloc. His work has focused on efforts to make novel types of transistors using organic materials. He was the lead author on at least 17 papers in Science and Nature in the last 2.5 years.
Until this week, many physicists believed the impressive string of results was worthy of consideration for a Nobel Prize, although other groups have reported no success in reproducing Schön's most striking results. Last week, several physicists began to present their doubts to company managers. Bell Labs spokesperson Saswato Das says that company officials take the concerns "very seriously." Within hours of hearing of them on 10 May, Das says that Lucent management decided to form an external review panel chaired by Stanford University physicist Malcolm Beasley. Das says, "The panel will be given full freedom to make an independent review of concerns that have been raised." Physicist Paul McEuen of Cornell University, one of the first to question the data openly, says that Lucent is taking the right step: "Malcolm Beasley has great stature in the community. ... Everybody wants to get to the truth."
Eash system shuld have its own design style (Score:1)
VLSI designers cant conseve a world where you dont have a global clock or a transition can take a varying ammount of time.
Each silicon replacement shuld have its own design style reather than try and use the things that worked for silicon.
Dear Popular Science Webmaster: (Score:3, Funny)
I'm so tired of those "old school" web pages that use a readable font like the default 10pt Times. I love it so much when I get the opportunity to read an article in a miniscule 6 point sans serif font in a narrow column that takes up about a fifth of the width of the screen. I'm tired of all these websites that actually flow text to the size of the window I've chosen. It's so refreshing to have all that nice white space.
And I hate those sites that actually put the related content on one page. It's time more webmasters realize how much I appricate having an article arbitrarily spilt into seven different pages. And its so nice of them to save the screen space taken up by those pesky "Next Page" buttons. I really enjoy clicking on those tiny page numbers to flip pages. I thought for a minute that they'd made a mistake and that red rectangle image with the ">" symbol was the page flipper, but after clicking it about ten times it's apparent it doesn't do anything. Phew, that was close.
It's a good thing it was split up to many pages, I was really looking forward to seeing that insightful poll question "Will the Segway change transportation? Yes/No/Maybe." I thought I'd only get to see it once, but instead it was on each page, in case I missed it the first six times. Well done!
Now, usually most webmasters go soft and have a "print this" link that shows the entire article text in the default font, wrapped to the screen size. popsci does includes this link, but they get it! They realize that should I wish to print an article, I don't want to print the whole thing at once. Rather, I enjoy clicking the "print this" link on each page and sending off a different print job for each page. After all, why should my printer driver decide where to break up page boundaries? Is that really its job? Why would I possibly want to have all the article text in one place?
Finally, a webmaster that "gets it"!
Using DNA Computing to solve hard (NP) problems (Score:2, Informative)
There are two compelling advantages to using molecular biology to solve computational problems. Firstly, DNA has a much greater information density than almost any other media: using DNA it is possible to store data in a trillion times less space than with an electronic computer. Presently, it is possible to contain 10^21 DNA molecules in less than 1 litre of water, (with each molecule encoding potentially 400 bits of information).
Secondly, biological operations performed on DNA are massively parallel. All operations that are executed are performed on each strand of DNA simultaneously.
Adleman's experiment encoded a 7 node Hamiltonian Path Problem. Each node of the graph was encoded as a random 20 base long strand of DNA, and these were randomly annealed into long potential 'paths' through the graph. Paths were selected (and extracted) based on the length of the strand, which nodes were encoded in the strand, and whether the path encoded all nodes. At the end of this selection process, the remaining strand/s should encode the shortest path of the graph.
As can be imagined, this experiment has potentially huge consequences for large computational problems.
The problem remains to make this process reliable.
David Jackson
--
Incorrigible punster - Do not incorrige.
Re:Using DNA Computing to solve hard (NP) problems (Score:2, Informative)
Anyway, what you described is perfectly true. NP complete problems and other 'hard' problems can scale exponentially in time on a normal computer, but linearly in time on a DNA computer due to the massive parrallel nature of working with DNA. Unfortunately, however, the mass of DNA required to represent every possible answer in the one glass beaker simultaneously grows exponentially instead.
Even Aldeman realises that DNA computers won't be able to outperform electronic computers, unless some radically different algorithms can be found. From his research website, he doesn't feel that it was a waste of time, since the principles learned in trying to develop DNA computers may be applicable to other areas of DNA research - this view is also held in microfluidic computing research - see P NATL ACAD SCI USA 98 (6) 2961-2966 for the discussion
And now I've written more for slashdot on the subject than for my supervisor. Good work, eh?
article in SciAm... (Score:1)
Obviously written three weeks ago... (Score:1)
Natural States of an Electron (Score:1)
alternative digital logic technologies (Score:1)
This PopSci article covers quantum computation, DNA computation, and molecular electronic digital logic devices built of nanotubes, thiol, and DNA.
I wrote an article [canonical.org] on this same topic last month. It's almost exactly the same length as the PopSci article, but it covers a broader range of topics: all of the above (except for thiol), but also inkjet-printed semiconductors, rod logic, buckling-spring logic, optical computing, spherical integrated circuits, fluidics, and Josephson junctions. It's also a little less confused than the PopSci article.
However, its style is not nearly as engaging, and I didn't interview any researchers for it, so it's limited by my own limited knowledge.
I hope you find it interesting.
I never trust predictions (Score:2)
Somehow I don't think we can begin to comprehend what we will see in the future. Try telling somone from 1995 that in 5 years he would be able to buy a 30 gig harddrive for $100 and in 7 years 80 gig HDs would ship in computers. Try telling someone from 1990 that in 10 years computers would run standard at 1 ghz. Or try telling someone from 1970 that in 30 years he would be able to hold a super-computer in his lap and it would wiegh about 5 pounds.
Interesting but for a true Pocket PC there must be (Score:1)
memory functions in one rewritable storage
device like volume holographic optical storage.
That is small, extreme data transfer bandwidth,
colossal amounts of memory, and non-volatile.
GaAsFET (Score:1)
First computer? (Score:1)
Colossus [umass.edu]
Atanasoff-Berry Computer [ameslab.gov]