Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Robotics Sci-Fi Technology

Understanding an AI's Timescale 189

An anonymous reader writes "It's a common trope in sci-fi that when AIs become complex enough to have some form of consciousness, humans will be able to communicate with them through speech. But the rate at which we transmit and analyze data is infinitesimal compared to how fast a computer can do it. Would they even want to bother? Jeff Atwood takes a look at how a computer's timescale breaks down, and relates it to human timeframes. It's interesting to note the huge variance in latency. If we consider one CPU cycle to take 1 second, then a sending a ping across the U.S. would take the equivalent of 4 years. A simple conversation could take the equivalent of thousands of years. Would any consciousness be able to deal with such a relative delay?"
This discussion has been archived. No new comments can be posted.

Understanding an AI's Timescale

Comments Filter:
  • by sandbagger ( 654585 ) on Saturday May 17, 2014 @12:42PM (#47026027)

    I hope they are nice to us.

  • by SJrX ( 703334 ) on Saturday May 17, 2014 @12:47PM (#47026071)
    One CPU cycle as one second might be a good metaphor for computer memory but not AI. It's closer to the equivalent of a neuron firing in the human brain, then it is to 1 second of human time. Human speech takes more than one neuron to fire, and it would take way more than one CPU cycle to process. An AI algorithm which is processing data, and analyzing it would literally take millions or billions of cycles most likely to do the most basic things. While no doubt speech recognition has gotten much faster, it is still and probably will always be a massive undertaking for a CPU to do, as opposed to say adding two 32-bit integers.
  • by Anonymous Coward on Saturday May 17, 2014 @12:49PM (#47026093)

    No task can be accomplished in a single CPU cycle.

    A human can actually do something in a second, like move or talk.

  • The wrong question (Score:2, Insightful)

    by Anonymous Coward on Saturday May 17, 2014 @12:50PM (#47026099)

    To a computer time is meaningless; you can 'suspend' a program and resume it. Pop data onto a stack and pull it back later. It doesn't 'age', there's no lifespan; in fact even if that hardware from 30 years ago completely dies, I can load it into an emulator. I turned on a computer from 30 years ago. it runs just fine, it can even connect to the internet.

    Furthermore, a consciousness in a computer would have to deal on these timescales in order to survive and be meaningful to us; Such an intelligence that didn't learn to deal on these timescales would not survive (thing maintenance intervals on machines, shutdowns at night/weekend, etc). So sure it may 'exist', and even last for billions of its cycles, but if it cannot persist past these thresholds, its irrelevant; much like an animal in a tidal pool that dies before the tide comes back; the ones that made it past that were our ancestors.

  • by Jane Q. Public ( 1010737 ) on Saturday May 17, 2014 @01:01PM (#47026191)
    OP's entire premise is pretty thin.

    Human beings perceive light, for example. (They can also perceive electricity, to a degree, but that is not as relevant to the point.)

    But while a human being might perceive that a flashlight at night has shined his/her way, it takes the same amount of time, roughly, a a fiber optic signal from the same distance. So what?

    Generally, it is the speed of perceiving and interpreting the signal that takes time, not the speed of its propagation. We communicate at lightspeed, too. Or close to it. Anybody who has had a video chat has done that. Did that make you superintelligent?

    We have never built an "AI". And in fact we have NO reason to believe -- no evidence whatsoever -- that its speed of perception and interpretation would be any faster than our own. There is a very good chance that it would be much slower... at least in the beginning.

    I would like to remind people that the idea of "intelligent" machines has been around for almost 100 years now. AND we still don't have any solid evidence of being close to achieving such a thing. Sure, computers can do a lot, and what they DO accomplish, they tend to do very fast. But what they accomplish is not "AI". Even Watson is not "intelligence", it is only the illusion of it.
  • by Anonymous Coward on Saturday May 17, 2014 @01:27PM (#47026349)

    Not only that, but trying to relate to individual CPU cycles is absurd. It's not like our minds execute in a stream of arithmetic operations, but rather a complex parallel network of signals. It might take billions or trillions of CPU operations to emulate all the stuff that happens on one "instant" in the brain. A more reasonable cycle comparison might be to compare macro scale wave front propagation in the brain (i.e. brain waves) and global synchronization in large-scale supercomputers (i.e. single iteration time of an MPI-based fluid dynamics simulation or other large-scale 3D mesh problem). Even then, I am not sure how many orders of magnitude we need to increase the size of the MPI problem before the per-cycle complexity starts to approximate the signal-processing of the entire nervous system.

    But all that aside, we have historically had many people who worked in relative isolation. Many artists, poets, philosophers, and scientists have had great symbiotic relationships with nothing more than the occasional letter or other work (papers, poems, paintings, sculptures) exchanged over great distances and latencies. Love affairs have been carried out with little more than a furtive glimpse and a series of notes sent through back channels...

  • by phantomfive ( 622387 ) on Saturday May 17, 2014 @01:53PM (#47026535) Journal

    Generally, it is the speed of perceiving and interpreting the signal that takes time, not the speed of its propagation. We communicate at lightspeed, too. Or close to it. Anybody who has had a video chat has done that. Did that make you superintelligent?

    Another way of looking at it: have you ever sent someone a letter, then waited a long time to receive a response? Did you nearly die from the excruciating pain of not having the response, or did you do something else until the response came?

    Most likely you are highly skilled at carrying on multiple conversations at different speeds.

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...