Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Books Media Technology

A Look Back At Kurzweil's Predictions For 2009 307

marciot writes "It's interesting to look back at Ray Kurzweil's predictions for 2009 from a decade ago. He was dead on in predicting the ubiquity of portable computers, wireless, the emergence of digital objects, and the rise of privacy concerns. He was a little optimistic in certain areas, predicting the demise of rotating storage and the ubiquity of digital paper a bit earlier than it appears it will actually happen. On the topic of human-computer speech interfaces, though, he seems to be way off." And of course Kurzweil missed 9/11 and the fallout from that. His predictions might have been nearer the mark absent the war on terror.
This discussion has been archived. No new comments can be posted.

A Look Back At Kurzweil's Predictions For 2009

Comments Filter:
  • Civil Liberties (Score:5, Insightful)

    by Kinky Bass Junk ( 880011 ) on Tuesday January 06, 2009 @12:50AM (#26339391)

    And of course Kurzweil missed 9/11 and the fallout from that. His predictions might have been nearer the mark absent the war on terror.

    His prediction on civil liberties might not have been so true if 9/11 never happened.

    • by Anonymous Coward

      Kurzweil may not be as far off on the human-computer speech interfaces as you may first think. It's currently focused in a narrow domain right now: automated telephone systems, which are are all pretty much voice activated these days.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      That is utter bullshit. Civil liberties have been going downhill for a long long time. His prediction on civil liberties was already true before 9/11.

    • Re: (Score:3, Funny)

      by RuBLed ( 995686 )

      His prediction on civil liberties might not have been so true if 9/11 never happened.

      Kurzweil had 4chan predicted to counter that.

  • So, basically (Score:5, Insightful)

    by Junior J. Junior III ( 192702 ) on Tuesday January 06, 2009 @12:54AM (#26339421) Homepage

    Kurzweil has a really good handle on where hardware will be, but not software. What I believe this means is that drives the creation of software is not how quickly it can be developed, but whether there's demand for it.

    Demand and innovation are a lot trickier to predict than advances in speed and minitiaturization of electronics hardware, so what we envisioned we thought our future selves might want in 2009 isn't actually quite what it turns out we actually wanted.

    Kurzweil thinks speech interface is where it's at, but the world gives us Twitter and Facebook.

    Kurzweil wants to use technology to make us immortal or give rise to machines that supercede humankind and take the next evolutionary step as a technological rather than biological one. Meanwhile, people want to make money, get laid, watch stupid video clips, listen to music, and act like their opinion is the best thing there's ever been.

    So... Where'll we be in the future? Watch Idiocracy.

    • Re:So, basically (Score:4, Interesting)

      by Spaseboy ( 185521 ) on Tuesday January 06, 2009 @01:34AM (#26339639)

      People have never really wanted a speech interface, it's been around FOREVER and has not taken off even when it's quite good.

      • Re:So, basically (Score:5, Interesting)

        by Junior J. Junior III ( 192702 ) on Tuesday January 06, 2009 @01:46AM (#26339715) Homepage

        Right. Kurzweil thinks they're awesome, in part I believe because he sees it as an incremental stepping stone to developing machines that think. In real life, users get tired after talking for a long time. Imagine how hoarse you'd be if you had to talk to a computer all day long in order to dictate a Word document, launch apps, navigate the interface, etc.

        Pointers and keyboards are far more efficient for such tasks. Are there tasks for which a voice interface would be better suited? Perhaps, but I don't think we've seen the applications developed yet that work better with voice than by manual input. Maybe voice-dialing for your cell phone? Nothing else springs to mind.

        Would having a conversation with a computer that was capable of understanding conversational english be awesome? I imagine it would be. But what would we talk about? What would I do with such a computer that I couldn't do with my current PC?

        Probably a few things would be a lot easier (programming by telling the computer what to do in a natural language rather than having to write objects and procedures in a high-level computer language... Or perhaps gaming applications.

        Yeah, that'd be awesome. but that's nowhere near being on the horizon yet, and I don't know that we'll ever get there, because where's the demand for the intermediary steps that would lead us there, and what would those intermediary steps even be??

        • What would you do? You'd be reprogramming your younger brothers computer to use the voice of Leslie Nielsen for his Talk Sex chatware.
        • Re:So, basically (Score:5, Insightful)

          by wrook ( 134116 ) on Tuesday January 06, 2009 @04:09AM (#26340289) Homepage

          Probably a few things would be a lot easier (programming by telling the computer what to do in a natural language rather than having to write objects and procedures in a high-level computer language...

          Actually, I don't think programming would be any easier at all. We already have people telling programmers what they want in human language (PGMs) and the result is universally horrible. In reality, the hard part about programming is sorting out the nitty gritty details. Transcribing the solution to the computer is not difficult. And I would *not* want to try to discuss these solutions in such detail in natural language.

          This is precisely why design documentation tends to go out of date very quickly -- it's written in the wrong language. We can't easily specify the level of detail we require in natural languages and so defer it to programming.

          • Pah, it's been around for decades! :) http://img.thedailywtf.com/images/ads/All-programs-you-need-BIG.png [thedailywtf.com]
          • You're right, even when programmers talk among themselves they use pseudo-code; natural language is generally insufficient to describe algorithms.

          • Re:So, basically (Score:4, Insightful)

            by nidarus ( 240160 ) on Tuesday January 06, 2009 @12:24PM (#26343713)

            This is precisely why design documentation tends to go out of date very quickly -- it's written in the wrong language. We can't easily specify the level of detail we require in natural languages and so defer it to programming.

            Another example: legalese. If you've ever tried reading a legal document, you'd notice that while nominally written in a "natural language", it's:

            1. More or less incomprehensible to a layman
            2. Actually much closer to a programming language (with a weird syntax and keywords in Latin)
            3. and.. since it's still related to natural languages, it's not precise enough for its purpose. People still argue about what certain words meant 100 years ago (when the law was written), and wage costly legal battles over vague wording.
        • Re: (Score:3, Interesting)

          by olman ( 127310 )

          Probably a few things would be a lot easier (programming by telling the computer what to do in a natural language rather than having to write objects and procedures in a high-level computer language... Or perhaps gaming applications.

          Programming? Yeah right. Probably last thing ever to go voice-activated. Something more plausible would be for example info-desk style application or perhaps GPS navigation system. After all you're supposed to be driving the car if you change your mind about destination etc.

          Gaming is dead-on, too. In fact it's surprising it's been used so little. There was ancient c64 game that already could be taught 3 speech commands. Given the modern cpu and memory capabilities it should be all over the place, especially

          • Re: (Score:3, Interesting)

            Probably a few things would be a lot easier (programming by telling the computer what to do in a natural language rather than having to write objects and procedures in a high-level computer language... Or perhaps gaming applications.

            Programming? Yeah right. Probably last thing ever to go voice-activated. Something more plausible would be for example info-desk style application or perhaps GPS navigation system. After all you're supposed to be driving the car if you change your mind about destination etc.

            Well, when I'm talking about "programming" using a natural language text interface, I don't mean what we currently think of as programming, I just mean programming in the sense of "giving a computer instructions to execute" -- basically, how they portray in Star Trek, where Kirk says "Computer: Do..." and the computer figures out what Kirk means by that, how to do it, and does it.

            It's very unrealistic based on how we understand computers today, of course, but perhaps a super-advanced computer could be deve

        • It's fiction of course, but I think Star Trek had it more or less right. You can talk to the computer, and it can understand you and comply with requests, but people generally only use this capability for quick and dirty things: ordering food, making a minor parameter change to the holodeck, opening a communications channel, etc. When you're doing something like writing a Holodeck scenario, driving the ship, analyzing data, etc you use something that looks like a combination of keyboard and touch screen.

      • Re:So, basically (Score:4, Interesting)

        by Gerzel ( 240421 ) * <brollyferret&gmail,com> on Tuesday January 06, 2009 @02:07AM (#26339795) Journal

        I think the problem is that developers focus on creating a pure speech interface rather than a mixed one.

        Also complicating things is the fact that we already use speech to interface with the world around us, other people telephones and such are often talked to while using the computer keyboard and mouse. How is the computer to know what is a command and what is being spoken to someone else?

        You either have to offset spoken commands with some token that won't come up in conversation and normal background speech or you have to give the computer context recognition which is also difficult.

        I'd like to bring back a revival of latin. Make all speech control software respond to latin phrases while normal speech is carried out in everyday language. Latin would be ideal because it is dead, and has a focus on commands in its grammatical makeup.

      • Re: (Score:3, Insightful)

        by anothy ( 83176 )
        the speech thing is really interesting. kurzweil, like the people producing the software, focus on dictation-style system. this is bad for two reasons. first, it's a much harder problem, technically: you need both better signal processing and confidence on your recognition, and you need vastly smarter software to resolve ambiguities based on context (and likely other factors). second, it's a less well-defined use case, which dilutes market demand. saying "use your computer like you do today, but talk to it!
    • Re:Idiocracy (Score:5, Insightful)

      by Anonymous Coward on Tuesday January 06, 2009 @01:37AM (#26339659)

      Excellent post. The worst thing too is that techy Internet pundits always bring up the Idiocracy reference, as if only the Internet could walk in a clean white suit above the supposed muck of the idiot masses.

      But of course, they all forget their own idiocratic backyard that includes places like 4chan, /b/, and Encyclopedia Dramatica. Or even places like Boing Boing or Youtube, which is a constant barrage of bite-sized irrelevant data for the ADHD crowd. /.'ers don't need to watch Idiocracy. We are living in an Internet Idiocracy that no one cares to improve because of the lulz. Neil Postman's 'Amusing Ourselves to Death' is THE ultimate predictor of the future. We are going to giggle ourselves to death with LOLcats, and people will argue vehemently that it's morally better than any alternative. Like Postman said, we'll beg to stay entertained.

      • Re:Idiocracy (Score:4, Insightful)

        by lordvalrole ( 886029 ) on Tuesday January 06, 2009 @03:39AM (#26340163)

        Gracchus: Fear and wonder, a powerful combination.
        Gaius: You really think people are going to be seduced by that?
        Gracchus: I think he knows what Rome is. Rome is the mob. Conjure magic for them and they'll be distracted. Take away their freedom and still they'll roar. The beating heart of Rome is not the marble of the senate, it's the sand of the coliseum. He'll bring them death - and they will love him for it.
        -gladiator

      • by smchris ( 464899 )

        So what's the Next Big Medium that has an intellectual price of admission?

        Been there. Seen it before. Amateur radio was a nerd's kingdom in the 60s. Then came CB in the 70s, Good Buddy. Once you've paved the path, the idiots will get on it.

    • Re: (Score:3, Funny)

      by hitchhacker ( 122525 )

      So... Where'll we be in the future? Watch Idiocracy.

      So like.. in the future... we'll be watching Idiocracy?

      -metric

    • Re: (Score:2, Interesting)

      by buraianto ( 841292 )

      so what we envisioned we thought our future selves might want in 2009 isn't actually quite what it turns out we actually wanted.

      Right. He knows that what we want is what we end up achieving. And I'm sure he knows that he will be wrong on some of his predictions. A large part of what he is doing when he makes these predictions is trying to get people informed about what is possible, to stimulate people's imaginations, so that we will want the things that he thinks are important and good for our future. Th

  • by Klootzak ( 824076 ) on Tuesday January 06, 2009 @12:56AM (#26339433)

    The following will happen in the next 10 years:

    1. Some Terrorist group will blow something up.

    2. That people will continue to argue whether Linux is superior to Windows (and vicea versa) on an ideological basis and continue to ignore individual situations/circumstances where their opposing OS would make a better choice.

    3. That people will still buy (or not buy) Mac's based on a fashion over function idea (despite the fact the actual Mac offering isn't too bad functionally).

    4. That people will make a bunch of random predictions, and several of these will pan out as predicted, and the people will say "Oh Wow!!!", (and then post the original predictions to Slashdot).

    • by Koiu Lpoi ( 632570 ) <koiulpoi AT gmail DOT com> on Tuesday January 06, 2009 @01:33AM (#26339627)

      3. That people will still buy (or not buy) Mac's based on a fashion over function idea (despite the fact the actual Mac offering isn't too bad functionally).

      Very true. A friend of mine "obtained" the latest beta of Windows 7, and was showing it to me. I pointed out that pinning the items to the taskbar was just like what's been in OSX for a long time now, and he replied (quite seriously), "Yes, but this isn't pretentious."

    • ...will be eating the LOL cats...

  • ...the emergency of digital objects...

  • by Anonymous Coward

    What? He got like 3 right out of 40.

    If you throw enough crap against a wall, some of it will stick.

    Kurzweil's 60. At this point, he can't seriously believe that technology is going to keep him alive forever anymore, can he?

    • by nprz ( 1210658 ) on Tuesday January 06, 2009 @01:46AM (#26339717)
      Did he even get 3?

      This one is obvious:
      "Individuals primarily use portable computers, which have become dramatically lighter and thinner than the notebook computers of ten years earlier. "

      Am I supposed to think that they just get bigger and bigger after 10 years?

      "Computers routinely include wireless technology to plug into the ever-present worldwide network, providing reliable, instantly available, very-high-bandwidth communication."

      Wrong, we don't have ever-present worldwide network. Even finding 'hot-spots' are hard.

      "Communication between components, such as pointing devices, microphones, displays, printers, and the occasional keyboard, uses short-distance wireless technology."

      Mouse/keyboard is about it. Display won't be wireless.

      "Government agencies, however, continue to have the right to gain access to people's files, which has resulted in the popularity of unbreakable encryption technologies."

      Umm, I guess they still have access if they have a warrant.
      I don't see your average person using encryption, let alone 'unbreakable' type.

      The only thing he got right is the obvious one. They rest are off. Making a 10-year prediction isn't very fun anyway. 20-year or longer predictions are great, especially if they include flying personal transportation.
      • by wurp ( 51446 ) on Tuesday January 06, 2009 @02:00AM (#26339761) Homepage

        Er, my phone certainly does have essentially an ever-present connection to the worldwide network. And my phone is a linux machine I would have been proud to have on my desktop 7 years ago.

        • by muridae ( 966931 ) on Tuesday January 06, 2009 @02:39AM (#26339947)
          You are right. Just because it doesn't look like what we thought it would look like ten years ago, doesn't mean it isn't happening. To the GP, that unbreakable encryption is available if you want it. Since the government does have access without a warrant, or have you ignored the past few years discussion of warrantless wiretaps, it's been quite common. And, you might use it without knowing it, like SSL for banking?
          Devices are all capable of talking to each other, via bluetooth or other means. Contactless smart cards fit as the ID protection on a chip, so do RFID passports, even if they aren't as secure as he had hoped. Memory on portable devices has moved away from the rotating platters. Kindle and other e-books are out there, and while I still prefer the contrast of paper and the lack of DRM, they are popular. Telephones do send high res pictures and video, my 'new' cellphone is capable of both. It's only new to me, the model has been out for some time. And his prediction of dating online/ virtual sex, I think it nicely sums up all the problems of Second Life. As for people preferring to interact with female AI, he's right. Wasn't there an article here about more people choosing the female workout instructor in Wii Fit?
          For his predictions of art, I've seen a lot of the things he dreamed up. People are making music with Guitar Hero 'toys', and cooking up strange new instruments with accelerometers. He didn't get it all right, but he was close.
      • Computers routinely include wireless technology to plug into the ever-present worldwide network, providing reliable, instantly available, very-high-bandwidth communication.

        Wrong, we don't have ever-present worldwide network. Even finding 'hot-spots' are hard.

        I beg to differ. Two weeks ago today, I stood on a beach in Australia — at Hat Head, which, for the curious, is a small and fairly unremarkable seaside town in New South Wales, about 500 km from the nearest large city — where I had no trouble using my Swedish mobile phone/SIM to

        • send a photo I'd just taken of a pelican using my mobile to my girlfriend (who, at the time, was in a small town in Spain that happens to be about as close to the middle of nowhere as you can get and still be on the Iberian Peninsula)
        • upload a photo of my daughter holding a hermit crab she'd just caught to my website, which (last I heard) is hosted in Texas, so my parents (in Florida and North Carolina) could see it (this required popping the memory card out of the camera and into the phone, too bad the camera doesn't support Bluetooth)
        • respond to a text message from a friend of mine who runs a café in Stockholm
        • look up the Swedish words for "pelican" and "hermit crab" in an online dictionary
        • ring a friend of mine in Thailand to let her know I'd had to change my plans and would be returning to Europe via Singapore rather than Bangkok
        • Fired off scripts on my two laptops — one at my ex's place in Kempsey (35 km inland) and the other back in my flat in Stockholm, both using WiFi connections — to update my MySQL server repos and do new builds
        • update my status on Facebook

        Now... You were saying something about the lack of world-wide wireless connectivity...? :)

      • "Communication between components, such as pointing devices, microphones, displays, printers, and the occasional keyboard, uses short-distance wireless technology."

        All of these except displays are common place. And despite the pointlessness, people continue to buy wireless network printers and sit them 3 feet from the router despite this. Technically even displays are possible, you can get short range retransmission devices for televisions (meant to hook cable up to an entire household with one output).

        Al

      • In reality, he didn't even get the "Things will get smaller" prediction that right. He was qualitatively right, in that "smaller" is the way things go, but he missed by probably an order of magnitude or two - suggesting that people would be wearing dozens of PCs across their body embedded in clothing and jewellery is a rather different view of "personal computing" than someone carrying around a netbook.
    • Spot on. The guy misses far more than he hits. Even when he hits, the innovation existed to some extent ten years ago.
  • by Anonymous Coward on Tuesday January 06, 2009 @01:03AM (#26339491)

    And of course he missed the Spanish Inquisition. Possibly he didn't expect that.

  • Not bad (Score:3, Interesting)

    by ceoyoyo ( 59147 ) on Tuesday January 06, 2009 @01:11AM (#26339525)

    Quite a bit of that was eerie, when you consider it was written ten years ago. Most decade predictions are way off, with maybe one in ten or twenty hitting near the mark.

  • by VoidEngineer ( 633446 ) on Tuesday January 06, 2009 @01:14AM (#26339539)
    I think he was pretty spot on regarding the visual cortex simulation:
    http://tech.slashdot.org/article.pl?sid=08/06/13/2014225 [slashdot.org]
  • Despite occasional corrections, the ten years leading up to 2009 have seen continuous economic expansion and prosperity due to the dominance of the knowledge content of products and services. The greatest gains continue to be in the value of the stock market. Price deflation concerned economists in the early '00 years, but they quickly realized it was a good thing. The high-tech community pointed out that significant deflation had existed in the computer hardware and software industries for many years earli

    • by Kohath ( 38547 )

      What I want to know is, does any sane person think that overall price deflation isn't terrible for the economy?

      It's not insanity. It's ignorance.

    • What I want to know is, does any sane person think that overall price deflation isn't terrible for the economy?

      I'm no economist, but if we have deflation right now I am pretty happy about it. Maybe pennies will once again be worth the metal we put in them.

      It's crushing to anyone in any significant amount of debt (i.e. anyone who holds a mortgage).

      Don't buy that house that costs so much more than your annual income.

      • by jfruhlinger ( 470035 ) on Tuesday January 06, 2009 @02:19AM (#26339851) Homepage

        The problem is that in deflationary periods incomes drop as well as prices, either by direct cuts in salaries or layoffs followed by new jobs that don't pay as well; otherwise everyone would be rich, by magic, which never happens. Thus my family's outstanding mortgage -- currently a fairly reasonable 120 percent or so of our annual income -- would become more and more of a burden.

    • Given the simple axiom of supply and demand, do you really think that the supply of dollars by the FED [stlouisfed.org] is outstripping the fall in demand caused by the economic downturn? Granted that graph is a reflection of what banks are holding onto, but what do you think is going to happen to the purchasing power of the individual using USD once it all hits the consumer market? The better term for this is Stagflation [wikipedia.org] and there is still a large amount of doubt by this armchair economist if Keynesian Economics can do any
  • by freeweed ( 309734 ) on Tuesday January 06, 2009 @01:24AM (#26339583)

    Most of his predictions that he got right were brain-dead obvious in 1999 - we already had portable computers coming into common use, and cellphones everywhere. This trend was pretty clearly going to continue. Hell, the Gameboy was proof enough that we were about to see a generation who grew up with portable computing. "Body LANs" don't exist in any meaningful form. People at best are wearing the utility belt of gadgets, some of which might talk Bluetooth to each other.

    The rest? Wireless? Please. Bluetooth and Wi-Fi were just coming into fruition around that time, and obviously wireless use was going to come into play. Again, cellphones paved the way for this. Beyond that though... I still see millions of wired speakers, mice, keyboards, dvd players, you name it. I still don't see wireless as being the most common form of network access, hell any network admin worth his salt will rant about the general poor performance of Wi-Fi. Wireless printers and displays never really came about (I do find it amusing that he says "occasional keyboard" - the most obvious use of a low-bandwidth wireless interface). His vision of ubiquitous wireless access never came about - the best we have is the cellphone networks, which again, we already had 10 years ago.

    Digital books, movies, music? Napster was already out by then. The entertainment industry did its best to stop this from happening and it's only been in the past year or three that it's even been practical (from a legal perspective).

    Eyeglass displays have existed for a long, long time and never achieved much success.

    A trillion calculations per second on a home computer, eh?

    Anyway, just seems a bit underwhelming. He got so much completely wrong.

    • You can get a teraflop of performance from one of the newer Nvidia GPUs in a desktop PC!
    • by marciot ( 598356 ) on Tuesday January 06, 2009 @01:46AM (#26339713)

      A trillion calculations per second on a home computer, eh?

      According to wikipedia [wikipedia.org], the ATI Radeon HD4800 series acheives one teraflop. So, I would say Kurzweil was right on the mark on that one.

  • by LuYu ( 519260 ) on Tuesday January 06, 2009 @01:30AM (#26339615) Homepage Journal

    . . . the lawyers.

    This is surprising since the copyright fanatics spoke much more boldly 10 years ago than they do today.

    How much of the truth of his predictions is the result of his predictions?

  • by Dutch Gun ( 899105 ) on Tuesday January 06, 2009 @01:36AM (#26339649)

    Style improvement and automatic editing software is widely used to improve the quality of writing."

    So close [nwsource.com], and yet [xkcd.com] so, so far... [xkcd.com]

    Most all the predictions I read in this article have roughly the same problem - it still assumes technology is much more ubiquitous than it is in the real world. I'd say he was probably off by a five to ten years in many of those predictions. Let's see:

    Computers: Personal computers are available in a wide range of sizes and shapes, and are commonly embedded in clothing and jewelry such as wristwatches, rings, earrings, and other body ornaments... The majority of text is created using continuous speech recognition (CSR) dictation software.

    Getting there, but we're not quite at the point of wearing computers in common objects. Keyboard and mouse are still king.

    Education: Students of all ages typically have a computer of their own, which is a thin tabletlike device weighing under a pound with a very high resolution display suitable for reading... Intelligent courseware has emerged as a common means of learning.

    Closer, but education still seems largely clueless about how to effectively use computers. Intelligent teaching software is making strides, but still really can't be called "intelligent" by any stretch of the imagination.

    Communication: "Telephone" communication is primarily wireless, and routinely includes high-resolution moving images... Virtually all communication is digital and encrypted, with public keys available to government authorities.

    Technologists always want that video phone, and the market continually says "no thanks, voice is good enough". In fact, it's gone backwards a bit, with text messaging being rather popular.

    Business and Economics: Intelligent assistants which combine continuous speech recognition, natural-language understanding, problem solving, and animated personalities routinely assist with finding information, answering questions, and conducting transactions... Most purchases of books, musical "albums," videos, games, and other forms of software do not involve any physical object.

    Again, the overestimation of natural interfaces. And as of right now, a large percentage of software (especially games) is still attached to a physical disk, although digital downloads are gaining Steam... (sorry)

    Politics and Society: Privacy has emerged as a primary political issue. The virtually constant use of electronic communication technologies is leaving a highly detailed trail of every person's every move.... There is a growing neo-Luddite movement...

    This one's pretty close regarding privacy concerns. As far as neo-Luddite, I haven't seen any such movement emerge in large numbers. There are some anti-technologists, but it's usually a secondary effect of some other philosophical argument.

    The Arts: The high quality of computer screens, and the facilities of computer-assisted visual rendering software, have made the computer screen a medium of choice for visual art.

    Another one technologists always get wrong is the idea that people are eager to throw away traditional art mediums. I think Star Trek was closer on this one, about how people will always enjoy timeless "classical" entertainment right alongside their "high-tech" (holodeck) entertainment. The two need not be mutually exclusive.

    Etc, etc... I'd say the predictions were generally on the right track, but perhaps just a bit too optimistic in the rate of adoption. Still, overall it was fairly insightful, if somewhat conservative. I'm not sure I could have done nearly as well.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      "Communication: "Telephone" communication is primarily wireless, and routinely includes high-resolution moving images... Virtually all communication is digital and encrypted, with public keys available to government authorities.

      Technologists always want that video phone, and the market continually says "no thanks, voice is good enough". In fact, it's gone backwards a bit, with text messaging being rather popular."

      I think you're missing the reason. It's not that people prefer text messaging. I mean, look a

    • Re: (Score:3, Insightful)

      Technologists always want that video phone, and the market continually says "no thanks, voice is good enough". In fact, it's gone backwards a bit, with text messaging being rather popular.

      Wireless video phones are widely available today and have been for years, even my (pretty cheap) phone has that feature. I've never seen anyone use it though, and I've never used it myself. It seemed like a really cool idea when seen in SciFi movies/tv shows, but in reality it's just isn't all that necessary to see the person you're speaking to, especially when on the move as you are with your cellphone.

      • by Lars512 ( 957723 )
        Exactly. People only seem to use video calls on their computers, when they're stationary, for example when skype calling one-another.
  • Apparently it could not predict its need for sufficient bandwidth.

  • mmm (Score:5, Funny)

    by Tomfrh ( 719891 ) on Tuesday January 06, 2009 @02:03AM (#26339773)

    His predictions might have been nearer the mark absent the war on terror.

    Oh I agree. His predictions may have been far more accurate had the future unfolded differently.

  • by TheSync ( 5291 ) on Tuesday January 06, 2009 @02:10AM (#26339809) Journal

    I was sitting next to someone with a Kindle on a plane last week, so the digital paper thing is moving fast.

    Rotational storage is not going away anytime soon (who though we'd have Terabyte drives?), but you certainly my iPhone can do a heck of a lot of computing with just Flash.

  • by Al Dimond ( 792444 ) on Tuesday January 06, 2009 @02:34AM (#26339913) Journal

    The things he was right about were fields where the path forward was pretty certain. We had a pretty good idea then how we'd make microchips smaller and faster, a clear path forward. Only now is that path getting clouded by physical limits. Where he was wrong was in predicting steady, linear progress in areas where there isn't a clear path forward. This includes AI, interface design, economics, and general welfare (I just love his dismissal of the underclass; they're a pretty big portion of humanity, you know, and I don't think the human story can be truly told without theirs as well).

  • by Animats ( 122034 ) on Tuesday January 06, 2009 @03:15AM (#26340091) Homepage

    I run Downside [downside.com], where, in 2000, I called the dot-com crash before it happened and named names. Check my track record. Since then, I've occasionally pointed out the obvious before it became conventional wisdom:

    • 2004-10-11 - The coming mortgage crunch
      The next crash looks to be housing-related. Fannie Mae is in trouble. But not because of their accounting irregularities. The problem is more fundamental. They borrow short, lend long, and paper over the resulting interest rate risk with derivatives. In a credit crunch, the counterparties will be squeezed hard. The numbers are huge. And there's no public record of who those counterparties are.
      Derivatives allow the creation of securities with a low probability of loss coupled with a very high but unlikely loss. When unlikely events are uncorrelated, as with domestic fire insurance, this is a viable model. When unlikely events are correlated, as with interest rate risk, everything breaks at once. Remember "portfolio insurance"? Same problem.
      Mortgage financing is so tied to public policy that predictions based on fundamentals are not possible. All we can do is to point out that huge stresses are accumulating in that sector. At some point, as interest rates increase, something will break in a big way. The result may look like the 1980s S&L debacle.
    • 2006-01-01 - Predictions for 2006
      • Saudi Arabia finally admits the Gawar field has peaked. Oil passes $70 per barrel.
      • US interest rate spike. "Homeowners" with adjustable-rate interest-only loans default and are foreclosed. Housing prices crash as foreclosures glut market..
      • Nobody wins in Iraq. Neither side can force a decision, so both sides keep bleeding.
      • One of the big three US car manufacturers goes bankrupt.
      • A major hurricane wipes out another southern US city.

    The 2004 prediction describes exactly what happened in housing. No question about that.

    The 2006 predictions took longer to happen than I'd expected. The Fed cut rates sharply in 2007, accelerating the economy when it should have been hitting the brakes. This deferred the collapse of the housing bubble, but not for long. When it did pop, it was worse than it had to be.

    I expected one of the car manufacturers to go bust. Instead, they all almost went bust, and only a Government bailout saved them. The fundamentals indicated something had to give. The housing bubble and interest rate cuts resulted in something of a "car bubble", deferring the inevitable a few more more years.

    The hurricane prediction was kind of off the wall, but Galveston was duly flattened.

    It's nice to be right, but it isn't happy-making.

  • Bah! Humbug! (Score:2, Insightful)

    by binpajama ( 1213342 )

    The biggest problem with Kurzweil's view of the world is that it assumes that any innovation, if technologically feasible, is going to be adopted. As a simple example, the issue of voice-to-voice translation that he raises in the article. Its just more economical and practical to do business with someone who knows English (or has easy access to someone who knows English)

    Similar wishful thinking by Sci Fi doyens caused visions of space colonies and interstellar travel by the first decade of the 21st cent

    • Re:Bah! Humbug! (Score:4, Insightful)

      by Lobo42 ( 723131 ) on Tuesday January 06, 2009 @11:05AM (#26342731) Journal
      I had to read the entire book for an AI course once. It was awful! Kurzweil seems to exist in a world of the geeks, by the geeks, for the geeks. He pretty clearly has no concept of, say, poverty, or even acknowledgment that as you go further down the economic totem pole, the more people you will find. His predictions make *some* sense if you're only talking about wealthy Americans. (I.e., geeks) But they make far less sense if you consider a world where people are *different from Ray Kurzweil*.

      Tech things from the past decade that he COMPLETELY misses in his book:
      • The green movement and the resource crunch (Yes! It turns out that even as we spend more time on our computers, we continue to care about natural resources! How quaint!)
      • Cultural clashes. Technology continues to bring people "closer" together, sometimes in ways they don't want. Globalization keeps happening, but it also continues to stir up discontent among people who see their jobs/traditions/beliefs replaced.
      • The degree to and method by which computers are used for entertainment. As mentioned in an earlier comment, facebook and youtube are really the stars of the past 10 years (at least, on the internet.) People like communicating with other people. And that doesn't mean better interfaces. (Apple's has mics and cameras installed in every Mac laptop for a few years now and....still I don't know any one who uses it.) It means *more things to communicate about*. We like our pets - let's video tape them doing stupid things to share with mom and dad. We like our own cleverness - let's update our facebook status with something more witty than our friends! Even in gaming, we see a rise in online and co-op play.

      Kurzweil, in his long term view, predicted a world where technology starts to change us, and we are replaced with computers. He envisioned conflict over this - people fighting about whether computer rights, the meaning of "human," etc. But there hasn't been much conflict, because computers haven't changed us. Human needs are still the driving force between technological change, and as long as this is the case, technology will continue to satisfy our basic needs - to help us do the same things we *already do* faster and better, rather than suddenly giving me a taste for wireless jewelry.

  • "There are services to keep one's digital objects in central repositories, but most people prefer to keep their private information under their own physical control." Bet google is glad that never hapened :p
  • by Sepiraph ( 1162995 ) on Tuesday January 06, 2009 @03:58AM (#26340241)
    I think Kurzweil is desperate for Singularity to happen sooner because frankly he just doesn't want to die.
    • by Lars512 ( 957723 )

      Maybe he just wants to live long enough to see it begin. Then again, it changes the ball game and seems a shame to die when it seems likely that one day people won't have to (for example, through digitising themselves).

  • by Arancaytar ( 966377 ) <arancaytar.ilyaran@gmail.com> on Tuesday January 06, 2009 @06:02AM (#26340779) Homepage

    Rotating memories (that is, computer memories that use a rotating platten, such as hard drives, CD-ROMs, and DVDs) are on their way out, although rotating magnetic memories are still used in "server" computers where large amounts of information are stored.

    We're nearly there. Some Netbooks already have solid-state hard drives without any rotating platters. The limitation right now seems to be writing speed and time of life. Flash memory still deteriorates with each delete+rewrite. Getting much better though.

    As for exchangable media, well, the USB key seems to have become the medium for personal data - although optical media are still used for mass-produced content like movies and music. Can't see that changing ever - DVDs and BluRay disks are much cheaper to produce than rewritable flash memory.

  • The OLPC/XO utopia was more down to the earth than Kurzweil's prediction. The idea looks a lot like their predictions. A for all children computer, non-rotational storage, networking everywhere, and cheap.
    But even if in my own country are widely used (in Uruguay this year should finally be in all the schools of the country, already are in most of it), the utopia painted on the launch of it wasn't reached, and the technology involved was the one available since the start.

    Another miss in Kurzweil predictions
  • The chapter contains a bland stew of ideas that were commonplace even a decade ago (when the chapter was written). Most of the engineering goals were major targets even back then, and he didn't exactly nail the timing on most of these. Factor out general knowledge of the tech industry, and he's no more accurate than your average tea leaf reader (even worse, if you imagine that Kurzweil has some access to industry insiders who actually know what technologies they're going to push next). It's a nice chapte

  • He was way off all over the place. The slashdot summary exaggerates.
  • by mpsmps ( 178373 ) on Tuesday January 06, 2009 @11:37AM (#26343087)

    Don't listen to the naysayers, it's only January...

  • LUIs (Score:3, Insightful)

    by jimfrost ( 58153 ) * <jimf@frostbytes.com> on Tuesday January 06, 2009 @09:28PM (#26352007) Homepage
    On the topic of human-computer speech interfaces, though, he seems to be way off.

    I'm not sure he was so far off. Sure, personal computers don't use it, but have you gone through a phone interface recently? It's not natural language but I've used some of them that are pretty free-form.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...