Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Microsoft Technology

Microsoft's Mundie Sees a Future In Spatial Computing 89

An anonymous reader writes "Speaking at the MIT Emerging Technology Conference, Microsoft Chief Research and Strategy Officer Craig Mundie explained that he sees the industry evolving into 'spatial computing,' and he envisions a 3-D virtual world populated by virtual presences, using a combination of client and cloud services. 'In a few months, the compay plans to test a new virtual reception assistant in some of its campus buildings. The assistant, which takes the form of an avatar, helps schedule shuttle reservations to get people to various locations across the 10-million-square-foot Redmond, Wash., campus. The system includes array microphones and natural language processing by which the avatar listens to the subjects and then interacts with them in real time. The system has been programmed to differentiate people by their clothing. Someone in a suit, for instance, would more likely be a visitor and not a potential shuttle rider.'"
This discussion has been archived. No new comments can be posted.

Microsoft's Mundie Sees a Future In Spatial Computing

Comments Filter:
  • by Dutch Gun ( 899105 ) on Friday September 26, 2008 @08:49PM (#25174043)

    I wondered when Clippy would resurface... Looks like he'll have a new job soon.

  • by Anonymous Coward on Friday September 26, 2008 @08:49PM (#25174047)

    but you don't see it on the front page of slashdot.

    • by MrNaz ( 730548 )

      Yes you do. It's represented in all its life-sized glory by all the periods at the ends of sentences.

  • Really... (Score:5, Funny)

    by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Friday September 26, 2008 @08:50PM (#25174051)

    Well. Isn't that spatial.

  • by clarkn0va ( 807617 ) <apt.get@NosPAm.gmail.com> on Friday September 26, 2008 @08:54PM (#25174079) Homepage

    And would we be silly to assume that they've made some improvements to their speech recognition software since it was demoed in vista [youtube.com]?

    db

    • The main problem is with background noise and crappy mics. Vista supports array mics (which are now getting more and more common on laptops), which are supposed to dramatically increase the accuracy of speech recognition. I have never got to test one, though.
      • by rtb61 ( 674572 )

        The reality of speech recognition, is it requires training of the system to suit the speech patterns of each individual user over a broad range of pysiological states as well as training of the user to alter the speech to suit what the system is capable of interpreting. People have trouble understanding and interpreting each others responces, how ever when handled in a polite personable fashion clarification is easily sought and provided in order to prevent confusion.

        Computers are not people and people d

        • by syntek ( 1265716 )
          A little off topic but...

          The funny thing about those three tiers are this. If you hired a handful of highly skilled individuals who were personally responsible for customers. You would have happier customers and cheaper labor cost and the customer would feel much more special. Let me explain

          I call a company and instead of getting an automated system, or 100 different people every time I call, I get one guy who I am familiar with and who is familiar with the services or products I have purchased from the com

          • I think the majority of consumers would not mind paying slightly more for a service is the support was great.

            That would be nice, but from the evidence I've seen they usually go for what's cheapest.

          • by rtb61 ( 674572 )

            Do you know the really interesting thing about all this, it is far easier and simpler for bean counters to calculate the savings per customer by providing cheap crap service than it is for business professionals to prove the losses that will result by providing cheap crap service.

            Only experienced knowledgeable business executives will recognise valuable insights from experienced staff, nepotistic idiots will of course grab the easy answers in vain attempts to prove their non-existant value, think grandki

      • by jipn4 ( 1367823 )

        You don't need array mics for good speech recognition: binaural people are a walking example that you don't.

        No, the problem is with crappy speech recognition software. Array mics are just a workaround until the software improves enough so that they aren't necessary anymore.

        • You realize people's ears are essentially an array mic, right? Sure, we can understand speech just fine with one ear, but two makes it a hell of a lot easier to zero-in on someone in noisy environments.

          That's the whole purpose of array mics: let the PC zero in on you for better quality.

  • by SEWilco ( 27983 ) on Friday September 26, 2008 @08:57PM (#25174097) Journal
    So would one enter this virtual 3-D world through the previously mentioned Internet Filtering Lobby?
  • by Anonymous Coward
    ... the Blue Avatar of Death!!!
  • by Pantero Blanco ( 792776 ) on Friday September 26, 2008 @09:10PM (#25174163)

    he envisions a 3-D virtual world populated by virtual presences, using a combination of client and cloud services

    So, he's predicting basically the same thing as every post-1985 Cyberpunk author. That's not really a story.

    I would have liked to read more about the visual recognition software that the summary mentioned, but the article was (predictably) short on details.

    Now, about the voice interaction part of that software... I don't really understand why someone would want to slow themselves down to the speed of speech (unless they're blind). It takes a minute for someone to hear information that they can read in a matter of seconds. I think this is mostly flash.

    • Re: (Score:2, Insightful)

      by willyhill ( 965620 )

      Visual recognition is actually quite far along now. Just ask Big Brother, seriously.

    • I totally agree about the voice recognition. Voice communication is terribly inefficient. It is also a very difficult thing to do naturally with a machine. For most people, it is very uncomfortable to "talk" to a machine - it's almost universally disdained.
      • Computer: End Program.

      • i don't know, speech recognition software seems to be in pretty high demand. i know a lot of companies prefer to use it for customer support rather than key inputs or live support staff. and a lot of people like using speech-activated speed dial on their cellphones.

        also, it might be easier to have a talking AI rather than set up a bunch of computer terminals for interactive assistance. sure, it's easier to just type on a keyboard and read text from a screen if you're dealing with computer applications, but

    • by CODiNE ( 27417 ) on Friday September 26, 2008 @10:11PM (#25174467) Homepage

      I think this is mostly flash.

      No I'm pretty sure it will be Silverlight.

    • I don't really understand why someone would want to slow themselves down to the speed of speech (unless they're blind).

      You might be surprised. I once got a personal demonstration by TV Ramen of his Emacspeak and was blown away. The speech output went out so fast I could not follow it.

      Blind folks don't need to be handicapped on computers if idiot "web designers" would cooperate a bit more.

      Sadly, TV Ramen was working for Adobe at the time, I do not know what he is doing now.

  • Will it help them find Waldo?
  • ...where thousands of sci-fi books have gone before!
  • by caywen ( 942955 )
    Nah, I think the future is a big ol' gob of Win32.
  • Does anyone know of a good cheap one for a desktop PC, preferably pure digital (using something such as the AKU2002 [akustica.com])? It seems like array mics should be cheap and easy to make, but almost nobody does it.
  • Something just doesn't settle right with a computer that tries to determine your purpose (essentially "who you are") based on selection of clothing. The progression of this technology seems to have some potentially chilly effects. At the same time there is real value in the research of AI. I just hope we know what to do with it when we get "there".
    • by syntek ( 1265716 )

      The system has been programmed to differentiate people by their clothing.

      I knew Microsoft stereotyped consumers!

  • by toby ( 759 ) * on Friday September 26, 2008 @10:19PM (#25174509) Homepage Journal
    I don't see much future for Microsoft.
  • Space: the final frontier. This are the voyages of the Windows Special for Enterprises, to explore strange new virtual world, to seek new second lifes and adquisitions, to boldly go where no blue screen has gone before.
  • Hey Microsoft, I've got news for you. We've had computers in space since at least the Apollo program.

    Sheesh, these guys think they're so smart...

  • by Anonymous Coward

    Flying chairs.

  • by melted ( 227442 ) on Friday September 26, 2008 @11:08PM (#25174825) Homepage

    But not in the next 50 years. I would love if my computer could serve as an omniscient "super secretary". If I could, for example, just say to it "I want to go to Chicago on Friday, book airline tickets, coach, no connections, in the evening, lowest cost. Return flight next Friday around the same time. Also reserve taxi to and from the airport, and a room the same hotel as last time I went there." Or let it find you a new job, given your experience and a list of available positions. Or ask for a concise summary on relative merits of top 7.1 home theater sound systems under $1K. Etc, etc. The list is endless.

    This is not to say that Mundie has a "vision" - my impression of him is that he will tell you anything to justify Microsoft paying him $1M a year in combined compensation. However, you can't deny the appeal of a truly natural user interface.

    • Or let it find you a new job, given your experience and a list of available positions.

      If you had an artificial secretary that could do the things you mentioned, then there would be no job for you because AI would have been solved.

      • AI won't be solved until computers are creative.

        The only time a computer created something "all by itself" is when a human told it to do so, told it how it should look like, and even then the computer used "the million monkeys" approach (think genetic programming). That's not creativity.

        Such artificial secretary probably won't need "real" AI, just sophisticated algorithms for speech recognition, parsing human language, searching the interwebs, and so on (the last 1/3 of this is already done - Google). All t

        • by SL Baur ( 19540 )

          AI won't be solved until computers are creative.

          INTERESTING. NOW LETS TALK ABOUT HOW YOU FEEL ABOUT YOUR MOTHER.

          Good god. The lameness filter kills what should have been a great joke. Sigh.

        • First we need to know what is creativity. How can we make things "out of nothing". Maybe it's possible due to one thing which we try to eliminate from computers all the time: random noise. If we look at brain, it's really noisy. It's actually as noisy as it can be and still function. So actually human creativity is "many monkey" approach with filters (broken ideas are typically filtered even before we consciously think about them).
          • Actually, I've been toying with genetic programming a while ago, and also been building a hardware RNG from an old FM radio (using the sound card as an ADC) to supply fine randomness used for the "natural selection". The goal was to make the interpreter "rewrite" itself in its own language, and then maybe see what else can it do. The Global Consciousness Project inspired me (http://noosphere.princeton.edu/) to start that experiment, but sadly all it ended up with is a half-broken implementation of my own pr

      • The "secretary" example is doable with technologies we have today. It does not, strictly speaking, require "strong AI". It would be ridiculously hard to build and ridiculously expensive though, if built using today's technologies.

  • I wish they'd just call it virtual reality.
  • by Orne ( 144925 ) on Friday September 26, 2008 @11:19PM (#25174887) Homepage

    The system has been programmed to differentiate people by their clothing. Someone in a suit, for instance, would more likely be a visitor and not a potential shuttle rider.

    Because nothing says "Good Idea" like differentiating between people based on their appearance.

  • I choose.. the warlock!

    "entering combat"

    "bill gates is afflicted by fear"

    "bill gates gains blessing of BSA lobbyists"

    "bill gates suffers 940 damage from your deathcoil (shadow)"

    "bill gates is afficted by fear"

    "bill gates suffers 9,450 damage from your soul fire (fire)"

    "you have slain mitch bainwol!"

  • Maybe Mundie found old stories from "Olivetti & Oracle Research Lab (ORL), which was then owned by Olivetti and Oracle Corporation. In 1999 AT&T acquired the lab" - they had interesting environment where everything followed you - music, phones, whatever, and you had your own avatar to show and connect. Some good came out of it as the free VNC, of course until AT&T closed the whole thing, they don't need research?

    Now, "virtual" presence is kind of weird, how to define it? Let me explain - in 80's

  • by Louis Savain ( 65843 ) on Saturday September 27, 2008 @01:19AM (#25175359) Homepage

    Microsoft and Intel are led by aging baby boomers who have run out of good ideas simply because they are too old and set in their ways. They still think in 20th century mode. They have no clue as to how to solve the parallel programming crisis [blogspot.com] and they speak about the future as if a solution were a fait accompli. Whoever is the first to come out with the correct solution will rocket right past them and they won't know what happened until it's too late. Personally, I am tired of Windows and x86-based processors. But then again, Linux is just as old as Windows. Doesn't matter. Soon they will all go the way of the Dodo.

    • by SL Baur ( 19540 )

      But then again, Linux is just as old as Windows. Doesn't matter. Soon they will all go the way of the Dodo.

      Actually, MS Windows is a bit older than Linux, but much younger than Unix. See my latest journal entry for a long explanation of why Linux, the BSDs, etc. are NOT going away any time soon.

    • Microsoft and Intel are led by aging baby boomers who have run out of good ideas simply because they are too old and set in their ways. They still think in 20th century mode. They have no clue as to how to solve the parallel programming crisis and they speak about the future as if a solution were a fait accompli. Whoever is the first to come out with the correct solution will rocket right past them and they won't know what happened until it's too late. Personally, I am tired of Windows and x86-based processors. But then again, Linux is just as old as Windows. Doesn't matter. Soon they will all go the way of the Dodo.
      Reply to This

      I love how you come here to spam about the world doesn't "get" parallel programming all the time to push your articles.

      But you're missing that we do use parallel programming extensively nowadays. Do you know what CUDA is? Do you know of Apple's OpenCL project?

      Do you know that even the new Flash Player 10 includes language specifically designed to run "embarassingly parallely" on everything from stock CPU to stock GPU-s. You can use this language (called "Pixel Bender") to process and generate anything from

      • But you're missing that we do use parallel programming extensively nowadays. Do you know what CUDA is? Do you know of Apple's OpenCL project?

        So CUDA and OpenCL are the solution to the parallel programming crisis? Quick, go tell that to Microsoft and Intel because they're wasting tens of millions of parallel programming research dollars at Berkeley, Stanford, the University of Illinois at UC and many other research labs around the world. I'm sure they'll be thrilled and reward you accordingly.

        • So CUDA and OpenCL are the solution to the parallel programming crisis? Quick, go tell that to Microsoft and Intel because they're wasting tens of millions of parallel programming research dollars at Berkeley, Stanford, the University of Illinois at UC and many other research labs around the world. I'm sure they'll be thrilled and reward you accordingly.

          First of all, that "crisis" is in your head.

          And yes, they are spending money on research dollars, because, unlike you, they realize there's no silver bullet for solving the parallel programming problems of today or tommorow. OpenCL, CUDA and Pixel Bender handle perfectly "emabarassingly parallel" problems, which applies well to graphics rendering, audio processing, filtering and other math problems.

          Another class of problems is about to be handled with research into transactional memory than companies like M

          • First of all, that "crisis" is in your head.

            Well, it must also be in the head of computer science professor, Kunle Olukotun [stanford.edu], who said recently, "If I were the computer industry, I would be panicked, because it's not obvious what the solution is going to look like and whether we will get there in time for these new machines" (source: CNN Money [cnn.com]). Glad to know you already have the answer. The fact remains that the vast majority of programmers have trouble programming parallel computers. And no, most parallel a

  • They didn't have to execute every code by speaking orders in virtual world. no wonder it is so s02032139391993 f0020 200f 0200d0 330
  • This article brings Vernor Vinge's

    Rainbows End [wikipedia.org]

    to my mind. It's a science fiction book set out in the near future where people use personal gear to operate in a virtual reality that is globally connected. The spatial web seems to have many similarities to it. A good read, I recommend!

  • by Snart Barfunz ( 526615 ) on Saturday September 27, 2008 @03:27AM (#25175775)

    Pants up butt-crack - Bill Gates
    Carrying stack of chairs - Ballmer's PA
    Bristling with fragmented chair splinters - Ballmer's PA
    Sporting a wedgie - has worn a Zune in public
    Shoes on wrong feet - Windows Vista Guru
    Wearing a food-stained straight-jacket - Mojave experiment subject
    Wearing a suit - Prey. Launch the sales droids!

  • Spatial computing? What we need is aural computing, computing by voice commands. I blame Apple for popularizing the graphical user interface. Massive amounts of time and resources have been devoted by programmers and software designers to perfect the GUI, first windows now full-blown virtual presences (avatars or is it MS Bob 2010?). If the Unix command prompt triumphed (maybe even in its anemic DOS mutation), we will now have true artificial intelligence. Remember Stanley Kubrick and Arthur C. Clarke [wikipedia.org]'s Spa [wikipedia.org]
    • I think blaming Apple is giving them too much credit. There were half a dozen companies offering machines with graphical user interfaces around the time the Mac came out. Apple wasn't the first and they weren't the most successful one either.

      What I really blame Apple for is not doing a better job on the software architecture. The original Mac's toolbox copied much of Xerox's user interface apperance, but almost nothing of the elegant architecture. Even the NeXT machine that came out a few years later wa

  • Wow, virtual worlds, locating people, natural language processing! What obvious, old idea do you want to copy today, Microsoft?

  • ... got any spare red pills?
  • "Someone in a suit, for instance, would more likely be a visitor and not a potential shuttle rider."

    So take your coat off if you want to penetrate the campus more easily...

  • by euxneks ( 516538 )
    Way to read a frickin' book. [wikipedia.org]

You know you've landed gear-up when it takes full power to taxi.

Working...