Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Handhelds Technology Hardware

The 'Pervasive Computing' Community 113

Roland Piquepaille writes "Most of us are using computers, but also PDAs and cell phones. And this trend is accelerating in our increasingly networked wireless world. We might use hundreds of computing devices by the end of this decade. Still, we are slaves to our machines. With every new device, we have to learn new commands, languages or interfaces. The Cambridge-MIT Institute (CMI), a strategic alliance between the University of Cambridge in the UK and the Massachusetts Institute of Technology in the U.S., has enough of it and wants to give back control to the users. So it launched its 'Pervasive Computing' initiative with the intention to tackle this challenge. In particular, the group wants to develop new technologies to make easier for us to interact with all these computers. This overview contains more details and references about this initiative."
This discussion has been archived. No new comments can be posted.

The 'Pervasive Computing' Community

Comments Filter:
  • I'm a PDA addict (Score:3, Interesting)

    by Face the Facts ( 770331 ) on Monday April 12, 2004 @09:38AM (#8836979) Journal
    I've got a Zaurus 5500, and love it for what I use it for, but a Palm seems to make a better PDA. So, I've come to the conclusion that a Linux handheld device isn't a PDA, but a small-sized computer. So, a Linux pda makes for a good platform if you are a unix developer who needs to write custom hand-held software. Also, while there are a bunch of Palm apps out there, not many are free. It's not that I have to have everything for free, but often times an app doesn't quite work the way I want, and I like to be able to tweek them a bit. An example, I found a good TI-85 calculator emulator, but the buttons looked awful. A bit of messing around with the xpm definitions, and now the button colors are defined in the config file. This is the kind of stuff that you just can't do with non-free apps that you find on Palm or PocketPC.

    As for what I use mine for:
    * Web lookups (i.e., looking up items in Internet phone books, TV listings, dictionary definitions)
    * Other web browsing when it wouldn't due to to carry a laptop (meetings, nature's call, etc)
    * Custom PIM app -- I wrote a web-based app which allows me to organize data and meeting notes in a unique way that suites me. On my Zaurus, I've got a version of the app served up by a local web server. Whenever I'm within wireless range, a background task automatically keeps the local database synced with the one on my server. (Once I perfect it, I'll put it up on sourceforge).
    * Entertainment -- with a wireless card in the Zaurus, and one in my laptop, I can stream movies and music to the kids in the car served up by my laptop which I use for navigation. It also runs Mame.
    • Is my GPS enabled, 802.11b/g capable, PDA cell phone. Is that too much to ask?
      • I want my tricorder, dammit!
      • Solution: TUI. Text user interface.

        The most effective tool is literally command line knowledge and tab completion. Pine is one of the simpliest programs to teach someone to use, and it's fast. And doesn't use much resources.

        Like, compare easy star to any modern MS Word... I mean, there's liks no comparison.

        I can teach someone who is handicapped and impatient how to use wordstar or pine in minutes. After months, I'd still have to answer questions about word.

        Actually, I sitll have to answer my own questio
    • Re:I'm a PDA addict (Score:2, Interesting)

      by RickL ( 64901 ) *
      I got a 5600 a few weeks ago. I'm having a really hard time migrating away from the Palm for PIM applications.

      Sharp did a Bad Thing when they changed the PIM file formats from XML used in all previous versions to a binary file. Not only does it make it harder to roll your own, but it breaks compatibility with other tools.

      I've thought about writing an web-based PIM suite that would synch with the Z through SOAP or such. I found with my Palm that I did most data entry and quite a few of the look-ups at t
    • The parent post is stolen, except for the first paragraph, word-for-word from this post [slashdot.org] by tchuladdiass (174342) [slashdot.org].

      It was stolen via the anti-slash.org database [anti-slash.org].

      In fact, all of the parent's posts are plagaiarized via teh anti-slash.org database.

      Mod parent down.
  • Clarification (Score:5, Insightful)

    by jetkust ( 596906 ) on Monday April 12, 2004 @09:40AM (#8836992)
    Still, we are slaves to our machines...

    No, we are slaves to the programers who program the software that runs on our machines.
    • Re:Clarification (Score:1, Insightful)

      by Anonymous Coward
      could we then say the programmers are slaves to the managers or the company?

    • Are we slaves to the batteries that run the machine that runs the software? What about food - are we slaves to McDonalds so we can the strength push the buttons on the machine running on batteries that execute the software?
    • Still, we are slaves to our machines...

      No, we are slaves to the programers who program the software that runs on our machines.


      You're my slave?

      Cool.

      What, specifically, does that entail? Can I order you to fetch me some peeled grapes?

    • > No, we are slaves to the programers who program the software that runs on our
      > machines.

      No, I chose to use a phone, and other devices. They're simple, and all work the same way, even if they use a different keypress or menu structure. If you have trouble with such things, then the information they offer you will probably be too much for you in the first place and you should probably stick with books or tv or whatever.
      • yes, there's no complex information at all in books.
        • > yes, there's no complex information at all in books.

          Books are ok. Not much interactivity though, not do they hyperlink to other documents containing definitions or examples. Nor can they play animations, sounds etc.
    • And if we can at least get one down, its pretty much common sense to run them all. I never had a problem learning the Palm OS after my excessive knowledge of the Windows OS and Linux was made.
  • Who really doesn't see a problem with the current setup?

    Varietous [unwords.com] interfaces and commands makes things fun, plus it increases one's aptitude.

    I say out with pervasive computing.
    • by Anonymous Coward
      It increases your aptitude, alright, at jumping through a series of arbitrary hoops. Some of us would like to have time to be technically proficient (not 37337, but strongly capable at least) AND study (in depth) the arts and sciences. Hopefully combine the two in some innovative ways, rather than learn X out of the Y platforms available today, but maybe not tomorrow.

      Now who was it running this project? Cambridge and MIT? Yes... more power to them!
    • It's manageable now by us geeky types, but the average person just does not have the time to figure it all out. And if this stuff evolves like the article predicts, it'll become too much for anyone to handle, at least if they want to use the devices for any purpose other than just for the geek factor.

      That being said, I don't really think a pre-emptive initiative like this can really hope to solve the future problems they're aiming for. Technology is just too unpredictable, not to mention all of the economi

    • I enjoy the challenge, I enjoy variety and I like learning new things, new interfaces, etc. However, I get greater satisfaction from making it so that other people don't have to if they don't want to.

      There will always be people who want to do their own thing and this is fine. I will probably be one of these people. But it is better to standardize everything first, so that everyone doesn't have to go through it.
  • by SpaceLifeForm ( 228190 ) on Monday April 12, 2004 @09:41AM (#8837004)
    Yes, I prefer my girlfriend that way also.
  • Long overdue. (Score:5, Informative)

    by torpor ( 458 ) <ibisum@ g m a i l . c om> on Monday April 12, 2004 @09:44AM (#8837019) Homepage Journal
    Disclaimer: I work for a synthesizer manufacturer. [virus.info]

    Synthesizers and other forms of electronic musical instrumentation have been having the same problems as computers.

    Nevertheless, the paradigms of "Page Up/Page Down" and "Parameter Left/Right", and "Patch Up/Dn", and "Edit/Play", as horrible as they are, have served 'standard interface' requirements for years. There is a 'standard user interface' in this realm, as crap as it is.

    Manufacturers in this market have copied each others interface ideas freely and easily, and it has resulted in an, admittedly hodge-podge, 'general user interface' set of 'music machine hacker' chops. "Multi-mode"/"Single-mode", etc. can generally be found on most modern synth platforms. Any synth geek around knows that the patch +/- keys are the ones you look for first, then the 'filter resonance knob', or whatever.

    Computers would do well to learn from the lessons of musical instruments in this regard. It never ceases to amaze me that all these TLA "Initiatives" often disregard even the most obvious examples of solutions to problems... I guess because their grants aren't "directed" to those realms.

    In any case, I hope to see some interesting results from CMI. At Access, we're really interested in human/user-interface problems and good ways to solve them ...
  • by FunWithHeadlines ( 644929 ) on Monday April 12, 2004 @09:44AM (#8837022) Homepage
    Nice goal. It would be great to have computers responding intuitively to my wants, but this is easier said than done. Sounds to me as if the ultimate goal is Star Trek. You know the scenes: Landing party checks out an abandoned alien ship. Some crisis occurs requiring them to access the alien computer. Captain tells Spock (or Data, or O'Brien/Dax, or Seven, or T'Pol depending on which Trek you want to use as an example) to access the alien's computer. Said person punches a few buttons (or touch screens) and voila! Access.

    It's almost as easy in the Trek universe as starting up an alien ship's engines, or navigating it through an asteroid belt. One thing you gotta say about those aliens: They followed the CMI 'Pervasive Computing' initiative slavishly, and we can be so thankful they did or Spock (or Data, or O'Brien/Dax, or Seven, or T'Pol) would have looked like incompetent idiots.

    • One of the subplots in Dune science fiction series was that humans got too dependent on their machines, particularly their computing machines, and had to fight a galactic war to free themselves. Two of the eleven Dune books so far are specificially about this.
      • The probelm in Dune is that they then were dependant on Spice, which gave some the ablitiy to see in the future. It was hat ablitiy that was used to control the popultion in the trillions. (with hundreds of worlds then yes it becomes possible)

        I will spoil the ending that I know of the Atriedes Line ultimate gift to humanity is that some people can't been seen in the future yet they can still affet events. Returning us to a better future. Also spice once again becomes unnessacary.
  • As technology becomes more and more pervasive in our lives we are growing up with a generation of people who don't know what it's like to live without computer assistance. They also are primarily exposed to computers as these large devices that do a whole bunch of things but have a terrible interface. They don't understand that computers can be small, unobtrusive, and do their jobs without the user having any idea they are there.

    Automobile control systems are one type of the latter while microwave oven controllers are a type of the former. The car control system works great and for the most part the user can be completely oblivious to its existence. However, the microwave oven control pad is getting more and more complicated every day with too many settings, too many choices, too much interface getting in the way of the user.

    When working on your next consumer device (those of you working on that kind of thing), think about making it invisible. That is the key to making it indispensable.
    • I don't know about you, but on my microwave there are about twenty buttons you can push (including the numeric keys). Of the twenty buttons I use two:

      1. Speed Defrost (Computer controlled)
      2. Speed Cook (Computer controlled)

      My wife took out something to defrost and spent over an hour trying to get the food defrosted. I walked in, went "Let me do that" and had it defrosted in about five minutes.

      Now, if you wanted to talk about our old microwave - it would have taken me probably the same amount of time to
  • by OglinTatas ( 710589 ) on Monday April 12, 2004 @09:45AM (#8837029)
    from the article, they state this of computers: "It needs to be sentient, loyal, small and low maintenance."

    I propose adding the following rules:
    0. It may not injure humanity or, through inaction, allow humanity to come to harm.
    1. It may not injure a human being, or, through inaction, allow a human being to come to harm except where such orders would conflict with the Zeroth Law.
    2. It must obey the orders given it by human beings, except where such orders would conflict with the Zeroth or First Laws.
    3. It must protect its own existence, except where such protection would conflict with the Zeroth, First or Second Laws.

    except for "small" and maybe "low maintenance" their goals seem to anthropomorphize computers.
  • Challenges (Score:5, Insightful)

    by marcello_dl ( 667940 ) on Monday April 12, 2004 @09:48AM (#8837048) Homepage Journal
    From the overview:
    There are still significant challenges to face before all these devices can improve our quality of life, such as designing better interfaces with these ever smaller computers. So the CMI has decided to tackle these challenges and is running several projects such as improved security, more robust networks and power-efficient computer architectures.

    IMHO The worst challenges are of commercial nature, not technical. Given enough time and funds, CMI can sure set usability standards for pervasive computing, but manufacturers are likely to ignore or "extend" them to promote their own platform over the competition.
    • Given enough time and funds, CMI can sure set usability standards for pervasive computing, but manufacturers are likely to ignore or "extend" them to promote their own platform over the competition.

      And consumers can chose the best one..
  • variety (Score:2, Insightful)

    by sreid ( 650203 )
    variety is the spice of computing, of course some os's are terrible but it makes the others look better.
  • by MrNonchalant ( 767683 ) on Monday April 12, 2004 @09:49AM (#8837056)
    This is something I'd love to see happen in my lifetime, sort of a life goal if you will. The idea here is like Bluetooth but infinitely scaleable, extendable in all directions, peer to peer, and so drop dead simple grandmother could use it without a manual.

    In a perfect system like this each node has about a 10 or so foot wireless range, each node extends the network like a repeater, and these babies are embedded in absolutely everything. Your robotic lawnmower needs to talk to your irrigation system but is 20 feet from it? Simple enough, both devices understand the network physical topology intimately and just route the communication through your SUV. And nobody should have to configure a thing for this to work.
    • Presumably, in this case, your lawn mower is negotiating with your irrigation system to reduce the amound of water for the grass. Not that I'm implying your lawn mower is lazy.

      In the future, we will be able to optimize simple things like irrigation by allowing the stakeholders in the process to act as agents in a complex system. These agents will be able to optimize their system(s) by adapting their own (rule based) behavior to the behavior of other agents in the system. Ubiquitous communication is the fi

      • Nothing is worse than trying to mow wet grass. For the record I do not posess a robotic lawnmower, sprinkler system, nor SUV. I simply couldn't find another example that didn't somehow involve a toaster. I think you'll agree we've all had enough of that futuristic premise.
      • These agents will be able to optimize their system(s) by adapting their own (rule based) behavior to the behavior of other agents in the system.

        Very true, connecting devices (with or without wires) is just one step, making them "talk" to each other is another. The mentioning of "sentient" and "loyal" sounded to me a bit like AI - which I think won't provide any solutions in the next decade, as the topic of AI is not so hot anymore in research.

        Agents on the other hand may truly be useful, although IMHO
    • The reason you don't want to do this (and similar P2P wireless phone ideas) is that it drains the batteries something crazy.

      Besides, the complexity of a dynamic networks would make it hard to make it work in a reliable way. (Roaming and similar issues for instance.)
    • Yes, this will happen soon, you don't have to worry, unless you already are on your deathbed. The only reason this didn't happen earlier is that there was need for it. In the past computers didn't move, they stood on our desks, notbooks and PDAs were rather rare and didn't move from the power grid anyway (and where you can find the power, you can find a phone line). Only recently we god capable devices with long power life and quickly Wi-Fi (Bluetooth, 3G) has emerged. Everyone is talking about dynamic netw
  • Focus on software (Score:4, Insightful)

    by vurg ( 639307 ) on Monday April 12, 2004 @09:50AM (#8837060)
    The council seems more focused on developing new hardware that can overcome these issues. But I think the main problem for what we have now are the rogue software programs that take away that control from us (e.g spyware, open relay SMTP servers that send spam).
  • Ummmm...? (Score:5, Interesting)

    by TubeSteak ( 669689 ) on Monday April 12, 2004 @09:53AM (#8837078) Journal
    not to troll or be considered flamebait
    but doesn't anyone else see the irony?
    in particular, the group wants to
    develop new technologies to make easier for us to interact with all these computers (read as 'old' technology)
    Now that I've read the article, I like what they're doing. Instead of trying to complicate our lives further, they want to change the way things work; which is good. Longer battery lifespans, secure UIs, ubiquitous communication, etc.

    I do think its a waste of time to try and create a 'better' input method. Pretty much the only thing faster than typing is a direct connection to your brain. We can type faster than we speak & read faster than we can listen.

  • by dkirchge ( 678383 ) on Monday April 12, 2004 @09:53AM (#8837079)
    Mod me a Luddite troll if you wish, but it seems to me that this is an appropriate time to step back and ask ourselves why we need all this computing power at our fingertips everywhere we go. I tried really hard to get into the PDA thing as well as having had to use laptop computers for my job over the years, however I've found that the best computing toolset I could carry for the any business trips was... a good pen and a pad of paper along with a decent solar-powered scientific calculator. Never ran out of power at incoonvenient times, never had to be rebooted because it locked up, never started beeping uncontrollably in the middle of a meeting, and it had an friendly interface able to tackle any task from word processing to number-crunching. My doodling during boring meetings even looked attentive and productive rather than looking like someone playing a video game...
    • by neglige ( 641101 ) on Monday April 12, 2004 @10:32AM (#8837344)
      Honestly, I don't think the question "why we need all this" will be asked. It simply will happen. Primarily to create additional 'benefit' for the users, whatever it may be.

      I agree with you that the tools today are not quite there. Laptops are too bulky, PDAs (esp. PocketPCs) drain the battery too quickly. Still, consider that we are pretty much at the beginning of the development, comparable to the 60s or 70s with regard to the PC.

      Taking into account the speed of development (and the interest from both the potential users and the industry), considering what cell phones lookes like 5-10 year ago, imagine what will happen over the next 10 years. My personal bet: it will be impressive.

      To use another parallel from the early days of the internet, I'm sure nobody saw the immediate benefit of transporting some data packets over a network. Want news? Buy a newspaper. Want music? Buy a CD. You get the idea ;)

      Again, I agree with you that todays mobile/pervasive technology can be improved - pen and paper are currently still essential. And I'm sure it will happen. Then we end up with electronic paper which takes your notes and then displays, if requested, the headlines of the major newspapers around the globe.
    • by Anonymous Coward
      The key problem you're having is that you're focusing on the shortcomings of your current technology, instead of the possibilities of the future. Obviously, a laptop or PDA or even a Tablet PC) isn't going to be good at note taking as a pad of paper, but imagine (for example) a device that looked, felt, and acted like paper, except that you could scribble down your equation, write a '=', and the device would write the answer for you. Or imagine that you could write your notes normally, but they would auto
    • It's a matter of both personal tastes and personal abilities.

      I'm glad paper and a calculator work for you, but they were inadequate for me. My productivity and my sanity have improved greatly since I got my first Palm.

      With the ADD I have struggled with all my working life, I find it pretty damned useful to have a Palm track my trivia, including filtering my e-mail for me, as well as keeping my schedule so when it's time for me to do X, I don;t absent-mindedly fail to do so.

      A friend from high school is P
  • Interface research (Score:4, Informative)

    by PlatinumInitiate ( 768660 ) on Monday April 12, 2004 @09:57AM (#8837099)

    Georgia Tech and others are working on a product called Squeak [squeak.org] which could gain ground in this regard. Some of the players involved are key names from the early years of computer interface/graphics research, including Dr Alan Kay.

    Squeak is an open source product with quite a flexible license, and although they are mainly concentrating on educational apps, it is worth noting that in the system itself they have developed an unusual, yet addictive, UI. It is such an easy system to learn, that quite complex tasks can be done within a few hours of learning the basics of the system and going through the tutorials.

  • by G4from128k ( 686170 ) on Monday April 12, 2004 @09:57AM (#8837102)
    This sounds exactly like what M$ (motto: "All your devices are belong to us") is trying to do - PCs, office software, servers, enterprise software, XBox, PocketPC, media formats, online music sales, tablet computers, MSN, etc. I wonder who will win the interface definition standardization game? A bunch of really smart people at MIT or an even larger bunch of better funded smart people at Microsoft? (Note: at $6 billion dollars, Microsoft's R&D department has more than 4 times the money of ALL of MIT.)

    Can me bitter, but I fear that with billion in R&D and hundreds of millions of dollars for marketing, M$ will win this game unless they commit suicide [slashdot.org].
    • Actually, if you want to support the "good guys" (in my book, anyway) see my other post [slashdot.org] about this topic.
    • by fcw ( 17221 ) *

      I fear that with billion in R&D and hundreds of millions of dollars for marketing, M$ will win this game unless they commit suicide.

      Even if they wanted to play this game, which I don't believe they do, Microsoft have no chance:

      • Microsoft's approach to the market has always been to copy and co-opt, not to invent and to lead.
      • In ten years, almost nothing invented by their big, shiny research group has materially affected their commercial products, supporting the contentions that it's basically just f
  • Things like this have been looked at before but not enougha nd not recently enough to keep all of the new technology in mind. I think it is insane that we are still using computers almost the same as we were when the first GUI OS came out. IT is time computers reacted to use better.
  • by StateOfTheUnion ( 762194 ) on Monday April 12, 2004 @10:01AM (#8837135) Homepage
    Doesn't the market already regulate this . . . to a certain extent creating standards by embracing ideas that are well accepted by the customer base?

    For example the Apple Newton's terrible handwriting recognition system vs Palm Pilot's . . . and Palm's system of handwriting recognition is becoming more ubiquitous as others license the operating system (handspring (now part of palm), Sony, etc.)

    Or a simple example, how many software products for sound recording or CD audio playing do not have the familiar play, rewind, FF and stop that look like a right arrowhead, double left arrowheads, double right arrowheads, and a square? If someone tried to write a player/recorder without this interface, would a significant number of people actually buy it even if in all other respects it was a great program?

    What about a trash can in the GUI for deleting files? . . . or even the concept of a mouse? All these became "standards" in their own right because they were well accepted by the consumer.

    A standards body may save some knock down drag out fights over "standards" in the marketplace and may speed things up a bit, but the ultimate challenge is the marketplace . . . if people think that the interactive experience from a product sucks, then they're not going to buy it . . .

  • continues from ... (Score:4, Interesting)

    by sir_cello ( 634395 ) on Monday April 12, 2004 @10:05AM (#8837155)

    This has been a floating research topic in Cambridge for a long time.

    The old Olivetti Research Labs (ORL) performed a lot of blue sky research activities, including production of omniORB (free CORBA ORB) and VNC (virtual network client) and so on. In fact, VNC was part of the focus on pervasive computing.

    There was an umbilical cord between ORL and Cambridge Computer Laboratory with people like Andy Hopper and so on.

    AT&T bought ORL in the late 1990's bringing it under its AT&T Labs arm: unfortunately it was too blue sky for AT&T is now days (e.g. AT&T Labs in Middletown NY is more commercially oriented - and as we've seen recently, they've lost a lot of fantastic talent by changing their focus) and closed in 2002.

    Microsoft Research Institute in Cambridge has a lot of staff that fell out of these places, and the umbilical cords remain. It's an incestuous community (but a good one, it breeds a lot of new and interesting things).

    The kinds of blue sky technologies that used to come out of these labs are now being produced by open source community.
  • Still, we are slaves to our machines. With every new device, we have to learn new commands, languages or interfaces. - No, some people are slaves of proprietary so-called operating systems. Other people use operating systems and programms which get improved, but don't change the userface with every new version. For example, if you are a Linux user (or addicted to UniXes in general), you will feel at home instantly, whether you use it on a cluster, a server, a desktop, a laptop or notebook [tuxmobil.org], a PDA [tuxmobil.org], a mobile (cell) phone [tuxmobil.org], a wearable [tuxmobil.org] or whatever [tuxmobil.org] .
  • Just my opinion of course but instant messaging etc. has just served to make people unable to think.
    Don't know how to do something? Don't bother with the manual or anything, just call tech support. I swear no one can make a decision on wiping their rear without consulting someone else.
  • Being able to talk to your computer (or whatever device you happen to be using) looks really cool on TV, but do we really want to be standing around listening to each other yammer on and on to no one?

    For me, the ultimate interface would be one that can receive 'thought waves'. Of course, this should require 'active thought', directed specifically to the device - don't want little boxes hanging around just listening to your brain all the time.

    It would be a challenge to keep other devices from listening in
    • the ultimate interface would be one that can receive 'thought waves'

      And then a low IQ results in sloppy mouse movement? ;)

      No, seriously, it's a good point. Audio is a good way of communication if you are in a quiet place. For crowds, where everyone babbles with their device, it's probably unpractical. Imagine a room full of people talking to their mobile phone. At the same time. *shudder*

      Another possibility could be projection keyboards and displays. Again, very much dependend on your surroundings,
    • It turns out that mentally rehearsing something you are going to say tenses the muscles in vocal track. Some researchers (as reported recently in slashdot) are trying to measure and interpreted these muscle movements as a sub-vocal interface. so it may be possible to design an sub-vocal in the form of something like a necklace.

      Also this has an use as a lie-detector, because people unconsciously sub-vocalize, unless they have been trained otherwise.
    • Or perhaps someone could offer a course in 'encrypted thinking'?

      Adkja nbia;'wselir hbia'wdlif asdvnaisd'o fsyxucv lznxdfaw ;erl iscvy u8zxo;cf nadfln ascvliyhzx;oicu vhzs diornsa klscv;'zxk ucfioS dnsvioas;dn sicvus dlfkjnms ;dlnzsLI DFu;LIDF JHS;LDKF V;LSIVCJZ;OLIXCJ V;ZLSDKF NJH;LS KDFJ'fkjz ;so ivj x;ldkma sdklfj zxiocv jsl;ek fz;lc xvkjzp'SId jmsL DK ns ;odij S:Ld ij

      Though It might have not made sense, the paragraph above me was the result ov "encrypted thinking".

      Now if I could just figure out what
  • It's true, that the things you own end up owning you. Financially an influence, as well as influences over the things you know and learn.
  • While I do applaud these two institutions for their initiative, and wish them the best of luck, I feel that the poster has put a little bit of an unfair spin on the news, as if this is something that they just dreamed up entirelyon their own.

    IBM, for example, has had a Pervasive Computing Lab [ibm.com] in Austin, TX for several years that has produced several applications [ibm.com] in a multitude of markets [google.com].

    In fact, those of you that are fans of Opera may want to check out Multimodal Browsing [ibm.com] on the Sharp Zaurus [ibm.com]. Those of you with Windows may want to check out IBM's Multimodal Toolkit [ibm.com] for creating these new X+V pages that we might be hearing more about in the future.

    Enjoy the links!
    ~ Mike
  • "Still, we are slaves to our machines.

    Incorrect, last time i kicked my computer it didn't hang me or beat me to death. :P
  • by dpbsmith ( 263124 ) on Monday April 12, 2004 @10:35AM (#8837366) Homepage
    Anyone who has tried to work collaboratively on word-processing documents has quickly discovered that it doesn't work, UNLESS a) the collaborative document is almost free of anything above character-level formatting, or b) the collaborators are willing to learn and submit themselves to working within a very complete, rigid, predefined stylesheet that is not changed during the course of the collaboration.

    In the real world, different people achieve the same printed appearance by very different semantic routes, and, as a result, it is almost impossible for person A to edit person B's document, or to cut and paste large portions of material, without messing up the formatting.

    I of course am thinking about Microsoft Word here but that's just because it's dominant. The same problems occur with virtually any "modern" WYSIWYG word processors. (Although I will say that Word's automatically numbered lists and paragraphs are still a mystery to me and I have been completely unable to form any mental model that explains their innately perverse behavior).

    Yes, I have no doubt that there are left-brained people who successfully work collaboratively with markup languages such as TeX, but in the world of casual "computer-literate" users I still frequently encounter paragraphs in which the first line indentation is achieved by typing five spaces.

  • BFD (Score:3, Interesting)

    by fermion ( 181285 ) on Monday April 12, 2004 @10:38AM (#8837396) Homepage Journal
    I think the user interface research is cool, but I really have to say 'so what'. We are a civilization that is dependent on our technology. We use it no matter the consequences or user interface. We use it without understanding of what makes it work. And we don't care. This applies to a shovel, a pencil, a tv, or a computer.

    Right now the computers are in their infancy. The people who will ultimately use these pervasive computing environments, those that are just now in grade school, will be trained to use whatever interface the producers of this technology develop. It is nice to have academic research to back up the production and marketing guys, but which group has the most years of experience getting users to use electronics?

    Take some examples. I never had any trouble learning or figuring out what the dials, yes the dial, on the TV did. I never had any trouble figuring out the top dial had to be set to a certain place in order to use the bottom dial. It was actually a complex logic puzzle. I figured it out. The same thing with the VCR. I now see three year old children able to navigate the complex buttons of the modern TV with no trouble at all. And they can't even read. The do by spatial position.

    The same is true for vending machines, microwave ovens, whatever you like. There is no such thing as a truly intuitive interface, although some are more intuitive than others. There is really no reason to make the audio controllers on a computer the same as on a radio, except as a crutch to the older users. The young will choose the design that works for them. They will use it in ways that the researchers never thought of. And most will use it without any understanding of the technology, not even the basic notion that the color of the LED is created by the quantum mechanics.

    • Take some examples. I never had any trouble learning or figuring out what the dials, yes the dial, on the TV did. I never had any trouble figuring out the top dial had to be set to a certain place in order to use the bottom dial. It was actually a complex logic puzzle. I figured it out. The same thing with the VCR. I now see three year old children able to navigate the complex buttons of the modern TV with no trouble at all. And they can't even read. The do by spatial position.

      Once upon a time, though, on

  • It seems like wherever I go these days- coffee houses, the bus, auditoriums, etc.- a large fraction of the population is "lost" in their electronic gizmos. This include music players, cell phones, PDAs, portables. Its kind of strange- all these people physically in one place, but mentally in completely separate worlds.
    • How is this any different than normal. Walk through a crowded bar and you'll hear a hundred different conversations about a hundred different things happening in the space of a small mobile home. Same location, different worlds. Same thing's true on the highway at rush hour (although I will admit that cell phones are my biggest gripe here). Or if you remember large lecture halls at college, the guy beside you is relating the story that starts "last night I was so blitzed" to the guy beside him, the guy
  • by noidentity ( 188756 ) on Monday April 12, 2004 @10:43AM (#8837431)
    "With every new device, we have to learn new commands, languages or interfaces."

    I agree. I was really annoyed that I had to learn a new interface to drive a car. Why can't it be just like walking? Then there was the TV set. The first time I tried to use one I lit a match thinking it would work like a fireplace, but nooo, they had to make it different with a huge lighter that supposedly emits invisible light rays. These days I can use a computer and I can't figure out why they don't make them all just like my desktop machine. Like my celphone, why doesn't it just have a normal keyboard and mouse, instead of those weird "Talk" and number keys?
  • I propose LCARS (Library Computer Access and Retrieval System) be the operating system of choice for the fleet.
  • Now I'll be able to unplug my teledildonics from my PC, plug it into my Palm Pilot, and hit the road!
  • Most of us are using computers, but also PDAs and cell phones. We might use hundreds of computing devices by the end of this decade.

    Man. I'm gonna need bigger pockets.

  • "Pervasive Computing"

    Let me see if I understand this:

    MIT suggests it: innovative and far-seeing concept for increasing useability, efficiency, and interactivity for humans and their ever-more pervasive electronic devices.

    Bill Gates suggests it (ie. implementing Windows everywhere, in everything): greedy, self-interested capitalist bastard trying to oppress all of the Open Source Ewoks of Truth and Light.

    Is that pretty much correct?
  • by Syberghost ( 10557 ) <syberghostNO@SPAMsyberghost.com> on Monday April 12, 2004 @12:14PM (#8838251)
    Explain to me again why it's bad if you have an RFID tag in your pocket that "the man" might track, but OK if you have a persistent wireless internet connection in your pocket, that's uniquely identified to you so that the access can be billed?

    Oh, yeah; because the latter can run Linux. NM.
  • Does anyone else feel like it's time for the users to start taking a little responsibility too? Computers are monstorously complex machines. People spend years of their lives studying mere fractions of how computers work. The fact that we've boiled it all down to a smooth, milky interface is absolutely incredible, in my opinion. Windows 3.1 was not there, and KDE is really close. But computer engineers are so used to hearing about how it's their fault that people can't use computers, when really, it's ok to

  • I'm more interested in the Perverse Computing Standards. Who needs another pr0n viewer?
  • "I'm An Idiot, But I Want To Use Another Computer."

    I use many computing devices every day. I program the VCR, the Microwave, my Cell Phone. Also, Digital Watches, Game Consoles, Environmental Controls. Not to mention the various pieces of software that mimic 'real' interfaces, at the same time, variating from the OS.
    My point: I use all of these without reading the manuals. I can figure any one of these out, just becuase the nature of the devices and interfaces is so similar. Has 'Pervasive Computing
  • If this happens then the l33t won't have anyone to look down on!

    Well, except for the guy who still can't figure out the toaster.
  • From the article:

    "It needs to be sentient, loyal, small and low maintenance."

    I can go either way on the sentient part but this sounds like the ideal girlfriend.

  • by Eminor ( 455350 ) on Monday April 12, 2004 @03:19PM (#8840257)
    This "problem" is not limited to computers.

    You have to learn how to use your lawn mower, drive your car, play your guitar, use your dishwasher....

    You cannot expect to get a new appliance without learning how to use it.

  • mya, i saw this a while back, theres tons of projects like this, but few seem to be making it anywhere. perhaps because the problem isnt quite the human/tool interface problem, its the human/human interface problem.

    anyhoo: MIT Project Oxygen [mit.edu]
    been covered here before, but for the love of redundancy...

  • although this council thing probably won't be a bad thing, I think that we should let evolution decide the best interfaces. Basically those that are the best will thrive, and others will copy those characteristics. You can definitely see that with the early mac os versus windows stuff. But I guess this is a step to ensure noone gets sued for plagiarism or whatever.
  • Most of us info-geeks have PDAs of one type or another. Somebody needs to write a simple graphical utility, that allows on to read the key's from a remote control through the PDAs I/R port, or download the common table of key functions for most remote controls from an online database.

    Once you have all the information in a nice little table in your PDA, you can throw away all your remotes, and use the PDA to control the entire collection of digital appliances you've accumulated (eg. TV, Surround Sound/Hom
  • Didn't pervasive computing start years ago at Xerox PARC?

The opposite of a correct statement is a false statement. But the opposite of a profound truth may well be another profound truth. -- Niels Bohr

Working...