Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Communications Biotech Medicine

Sending Messages With Your Brain Via EEG 99

An anonymous reader writes "From a University of Wisconsin-Madison announcement: 'In early April, Adam Wilson posted a status update on the social networking Web site Twitter — just by thinking about it. Just 23 characters long, his message, 'using EEG to send tweet,' demonstrates a natural, manageable way in which "locked-in" patients can couple brain-computer interface technologies with modern communication tools. A University of Wisconsin-Madison biomedical engineering doctoral student, Wilson is among a growing group of researchers worldwide who aim to perfect a communication system for users whose bodies do not work, but whose brains function normally.' A brief rundown of the system: Users focus on a monitor displaying a keyboard; the interface measures electrical impulses in the brain to print the chosen letters one by one. Wilson compares the learning curve to texting, calling it 'kind of a slow process at first.' But even practice doesn't bring it quite up to texting speed: 'I've seen people do up to eight characters per minute,' says Wilson. See video of the system in action."
This discussion has been archived. No new comments can be posted.

Sending Messages With Your Brain Via EEG

Comments Filter:
  • TCMP? (Score:5, Funny)

    by explosivejared ( 1186049 ) * <hagan@jared.gmail@com> on Tuesday April 21, 2009 @03:35PM (#27665545)
    Is this anything like TCMP? [xkcd.com]
  • by Anonymous Coward on Tuesday April 21, 2009 @03:37PM (#27665579)

    That a brain was involved in the process of Tweeting.

    • by Fred_A ( 10934 )

      That a brain was involved in the process of Tweeting.

      At least with locked-in people it should make for more interesting reading than the rest :

      - I'm in bed
      - In bed today
      - Will spend the rest of the day in bed
      - Still in bed
      - In bed
      - New nurse
      - In bed. Still
      - Today I'm in bed
      - Bed. Again.

  • It may not be as quick as texting yet, but as the interfaces and the technology gets better, I don't see why it couldn't be.

    For this to be possible at all with this preliminary technology, it shows the future iterations could be amazing.

  • This sounds like we're getting that much closer to a human computer interface. How long till we go a little more invasive and have implants that let us "jack in" ala matrix, or andromeda, or any other Sci-Fi show or movie, and start interacting merely by thinking. In the next few hundred years we could turn ourselves into fat unmoving beings plugged from birth to death into computers. Till solar flares overload the system and kill us all at least... Either way, very interesting to see where this line of res
    • This sounds like we're getting that much closer to a human computer interface.

      Tell him we've already got one.

      Monitor + keyboard.

      Sure, I know what you mean -- a direct neural interface instead of a kinetic input device (like a keyboard).

      I think you're looking at it from the wrong perspective though, in terms of coolness. Instead of implanting a neural interface, wouldn't it be much better if we could just use telekinesis? Then we don't have to deal with extremely messy surgical hardware upgrades.

      Now

    • I think this would also be good for checking for brain function at all. It could be used to determine the point of brain death. Might also stop all those nasty zombie jokes...

    • Re: (Score:3, Interesting)

      by jd ( 1658 )

      Similar things have been done. Robot arms can be moved by the mind. Rats brains have flown F14 fighters. EEG sensors placed directly onto the brain (rather than onto the head) produce far more detailed information - it's not a stretch to suggest that some day a sensor layer will be placed onto the inside of the skull with a connection to the outside world.

      You could, of course, play with EEG technology yourself. The OpenEEG project details the hardware needed and provides some basic software. See if you can

  • ... I stare at my Windows system all day and frequently send it messages with my mind, and yet the computer still hasn't exploded ...

    In related news (to TFA): This kind of interface was on "House" the other week.

    • by Indiges ( 701323 )
      It was on House indeed; the article even mentions Wilson!!11oneonetwo
    • by geekoid ( 135745 )

      If a device has been on TV, it must be real and proven~

      • by DavidTC ( 10147 )

        The device on House was just a binary interface, and it was entirely plausible.

        The patient could make a cursor go up, or not make it go up, and that was it. He had to make it go up twice for 'no'. And he had a response time measured in seconds just to get the cursor to go up.

        I'm pretty certain that's real. It's actually a good deal less complex than technology that already exists, because I'm pretty certain they've demonstrated full 2-D control of a cursor using brainwaves. Half an axis vs. two full axis.

  • by Finallyjoined!!! ( 1158431 ) on Tuesday April 21, 2009 @03:39PM (#27665605)
    Think how much more Stephen Hawking could gives us with this device.

    I know he's in the hossie at the moment and I hope he recovers fully, enough to try this device. :-)

    Send one to him. Now!
    • Re: (Score:2, Funny)

      by Anonymous Coward

      "hossie"? What are you, 6 years old?

    • Re: (Score:3, Insightful)

      by zappepcs ( 820751 )

      If they are not already in contact with him about this, they probably don't deserve to see any profit from it. This is the first thing I thought of since the stories are right next to each other.

      I'm also interested to know if they can improve this to work even when people can't see the keyboard etc.

      • by Penguinshit ( 591885 ) on Tuesday April 21, 2009 @04:56PM (#27666749) Homepage Journal

        Professor Hawking uses a more sophisticated system with word-prediction and a micro-switch activated by a slight motion of his shoulder. he can do much better than 8 cpm. I use a similar system but use eye gaze on a virtual keyboard rather than a sectoring keyboard.

        Perhaps he's more accustomed to the sectoring keyboard or no longer has the ocular control for the eye gaze system.

        • by jd ( 1658 ) <imipak@ y a hoo.com> on Tuesday April 21, 2009 @07:37PM (#27668907) Homepage Journal

          The eyes probably couldn't be steered accurately enough. His muscular control was a mess when I saw him in person in the late 1980s, and it won't have improved since.

          On the other hand, if they tune into the neurons that control his arm, they may be able to anticipate what he is going to type. That might help accelerate things for him. It's a bit much to be able to decode the language centres sufficiently to record thoughts directly, but it will eventually get to that point.

          Once it is possible to decode his thoughts directly, he would be able to communicate as fast as he can think. Which means that it'll be a babble because he thinks far too fast. On the other hand, it will help him to turn out papers at a fantastic speed.

  • by Anonymous Coward
    If this relies on a screen why not just use the screen and use pupil tracking to determine what letter people want to "type" instead of thinking about it which seems so far slow.
    • Re: (Score:3, Informative)

      by arth1 ( 260657 )

      Because the user might not be able to move his eyes.
      The idea is that no focusing is required, just thought.

      • I don't understand. If only thought is required then why is a screen even needed. Why can't one think of a letter without seeing it on a kbd/screen? What am I missing?
        • by arth1 ( 260657 )

          The screen is used to figure out just WHAT letter the user is thinking of, until the technology is there to lift the letter out of someone's head. When the letter that blinks is the same as the user is thinking about, a signal is triggered in the brain, and that can be read by EEG.

          Granted, it might be possible for someone who knows binary ASCII values or morse code to do without a screen, simply by thinking two different ways that generate a binary signal, but this is probably far more feasible.

          • I remeber hearing of a study once where they found that people could on average send text messages faster using morse code then with a cellphone keyboard. Although AFAIK this was before things like predictive text and such. Nevertheless though you just need to find 2 muscles that a person is able to control and from there would be pretty easy to translate that to morse code and then text.

            • by arth1 ( 260657 )

              I remeber hearing of a study once where they found that people could on average send text messages faster using morse code then with a cellphone keyboard. Although AFAIK this was before things like predictive text and such. Nevertheless though you just need to find 2 muscles that a person is able to control and from there would be pretty easy to translate that to morse code and then text.

              Be careful what you ask for. You might get to teach someone who can only control his sphincter and prostate...

        • This particular kind of mind-reading functions by detecting electrical impulses created when the chosen letter flashes blue. If row 1 and column 1 both produce impulses, then the desired letter must be A.

          We have a long way to go before you can just think a letter or word and it shows up.

  • by Saint Aardvark ( 159009 ) on Tuesday April 21, 2009 @03:40PM (#27665633) Homepage Journal

    ...can be found here:

    http://nitrolab.engr.wisc.edu/ [wisc.edu]

  • Eye tracking? (Score:5, Insightful)

    by TinBromide ( 921574 ) on Tuesday April 21, 2009 @03:41PM (#27665643)
    So, when the letter being focused on flashes, the EEG picks it up and figures out which row and column are desired...

    So it wouldn't work very well for the blind and its not pulling the letters out of the brain, its just a more sophisticated eye tracking device, similar to the goggles in apache helicopters? Why not just fit patients with those for a faster input method?
    • by Chris Burke ( 6130 ) on Tuesday April 21, 2009 @04:02PM (#27665975) Homepage

      So it wouldn't work very well for the blind and its not pulling the letters out of the brain, its just a more sophisticated eye tracking device, similar to the goggles in apache helicopters? Why not just fit patients with those for a faster input method?

      Because Apache helicopters are prohibitively expensive even for patients with the best insurance, aside from being illegal for civilians to own. Duh.

      • Re: (Score:3, Funny)

        by RDW ( 41497 )

        'Because Apache helicopters are prohibitively expensive even for patients with the best insurance, aside from being illegal for civilians to own.'

        It would probably be much cheaper to pick up a surplus thought-guided control system from the Soviet Mig-31 project on ebay. The only downside (and this is very important) is that you must think in Russian. You can't think in English and transpose it - you must think in Russian.

      • stephen hawking could probably afford an Apache....
        he'd kick some ass too.... not sure about our arms trade agreement with england.... but i'm sure we could work something out where hawking could get an Apache.. as long as he used it to run a few patrols over pirate infested waters to protect international trade interests....
    • I would call this a first step. They still don't know how the brain works, they are just guessing right now. They can figure what part of the brain deals with say language(as an example) and detect what is firing for what thought and then adjust the machine to say this is what he was thinking about so this is what the device should do.

      The more we understand the brain the better a device like this will work. Either way it may never work for a blind person unless you can somehow figure out how to transmit

    • Because it's not eye tracking. The user sees the letters flash, in sequence, and when the correct one is seen to flash the user changes his/her thoughts in a way detectable by the EEG. The system then inputs that letter. The eyes don't actually have to move (though it can be hard to see a letter since the usable area of the vision range is quite small).
      • While technically you are correct. If i have a device that remotely heats my dog until it changes the channel for me via a sophisticated cabinet of lights and pull levers, I have a whole lot of overkill to achieve the same functionality as a an infrared remote control, except it doesn't use the infra-red LED or Sensors built in to standard remotes and TV's.

        Instead, this device figures out what you're focusing on (looking at in 99/100 cases) flashes without interfacing or reading the eye. If the object of
    • I also wonder why the user is presented with a full alphabet. I would have thought that some form of predictive input, such as T9 would be much faster - it seems pretty much perfect in this scenario, and could easily leverage existing T9 software. I'm going to go ahead and assume I'm missing something. Can someone who is more informed correct me here?
  • by Anonymous Coward on Tuesday April 21, 2009 @03:41PM (#27665651)
    d a m n  t h i s  t h i n g  i s  s l o w
    • by D Ninja ( 825055 )

      Actually, I was sort of wondering about the quote from the summary:

      Wilson compares the learning curve to texting, calling it 'kind of a slow process at first.' But even practice doesn't bring it quite up to texting speed: 'I've seen people do up to eight characters per minute,' says Wilson.

      I don't know if Wilson has seen how fast people really get with texting, but it's fast. This would have to get a lot faster than 8 characters per minute to even be close to texting.

      • by DavidTC ( 10147 )

        Hell, I text that fast, and I suck at it.

        Hell, I actually type, with no predictive stuff, that fast, on a cell phone numeric pad.

  • by Cyberwasteland ( 1467347 ) on Tuesday April 21, 2009 @03:50PM (#27665773) Homepage
    As this technology gets better isn't there going to be a big chance for really bad Fruedian slips? XD
  • by Ungrounded Lightning ( 62228 ) on Tuesday April 21, 2009 @03:52PM (#27665813) Journal

    Instead of flickering one row or column at a time, flicker ALL the letters simultaneously in different patterns. The brainwave trace should follow the one you're watching and the wait for it to be identified and confirmed will be much shorter.

    = = = =

    How is this better than eye tracking?

    • How is this better than eye tracking?

      What if you're blind? What if you can't move your eyes?

      That's how it's better than eye tracking.

      • How is this better than eye tracking?

        What if you're blind?

        The new technique works by recognizing, from brainwaves, when a letter on a screen is blinked. I doubt that will work for the blind.

        What if you can't move your eyes?

        It's not clear to me whether the EEG device is recognizing the brain signal alterations from the letter being concentrated on blinking or the letter being looked at blinking. If the former it may work for someone whose eyes are paralyzed. If the latter, it certainly won't. (I suspect

    • by julesh ( 229690 )

      How is this better than eye tracking?

      I imagine it's substantially cheaper. You can get home EEG devices for about $100 US. The tech in this is probably not much harder.

      • by Amorya ( 741253 )
        Eye tracking's pretty cheap these days. We let undergraduates play with it in my department. All you need is an infared light source (LEDs), an infared camera, and some clever software.
    • Re: (Score:1, Informative)

      by Anonymous Coward

      What you're suggesting involves much more sophisticated signal processing methods involving narrow-band spectral detection. What this is is just a P300 speller, hooked up to twitter. What they're picking out using EEG is a broad event-related potential known as the P300 which is detectable by averaging traces together. You can find more about how it works here: http://www.gtec.at/products/g.BCIsys/P300_Speller.htm

      And honestly, it's not that much better than an eye tracker. It just uses fancier technology.

  • Optimization (Score:5, Insightful)

    by Rival ( 14861 ) on Tuesday April 21, 2009 @03:54PM (#27665849) Homepage Journal
    FTA:

    "The interface consists, essentially, of a keyboard displayed on a computer screen. "The way this works is that all the letters come up, and each one of them flashes individually," says Williams. "And what your brain does is, if you're looking at the 'R' on the screen and all the other letters are flashing, nothing happens. But when the 'R' flashes, your brain says, 'Hey, wait a minute. Something's different about what I was just paying attention to.' And you see a momentary change in brain activity."

    Their "cognitive click from flash recognition" interface sounds an awful lot like the retrace timing system used for the http://en.wikipedia.org/wiki/NES_Zapper [wikipedia.org].

    I'm curious what kind of language optimization has been added, if any. Do they use predictive text of some sort?

    Also, it seems a waste to limit the input to a display of a static keyboard (other than ease of use for people who know where to look for certain letters.) Why not have a dynamic interface, something alongs the lines of http://en.wikipedia.org/wiki/Dasher/ [wikipedia.org]?

  • by jeffmeden ( 135043 ) on Tuesday April 21, 2009 @04:04PM (#27666001) Homepage Journal

    Kif Kroker: One beep for yes, two beeps for no.
    [Fry beeps once] ... [Fry beeps twice]
    Captain Zapp Brannigan: Double yes. Guilty.

  • All he would need to pick up with the EEG is "up" or "down" signals, and it could be used to type very quickly with Dasher

    http://blog.makezine.com/archive/2009/04/single-finger_text_input_1.html [makezine.com]

    http://www.inference.phy.cam.ac.uk/dasher/Demonstrations.html [cam.ac.uk]

  • P R O N

    in 30 seconds, without tying up my hands..

  • When will Hawking get one?

    Speaking of Hawking, they should change this so that it is full words. It is probably easier to get the comp to recognize the difference between left or right than A,B,C,D,.... Use the interface that Hawking has on his computer, where it just narrows down the word groups.

  • This system has been around for a while; I've seen it demonstrated live twice, and it didn't work at all either time. In my opinion, even in best conditions (bald patient, shit-tons of electrodes, professional setup, well-trained subject) it doesn't work well enough to fuel science-fiction fantasies, and probably never well. For locked-in patients, who can do nothing but move their eyes, though, it's an awesome technology. They made a movie recently about such a patient who spent years using it to write

    • Whoops, looks like I misremembered - the patient in that movie wrote by blinking at a grid, not via P300. My bad.

  • I so want to do this! (Thinking really hard now...) Is it working?
  • I typed this message with brain waves. I did this making waves that actuated the input mechanism. The input mechanism is a complex chemical-based detector which translates the waves into physical movements, which it then translates into electrical signals using crude switches. In the article, a device is described which uses a mechanism which is different in particulars, but gives the exact same result (though slower).
  • by Anonymous Coward

    Perfect for saying "KILL ME " over and over again.

  • A Beowulf cluster of those. I wonder if the on-screen keyboard they are using is Dvorak. The Dvorak keyboard layout is far more efficient. Qwerty was deliberately designed to slow people down. I personally switched to a Dvorak keyboard on my EEG device, and I went from 8 to 12 letters per minute and experience way less eyestrain now.
  • Think about growing a new arm...or a new anything. It might be an appendage you've never previously imagined before. "Thought" is not the same as motor control. I don't think my fingers into typing this post at 60wpmish. If I had to think about it it would take forever to type.

    Now think about if you had a third arm growing out of your chest. How would you control it? Without the motor control that has been learned over several years of childhood and adolescence, what good will it do you? A good quest

  • I think I can, I think I cam!
  • He already knows what the keyboard looks like. He should have made it where he thinks of the character and it appears rather than focusing the eyes on a keyboard. The problem with the keyboard approach is that think of a key on the keyboard. In a sense you think of a image of the area round that key. Say the H key, but in reality you visualize the keys around it as well.

    Now you know the shapes of letters. Think of an L and that's about it.

    It would be cool if you could think entire words as well, but it

  • I'd have to say that this is kind of the wrong way to go about expressing yourself directly from your brain. The brain doesn't naturally think in language. Language is manmade and is a "higher level protocol" if you will, so something that directly accesses brainwaves (via EEG) and tries to output English or another language is kind of dumb IMHO.

    Why not make something that expresses emotion (happy/sad/mad/regretful/passionate) first? It'd probably be way easier to get directly from the ole noggin.

  • TWATTER, Arsebook, Tuesday — A direct neural interface to post on Twitter [today.com] has been created by Adam Wilson of the University of Wisconsin-Madison.

    "We originally hooked it to the brain," said Wilson, "but only a very limited selection of messages came out, that appeared to be coming from somewhere else. So we've just gone directly to the penis without the middleman."

    Male humans suffer from having functional bodies trapped with almost completely paralysed minds. The penis is an organ used by male hu

  • Fortunately someone posted the video on youtube so one doesn't have to be able to play .mov videos..

    No sound, but maybe the original .mov doesn't have sound?

    Link [youtube.com]
  • Shouldn't this be under the Borg icon instead of the old phone? After all this is the very basics of Borg communication... Ahh but we used that icon for Bill, right?

E = MC ** 2 +- 3db

Working...