Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Open Source

Video CES 2014: Danish Company Promises Low-Cost Eye Trackers For the Masses (Video) 22

Video no longer available.
Their website's "About" page says, under the headline, "Our Big Mission": "The Eye Tribe intends to become the leading provider of eye control technology for mass market consumer devices by licensing the technology to manufacturers." Their only product at the moment is a $99 development kit ($142.50 with shipping and VAT). Some people may want to say, "This is old news. Wasn't there an open source project called Gaze Tracker that was originally developed to help handicapped people interact with the world?" Yes, there was. The Eye Tribe is an outgrowth of the Gaze Tracker research group, which is still going strong and still offers its software for free download (from SourceForge) under an open source license. The company's funding comes in large part from a government grant. In the interview (below), The Eye Tribe CEO Sune Johansen notes that they have just started shipping their development kit, and that they hope to start selling an eye control kit for tablet computers to the general public before long, but he doesn't want to commit to a specific shipping date because they don't want to sell to end users until "...we have enough applications out there so that it makes sense for the consumers to buy it directly."

Sune Johansen: So this is the world’s first truly affordable eye tracker. We are selling this now. We are shipping today to developers around the world. For $99 you can buy this. A development kit for eye control on laptops, tablets, desktop computers. We are now shipping for Windows. We will be shipping very soon for Mac OS as well, and then the other major OS operating systems will be supported.

Timothy Lord: Now talk about what’s exactly the hardware that you track off?

Sune Johansen: So what do we use to actually determine exactly where you look on the screen, we can determine with the accuracy comparable to the size of your fingertip, of where you are looking on the screen. And we are doing that with an infrared camera and a set of infrared LEDs. It works the way that it projects infrared red light towards your eye that then reflects back to the camera. And then we have some advanced algorithms that can detect exactly where you are looking at the screen.

Timothy Lord: How can you do that with only one camera as opposed to a stereo set of cameras?

Sune Johansen: The founders of this company have been doing PC research in this particular area for many years. We managed to develop actually a software that can use standard low cost consumer electronics components. So for the first time, we don’t rely on expensive custom hardware or machine vision cameras. We can do this with the kind of components that are already used in standard mobile devices like tablets and smart phones. This means that the cost has come down now, so low so it is potentially very easily integratable into the next generation of devices.

Timothy Lord: Now you are showing us right now attached to a tablet, can you explain what is the connection? Is it USB?

Sune Johansen: So what we have is our development kit here. It is mounted here with a small mount on the tablet. This fits for most standard tablets. And then we have our USB connection here to the device. So you would be able to order our development kit for $99, attach it to any computer with a USB port.

Timothy Lord: Now what distance does it need from the user for a desktop?

Sune Johansen: So you can see here probably that if I hold it like this, you can see I can move down to 45 cm and it is tracking me. You can see my eyes here, it is the green light. Then I can take it down to 75 cm. If I move it beyond the 75 cm at some point it will stop tracking me. But that is the working range we set up. So we optimize this particular device for that working range, because that is the distance of working range of a tablet or a desktop computer.

Timothy Lord: Now this sort of technology has been around in much more expensive forms for a while.

Sune Johansen: Sure. We didn’t invent eye tracking. It has been around for years but it has been super expensive. What we are doing is making it available for the consumer market for the first time. The first step in doing that is making it available for thousands of developers around the world so that they can start developing cool applications with eye control. Because that is going to be key—how to use eye control. How do we apply it in a daily setting? So we think that is the key. The next step is when we have a lot of cool applications out there already, then we are going to make it available for the consumers as well.

Timothy Lord: Now how are you encouraging the development of those applications? You have an SDK?

Sune Johansen: We have a full SDK. It means with just a few lines of code, you can actually modify existing apps or games, or you can just develop your own. So everyone can do that today. And it is only 99 bucks. And the price will only go down from here.

Timothy Lord: What sort of applications are you already seeing this being used for?

Sune Johansen: So things we know already. I can show you here, for instance, this is just a couple of examples, we have a couple of very simple demo applications. I am just going to start this so you can see an example. So this is a browser. It is not a normal browser because what I am doing is, now I am looking down and then it stops scrolling—completely automated. It is an effortless experience. I don’t have to do anything. I don’t have to move my hands off the device.

If I look up, it scrolls up. If I look down, it scrolls down. And then I can just touch my finger here, and then as I look up the icons on the top here, they are highlighted, and by releasing my finger I select that new page. So this is the absolute quickest way of doing selection on a tablet. I don’t have to take my hands off the device, and do like this, like you usually would. I just have my hands where they should be, and then I just slide my finger to actually do a selection of a different page.

So this is brilliant. We have one of our developers, he put in the note sheet, here he is playing the guitar. So now he can sit in front of his computer play the guitar and he doesn’t have to slide or turn the pages with the hands. He can just play hands free. So this is just one example. And then I can show you a couple of other examples. We have here, this is a game. Some of you might know this game, because normally you would use this game by slicing with your hands on the screen.

What I am doing now is not using the hands, I am using the power of the eyes—just looking it and the fruits go boom just by the look of an eye. This is just again examples. Very simple applications. This is a hack, this is an example of how we inject eye movements just into the game, it is something every developer can do very easily.

Timothy Lord: Now you have shown it attached to a full sized tablet here, I saw yesterday you had a demo, you can also actually similar technology on a smart phone. Talk about that.

Sune Johansen: Actually we have our colleague here [Anas] holding the device here, so what you would be able to see if you can get a picture of the screen here, what we have here is basically sort of the same technology, you have an infrared mirror here that we plugged in. We call that the [Dice] and then we are using the integrated camera here. So this is an example. This is a showcase of using a standard mobile camera already being used today and just adding infrared light. [Anas] is now looking at the icons to highlight and then he can press anywhere on the screen to do a selection of that particular item. So you noticed he just pressed, he doesn’t have to move his hand to the particular icon, he is finding it with the eyes and then he just does the selection by pressing anywhere on the touch screen.

Timothy Lord: Now for practical purposes, when will this be other than a development kit? When will this be available for ordinary consumers to buy and attach to tablets?

Sune Johansen: So right now, I can’t give you the date obviously because it will be announced as soon as we believe that we have enough applications out there so that it makes sense for the consumers to buy it directly. But we are seeing a lot of traction right now. We just started shipping. And thousands of developers around the world are doing very nice applications—it is very exciting to see. There are a lot of different applications.

This discussion has been archived. No new comments can be posted.

CES 2014: Danish Company Promises Low-Cost Eye Trackers For the Masses (Video)

Comments Filter:
  • GPLv3 (Score:4, Informative)

    by game kid ( 805301 ) on Wednesday January 15, 2014 @02:57PM (#45968749) Homepage
  • by Anonymous Coward

    Heretoforward I propose that "Danish" shall refer to the tasty filled pastry or things thereto related. Whatever is left that is identified with Denmark but unrelated to the pastry shall be named "Denmarkish".

  • Do my cheap sunglasses work against yet another monitoring tool, or should I wait for the new "Privacy by RayBan" line?

  • Due to the magic of ad targeting (might be through Google; I have no knowledge of Slashdot's ad deals) I see that the ads on this story are for eye trackers and eye tracker developer kits. Amusing.

  • I'm working on a project that involves eye tracking while a person's eyes are closed, or at least mostly closed. Anyone know if any current eye trackers are capable of this?

  • by pcor ( 8413 ) on Wednesday January 15, 2014 @03:38PM (#45969127)

    There is research work from the 1980's & 90's on the use of eye tracking for UI. Eyes tend to naturally scan around a scene - when the user becomes conscious that their gaze will 'affect' control regions in the UI they try to restrict this natural eye movement and it yields an uncomfortable sensation known as the 'Midas Effect'. This is mentioned and cited in this more recent paper: Real-Time Eye Gaze Tracking for Gaming Design and Consumer Electronics Systems [nuigalway.ie]
    Sorry I'm too lazy to extract the original citation ... but it is worth tracking down and reading about how eye-gaze have been well known as a UI technique going back to the 1980's!

    • Changing the dwell time could be a very common action for web browsing (eye tracking interfaces allow you to activate a graphical widget by dwelling on it, and you can set the time that you need to fixate on the target). For example, if you are on a website that youâ(TM)ve never been to before, you might want to more carefully and slowly examine the hyperlinks, so you might choose to put a longer dwell time for activating links and other web elements. On another tab, you might be on one of your favorit

    • Active control versus passive control

      In a video of Eye Tribeâ(TM)s presentation at Techcrunchâ(TM)s Hardware Battlefield, they distinguish between active control, where youâ(TM)re using your eyes to manipulate interface elements, and passive control, such as when your eye gaze approaches the bottom of a webpage of text, it automatically scrolls down.

      They emphasize passive control in the presentation, but I think that itâ(TM)s because here, youâ(TM)re using your eyes already. If you

  • *Advantages:*

    *Comfort*
    I have not seen any examples of a developer doing serious programming on a touchscreen. Iâ(TM)ve seen programmers that operate in a three-monitor environment, and I don't think that repeatedly reaching their arms across to touch the screens would be comfortable over time.

    Gorilla arm syndrome: "failure to understand the ergonomics of vertically mounted touchscreens for prolonged use. By this proposition the human arm held in an unsupported horizontal position rapidly becomes fatigu

"I got everybody to pay up front...then I blew up their planet." "Now why didn't I think of that?" -- Post Bros. Comics

Working...