13.9.09

Stanford University's EyePoint: Web Surfing With Eye Gaze

August 20, 2007 (Computerworld) -- Increased computing power doesn't just make for better graphics; it opens up new ways to interact with computers. Take, for example, the EyePoint system developed by Stanford University doctoral researcher Manu Kumar.

EyePoint uses a four-step process that incorporates a user's hands and eyes to increase accuracy and eliminate the false positives that come from using eye movements alone. Plus, the technique brings a more natural way of interaction to a broader band of users.

"Using gaze-based interaction techniques makes the system appear to be more intelligent and intuitive to use," says Kumar. "Several users have reported that it often felt like the system was reading their mind."

Here's how it works: While looking at a screen, the user presses a hot key on the keyboard, magnifying the area being viewed. The user then looks at the link within the enlarged area and releases the hot key, thereby activating the link.

Eye tracking, which has been around for decades, typically uses infrared devices embedded into a headset or a monitor frame. The devices track the centers of the user's pupils and then calculate which part of the screen the user is viewing.

This method, however, has been plagued by errors, limiting its use primarily to people with disabilities that prevent using a keyboard and mouse.

Eye trackers are accurate to about 1 degree of visual angle. When looking at a 1,280-by-1,024-pixel, 96-dpi screen at a distance of 20 inches, this equates to a 33-pixel spread in any direction from where the user is looking. That's not accurate enough to pinpoint a link.

"What is really exciting is that the processing power of today's computers is completely changing the kinds of things we can use for computer interfaces," says Ted Selker, associate professor at the MIT Media and Arts Technology Laboratory and director of the Context Aware Computing Lab. "Things like eye tracking are using channels of communication that literally were unavailable to interface designers even five years ago."

"[Kumar's] approach -- using eye movement in a subtle, lightweight way, rather than as a direct mouse substitute -- is exactly the right way to go," says Robert Jacob, a professor of computer science at Tufts University in Medford, Mass.

Selker says eye tracking might become a standard computer interface within the next five years. For now, the primary obstacle is the high cost of eye-tracking hardware, although mass adoption of the technology would drive those costs down.

9.13.09
Eric Layton

No comments:

Post a Comment