Lifelong Health and Wellbeing

RApid GAze-Based Interaction Techniques (RAGABITS) for on line games and virtual communities intended for users with severe motor impairments

Eye gaze

Online virtual worlds and games (MMORPG's) have much to offer users with severe motor disabilities. It gives this user group the opportunity as entirely able-bodied to others in the virtual world. if they so wish. The extent to which a user has to reveal their disability becomes a privacy issue. Many of the avatars in Second Life appear as stylized versions of the users that control them and that stylization is the choice of the user. This choice is equally appropriate for disabled users. While the appearance of the user's avatar may not reveal the disability of the person that controls it, the behavior and speed or interaction in the world may do.

Many users with severe motor impairments may not be able to operate a keyboard or hand mouse and may also struggle with speech and head movement. Eye gaze is one method of interaction that has been used successfully in enabling access to desktop environments. However, simply emulating a mouse using eye gaze is not sufficient for interaction in online virtual worlds and the users privacy can be exposed unless efficient gaze-based interaction techniques, appropriate to activities in on-line worlds and games can be provided.

This genre of gaming (MMORPG's) is constantly evolving and regardless of the aim of the game they all involve common tasks such as, avatar creation, social interaction (chatting, IM), interaction with in world objects (pick up, open, shoot etc), navigating and walking around the environment. Our research involves analyzing these common tasks so that suitable gaze based interaction techniques to support them can be used in place of a mouse and keyboard. These will have different performance/effort trade-offs, and will include extended mouse/joystick emulation, gaze gestures, toolglasses and gaze-aware in-world objects. These techniques need to be integrated into a coherent and efficient user interface suited to the needs of an individual user with a particular disability.

The research aims to model tasks inherent in using these worlds so that predictions can be made about the most appropriate gaze based interaction techniques to use. When these have been identified, they can be assembled into a front end or user interface. One possible outcome could be a software device for automatic configuration of a gaze-control interface for new games, which could use knowledge of a specific user's disability and the eye tracking equipment that they have.

For further information about this project, please contact:
Stephen Vickers, Computer Human Interaction Research Group, De Montfort University, The Gateway, Leicester, LE1 9BH.
Tel: (+44) 0116 2551 551 ext. 8226
Email: svickers [at] dmu.ac.uk
eyegaze.iblogger.org

Links