2011_v13_no1_44-45

Upload: akshit-sharma

Post on 06-Jul-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/17/2019 2011_v13_no1_44-45

    1/2

    44

     International Journal of Bioelectromagnetism www.ijbem.org

    Vol. 13, No. 1, pp. 44- 45, 2011

    Towards a Brain-Activated and Eye-Controlled Wheelchair 

    Massimo Gneo, Giacomo Severini, Silvia Conforto, Maurizio Schmid, Tommaso D’Alessio Applied Electronics Department, Roma Tre University, Rome, Italy

    Correspondence: M Gneo, Applied Electronics Department, Roma Tre University, Via Vasca Navale 84, Rome, Italy.

    E-mail: [email protected], phone +39 6 5733 7057, fax +39 6 5733 7026.

     Abstract. All the methods to control electric-powered wheelchair (EPW) with user’s gaze require a graphical user interface

    (GUI) to select and confirm commands. This kind of GUI may give non natural guide and partial obstructed sight. As

    further gaze independent inputs are needed for safety issues, we propose a high-level scheme of a system integrating an

    eye-gaze tracking system (EGTS) to select the desired motion command, with a brain-computer interface (BCI) using the

    user’s electroencephalogram (EEG) as a motion activation command, obtaining a safer obstruction-free eye-and

     brain-guided EPW.

     Keywords: Brain-Computer Interface (BCI); Hybrid BCI; Human Computer Interface (HCI); Electroencephalogram (EEG); Eye-GazeTracking System (EGTS); Neural Networks; Electric-Powered Wheelchair (EPW).

    1. Introduction

    A standard electric-powered wheelchair (EPW) is a wheelchair acted by an electric motor with a hand-operated

     joystick providing navigational controls. Though paralyzed users who cannot use the joystick have other special devices

    available (touchpad, head/chin/speech control, sip-n-puff), some locked-in patients keep only very poor residual motor

    abilities, among which the oculomotor control is preserved for long periods (e.g. amyotrophic lateral sclerosis). Two

     possible approaches allowing those patients to guide EPWs – eye-gaze tracking systems (EGTSs) [Tuisku et al., 2008] and brain computer interfaces BCIs) [Millán et al., 2009] – have been mainly analysed alone. In [Zander et al., 2010] eye

    movements select objects and a BCI gives the mouse click on a human computer interface (HCI). Following a similar

     philosophy, we propose to integrate an EGTS with an EEG BCI to control EPWs.

    EGTSs estimate the user’s point of gaze (POG) either providing information on the oculomotor tract (e.g. in

    ophthalmology, neurology) or to drive input devices for HCI. While intrusive EGTSs require physical contact (e.g. contact

    lenses, electrodes fixed around the eye), video-based EGTS use eye images captured by cameras [Duchowsky, 2002]. There

    are no currently available systems allowing the user to directly look where he/she wishes to go ( eyes-up interfaces), since

    existing systems require the user to continuously look at a graphical user interface (GUI) to select and validate the EPW

    command during motion (eyes-down interfaces) [Tuisku et al., 2008]. Thus, eye-controlled EPWs exhibit two main

     problems: first, as the user is always gazing at somewhere, undesired commands may be activated (the so-called Midas

    touch); then, the GUI hardware may obstruct visibility, and the driver always needs to stay really focused in the desired

    direction.

    An EEG-based BCI uses the electric signal measured on the scalp to classify cortical activity and translate it into

    commands for a given device. Due to the noise and reduced spatial resolution, EEG-actuated devices are limited by a low

    information transfer rate and are generally considered too slow for controlling rapid and complex robot movements.

    EEG-based BCI can be, however, used as an effective binary switch for movement activation: for instance, event related

    de/synchronization [Pfurtscheller et al. 1998] can be exploited as a method to drive this switch based on non complex motor

    imagery tasks.

    As some authors considered eye-control still immature [Tuisku et al., 2008] and unsafe to control EPWs (independent

    inputs should be considered), and the BCI activation command has been shown as being more reliable (though slower) than

    the eye dwell time [Zander et al., 2010], we propose to use an EGTS to select the desired direction, and EEG signals to

    activate the motion along that direction, avoiding both the Midas touch and the need to stare at a GUI. Therefore, the user

    can naturally control the EPW looking at the place to be reached and activating the BCI when motion is desired, letting

    her/his sight free.

  • 8/17/2019 2011_v13_no1_44-45

    2/2