1571.pdf

5
8/19/2019 1571.pdf http://slidepdf.com/reader/full/1571pdf 1/5  Using eye tracking to apply focus detection to an adaptive game controller  Blind for review ABSTRACT In the gaming experience, players are used to intuitive interfaces that allow them to jump straight to the entertainment. Modern  joysticks are physical hardware components with a fixed layout,  being the main interface for a large variety of games with different control methods and needs that will be played by users that also  present different ergonomic needs and preferences. This work  proposes a new interface, based on a touchscreen device. We  present a gamepad concept capable of adapting itself dynamically to a user according to its touch and attention focus patterns, trying to avoid errors and provide a more comfortable experience. We also present the results of our usability tests, with both objective and subjective evaluations and the discussion about our findings.  CCS Concepts  Human-centered computing Human computer interaction (HCI)  Interaction devices  Touch screens  Human-centered computing Ubiquitous and mobile computing Ubiquitous and mobile devices Mobile devices Keywords Adaptive interfaces; adaptive game control; game input; eye tracking; focus detection. 1. INTRODUCTION Video games are one of the main entertainment fields nowadays. They are composed of many elements, such as gameplay, audio, graphics and narrative. When these factors are well performed and combined, the game may produce engagement and immersion to the players. Nowadays a special attention has being given to the game interaction, which is an important element for the implementation of the gameplay [7] [3] and rules [11]. Due to this fact, the controller and the control scheme must match user expectations or even exceed them [12]. To achieve and enhance the game interaction, new ways of inputs and devices has been  proposed [ANONYMOUS]. These inputs must be intuitive, provide a deep interaction, where sometimes the user can map the action in the real world to the game. A successfully control scheme must improve the gaming experience. On the other hand, a game that does not have an intuitive control and makes the user uneasy when playing may decrease the overall entertainment process. To provide a better user experience, we proposed in past works the use of an adaptive controller using a mobile device to create a dynamic touchscreen interface. Our controller can present any layout, allowing game designer to project not only their games,  but the own joystick used to play it. Currently, games are played using physical joysticks that present a generic button configuration and layout, with a fixed amount and position for its  buttons. Our work aims to remove this constraint, allowing each game to use a controller layout that best fits its needs. In following works we proposed the dynamic improvement of the custom interface based on machine learning techniques and data captured from the user’s touches on the screen. In this work we improve our proposal including attention focus elements based on eye tracking approaches in order to best fit the ergonomic needs and the current moment in the game. 2. RELATED WORK This work presents the use of a mobile device as an adaptive controller to games. Actually there are many different kinds of controllers. To name few of them, we can mention gamepads, keyboards, steering wheel and mobile devices. The study in [4] compared the usability, user experience, functionality and the design of some controllers but they did not compare mobile devices as inputs neither adaptive devices. Mobile phones have specific hardware (camera, accelerometer, GPS, Bluetooth and so on) with lots of them being different from the ones found in traditional game platforms, like video games and PCs. For this Permission to make digital or hard copies of all or part of this work for  personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.  SAC’1, April 4-8, 2016, Pisa, Italy. Copyright 2016 ACM 978-1-4503-3739-7/16/04…$15.00. http://dx.doi.org/xx.xxxx/xxxxxxx.xxxxxxx  

Upload: julio-moreira

Post on 08-Jul-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1571.pdf

8/19/2019 1571.pdf

http://slidepdf.com/reader/full/1571pdf 1/5

 

Using eye tracking to apply focus detection to an adaptive

game controller  

Blind for review

ABSTRACT 

In the gaming experience, players are used to intuitive interfaces

that allow them to jump straight to the entertainment. Modern joysticks are physical hardware components with a fixed layout,

 being the main interface for a large variety of games with different

control methods and needs that will be played by users that also

 present different ergonomic needs and preferences. This work

 proposes a new interface, based on a touchscreen device. We

 present a gamepad concept capable of adapting itself dynamically

to a user according to its touch and attention focus patterns, trying

to avoid errors and provide a more comfortable experience. We

also present the results of our usability tests, with both objective

and subjective evaluations and the discussion about our findings. 

CCS Concepts 

• Human-centered computing 

Human computer interaction

(HCI) 

Interaction devices 

Touch screens • Human-centered computing

 

Ubiquitous and mobile

computing Ubiquitous and mobile devices 

Mobile

devices 

Keywords 

Adaptive interfaces; adaptive game control; game input; eye

tracking; focus detection.

1.  INTRODUCTIONVideo games are one of the main entertainment fields nowadays.

They are composed of many elements, such as gameplay, audio,

graphics and narrative. When these factors are well performed and

combined, the game may produce engagement and immersion to

the players. Nowadays a special attention has being given to the

game interaction, which is an important element for the

implementation of the gameplay [7] [3] and rules [11]. Due to this

fact, the controller and the control scheme must match userexpectations or even exceed them [12]. To achieve and enhance

the game interaction, new ways of inputs and devices has been

 proposed [ANONYMOUS].

These inputs must be intuitive, provide a deep interaction, where

sometimes the user can map the action in the real world to the

game. A successfully control scheme must improve the gaming

experience. On the other hand, a game that does not have an

intuitive control and makes the user uneasy when playing may

decrease the overall entertainment process.

To provide a better user experience, we proposed in past works

the use of an adaptive controller using a mobile device to create a

dynamic touchscreen interface. Our controller can present any

layout, allowing game designer to project not only their games, but the own joystick used to play it. Currently, games are played

using physical joysticks that present a generic button

configuration and layout, with a fixed amount and position for its

 buttons. Our work aims to remove this constraint, allowing each

game to use a controller layout that best fits its needs. In

following works we proposed the dynamic improvement of the

custom interface based on machine learning techniques and data

captured from the user’s touches on the screen. In this work we

improve our proposal including attention focus elements based on

eye tracking approaches in order to best fit the ergonomic needs

and the current moment in the game.

2.  RELATED WORKThis work presents the use of a mobile device as an adaptive

controller to games. Actually there are many different kinds of

controllers. To name few of them, we can mention gamepads,

keyboards, steering wheel and mobile devices. The study in [4]

compared the usability, user experience, functionality and the

design of some controllers but they did not compare mobile

devices as inputs neither adaptive devices. Mobile phones have

specific hardware (camera, accelerometer, GPS, Bluetooth and so

on) with lots of them being different from the ones found in

traditional game platforms, like video games and PCs. For this

Permission to make digital or hard copies of all or part of this work for

 personal or classroom use is granted without fee provided that copies are

not made or distributed for profit or commercial advantage and that

copies bear this notice and the full citation on the first page. To copy

otherwise, or republish, to post on servers or to redistribute to lists,

requires prior specific permission and/or a fee.  

SAC’16 , April 4-8, 2016, Pisa, Italy.

Copyright 2016 ACM 978-1-4503-3739-7/16/04…$15.00. 

http://dx.doi.org/xx.xxxx/xxxxxxx.xxxxxxx 

Page 2: 1571.pdf

8/19/2019 1571.pdf

http://slidepdf.com/reader/full/1571pdf 2/5

reason, these devices bring new forms of user interaction,

however with the drawback of lacking tactile feedback [6].

Adaptive user interfaces are interactive software systems that

improve its ability to interact with a user based on partial

experience with that user [8]. Some of them will be detailed in

this section to provide a useful background for this work.

Mobile phone touch screen devices are very common. Most

devices that have such features possess few buttons, with almost

all input made by touch. Hence, mobile touchscreen games must

 be designed to accept most of their input by touch that will be

used in games in a similar way to mouse clicks on a regular

computer allowing developers to make different types of mobile

games based on virtual buttons. Consequently, the screen can

draw buttons and use it to simulate button input.

In previous work [ANONYMOUS] the authors developed a

controller that adapt itself according to the users touch press. The

statistical analysis showed strong evidences that the adaptive

controller improved the accuracy of the users.

This work introduces new heuristics for controller adaptation

 based on eye tracking that recognizes where the user is looking

and then decides if the adaptation is necessary. The use of eyetracking to estimate user’s gaze location is present in several

works, such as [9]. It also has being used on games, mainly as a

control method, a technique used on [14] and to evaluate the

user’s focus in gaming interfaces [15]. In our case, we will track

the user’s focus to dynamically evaluate the best parameters for

our adaptive interface.

Figure 1: A user playing a game with the adaptive controller.

3.  PROPOSED INTERFACE

Our adaptive controller consists in a mobile application for asmart device (in our case, an Android smartphone or tablet) that

 presents a customizable graphic interface, specifically built to

attend the game’s needs. The contr ols are presented in the screen

and the user interacts with it by touching buttons to perform

actions in a game running on a regular PC or game console.

Figure 1 shows the prototype controller in action.

However, the interaction with the controller is not a static

experience. Each user has personal ergonomic needs and

 preferences, so that a generic controller is not capable of

 providing the best experience for the individual. Besides that, the

game itself is not constant: the challenges, level design and even

control options will change as the player progresses on it. To

create a dynamic interface that follows this process, our adaptive

controller constantly improves its interface to better fit both the

 player and the current moment in the game. Adaptations may be

triggered by different adaptation causes, such as the context of the

interaction, the experience of the user, or user behavior [2]. Theuse of a touchscreen device instead of a physical hardware, like

traditional controller, allows a single device to not only provide

custom interfaces for each game, but to change its own layout so

that it can respond to new requirements in the interaction.

In [ANONYMOUS], the authors developed agents that monitor

the player usage during the gameplay experience, tracking the

user’s touches and buttons interactions. Based on this data and

using machine learning approaches, the system dynamically

changed and adapted the buttons, in order to better fit the specific

 player. In this work we propose a novel approach for the adaption

using eye tracking to perform better adaptations. In our

experiment we will focus in changes to the button’s position and

size, but other changes such as shape and even the way to interact

with a button can be altered, since our controller allows any kindof interface element, including dragging into the sensible screen.

The communication between the mobile app on the smart device

and the PC that is running the game is performed via network,

using the traditional TCP protocol. Both devices must be in the

same network and an application on the device that is running the

game (PC or console) will be responsible for receiving the

commands from the controller and translate it to local keyboard

events that perform actions on the game. In this work we also

capture eye tracking data and send it to the controller.

4.  CONTROLLER ADAPTATIONS BASED

ON THE USER BEHAVIOR

4.1 

Adaptation Based on the User’s Touches All user touches on the screen are tracked and kept in an internal

database, which will use the 10 most recent touched points and

the amount of presses on each button for evaluating and perform

adaptations to improve the user experience. For our tests, we

manipulated buttons size and positions.

The size adaptation is based on the following heuristic: The

amount of touches is stored for each button and the controller

creates an ordered list, where the first button is the one that was

 pressed more times while the last one will be the least used

 button. One third of the most used buttons will have their size

increased while one third of the less used buttons will decrease in

size. We limited in our tests the button size increase to a

maximum of two times its original size, while the lower bound

was established to its own initial size. This list is constantlyupdated, so if a button stops being used frequently, it will

naturally move down the list and decrease in size.

For the position adaptation, it was used the K-means clustering

algorithm [13], an unsupervised machine learning method. The K-

means receives a set of points and try to cluster them in K classes,

each one containing the corresponding points and a centroid. This

centroid will be the most important result, since it represents the

center of all touches in a class. During the interaction, the user

 performs correct touches in the buttons and incorrect touches in

Page 3: 1571.pdf

8/19/2019 1571.pdf

http://slidepdf.com/reader/full/1571pdf 3/5

the surrounding areas. With K-means, these touches will be

grouped and the centroid will represent a position that allows the

 button to better represent the area where the user is actually trying

to press. Each button will be paired with the closest centroid

found (if a centroid is close to it) and its position will be changed

so that the button’s center matches the centroid. 

All these changes are made gradually, with each button changing

 position or size by a maximum of constant defined pixels periteration. The adaptations also follow some rules and constraints.

Each time a button has to change its size or position, the

controller will verify if this change would make the button invade

another button’s area (an intersection). In positive case, the

change will not be made. However, it is important to note that the

 button will grow or move as much as it can without intersecting

with a neighbor and that, if future layout changes moves the

neighbor out of the way, the desired change will now continue.

To better analyze the effect that our improved controller can have

on the general user experience, we used an eye tracking algorithm

to collect information about focus during the gameplay

experience. This data is used as another input to adaptation as will

 be better explained in the next section.

4.2  Adaptation Based on Focus DetectionGaming aims to be an immersive experience. In our study, we will

consider that the ideal controller should not distract the player

from the game and must avoid breaking this immersion. As our

controller relies on a touchscreen and the lack of physical

feedback can make the user miss buttons, a possible immersion

 break can be observed. If the user cannot feel the position of the

 button, he will have to look to the controller to determine visually

the button’s position. While the K -means algorithm is used to

optimize the button’s layout and correct the interface to avoid

errors, they still are a guess. With this situation on mind, we used

an eye tracking algorithm presented in [16] and modified it to

determine if and when the user may be losing the focus on the

game and looking back to the controller.

These events will be used to allow the interface to perform a

dynamic self-evaluation. In previous user tests [ANONYMOUS],

the authors observed that the user interface would change

constantly even if the user’s play style was not changing. The

feedback received from the users pointed that sometimes the

adaptation was too aggressive. This resulted in some cases where

an interface already close to the optimal just kept changing,

sometimes resulting in less adequate configurations. Another

interesting observation was that the users would look to the

controller when they couldn’t find the buttons they were trying to

 press. As the controller uses a touch interface, the lack of physical

feedback, something always present on regular joysticks, results in

a case where the only possible way to locate a button is to look at

the controller. But as the adaptive interface results in an increased precision, the need to visually check the position for a button can

 be reduced when the interface reaches a configuration closer to

the optimal layout for a user.

We decided to track the user’s attention focus to provide a new

source of data for our adaptations. We used the computer’s

webcam to track the pupils of the user and calculate if he is

looking to the screen or to the controller, using [16] method for

eye tracking. Our prototype controller would adjust the

aggressiveness of its adaptation according to the user’s focus. If

the user looks constantly to the controller, the speed of change for

the size and position of the buttons will be progressively

increased. If the interface stabilizes and the user stops looking at

the controller for more than 10 seconds, the controller will slowly

decrease the speed of the adaptation, stabilizing the interface and

 performing way less changes to its layout.

We change the speed of the adaptation using two parameters: the

maximum change of a buttons’ size and position per iteration (the

algorithm is executed 2 times per second) and the amount of

 points passed to the K-means clusterization algorithm. The first

 parameter, when increased, makes the controller apply the

changes faster, changing its layout almost immediately, while a

lower value will result in slower changes. The last parameter,

when decreased, results in less points being passed to the

clusterization algorithm, resulting in an adaptation focused in the

more recent interaction patterns, being able to change its layout

more dramatically to answer to differences in the user’s behavior.

When this parameter is increased, the controller will base its

suggestions in long term characteristics of the gameplay section

and will be more conservative when performing changes.

We expect that this approach helps to avoid cases where the

interface keeps adapting itself after finding an optimal

configuration, performing unnecessary changes that can be

detrimental to the user’s experience. 

5.  USABILITY TESTSIn order to evaluate properly our proposal adaptation and the

interaction with the final user, we conducted a usability test,

observing some parameters given by the controller and the eye

tracking algorithm. To realize this evaluation, the tests were

divided in two stages: the pilot and the final user tests. The pilot

was the preliminary test and it worked to set the parameters to the

final test, determining the best configuration possible that would

fit to the game and the adaptation. As the controller do not need to

 be identical to the physical one, the pilot test gave insights to

define his design.

5.1  Participants and ApparatusThe evaluation tested two different controllers, that apparently

resemble visually but the core algorithm is different. When the

user looks to the screen, the first impression is that both

controllers are equal. But, one controller is adaptive and the other

not. It is important to note that while the user plays, the adaptive

controller changes completely including the visual.

The functionality of both is the same, so each button corresponds

to the same action regardless of version. The participant must test

 both controller and he does not know which version is being

tested at the moment.

The group of testers was composed by 8 users, with age that

varies between 21 and 51. Our group consisted of 3 women and 5

men. The whole group tested both controller versions with one

game, Streets of Rage. The user’s level of experience and profile

varied from experienced gamers to casual players and people that

usually do not play games at all, covering a wide range of possible

 player profiles. In which play for more than ten years and still

 play.

Page 4: 1571.pdf

8/19/2019 1571.pdf

http://slidepdf.com/reader/full/1571pdf 4/5

Page 5: 1571.pdf

8/19/2019 1571.pdf

http://slidepdf.com/reader/full/1571pdf 5/5

subjective evaluation have shown that a final analysis requires a

new test that totalizes the amount of gazes targeted to the

controller by the users.

Figure 3: The initial controller layout (upper side) and the

final configuration achieved by the adaptive controller for a

user (lower side)

7. 

CONCLUSION AND FUTURE WORKSMany people avoid start playing games. One of the reasons is the

need to interact with complex devices with large amounts of

 buttons and combinations, an issue capable of pushing them away

from video games.

In this work we provided an experience where each user can have

its own personalized game controller, automatically adapted to its

ergonomic and personal preferences. Our test results showed that

the adaptive controller can increase the user’s precision, leading

to less errors and a more comfortable interface. Additionally, the

subjective evaluation also demonstrated good acceptability by the

users.

After proposing a new controller that can be adapted to each user

 behavior, our intention in a future work is to include new tools in

the proposed work that can be used by the game interface designerto devise his own initial interface with adequate amount and

layout of the buttons. In this new paradigm, both the game and the

machine learning algorithms would work together to personalize

the interface. In this case, we would have a game controller for

each user and for each game, resulting in unique combinations.

Measuring the user experience is also an area that must be

explored. With this in mind, we would like to use an EEG

(electroencephalography) headset to evaluate the user’s emotions

during gameplay, trying to determine the exact impact that a game

controller can have on variables such as frustration, engagement

or excitement.

8.  REFERENCES[1]  Bangor, A., Kortum, P., and Miller, J. Determining what

individual SUS scores mean: Adding an adjective rating

scale. Journal of usability studies, 4(3), 114-123, 2009.

[2] 

Bezold, M., and Minker, W.: Adaptive multimodal

interactive systems. Springer, Boston, 2011.

[3]  Brooke, J. SUS-A quick and dirty usability scale. Usability

evaluation in industry, 189(194), 4-7.Chicago, 1996.

[4]  Brown, M., Kehoe, A., Kirakowski, J., and Pitt, I. (2010).

Beyond the gamepad: HCI and game controller design and

evaluation. In Evaluating User Experience in Games (pp.

209-219). Springer London.

[5]  [ANONYMOUS], details omitted due to double-blind

reviewing.

[6]  Joselli,M. ,Silva Junior, J. R. S., Zamith, M., Clua, E., and

Soluri,E. A content adaptation architecture for games. In:

SBGames. SBC, 2012.[7]  Koster, R.: Theory of fun for game design. O’Reilly Media

Inc., Sebastopol, 2013.

[8]  Langley, P. Machine learning for adaptive user interfaces. In:

Brewka, G.,Habel, C., Nebel, B. (eds.) KI 1997. LNCS, vol.

1303. Springer, Heidelberg, 1997.

[9]  Majaranta, P., and Bulling, A. Eye tracking and eye-based

human – computer interaction. In Advances in Physiological

Computing (pp. 39-65). Springer London, 2014.

[10] [ANONYMOUS], details omitted due to double-blind

reviewing.

[11] Salen, K., and Zimmerman, E. Rules of Play: Game Design

Fundamentals. Cambridge, Massachusetts: The MIT Press.

ISBN 978-0-262-24045-1. “Game play is the formalizedinteraction that occurs when players follow the rules of a

game and experience its system through play”, 2004.  

[12] Schell, J.: The Art of Game Design: A book of lenses. CRC

Press, Boca Raton, 2008.

[13] Smola, A., and Vishwanathan, S.: Introduction to Machine

Learning. Cambridge University, UK, 3234 (2008)

[14] Sundstedt, V. Gazing at games: using eye tracking to control

virtual characters. ACM SIGGRAPH 2010 Courses. ACM,

2010.

[15] Sundstedt, V., Bernhard, M., Stavrakis, E., Reinhard, E., and

Wimmer, M. Visual attention and gaze behavior in games:

An object-based approach. In Game analytics (pp. 543-583).

Springer London, 2013.

[16] Timm, F., and Barth, E. (2011, March). Accurate Eye Centre

Localization by Means of Gradients. In VISAPP (pp. 125-

130). 

[17] [ANONYMOUS], details omitted due to double-blind

reviewing.

[18] [ANONYMOUS], details omitted due to double-blind

reviewing.