eye tracking as interface for parametric designgaze-interaction.net/web/images/emp13.pdf · figure...

6
Eye Tracking as Interface for Parametric Design Henrik Ekeus Queen Mary University of London Media and Arts Technology School of Electronic Engineering and Computer Science [email protected] Peter W. McOwan Queen Mary University of London Computer Vision Group School of Electronic Engineering and Computer Science [email protected] Mark D. Plumbley Queen Mary University of London Center for Digital Music School of Electronic Engineering and Computer Science [email protected] Copyright is held by the author/owner(s). CHI 2013 Workshop on ”Gaze Interaction in the Post-WIMP World”, April 27, 2013, Paris, France. Abstract This research investigates the potential of eye tracking as an interface to parameter search in visual design. We outline our experimental framework where a user’s gaze acts as guiding feedback mechanism in an exploration of the state space of parametric designs. A small scale pilot study was carried out where participants influence the evolution of generative patterns by looking at a screen while having their eyes tracked. Preliminary findings suggest that although our eye tracking system can be used to effectively navigate small areas of a parametric design’s state-space, there are challenges to overcome before such a system is practical in a design context. Finally we outline future directions of this research. Author Keywords parametric design, eye tracking, aesthetics, HCI ACM Classification Keywords F.1.2 [Computation by Abstract Devices]: Modes of Computation—Interactive and Reactive Computation; H.1.2 [Models and Principles]: User/machine Systems— human information processing ; H.5.2 [Information Interfaces and Presentation]: Input devices and strategies, User-centered design

Upload: others

Post on 29-May-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Eye Tracking as Interface for Parametric Designgaze-interaction.net/web/images/emp13.pdf · Figure 1: a) Textural output consisting of overlapping entities, b) Output consisting of

Eye Tracking as Interface forParametric Design

Henrik EkeusQueen Mary University ofLondonMedia and Arts TechnologySchool of ElectronicEngineering and [email protected]

Peter W. McOwanQueen Mary University ofLondonComputer Vision GroupSchool of ElectronicEngineering and [email protected]

Mark D. PlumbleyQueen Mary University ofLondonCenter for Digital MusicSchool of ElectronicEngineering and [email protected]

Copyright is held by the author/owner(s).

CHI 2013 Workshop on ”Gaze Interaction in the Post-WIMP

World”, April 27, 2013, Paris, France.

AbstractThis research investigates the potential of eye tracking asan interface to parameter search in visual design. Weoutline our experimental framework where a user’s gazeacts as guiding feedback mechanism in an exploration ofthe state space of parametric designs. A small scale pilotstudy was carried out where participants influence theevolution of generative patterns by looking at a screenwhile having their eyes tracked. Preliminary findingssuggest that although our eye tracking system can beused to effectively navigate small areas of a parametricdesign’s state-space, there are challenges to overcomebefore such a system is practical in a design context.Finally we outline future directions of this research.

Author Keywordsparametric design, eye tracking, aesthetics, HCI

ACM Classification KeywordsF.1.2 [Computation by Abstract Devices]: Modes ofComputation—Interactive and Reactive Computation;H.1.2 [Models and Principles]: User/machine Systems—human information processing ; H.5.2 [InformationInterfaces and Presentation]: Input devices and strategies,User-centered design

Page 2: Eye Tracking as Interface for Parametric Designgaze-interaction.net/web/images/emp13.pdf · Figure 1: a) Textural output consisting of overlapping entities, b) Output consisting of

General TermsDesign, Experimentation, Human Factors, Theory

BackgroundParametric DesignThe use of generative processes for the discovery of novelforms and patterns is not uncommon to the practices ofcontemporary artists, composers and designers. Inarchitecture, parametric or ‘morphogenetic design’ [4] is a‘form finding’ strategy where designers initiate (oftenbiologically inspired) computational generative processesto yield novel forms or patterns. Such form findingconsists not only of the specification of the generativealgorithm, but also of a search for appropriate parametervalues to input into the algorithm. This can be difficultwhen the generative processes are complex or the outputis determined by numerous inter-connected components; asmall change in parameters can have difficult to predictconsequences. Moreover design involves exploration anddiscovery; designers may not know a priori what thefeatures of a desired output are. The present researchexplores the potential of eye tracking as an interface inthe search for desirable parameter values.(a)

(b)

Figure 1: a) Textural outputconsisting of overlapping entities,b) Output consisting of distinctentities

Eye Tracking as Design ToolEye tracking data is often used as feedback to inform thedesigns of user interfaces, web pages, advertisments andproducts on the supermarket shelf. Additionally, there hasbeen a considerable amount of research into using gazetracking as a control interface [1, 6, 7]. Research on eyetracking in image search tasks suggests that informationin gaze patterns can be used to make inferences abouthuman interests [3]. Eye tracking as the interface to agenetic algorithmic process has also been explored [8]. Insuch work, one of a number of discrete stimuli is selectedand these selections determine their evolution. This occurs

in temporally discrete steps of selection and generation.

ApproachOur framework seeks to enable a continuous navigation ofa generative system’s state space. We use real-time eyetracking as the feedback mechanism to ‘steer’ thenavigation of parameter space. This is a variant ongaze-contingent methodology, which is used widely invisual perception research [1].

Instances of a class of visual artefacts populate the screen.The output can either consist of individual distinctgenerative artefacts, or can be used to evolve textures thatare the result of spatially overlapping entities. To increasevariety, we apply noise to the parameter values: a processof differentiation. The feedback from the eye trackerdrives a process of integration, where the parametricvariety of the output decreases. The user’s gazecounterweights the noise processes, and this interactiondetermines the path traveled through state-space.

OutputsEssentially any visual generative system can be used, aslong as: a) the output is local to an area of screen, b)small changes in parameter values do not cause suddenchanges in the visual output (otherwise the suddenchanges will attract gaze), and c) the output can berendered in realtime.

In Figure 1a is an example of a parametric output thatconsists of many overlapping simple entities, creatingtextures. It takes three parameters; the size of a circle, itsoffset in x, and its offset in y. Figure 1b shows a simpleoutput that consists of a few spatially distinct entities, inthis case ’Pepsi’ logos, the parameters define their Bezziercurve.

Page 3: Eye Tracking as Interface for Parametric Designgaze-interaction.net/web/images/emp13.pdf · Figure 1: a) Textural output consisting of overlapping entities, b) Output consisting of

Parameter SpaceIn the software each of the parameters of our generativeoutput is represented as a two-dimensional plane, virtuallyexisting behind the screen. Thus if our artefacts havethree parameters, there are three planes. For an artefactat a given screen location, its rendering parameters aredrawn from the parameter planes at the same screenposition (Figure 4).(a)

(b)

(c)

Figure 2: Changing parametervalues(a) are mapped to a trivialoutput; the size of circles(b).Temporal blur is applied(c)

Figure 3: Perlin noise changingover a short period of time

Figure 4: Parameter planes - values on planes map toparameters of artefacts on corresponding area of the screen

The parameter planes contain floating point valuesbetween 0 and 1, however, using them can require ascaling to discrete integers (for instance the radius of acircle is measured in pixels). In these cases discreteboundaries appear in the visual output across thepopulation of artefacts. Since edges and sudden changesare salient and attract the gaze, we apply a temporal blurto the output, turning the discrete changes into smoothtransitions (Figure 2).

DifferentiationAt any one time a range of parameter values isrepresented on screen, the values changing each frame.To ensure that that there are no sudden and salientchanges in parameter values, three dimensional1 Perlinnoise[5] is used to drive the change. Perlin noise is amethod of efficiently generating graphical textures. Oftenused to model water, smoke and clouds, it is an elegantway to provide both temporally and spatially continuous,yet random, values across a plane (Figure 3). The Perlinnoise algorithm relies on a deterministic pseudo-randomnumber generator, allowing for the possibility to replay anexact evolution and repeat experiments.

Each frame, Perlin noise is added to the values in theparameter planes. This ensures that changes in the outputare continuous and neighbouring instances of the outputartefacts remain similar. How much the Perlin noiseaffects an area of the screen at any one time isdetermined by the eye tracking data, as explained in thenext section. When no gaze is detected, the system doesa kind of random ‘drunken walk’ through the space ofpossible parameter values.

Integration - Interpreting Gaze DataTo determine which parameter values are being attendedto more than others, the streams of eye-tracking data isprocessed to calculate a continually moving ’heat-map’representing the fixation areas in the recent past(approximately 1 second, see Figure 5). This process actslike a de-noising, high pass filter, smoothing out rapidsaccadic eye movements and emphasising fixations. This‘heat-map’ is mapped to the parameter planes todetermine the parameter values of the artefacts mostattended to.

1interpreted as two spatial dimensions and one temporal

Page 4: Eye Tracking as Interface for Parametric Designgaze-interaction.net/web/images/emp13.pdf · Figure 1: a) Textural output consisting of overlapping entities, b) Output consisting of

Each frame, the further an area is from these values, themore it is affected by the Perlin noise. This has the effectof slowing and eventually freezing the parameters ofartefacts similar to the ones most looked at. Additionallythere is a weak ‘gravitational pull’ or bias in the change ofparameter values in the direction of the average value foreach parameter plane. This constrains the range ofparameter values represented on screen at any one time.

If gaze is fixated at just one point, gradually the range ofparameter values represented decreases until all theartefacts on screen are identical. Gaze fixations thusengender a kind of ‘zooming’ in the abstract plane ofparameter values, enabling a small area of parameterspace to be explored in greater detail. A similar zoomingprocess is employed in the text input system Dasher[7],when using gaze for character selection. By gazingtowards the desired character, the interface zooms andthe area around the fixation grows, making it easier toselect the intended character. In our system gaze alsodrives a zoom, but in parameter space.

Figure 5: Heat map representinga rolling average of gaze fixationin recent past

ImplementationThe eye tracking system used is Face Lab 5 by SeeingMachines, a non-contact optical tracker. It sends inreal-time the screen coordinates of the subject’s gaze tothe application driving the visual output, developed inC++.

Pilot StudyGaze is understood to be ”both simultaneouslybottom-up, stimulus-driven as well as top-down,goal-oriented”[2]. This distinction between volitional andnon-volitional gaze informed the design of the study sothat both scenarios could be observed. The first part ofthe study had the participants passively observe the

happenings on screen, in the second part of the studyparticipants they were told to look for a particular pattern.The participants were not told that their gaze affects theoutput until after the end of the experiment.

The output for this study was a very simple parametricdesign consisting simply of overlapping circles (see Figure1a, it has three parameter planes, one for the size of thecircle, the offset in x, and the offset in y. Even though itis a very simple design, it can exhibit very many visuallydistinct outputs as the circles overlap with each other andform patterns.

For all participants the Perlin noise sequence was thesame, and the system started in the ‘middle’ of thestate-space. The study was carried out with 6participants, and each task lasted 5 minutes. Howeversome participants reported getting fatigued, so this meantthat only 3 out of the 6 participants felt comfortablecompleting all tasks.

Part 1: Passive ObservationIn this part, participants were asked to observe a video onscreen. This was repeated three times, how strongly thegaze affects the output being varied from ‘strongly’,‘moderately’, to ‘not at all’. In addition to getting somefirst insights into the dynamics of the system in use, anaim was to see if multiple subjects end up taking similarroutes through state space. Although numerical analysisof the data of this not yet complete, users reportedfinding their gaze clearly attracted to some outputs andrepelled by others.

In these five minute navigations through theparameter-space, the participants never travelled far fromthe starting point in the middle of the state-space. Thedesign of the system, which relies on making changes in

Page 5: Eye Tracking as Interface for Parametric Designgaze-interaction.net/web/images/emp13.pdf · Figure 1: a) Textural output consisting of overlapping entities, b) Output consisting of

the output almost imperceptible, made for a slow, butdetailed exploration of a small sub-set of the parameterspace. To do an exhaustive exploration through thismodest three-parameter design would take a prohibitivelylong time with this interface.(a)

(b)

(c)

(d)

Figure 6: Outputs of theparmetric design used in the pilotstudy. (a)The output at thestarting position at the middle ofthe state space. (b) The ‘near’search target. (c) The ‘middle’search target. (d) The ‘far’search target.

Part 2: Search TaskIn this part of the study, participants were asked to searchfor specific patterns. There were three search targets (seeFigure 6). Each target is a point in parameter-space, andthe Euclidean distance between these and the startingpoint in the middle of the parameter-space varied from‘short’, to ‘medium’, to ‘far’. The participants were toldthat the pattern they are looking for ‘might or might not’appear, but if it does to try to spot it as soon as possible.Again they were not told that their gaze affects theoutput. Each search had a time limit of five minutes, ifthe target was found the experiment would stop.

All four the participants who under took this part of thestudy ended up steering the system to the ‘short’ searchtarget (though one participant didn’t notice that theyhad). None of the participants reached the ‘far’ target.One reached the ‘middle’ target with another gettingclose.

Because the parametric design used in this experimentconsisted of overlapping entities there is no guarantee thatvisual similarity coincides with parametric proximity.Consequently there are cases where the output at a certainpoint in parameter space is visually similar to one far away.(This is less frequently the case with non-overlappingdesigns such as in Figure 1b). So searching based onvisual similarity could lead to a ‘dead end’ in thenavigation, and the number of such ‘dead ends’ increasesthe greater the parametric distance to the search target.However over short parametric distances, visual similarity

coincides with parametric proximity, and our study seemsto indicate that at these scales it easy to lead system to atarget output just by visually searching for it. Designersdon’t have clear search targets in mind when working withparametric designs (then they would have already finishedtheir design!), but if they ‘know it when they see it’,perhaps our framework could assist in a design context.

EvaluationFatigue as an issue has been noted in other studies, and itis suggested that eye-tracking may be better suited as anassistive and supporting interface, rather than as the solemeans of control[6]. In our case, it is likely that the choiceof parametric output contributed further to the fatigue, asthe output in the tasks was reported as being ‘trippy’ and‘intense’ and as consisting of “too much input”.Participants expressed that some of the fatigue could havecome from the cognitive load of trying to work out whatthe animation was doing. One participant reported forinstance that “I knew it was changing, but I couldn’tactually see it change, I was trying to work it out”.Interestingly most participants found that the search taskswere less tiring than the passive observation.

After the experiment, the participants were debriefed, andonly then was it revealed how their gaze was controllingthe animation. They were then shown how to use theirgaze to consciously ‘will’ the patterns at one area of thescreen to spread to the rest, which all participants couldthen do easily. This indicates there is potential for thesystem being a consciously controlled design interface,however it is clear from the pilot study that this interfaceis not suitable for traveling across a large state-spacequickly. These consideration inform the direction of futurework.

Page 6: Eye Tracking as Interface for Parametric Designgaze-interaction.net/web/images/emp13.pdf · Figure 1: a) Textural output consisting of overlapping entities, b) Output consisting of

Further DevelopmentsWe are currently exploring the potential of this system asa practical tool for parametric design. To achieve this,eye-tracking is no longer the sole means of control, but isjust one way for the user to interact with the system. Aphysical controller, motorised sliders (see Figure 7), allowsthe designer to quickly navigate to any point in the statespace with their hands. Each slider is assigned to aparameter value, and can either be moved by the designeror will move by themselves to reflect the state of thesystem. The framework will allow the designer to controlwhich parameters they want under eye tracking control,and which under manual control. The designer could forinstance find an approximate design manually, and thenput one or more of the parameters under gaze control toexplore a detailed area of the state-space. Spatiallydistinct entities will be used as outputs, like the logos inFigure 1a. This is a much less visually busy output, andwill help reduce fatigue.

Figure 7: Motorised sliderinterface to allow for navigationof state space with the hands.

ConclusionsWe outlined an experimental framework where a user’sgaze acts as guiding mechanism in a navigation of thestate space of parametric designs. A small scale pilotstudy was carried out where participants influence theevolution of generative patterns with their gaze. Thoughthere were flaws in the study, it seemed to indicate thatthis system can be used to effectively navigate small areasof a parametric design’s state-space. We outlined futuredirections of this work, including developing the systeminto a more practical design interface with theincorporation of a physical controller.

AcknowledgmentsThis work is supported by an EPSRC Doctoral TrainingCentre EP/G03723X/1 (HE), an EPSRC Leadership

Fellowship EP/G007144/1 (MDP) and EPSRC IDyOM2EP/H013059/1.

References[1] Duchowski, A. A breadth-first survey of eye-tracking

applications. Behavior Research Methods 34, 4(2002), 455–470.

[2] Duchowski, A. Eye tracking methodology: Theory andpractice.

[3] Haji Mirza, S., Proulx, M., and Izquierdo, E. ReadingUsers’ Minds from Their Eyes: A Method for ImplicitImage Annotation. Multimedia, IEEE Transactions on,99 (2012), 1.

[4] Hensel, M., Menges, A., and Weinstock, M., Eds.Emergence: Morphogenetic Design Strategies(Architectural Design). Academy Press, 2004.

[5] Perlin, K., and Hoffert, E. M. Hypertexture. InSIGGRAPH ’89: Proceedings of the 16th annualconference on Computer graphics and interactivetechniques, ACM Request Permissions (July 1989).

[6] Stellmach, S., and Dachselt, R. Look & touch:gaze-supported target acquisition. In CHI ’12:Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems, ACM RequestPermissions (May 2012).

[7] Tuisku, O., Majaranta, P., Isokoski, P., and Raiha,K.-J. Now Dasher! Dash away!: longitudinal study offast text entry by Eye Gaze. In ETRA ’08: Proceedingsof the 2008 symposium on Eye tracking research &applications, ACM Request Permissions (Mar. 2008).

[8] Verma, M., and McOwan, P. W. Generatingcustomised experimental stimuli for visual search usingGenetic Algorithms shows evidence for a continuum ofsearch efficiency. Vision Research 49, 3 (Feb. 2009),374–382.