moshnyaga the use of eye tracking for pc energy management

4
Copyright © 2010 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail [email protected] . ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 The Use of Eye Tracking for PC Energy Management Vasily G. Moshnyaga Department of Electronics Engineering and Computer Science Fukuoka University, Japan [email protected] Abstract This paper discusses a new application of eye-tracking, namely power management, and outlines its implementation in personal computer system. Unlike existing power management technol- ogy, which “senses” a PC user through keyboard and/or mouse, our technology “watches” the user through a single camera. The technology tracks the user’s eyes keeping the display active only if the user looks at the screen. Otherwise it dims the display down or even switches it off to save energy. We implemented the technology in hardware and present the results of its experi- mental evaluation. CR Categories: K.6 [Management of Computing and Informa- tion Systems]; K6.4 [System management] Keywords: Eye tracking, applications, energy reduction 1 Introduction With the wide popularity of user centric applications, the role of smart and intelligent devices, capable of monitoring human eyes is increasing. Up to the date eye-tracking has been applied for applications, such as HCI, security systems, health care, assis- tive technologies, ubiquitous computing, etc. [Morimoto, 2004]. In this paper, we discuss a new application of eye-tracking, namely power management and outline its hardware implemen- tation in personal computer system. Modern PCs burn a half of its total energy in display [Mahesri 2004]. To reduce energy consumption, OS-based Advanced Configuration and Power Interface (or ACPI) sets display to low-power modes after specified periods of inactivity on mouse and/or keyboard [ACPI 2004]. The efficiency of ACPI strongly depends on inactivity intervals, set by the user. From one hand, if the inactivity intervals are improperly short, e.g. 1 or 2 min- utes, the ACPI can be quite troublesome by shutting the display off when it must be on. From another hand, if the inactivity in- tervals are set to be long, the ACPI’s efficiency decreases. Be- cause modifying the intervals requires system setting, a half of the world’s PC users never adjust the power management of their PCs for fear that it will impede performance [Fujitsu 2007]. Those who do the adjustment, usually assign long intervals. As HP [Global Citizenship Report, 2006] reveals, just enabling the low power mode after 20 minutes of inactivity can save up to 381kWh for a single PC per year. Clearly, the PC power man- agement must employ a more efficient user monitoring. Several approaches have been proposed to improve the PC user monitoring. Extending the touch-pad function for user pres- ence detection [Park 1999] and placing thermal sensors around the display [Dai 2003] are some of them. Despite differences, all these approaches have one drawback in common. Namely, they ignore the viewer attention. Paradoxically, while the display is needed only for our eyes, none of the existing approaches, up to our knowledge, takes them into account. Neither ACPI nor tem- perature sensors nor the advanced touchpad screeners can dis- tinguish whether the user looks at screen or not. As a result, they may either switch the display off inappropriately (i.e. when the user looks at screen without pressing a key) or keep the display active when it is not needed. We propose to apply eye-tracking for PC energy management. Unlike existing technologies which “sense” a PC user through keyboard, touchpad and/or mouse, our technology “watches” the user through a single camera. More precisely, it tracks the user’s eyes to detect whether he or she looks at screen or not and based on that changes the display brightness and power consumption. 2 The Proposed Technology 2.1 An Overview The proposed technology is based on the following assumptions: A. The PC is equipped with a color video camera. The camera is located at the top of display. When the user looks at dis- play it faces the camera frontally. B. The display has a number of backlight intensity levels with the highest level corresponding to the largest power con- sumption and the lowest level to the smallest power, respec- tively. The highest intensity level is enabled either initially or whenever the user looks at the screen. The idea is simple: based on the camera readings determine the display power mode. If no human face is detected in the current video frame, the display is switched off. Otherwise, we track the user’s eye-gaze. If the gaze has been off the screen for more than N consecutive frames, the current backlight luminance is dimmed down to the next level. Any on-screen gaze reactivates the initial backlight luminance by moving the display onto the power up mode. However, if no on-screen gaze has been de- tected for more than N frames and the backlight luminance reached already the lowest level, the display enters the standby mode. Returning back from either standby or off modes is done by pushing the ON button. Below we describe the technology in details *The work was supported by The Ministry of Education, Culture, Sports, Science and Technology of Japan under the Knowledge Cluster Initiative (The Second Stage) and Grant-in-Aid for Scientific Research (C) No.21500063. 113

Upload: kalle

Post on 06-Sep-2014

1.835 views

Category:

Documents


2 download

DESCRIPTION

This paper discusses a new application of eye-tracking, namely power management, and outlines its implementation in personal computer system. Unlike existing power management technology, which “senses” a PC user through keyboard and/or mouse, our technology “watches” the user through a single camera. The technology tracks the user’s eyes keeping the display active only if the user looks at the screen. Otherwise it dims the display down or even switches it off to save energy. We implemented the technology in hardware and present the results of its experimental evaluation.

TRANSCRIPT

Page 1: Moshnyaga The Use Of Eye Tracking For Pc Energy Management

Copyright © 2010 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail [email protected]. ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00

The Use of Eye Tracking for PC Energy Management

Vasily G. Moshnyaga Department of Electronics Engineering and Computer Science

Fukuoka University, Japan [email protected]

Abstract

This paper discusses a new application of eye-tracking, namely power management, and outlines its implementation in personal computer system. Unlike existing power management technol-ogy, which “senses” a PC user through keyboard and/or mouse, our technology “watches” the user through a single camera. The technology tracks the user’s eyes keeping the display active only if the user looks at the screen. Otherwise it dims the display down or even switches it off to save energy. We implemented the technology in hardware and present the results of its experi-mental evaluation.

CR Categories: K.6 [Management of Computing and Informa-tion Systems]; K6.4 [System management]

Keywords: Eye tracking, applications, energy reduction

1 Introduction

With the wide popularity of user centric applications, the role of smart and intelligent devices, capable of monitoring human eyes is increasing. Up to the date eye-tracking has been applied for applications, such as HCI, security systems, health care, assis-tive technologies, ubiquitous computing, etc. [Morimoto, 2004]. In this paper, we discuss a new application of eye-tracking, namely power management and outline its hardware implemen-tation in personal computer system.

Modern PCs burn a half of its total energy in display [Mahesri 2004]. To reduce energy consumption, OS-based Advanced Configuration and Power Interface (or ACPI) sets display to low-power modes after specified periods of inactivity on mouse and/or keyboard [ACPI 2004]. The efficiency of ACPI strongly depends on inactivity intervals, set by the user. From one hand, if the inactivity intervals are improperly short, e.g. 1 or 2 min-utes, the ACPI can be quite troublesome by shutting the display off when it must be on. From another hand, if the inactivity in-tervals are set to be long, the ACPI’s efficiency decreases. Be-cause modifying the intervals requires system setting, a half of the world’s PC users never adjust the power management of their PCs for fear that it will impede performance [Fujitsu 2007].

Those who do the adjustment, usually assign long intervals. As HP [Global Citizenship Report, 2006] reveals, just enabling the low power mode after 20 minutes of inactivity can save up to 381kWh for a single PC per year. Clearly, the PC power man-agement must employ a more efficient user monitoring.

Several approaches have been proposed to improve the PC user monitoring. Extending the touch-pad function for user pres-ence detection [Park 1999] and placing thermal sensors around the display [Dai 2003] are some of them. Despite differences, all these approaches have one drawback in common. Namely, they ignore the viewer attention. Paradoxically, while the display is needed only for our eyes, none of the existing approaches, up to our knowledge, takes them into account. Neither ACPI nor tem-perature sensors nor the advanced touchpad screeners can dis-tinguish whether the user looks at screen or not. As a result, they may either switch the display off inappropriately (i.e. when the user looks at screen without pressing a key) or keep the display active when it is not needed.

We propose to apply eye-tracking for PC energy management. Unlike existing technologies which “sense” a PC user through keyboard, touchpad and/or mouse, our technology “watches” the user through a single camera. More precisely, it tracks the user’s eyes to detect whether he or she looks at screen or not and based on that changes the display brightness and power consumption.

2 The Proposed Technology

2.1 An Overview

The proposed technology is based on the following assumptions: A. The PC is equipped with a color video camera. The camera

is located at the top of display. When the user looks at dis-play it faces the camera frontally.

B. The display has a number of backlight intensity levels with the highest level corresponding to the largest power con-sumption and the lowest level to the smallest power, respec-tively. The highest intensity level is enabled either initially or whenever the user looks at the screen.

The idea is simple: based on the camera readings determine the display power mode. If no human face is detected in the current video frame, the display is switched off. Otherwise, we track the user’s eye-gaze. If the gaze has been off the screen for more than N consecutive frames, the current backlight luminance is dimmed down to the next level. Any on-screen gaze reactivates the initial backlight luminance by moving the display onto the power up mode. However, if no on-screen gaze has been de-tected for more than N frames and the backlight luminance reached already the lowest level, the display enters the standby mode. Returning back from either standby or off modes is done by pushing the ON button. Below we describe the technology in details

*The work was supported by The Ministry of Education, Culture, Sports, Science and Technology of Japan under the Knowledge Cluster Initiative (The Second Stage) and Grant-in-Aid for Scientific Research (C) No.21500063.

113

Page 2: Moshnyaga The Use Of Eye Tracking For Pc Energy Management

2.2 User Presence Detection

The goal of this task is to determine from the camera readings whether or not the user is currently present in front of display. To detect the user’s presence in front of the display, we first localize the face search by applying background subtraction and skin-color segmentation to the RGB representation of input im-age. The skin is defined by the following criteria [Douxchamps, 2008]: 0.55<R<0.85, 1.15<R/G<1.19, 1.15<R/B<1.5 and 0.6<(R+G+B)<1.8. To accelerate the face-area extraction, two additional filters are used. The first one limits the size of the head in reasonable range. The second one verifies that the face contains a minimum of 25% of skin colored pixels. Thus, if the total number of pixels in the derived face area exceeds a given threshold the user is assumed present.

2.3 Eye-Gaze Detection

The eye-gaze detector implements the algorithm proposed by [Kawato 2005], which scans the Six Segment Rectangular (SSR) filter over the integral representation input image to define the Between The Eyes (BTE) pattern of human face (Fig.1) and then searches the regions 1 and 3 from the left and right side of the BTE pattern to locate the eyes. The algorithm does not depend on illumination, face occlusion and eye closure. It is more stable, robust and less complex than the other eye-tracking formula-tions. However, it is still very computationally demanding. In a quest to locate all faces in an image (without restriction on face size, motion and rotation), the algorithm scans the whole image six times performing over 28M operations per (640x480) frame. Though such a full search might be necessary in some applica-tions, it seems redundant when tracking eyes of the PC user.

In our eye-tracking application we can assume that: 1. The target object is a single PC user. The user sits in front

of PC at a relatively close distance of 50-70cm. 2. The user’s motion is slow relatively to the frame rate. 3. The background is stable and constant.

Based on these assumptions, we apply the following algorithmic optimizations to reduce eye-tracking complexity [Yamamoto and Moshnyaga 2009]: • fixed SSR filter size; when the user is 50-70 cm from the

camera, the BTE interval of 55 pixels and the filter size ra-tio of 2:3 ensure minimal computational complexity at almost 100% detection rate.

• single SSR filter scan; It follows from the single user as-sumption and the fixed SSR filter size.

• pixel displacement of the SSR filter during the scan; Ex-periments showed that the computational complexity de-creases by a factor of 3 when the displacement of 2, and by a factor of 4.5 for the displacement of 3 without affecting the detection rate of the original (full scan) algorithm.

• low frame processing rate (5-10 fps); Because the user motion is very slow, high processing rates are redundant.

Fig.2 shows the modified algorithm. For the first frame or any frame in which the search for BTE candidate was unsuccessful, we search the image area reduced by background and skin-color extraction; otherwise the search is restricted to a small area (S) of ±8 pixels around the previously located BTE pattern. For the chosen area, the algorithm first transforms the green component of corresponding image into integral image representation and then scans it by the SSR filter to select the BTE candidate. If the BTE candidate is found, the system takes it as a starting point to locate eyes. If eyes have been detected, the user is assumed to be looking at screen; else it is not. If no BTE candidate has been found, the user is considered to be not looking at the screen. To detect a BTE pattern in an image, we scan the SSR filter over the search area S in a row-first fashion and at each location compare the integral sums of the rectangular segments corre-sponding to eyes, chicks and nose (i.e. 1 and 2, 1 and 4, 3 and 2, and 3 and 6) as follows:

Sum(1) < Sum (2) & Sum(1) < Sum(4) (2) Sum(3) < Sum(2) & Sum(3) < Sum(6) If the above criteria are satisfied, the SSR is considered to be a candidate for the BTE pattern (i.e. face) and two local minimum (i.e. dark) points each are extracted from the regions 1 and 3 of the SSR for left and right eyes, respectively. The eye localization procedure is organized as a scan over the green plane representation of regions 1 and 3 for a continuous segment of dark pixels (i.e. whose value is lower than the threshold k). During the search for eyes, we ignore 2 pixels at the boarder of the regions to avoid effects of eyebrows, hair and beard. Also, because the eyebrows have almost the same grey level as the eyes, the search starts from the lowest positions of regions 1 and 3. Similarly to [Kawato 2000], we assume that eyes are located if the distance between the located eyes (D) and the angle (A) at the center point of the BTE area 2 (see Fig.3, left) satisfy the following: 30<D< 42 & 115° < A < 180°.If both eyes are detected, the user’s gaze is considered to be on screen. The eye positions found in this case the current frame are then used to reduce complexity of processing the successive frames. The search in the next frame is limited to a small region, which spans by 8 pixels in vertical and horizontal direction around the eye points of the current frame.

Figure 1: Illustration of the SSR filter

1 32

4 65

(x1,y1) (x2,y1)

(x1,y2) (x2,y2)

(x0,y0)

1 32

4 65

(x1,y1) (x2,y1)

(x1,y2) (x2,y2)

(x0,y0)

Locate the eyes

Confirm by SVM

Save BTE, eye locations

Positiveyes

no

Find BTE candidates

Next BTE candidate

0ther candidates?yesno

Next frame

Positive

Compute integral image of S

0ther candidates?

Run SSR filterno

yes

noSave BTE location

Run complete?

Face candidate? no

no

yes

Face candidate?

Search area (S) around the BTE

no Search area (S) of whole imageyes

BTE location known?

Locate the eyes

Confirm by SVM

Save BTE, eye locations

Positiveyes

no

Find BTE candidates

Next BTE candidate

0ther candidates?yesno

Next frame

Positive

Compute integral image of S

0ther candidates?

Run SSR filterno

yes

noSave BTE location

Run complete?

Face candidate? no

no

yes

Face candidate?

Search area (S) around the BTE

no Search area (S) of whole imageyes

BTE location known?

Figure 2: The modified eye-tracking algorithm

114

Page 3: Moshnyaga The Use Of Eye Tracking For Pc Energy Management

Fig. 3 (right) demonstrates the search area reduction by our algo-rithm: the dashed line shows the area defined by background extraction; the dotted line depicts the area obtained by skin-color-segmentation; the plain (dark line) shows the area around the BTE pattern found in the previous image frame; white crosses show the computed locations of eyes.

3 Implementation

We implemented the proposed PC display power management system in hardware. Fig.4 outlines the block-diagram of the system. The user tracking unit receives an RGB color image and outputs two logic signals, u1,u0. If the user is detected in the image, the signal u0 is set to 1; otherwise it is 0. The zero value of u0 enforces the voltage converter to shrink the backlight sup-ply voltage to 0 Volts, dimming the display off. If the eye-gaze detector determines that the user looks at screen, it sets u1=1. When both u0 and u1 are 1, the display operates as usual. If the user’s gaze has been off the screen for more than N consecutive frames, u1 becomes 0. If u0=1 and u1=0, the voltage converter lowers the input voltage (Vb) of the high-voltage inverter by ∆V. This voltage drop lowers backlight luminance and so shrinks the power consumption of the display. Any on-screen gaze in this low power mode reactivates the initial backlight luminance and moves the display onto normal mode. However, if u0=0 and the backlight luminance has already reached the lowest level, the display is turned off.

The user-tracking unit was realized on a single Xilinx FPGA board connected to VGA camera through parallel I/O interface. See [Moshnyaga, et al, 2009] for details. The unit operates at 48MHz frequency, 3.3V voltage and provides eye tracking at 20fps rate. Due to capacity limitations of the on-chip SRAM

memory, input images were 160x120 pixels in size. The SSR filter was 30x20 pixels in size. The total power consumption of the design was 150mW, which is 35 times less than software implementation on a desktop PC [Moshnyaga et al. 2009].

4 Experimental Evaluation

4.1 Eye-Detection Accuracy

To evaluate accuracy of the gaze detector, we ran four different tests each of each conducted by different users. The users were free to look at the camera/display, read from the materials on the table, type text, wear eyeglasses, move gesticulate or even leave the PC whenever wanted. Fig.5 illustrates the detection results on 4 images. The + marks depict positions where the system assumes the eyes to be. As we see, even though the lighting conditions of faces vary, the results are correct. Ordinary pairs of glasses (see Fig.5, top row) have no bad effect on the per-formance for frontal faces. In some face orientations, however, the frame of pair of glasses can hide a part of eye ball, causing the system to loose the eye. Or sometimes it takes eyebrow or hair as an eye and tracks it in the following frames.

Table 1 summarizes the results. Here, the second column de-picts the total number of frames considered in the test; columns marked by ‘True’ and ‘False’ reflect the number of true and false detections, respectively, for positive and negative cases. The false positives correspond to cases in which one of the eyes is tracked on the eyebrow or on the hair near the eye. The false negatives reflect cases in which the user gazed off the screen (the both eyes are tracked on the eyebrows). Accuracy column shows the ratio of true decisions to the total number of decisions made. As the tests showed, the eye tracking accuracy of pro-posed system is quite high (88% on average).

4.2 Energy Reduction Efficiency

Next, we estimated the energy efficiency of the proposed cam-era–based power management system by measuring the total power consumption taken from the wall by the system itself and the 17” IO-DATA TFT LCD display controlled by the system. Fig.6 profiles the results measured per frame on a 100sec (2000frames) long test. In the test, the user was present in front of the display (frames 1-299, 819-1491, 1823-2001); moved a little from the display but still present in the camera view (frames 1300 to 1491); and stepped away from the PC disap-pearing from the camera (frames 300-818, 1492-1822). The

D

A

DD

A

Figure 3: An illustration of the eye detection heuristics (left) and the search area reduction (right)

Figure 5: Examples of correct eye-detection

Table 1: Results of evaluation on test sequences Positives Negatives Test Frames True False True False

Accuracy(%)

1 151 127 0 6 18 88 2 240 149 1 65 25 89 3 100 74 0 16 10 90 4 180 142 4 18 24 84

Average 167 123 1 26 19 88

High-voltageinverter

Backlight lamp

Voltageconverter

Screen

Vb

VideoCamera

DisplayEye

detectoru1

(R,G,B)

u0User

presence detector

User tracking unit

Figure 4: System overview

115

Page 4: Moshnyaga The Use Of Eye Tracking For Pc Energy Management

system was set to step down from the current power level if the eye-gaze off the screen was continuously detected for more than 15 frames (i.e. almost 1 sec). The ACPI line shows the power consumption level of the ACPI.

We see that our technology is very effective. It changes the display power accordingly to the user behavior; dimming the display when the user gaze is off the screen and powering the display up when the user looks at it. Changing the brightness from one power level to another in our system takes only 20ms, which is unobservable for the user. Fig.7 shows the brightness of the screenshots and the corresponding power consumption level (see the numbers displayed on the down-right corner of the screenshots; the second row from the bottom shows the power).

The total power overhead of the system is 960mW. Even though the system takes a little more power than ACPI (see horizontal line in Fig.6) in active mode, it saves 36% of the total energy consumed by the display on this short test. In environ-ments when users frequently detract their attention from the screen or leave computers unattended (e.g. school, university, office) the energy savings could be significant.

5 Conclusion

In this paper we presented a novel eye-tracking application, namely display power management, and outlined an implemen-tation technology which made the application viable. Experi-ments showed that the camera-based display power management is more efficient than the currently used ACPI method due to its ability to adjust the display power adaptively to the viewer be-havior. The application-specific algorithm optimizations and eye-tracking implementation in hardware allowed us reduce the power overhead below 1W yet satisfying real-time and high accuracy requirements of the application. However, this power can be reduced even further should custom design be performed.

In the current work we restricted ourselves to a simple case of a singular user monitoring. However, when talking about moni-toring in general, some critical issues arise. For instance, how should the technology behave when handling more than one person looking at screen? The user might not look at screen while the others do. Concerning this point, we believe that a feasible solution is to keep the display active while there is someone looking at the screen. We are currently investigating the issue as well as the influence of camera positioning, user gender/race etc.

References ACPI: Advanced Configuration and Power Interface Specification, 2004, Sept., Rev. 3.0, http://www.acpi.info/spec.htm YAMAMOTO S., AND MOSHNYAGA V.G. 2009. Algorithm optimizations for low-complexity eye tracking, Proc. IEEE SMC, 18-22 MOSHNYAGA V.G., HASIMOTO K., SUETSUGU T., HIGASHI S. 2009. A hardware implementation of the user-centric display energy management, Proc. PATMOS 2009, LNCS 5953, 56-65. DAI, X., AND RAYCHANDRAN, K. 2003. Computer screen power management through detection of user presence, US Patent 6650322. DOUXCHAMPS D., AND CAMPBELL N. 2008, Robust real time face tracking for the analysis of human behavior, in Machine Learning for multimodal Interaction, LNCS 4892, 1-10. Fujitsu-Siemens Report 2007, Energy savings with personal computers, Fujitsu-Siemens Corp. http://www.fujitsu-simens.nl/aboutus/sor/energy_ saving/prof_desk_prod.html Global Citizenship Report, Hewlett-Packard Co., 2006, www.hp.com/hpinfo/globalcitizenship/gcreport/pdf/ hp2006gcreport_lowres.pdf KAWATO S., AND OHYA J. 2000, Two-step approach for real-time eye-tracking with a new filtering technique. Proc. IEEE SMC, 1366-1371 KAWATO S., TETSUTANI N., OSAKA K. 2005. Scale-adaptive face detection and tracking in real time with SSR filters and support vector machine, IEICE Trans. Information &Systems, E88-D, (12) 2857-2863. MORIMOTO, C., MIMICA, M. R.M. 2004. Eye gaze tracking techniques for interactive applications, Computer Vision and Image Understanding, 98 (2004) (1), 4–24 MAHESRI, A., VARDHAN, V. 2005. Power Consumption Breakdown on a Modern Laptop, Proc. Power Aware Computing Systems, LNCS (3471), 165-180. PARK, W.I. 1999. Power saving in a portable computer, EU Patent, EP0949557.

01 501 1001 1501 2001

Frame

ACPI

0

5

10

20

30

35

40Po

wer

(W)

15

25

No user

Gazeoff screen

No user

Gaze on

screen

User is present

Gazeon screen

01 501 1001 1501 2001

Frame

ACPI

0

5

10

20

30

35

40Po

wer

(W)

15

25

No user

Gazeoff screen

No user

Gaze on

screen

User is present

Gazeon screen

Figure 6: Display power consumption per frame

Figure 7: Screenshots of display and corresponding power consumption: when the user looks at screen, the screen is bright and power is 35W (top picture); else the screen is dimmed and is power 15.6W (bottom picture)

116