development of a mobile user interface for image-based...

7
Development of a Mobile User Interface for Image-based Dietary Assessment SungYe Kim , TusaRebecca Schap , Marc Bosch , Ross Maciejewski Edward J. Delp , David S. Ebert , Carol J. Boushey Electrical and Computer Engineering, Foods and Nutrition Purdue University, West Lafayette, IN, USA {inside, tschap, mboschru, rmacieje, ace, ebertd, boushey}@purdue.edu ABSTRACT In this paper, we present a mobile user interface for image- based dietary assessment. The mobile user interface pro- vides a front end to a client-server image recognition and portion estimation software. In the client-server configura- tion, the user interactively records a series of food images using a built-in camera on the mobile device. Images are sent from the mobile device to the server, and the calorie content of the meal is estimated. In this paper, we describe and discuss the design and development of our mobile user interface features. We discuss the design concepts, through initial ideas and implementations. For each concept, we dis- cuss qualitative user feedback from participants using the mobile client application. We then discuss future designs, including work on design considerations for the mobile ap- plication to allow the user to interactively correct errors in the automatic processing while reducing the user burden as- sociated with classical pen-and-paper dietary records. Categories and Subject Descriptors H.4 [Information Systems Applications]: Miscellaneous; H.5.2 [User Interfaces]: Mobile Dietary System Keywords Mobile user interfaces, User interface design, Mobile devices, Health monitoring tools, Dietary assessment system. 1. INTRODUCTION As smart telephone technology continues to advance in sensor capability and computational efficiency, mobile tele- phones are being used for purposes other than just dialing and receiving calls. Advances in mobile technology have led to enhanced features, such as built-in cameras, games and texting on phones. Furthermore, researchers have employed mobile technologies in a variety of tools ranging in applica- tions from personal health and wellness to biological research Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. MUM ’10, December 1-3, 2010, Limassol, Cyprus Copyright 2010 ACM 978-1-4503-0424-5/10/12 ...$10.00. to emergency response [3, 6, 14, 29]. Our research focuses on employing mobile technology for dietary assessment. Given the mounting concerns of childhood and adult obe- sity [20, 21], advanced tools for keeping personal dietary records are needed as a means for accurately and effectively monitoring energy and nutrient intakes. Such records pro- vide researchers with information for hypothesis generation and assessment regarding health conditions and nutrient im- balances. In light of these issues, research has focused on a means to utilize mobile technology for dietary assessment (e.g., [10, 15, 22, 27]). Previously, we presented a dietary record assessment system [18] focused on creating a server- client architecture in which the client mobile device was used to capture a pair of images (before and after) of meals and snacks. In our system design, the images are sent to the server where image analysis and volume estimation calcula- tions are performed and the results are returned to nutrition researchers (or dietitians). In this paper, we discuss the de- sign and development of the mobile interface for our client- server system. The development of the mobile user interface of our dietary assessment system has been a collaboration with researchers in the Department of Foods and Nutrition. We intersperse our discussion with lessons learned based on user feedback, providing guidelines for the development of such applications. Compared to our previous papers, this is the first time our mobile application is described in depth. Contributions of this paper include: Field specific design considerations and strategies : We describe design considerations for our mobile user in- terface for dietary assessment and development strate- gies based on user evaluation. Improved user interface : From the point of view of an integrated system, we present details of the data that is created in the mobile application and describe the communication protocol between the mobile client to the server. Flexible eating patterns : We extend a previous design for recording food images to support flexible eating patterns such as portion refills so that each eating oc- casion can include not only a pair but also a series of food images. Minimal data exposure and user interaction : Through filtering eating occasions, we display only some of the images containing information that needs user confir- mation. Such a process is used to minimize user inter- action, thus reducing user burden.

Upload: others

Post on 24-Jun-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Development of a Mobile User Interface for Image-based ...rmaciejewski.faculty.asu.edu/papers/2010/MUM_TADA.pdfand assessment regarding health conditions and nutrient im-balances

Development of a Mobile User Interface for Image-basedDietary Assessment

SungYe Kim†, TusaRebecca Schap‡, Marc Bosch†, Ross Maciejewski†Edward J. Delp†, David S. Ebert†, Carol J. Boushey‡

† Electrical and Computer Engineering, ‡ Foods and NutritionPurdue University, West Lafayette, IN, USA

{inside, tschap, mboschru, rmacieje, ace, ebertd, boushey}@purdue.edu

ABSTRACTIn this paper, we present a mobile user interface for image-based dietary assessment. The mobile user interface pro-vides a front end to a client-server image recognition andportion estimation software. In the client-server configura-tion, the user interactively records a series of food imagesusing a built-in camera on the mobile device. Images aresent from the mobile device to the server, and the caloriecontent of the meal is estimated. In this paper, we describeand discuss the design and development of our mobile userinterface features. We discuss the design concepts, throughinitial ideas and implementations. For each concept, we dis-cuss qualitative user feedback from participants using themobile client application. We then discuss future designs,including work on design considerations for the mobile ap-plication to allow the user to interactively correct errors inthe automatic processing while reducing the user burden as-sociated with classical pen-and-paper dietary records.

Categories and Subject DescriptorsH.4 [Information Systems Applications]: Miscellaneous;H.5.2 [User Interfaces]: Mobile Dietary System

KeywordsMobile user interfaces, User interface design, Mobile devices,Health monitoring tools, Dietary assessment system.

1. INTRODUCTIONAs smart telephone technology continues to advance in

sensor capability and computational efficiency, mobile tele-phones are being used for purposes other than just dialingand receiving calls. Advances in mobile technology have ledto enhanced features, such as built-in cameras, games andtexting on phones. Furthermore, researchers have employedmobile technologies in a variety of tools ranging in applica-tions from personal health and wellness to biological research

Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise, torepublish, to post on servers or to redistribute to lists, requires prior specificpermission and/or a fee.MUM ’10, December 1-3, 2010, Limassol, CyprusCopyright 2010 ACM 978-1-4503-0424-5/10/12 ...$10.00.

to emergency response [3, 6, 14, 29]. Our research focuseson employing mobile technology for dietary assessment.

Given the mounting concerns of childhood and adult obe-sity [20, 21], advanced tools for keeping personal dietaryrecords are needed as a means for accurately and effectivelymonitoring energy and nutrient intakes. Such records pro-vide researchers with information for hypothesis generationand assessment regarding health conditions and nutrient im-balances. In light of these issues, research has focused on ameans to utilize mobile technology for dietary assessment(e.g., [10, 15, 22, 27]). Previously, we presented a dietaryrecord assessment system [18] focused on creating a server-client architecture in which the client mobile device was usedto capture a pair of images (before and after) of meals andsnacks. In our system design, the images are sent to theserver where image analysis and volume estimation calcula-tions are performed and the results are returned to nutritionresearchers (or dietitians). In this paper, we discuss the de-sign and development of the mobile interface for our client-server system. The development of the mobile user interfaceof our dietary assessment system has been a collaborationwith researchers in the Department of Foods and Nutrition.We intersperse our discussion with lessons learned based onuser feedback, providing guidelines for the development ofsuch applications. Compared to our previous papers, this isthe first time our mobile application is described in depth.Contributions of this paper include:

• Field specific design considerations and strategies: Wedescribe design considerations for our mobile user in-terface for dietary assessment and development strate-gies based on user evaluation.

• Improved user interface: From the point of view of anintegrated system, we present details of the data thatis created in the mobile application and describe thecommunication protocol between the mobile client tothe server.

• Flexible eating patterns: We extend a previous designfor recording food images to support flexible eatingpatterns such as portion refills so that each eating oc-casion can include not only a pair but also a series offood images.

• Minimal data exposure and user interaction: Throughfiltering eating occasions, we display only some of theimages containing information that needs user confir-mation. Such a process is used to minimize user inter-action, thus reducing user burden.

Page 2: Development of a Mobile User Interface for Image-based ...rmaciejewski.faculty.asu.edu/papers/2010/MUM_TADA.pdfand assessment regarding health conditions and nutrient im-balances

This paper is organized as follows: Section 2 discusses therelated research. Section 3 presents our image-based mobiledietary assessment system and the design considerations of amobile user interface. Section 4 presents the framework andfunctionality of our mobile dietary assessment tools. Finally,Section 5 presents conclusions and discussion.

2. RELATED WORKMany systems incorporating mobile devices (e.g., PDAs,

cell phones) for data collection and management, have beenproposed in various fields [7, 12, 15, 28, 29]. Denning et al.[9] proposed a wellness management system using mobilephones and a mobile sensing platform unit. In this system,users manually input food consumption information with amobile telephone. The mobile sensing unit computes theuser’s energy expenditure and transfers this data to the mo-bile telephone. Arsand et al. [4, 5] studied how to designeasily usable mobile self-monitoring tools for diabetes.

Other dietary assessment systems also make use of mo-bile phones. Wang et al. [27] describes the evaluation ofpersonal dietary intake using mobile devices with a cam-era. Users took food images and sent them to dietitians bya mobile telephone. Reddy et al. [24] introduced the Di-etSense architecture that consists of mobile phones, a datarepository, data analysis tools, and data collection proto-col management tools. Mobile phones with a GPS receiverwere used to capture food images as well as relevant context,such as audio, time and location. The food images were col-lected into a data repository to be analyzed by dietitians.Donald et al. [10] discussed the effect of mobile phones indietary assessment by performing a user study with adoles-cent patients with diabetes. In their work, users were askedto complete food diaries and to capture images of all foodand drink by using mobile phones. Then, food images weredownloaded to a computer and a dietitian estimated portionsizes from the diaries and images.

In first developing our mobile user interface, we lookedat research which focused on general guidelines and require-ments for mobile interface design. Nilsson [19] presenteduser interface patterns for mobile applications and discusseda series of challenges regarding screen space and interactionmechanisms. Gong and Tarasewich [13] discussed design re-quirements for mobile user interfaces compared to those forthe desktop environment based on Shneiderman’s GoldenRules of Interface Design [25] and proposed guidelines formobile devices such as consistency for a desktop counter-part and design for top-down interaction. Chong et al. [8]focused on designing a touch screen based user interface. Inthis work, they proposed several key guidelines influencinga good user interface including; simplicity, aesthetics, pro-ductivity, and customizability.

We also examined research [4, 5, 16] focusing specificallyon design concepts for self-health monitoring tools utiliz-ing the capability of mobile devices. Designing field specificinterfaces requires careful consideration and even field back-ground knowledge [17]. Based on our initial system proto-type and user feedback, we adopted design concepts pro-posed by Arsand et al. [4]. Our system design was donein conjunction with interactive feedback from nutrition re-searchers. Our system was developed to provide researcherswith an automated analysis for the estimation of daily foodintake. The dietary assessment tool uses a built-in cam-era on a mobile telephone to capture food images, and the

Figure 1: Overview of an image-based dietary as-sessment system in a server-client architecture.

user interactions of taking meal images and verifying the as-sessment demands careful design considerations in terms ofusability and flexibility in order to reflect various situationsin daily life. Specifically, we incorporate an automatic datatransfer scheme (hiding the data transfer paradigm from theusers) and an easy data entry scheme through the use of themobile telephone camera.

3. SYSTEM DESIGNThe focus of our dietary assessment system is to better

capture meal patterns of users using automated food andportion analysis. Our dietary assessment system utilizes aserver-client configuration.

3.1 System OverviewIn the record method, food images are captured using a

built-in camera on a mobile device and the correspondingmanagement data for food images is generated. The cap-tured food images are transferred to a server for analysiswhere the various types of food in the images are identi-fied and the portion sizes (volumes) of food are estimatedto compute the amount of food consumed by the user. Thisanalysis part is broken into two sub parts; food identifica-tion [31] and volume estimation [30]. These results (foodtype and volume) are returned to the user for review. Inthe review method, the user confirms or adjusts the resultsproduced in the server. Figure 1 shows the overview of ourimage-based dietary assessment system. Additionally, an al-ternate method is considered for cases where the user cannotcapture food images.

The record and review methods and the alternate methodare deployed on client mobile phones. The analysis compo-nents run on the server to minimize the burden of compu-tation on the mobile telephone. This paper focuses on theclient side of the system and we present the design and de-velopment of the mobile user interfaces for an image-baseddietary assessment system. For details of the analysis com-ponents, see Zhu et al. [31] and Woo et al. [30].

3.2 Design Considerations for Mobile DietaryAssessment Tools

Page 3: Development of a Mobile User Interface for Image-based ...rmaciejewski.faculty.asu.edu/papers/2010/MUM_TADA.pdfand assessment regarding health conditions and nutrient im-balances

Figure 2: (Left) the framework of our mobile dietary assessment tool and (right) an eating occasion structureconsisting of a series of food images, eating occasion data and food tags. The framework is broken into twotracks; one uses eating occasions with food images and the other deals with eating occasions without foodimages. Each food tag includes food labels and volumes. Note the dashed structure in the eating occasiondiagram. This was part of our initial concept framework and was later removed based on user feedback.Please see Section 4.1 for a detailed discussion of the design concept and lessons learned.

Based on design discussions with our nutrition researchpartners and analysis of various eating patterns in real life,we derived several design considerations for our mobile di-etary assessment tools.

1) Easy and fast collection of daily food images: Capturingfood images by using a mobile telephone with a built-in camera provides users with easier, faster and poten-tially more accurate ways to record meals in comparisonto methods of remembering or noting what they ate [1].Thus, we chose to utilize the built-in cameras on mobilephones as a data collection tool.

2) Minimum user interaction: Since users of a dietary as-sessment system are as various as users of mobile phones,we need to consider a wide range of users from adoles-cents to senior citizens. Hence, the user interface shouldbe easy to use and the interaction needed to record orreview eating occasions should be minimal.

3) Flexible eating patterns: The system should be flexibleenough so that various eating patterns in real life, suchas multiple eating occasions per day and the additionof foods during an eating occasion, can be collected andmanaged in the system.

4) Protection of personal data: Users’ food images and ad-ditional private data should be kept private. Hence, foodimages on the mobile telephone should be hidden fromother people.

5) Automatic data processing : A server-client architecturereduces the burden of analysis and computation on themobile telephone and increases the overall system perfor-mance. While employing this, the data transfer between

Figure 3: Data communication between a server andmobile applications (record and review) based onFigure 1. In this figure, step 5 occurs from users’interaction, and the other steps are automaticallyprocessed. On the right side, an example databaseon the server is presented.

a mobile telephone and a server should be hidden fromthe users. Moreover, users do not need to know detailsof server processing.

6) Exceptional situations: Additional methods should pro-vide users with tools to manually save or modify eat-ing occasions. For instance, in cases that users cannotcapture food images due to situational or environmen-tal conditions, users should be able to create their eatingoccasion data without food images. Moreover, if the re-sults from the analysis components contain errors, usersshould also be able to correct them and update the server.

Page 4: Development of a Mobile User Interface for Image-based ...rmaciejewski.faculty.asu.edu/papers/2010/MUM_TADA.pdfand assessment regarding health conditions and nutrient im-balances

4. MOBILE USER INTERFACE FORIMAGE-BASED DIETARY ASSESSMENTSYSTEM

Based on the design considerations in Section 3.2, we de-scribe the framework of our mobile dietary assessment tool.In this section, we describe our system tools and implemen-tations. Furthermore, we discuss lessons learned and presentqualitative user feedback on each of the deployed compo-nents. User studies have included 117 adolescents (10∼18years) and 57 adults (21∼65 years).

Figure 2 shows the framework for our mobile interface andthe basic structure of an eating occasion. As shown in Fig-ure 2 (left), our mobile tool includes three main methods;the record, review and alternate method. The record and re-view methods share eating occasion data with food images,whereas the alternate method supports users in creating eat-ing occasions even when they cannot capture food images.Each eating occasion may contain several food images, eat-ing occasion data created by the record method, and foodtags including food labels and volumes as shown in Figure2 (right).

Due to programming constraints (i.e., no multi-threading)with iOS 3.x and earlier, we separated the record and reviewmethods into two different applications. Eating occasiondata is created in the record method in one application andtransferred to the server. The review method in the otherapplication receives the eating occasion data from the serverafter completing the analysis on the server. User feedbackindicated that such a separation was sub-optimal. Fortu-nately, the new iOS 4.x allows an application to remain inthe background by providing multi-threading so that datacommunication between a mobile phone and a server will beavailable without such separation. Figure 3 shows a diagramfor data communication between the server and the sepa-rated two record and review applications and an exampledatabase on the server. In the following sections, we presentthe detailed functionality, development, lessons learned anduser feedback of each method for our mobile dietary assess-ment.

4.1 Recording the Eating OccasionBy utilizing the record method, the user captures food im-

ages with the built-in camera on the mobile device. Whilerecording food images, the eating occasion data for the foodimages (*.rec) (see Figure 2 (right)) is generated for eacheating occasion containing the device ID, user ID, date/timeand file names of all food images. The date and time rep-resent when the first image was captured. All food imagesand eating occasion data are transferred to the server (step1 in Figure 3). The guide lines denoting the top and bottomof the image, superimposed on the camera view were key tohelping users fit foods into the screen. More importantly,the top and bottom notation insured correct orientation ofa mobile phone. Prior to implementing this feature, imageswere captured in a variety of orientations, confounding im-age segmentation and volume estimation.

Handling various eating patterns: To record andmanage multiple helpings (portion refills), we allowed usersto capture a series of images for a given eating occasion.Each image in the sequence was a before or after image ex-cept for the first image (which can only be a before image)and the last image (which can only be an after image). All

these images were then saved in the order of their capturetime to reflect the time progression of the eating occasion.Figure 4 shows the food image recording interface that wascomposed of three buttons. Capturing the first image cre-ates a new eating occasion and capturing the last imagesends the current eating occasion to the server for analysis.

To implement this concept, we included three buttons:First, In-Between and Last. Adolescent users found the In-Between button to be confusing. Many users interpreted itto mean they should take an image mid-way through theirmeals. Therefore, the button was disabled and a series ofFirst/Last images were used to accommodate portion refillsand multiple courses. This sequencing was well received.

Recommendations for training and recording: Theuser interface development and available technology can pro-vide many solutions to aid in the automated analysis. How-ever, some issues must be addressed through user training.Suggestions to the users for improved training and recordinginclude:

1) Full image acquisition: Foods should be captured largeenough and around the center area in the images so thatthey can easily be recognized during analysis.

2) Highlight and shadow condition: Highlights and shadowshinder the automatic analysis by making the recognitionof food areas difficult.

3) Spatial consistency : When several foods exist in the im-ages, keeping them at the same position helps the analy-sis part to automatically recognize foods in different im-ages of the same eating occasion.

4) Camera setup: Significant perspective distortion can oc-cur in the image according to the distance and the an-gle of view. We experimentally found angles between30∼45 ◦ to be satisfactory.

4.2 Reviewing the Eating OccasionAfter recording eating occasions, the users review the an-

alyzed results from the server to either confirm or changefood types (food names) and volumes (the amount of in-take for each food). As shown in Figure 1, the analysis isautomatically decided by the food identification and savedinto food label data (*.tag), and the food volume is algorith-mically estimated in the volume estimation and saved intofood volume data (*.vol) to be returned to the users. Thepurpose of this review method is to provide user interfaceson mobile phones for reviewing the data transferred fromthe server. When the user starts the review method, an ap-plication requests the eating occasion data of food imagesgenerated in the record method and analysis results to theserver (steps 4-1 and 4-2 in Figure 3).

Filtering eating occasions: First, a list of eating oc-casions are displayed so that the users can select one eatingoccasion to review as shown in Figure 5. At this moment,we delay loading the food images and related context in-formation (i.e., food tag data) on the memory of the mo-bile device until the user selects one to review among theeating occasions. This lazy loading [11] strategy is appliedto all the other methods to reduce the memory consump-tion on the mobile device. Conceptually, users view imagesthey recorded so that they can review them and make ad-justments or corrections as necessary to the foods’ type orvolume.

Page 5: Development of a Mobile User Interface for Image-based ...rmaciejewski.faculty.asu.edu/papers/2010/MUM_TADA.pdfand assessment regarding health conditions and nutrient im-balances

Figure 4: User interface for the record method.The dot indicators were proposed to remind usersof which image has been taken. In this example,four red dots indicate that the first image and threein between images have been taken. The right mostgrey dot among the dots shows that the last imagehas yet to be taken.

If the list of eating occasions displayed includes all eat-ing occasions recorded in the record method, it could beextremely long after continuous usage. Although perfor-mance issues of handling long lists could be averted withefficient coding of the application, the concern is that theusers would have to scroll through the long list of eating oc-casions in order to look for a specific one. Hence, we filtereating occasions to display on the review method using threestrategies:

1) When all the food tags in an image have been confirmedby the user, the corresponding image should be hiddenfrom the eating occasion.

2) When all the images in the eating occasion have beenconfirmed by the user, the corresponding eating occasionis hidden from the user. (The eating occasion is not dis-played on the list anymore.)

3) Eating occasions whose images are yet to be processedby the server are not displayed on the list.

Displaying food tags: Once the user selects an eatingoccasion, food images with food tags (food labels) are dis-played in order. Figure 6 (left) shows the image with foodlabels. Each label is transparent to reduce occluding foodareas and is matched to the food where the pin icon with thesame color is placed. In the case that there are several foodlabels, the positions of the labels are rearranged to minimizecluttering.

User confirmation and adjustment: In the case thatthe analysis results from the server contain errors, we engagethe confirmation and adjustment of the food tags by theusers (step 5 in Figure 3). The confirmation reports to theserver that all food tags in the image are correct and updatesthe server when food tags are changed (step 5-1 in Figure

Figure 5: A list of eating occasions in the reviewmethod with a thumbnail, date, time, the numberof food images with food tags and the number ofconfirmed images in each eating occasion.

3), and the adjustment allows users to locally make changesto food labels and volumes (step 5-2 in Figure 3).

To correct wrong food labels in the food tags, we em-ploy the stemming algorithm [23] for keyword search, simpli-fied for the Food and Nutrient Database for Dietary Studies(FNDDS) 3.0 [2]. However, utilizing general keyword searchfor food data often produces undesired results. For example,keyword search for ‘apple’ returns a list of 158 food namesfrom the FNDDS 3.0 and the raw apple is buried at least 50foods deep. Hence, another potential way to search foods isto use a set of predefined keywords. Since defining searchkeywords for foods requires domain specific knowledge anduser evaluation, our nutrition colleagues have been helpedto define the keywords. In future versions of our user inter-face, new search method will be incorporated using thesepredefined keywords. Further, feedback from adults andadolescents also indicated that a search mechanism needsto accommodate misspellings and synonym names for foods.Integration of these concepts will improve the efficiency ofthe tool used to search for foods.

4.3 Alternate MethodThe main concept of our dietary assessment system is to

utilize food images recorded by the users during eating occa-sions. However, it may, at times, be impossible for users totake food images. For instance, the user may not have hadhis or her mobile telephone or may have forgotten to takefood images. To support such situations, we also providean alternate method that is based on user interaction and amanual food search.

This alternate method is developed as an independenttrack in the framework of the client side as shown in Fig-ure 2 (left) and provides users with tools for creating eatingoccasions containing enough information for nutrient anal-ysis, including date and time, food name, measure descrip-tion and the amount of intake as shown in Figure 7. In thismethod, food names are searched by using the same methodemployed for the user confirmation in the review method.

Page 6: Development of a Mobile User Interface for Image-based ...rmaciejewski.faculty.asu.edu/papers/2010/MUM_TADA.pdfand assessment regarding health conditions and nutrient im-balances

Figure 6: Displaying food tags. (Left) food labels onan image and (right) a 3D food volume shape. The3D shape of food volume and corresponding userinterfaces for translating (green arrows) and scaling(green minus and plus) are displayed when the userselected a specific food tag.

The portion units for the amount consumed depend on thefood that the user selects and come from FNDDS 3.0. Fromuser feedback, users have expressed that the portion unitsneed to be polished (i.e., the number ‘1’ in front of each por-tion unit is confusing (Figure 7 (right))). This is an issuelinked to modification of the FNDDS rather than user inter-face design. Our nutrition colleagues have also been workedto update the FNDDS. Thus, in our future versions of thealternate method, this issue will be improved.

5. CONCLUSION AND DISCUSSIONWe have described the design and development of a mo-

bile user interface for an interactive, image-based dietary as-sessment system using mobile devices. As a self-monitoringsystem for dietary assessment, our system uses food imagescaptured using a built-in camera on mobile devices and ana-lyzes the images on the server in order to automatically labelfoods and estimate the amount of food consumed. For flex-ibility, our system also considers an alternate method basedon user interaction and food search instead of food images.Hence, our mobile client system using mobile phones is com-posed of the record, review and alternate methods.

The current user interface design considerations are usedto handle capturing meal images and reporting mistakes dur-ing the automatic food identification. We have begun workon implementing user controls for also adjusting errors inthe volume estimation. As an initial idea, we are currentlyproposing a system that will provide 3D primitive templateshapes and interactions for translating, rotating and scalingthe overlayed volumes. Figure 6 (right) shows the imagewith cylindrical food volume. A transparent 3D shape issuperimposed on the food area to provide users with toolsto either confirm or adjust the food volumes estimated onthe server. User interfaces for translating and scaling the3D shape are displayed when the user selects a specific food

Figure 7: An alternate method. (Left) a list of eat-ing occasions without food images, (middle) foodsearch results, and (right) the settings of food por-tion unit and the amount of food consumed.

tag in order to minimize cluttering the screen.While designing and developing our mobile user interface,

we collaborated with researchers in the Department of Foodsand Nutrition to collect feedback on the interaction designthrough several user studies [26]. Based on the feedback,we tailored the user interface of our client mobile system forimproving usability as well as system flexibility. To demon-strate, we developed and deployed our client mobile tools onthe iPhone 3GS devices running iOS 4.0.

6. ACKNOWLEDGMENTSWe would like to thank our collaborators and colleagues,

Bethany Six of the Department of Foods and Nutrition andDeb Kerr of the School of Public Health, Curtin Institute ofTechnology for feedback about mobile user interfaces throughuser studies, Insoo Woo, JungHoon Chae, Fengqing Zhu ofthe School of Electrical and Computer Engineering for theirwork in developing food volume estimation and food imageanalysis on the server and Ahmad M. Razip of the Schoolof Electrical and Computer Engineering for his help in re-vising the user interface. Support for this work comes fromthe National Cancer Institute (1U01CA130784-01) and theNational Institute of Diabetes, Digestive, and Kidney Disor-ders (1R01-DK073711-01A1). More information about ourproject can be found at www.tadaproject.org.

7. REFERENCES[1] [Diet History Questionnaire: Web-based DHQ,

National Cancer Institute]http://riskfactor.cancer.gov/DHQ/webquest/

(accessed 10 September 2010).

[2] [The Food and Nutrient Database for Dietary Studies]http://www.ars.usda.gov/services/docs.htm?

docid=7673 (accessed 10 September 2010).

[3] A. Ahtinen, M. Isomursu, M. Mukhtar, J. Mantyjarvi,J. Hakkila, and J. Blom. Designing social features formobile and ubiquitous wellness applications. In MUM’09: Proceedings of the 8th International Conferenceon Mobile and Ubiquitous Multimedia, pages 1–10,New York, NY, USA, 2009. ACM.

[4] E. Arsand, J. T. Tufano, J. D. Ralston, andP. Hjortdahl. Designing mobile dietary management

Page 7: Development of a Mobile User Interface for Image-based ...rmaciejewski.faculty.asu.edu/papers/2010/MUM_TADA.pdfand assessment regarding health conditions and nutrient im-balances

support technologies for people with diabetes. InJournal of Telemedicine and Telecare, volume 14,pages 329–332, July 2008.

[5] E. Arsand, R. Varmedal, and G. Hartvigsen. Usabilityof a mobile self-help tool for people with diabetes: theeasy health diary. In Automation Science andEngineering, 2007. CASE 2007. IEEE InternationalConference on, pages 863–868, Sept. 2007.

[6] O. Baecker, T. Ippisch, F. Michahelles, S. Roth, andE. Fleisch. Mobile claims assistance. In MUM ’09:Proceedings of the 8th International Conference onMobile and Ubiquitous Multimedia, pages 1–9, NewYork, NY, USA, 2009. ACM.

[7] D. C. Baumgart. Personal digital assistants in healthcare: experienced clinicians in the palm of your hand.In Lancet, volume 366, pages 1210–22, Oct. 2005.

[8] P. Chong, P. So, P. Shum, X. Li, and D. Goyal. Designand implementation of user interface for mobiledevices. Consumer Electronics, IEEE Transactions on,50(4):1156 – 1161, November 2004.

[9] T. Denning, A. Andrew, R. Chaudhri, C. Hartung,J. Lester, G. Borriello, and G. Duncan. Balance:towards a usable pervasive wellness application withaccurate activity inference. In HotMobile’09:Proceedings of the 10th workshop on MobileComputing Systems and Applications, pages 1–6, 2009.

[10] H. Donald, V. Franklin, and S. Greene. The use ofmobile phones in dietary assessment in young peoplewith type 1 diabetes. In Journal of Human Nutritionand Dietetics, volume 22, pages 256–257, June 2009.

[11] M. Fowler. Patterns of Enterprise ApplicationArchitecture. Addison-Wesley, 2002.

[12] D. Gammon, E. rsand, O. A. Walseth, N. Andersson,M. Jenssen, and T. Taylor. Parent-child interactionusing a mobile and wireless system for blood glucosemonitoring. In Journal of Medical Internet Research,volume 7, page e57, Nov. 2005.

[13] J. Gong and P. Tarasewich. Guidelines for handheldmobile device interface design. In In Proceedings of the2004 DSI Annual Meeting, pages 3751–3756, 2004.

[14] S. Kim, R. Maciejewski, K. Ostmo, E. J. Delp, T. F.Collins, and D. S. Ebert. Mobile analytics foremergency response and training. InformationVisualization, 7(1):77–88, 2008.

[15] M. J. Kretsch, C. A. Blanton, D. Baer, R. Staples,W. F. Horn, and N. L. Keim. Measuring energyexpenditure with simple low cost tools. In Journal ofthe American Dietetic Association, volume 104,page 13, Aug. 2004.

[16] T.-H. Lee, H.-J. Kwon, D.-J. Kim, and K.-S. Hong.Design and implementation of mobile self-care systemusing voice and facial images. In ICOST ’09:Proceedings of the 7th International Conference onSmart Homes and Health Telematics, pages 249–252,Berlin, Heidelberg, 2009. Springer-Verlag.

[17] J. Lumsden. Handbook of Research on User InterfaceDesign and Evaluation for Mobile Technology(2-Volume Set). Information Science Reference, 1edition, January 2008.

[18] A. Mariappan, M. B. Ruiz, F. Zhu, C. J. Boushey,D. A. Kerr, D. S. Ebert, and E. J. Delp. Personaldietary assessment using mobile devices. In

Proceedings of the IS&T/SPIE Conference onComputational Imaging VII, volume 7246, pages72460Z–72460Z–12, Jan. 2009.

[19] E. G. Nilsson. Design patterns for user interface formobile applications. In Advances in EngineeringSoftware, volume 40, pages 1318–1328, December2009.

[20] C. L. Ogden, M. D. Carroll, L. R. Curtin, M. A.McDowell, C. J. Tabak, and K. M. Flegal. Prevalenceof overweight and obesity in the united states,1999-2004. In JAMA: the journal of the AmericanMedical Association, volume 295, pages 1549–55, Apr.2006.

[21] C. L. Ogden, M. D. Carroll, and K. M. Flegal. Highbody mass index for age among us children andadolescents, 2003-2006. In JAMA: the journal of theAmerican Medical Association, volume 299, pages2401–5, May 2008.

[22] K. Patrick, W. G. Griswold, F. Raab, and S. S. Intille.Health and the mobile phone. In American Journal ofPreventive Medicine, volume 35, pages 177–181, 2008.

[23] M. F. Porter. An algorithm for suffix stripping. pages313–316, 1997.

[24] S. Reddy, A. Parker, J. Hyman, J. Burke, D. Estrin,and M. Hansen. Image browsing, processing, andclustering for participatory sensing: lessons from adietsense prototype. In EmNets ’07: Proceedings of the4th workshop on Embedded networked sensors, pages13–17, 2007.

[25] B. Shneiderman. Designing the User Interfae -Strategies for Effective Human-Computer Interaction.1998.

[26] B. L. Six, T. E. Schap, F. M. Zhu, A. Mariappan,M. Bosch, E. J. Delp, D. S. Ebert, D. A. Kerr, andC. J. Boushey. Evidence-based development of amobile telephone food record. 110(1):74 – 79, 2010.

[27] D. H. Wang, M. Kogashiwa, and S. Kira. Developmentof a new instrument for evaluating individuals’ dietaryintakes. In Journal of the American DieteticAssociation, pages 1588–1593, 2006.

[28] D. H. Wang, M. Kogashiwa, S. Ohta, and S. Kira.Validity and reliability of a dietary assessmentmethod: the application of a digital camera with amobile phone card attachment. In Journal ofNutritional Science and Vitaminology, volume 48,pages 498–504, Dec. 2002.

[29] S. M. White, D. Marino, and S. Feiner. Designing amobile user interface for automated speciesidentification. In CHI ’07: Proceedings of the SIGCHIconference on Human factors in computing systems,pages 291–294, New York, NY, USA, 2007. ACM.

[30] I. Woo, K. Ostmo, S. Kim, D. S. Ebert, E. J. Delp,and C. J. Boushey. Automatic portion estimation andvisual refinement in mobile dietary assessment. InProceedings of the IS&T/SPIE Conference onComputational Imaging VIII, volume 7533, pages75330O–75330O–10, January 2010.

[31] F. Zhu, M. Bosch, I. Woo, S. Kim, C. J. Boushey,D. S. Ebert, and E. J. Delp. The use of mobile devicesin aiding dietary assessment and evaluation. SelectedTopics in Signal Processing, IEEE Journal of, 4(4):756–766, aug. 2010.