using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a...

8
Using proprioceptive sensors for categorizing interactions * [Extended Abstract] T Salter , F Michaud and D L´ etourneau Universit ´ e de Sherbrooke Sherbrooke Quebec, Canada t.salter | f.michaud | d.letourneau @usherbrooke.ca D.C. Lee and I.P. Werry University of Hertfordshire Hatfield Hertfordshire, England d.c.lee | i.p.werry @herts.ac.uk ABSTRACT Increasingly researchers are looking outside of normal com- munication channels (such as video and audio) to provide additional forms of communication or interaction between a human and a robot, or a robot and its environment. Amongst the new channels being investigated are infrared, proprio- ceptive and temperature sensors to detect touch. Our work aims at developing a system that can detect natural touch or interaction coming from children playing with a robot, and adapt to this interaction. This paper reports trials done using Roball, a spherical mobile robot, demonstrating how sensory data patterns can be identified in human-robot in- teraction, and exploited for achieving behavioral adaptation. The experimental methodology used for these trials is re- ported, which validated the hypothesis that human interac- tion can not only be perceived from proprioceptive sensors on-board a robotic platform, but that this perception has the ability to lead to adaptation. General Terms Algorithms, Performance, Design, Experimentation, Human Factors. Keywords Human-Robot Interaction (HRI), Adaptive Mobile Robots, Sensor Evaluation, Categorizing Interaction. * (Produces the permission block, copyright information and page numbering) A full version of this paper is available as Author’s Guide to Preparing ACM SIG Proceedings Using L A T E X2 and BibTeX at www.acm.org/eaddress.htm Also affiliated with University of Hertfordshire 1. INTRODUCTION Touch is an important form of communication or interaction between robots and humans [16], [17] or between robots and their environment [6]. While video and audio are typically used for communication and interaction between humans and robots, other sensors such as contact, infrared, pro- prioceptive and temperature sensors can provide additional means of communication related to touch [6] and [13]. Our main aim is to develop a system where children can in- teract with a robot naturally and the robot can adapt to this natural interaction. How children interact with and perceive robots is itself becoming a highly studied area [3], [14], [23], [5], [22]. People working with children or in therapy are be- ginning to recognize that natural touch is an important form of interaction or communication with a robot [3], [21], [16], [17], [2]. In an effort to register touch or communication, some robotic systems utilize buttons that must be pushed by a person [4], [1], [19]. Salter et al. [16], [17], [15] showed that infrared sensors on-board a mobile robot, typically ex- ploited for navigation purposes, can also be used to record interactions or natural touch coming from children playing with a mobile robot. The research demonstrates that it is even possible to detect personality traits (e.g., boisterous, or cautious) of a child interacting with a wheeled robot, sim- ply from the analysis of infrared sensor data. At MIT [20], a robotic teddy bear named Huggable is being designed with full-body sensate skin and smooth, quiet voice coil actuators that are able to relate to people through touch. Huggable features a series of temperature, electric field and force sen- sors which it uses to sense the interactions that people have with it [12]. At NASA, Lumelsky [8] is working on develop- ing a “sensitive skin” that could be used to cover a robot. The skin would include more than 1,000 infrared sensors that would detect an object [7]. In their related work, Saraf and Maheshwari [9] claim that their device can give a robot tactile sensitivity equivalent to that of human fingers, and one early use might be in minimally invasive surgery. The system they have developed is based on alternating layers of gold and semi-conducting cadmiumsulfur nanoparticles sep- arated by nonconducting, or dielectric, films. They hope to coat a robot hand with this film. Another move away from the typical use of microswitches for detecting touch is real- ized in the robot seal named Paro, which was developed for

Upload: others

Post on 09-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

Using proprioceptive sensors for categorizing interactions∗

[Extended Abstract]†

T Salter‡

, F Michaud and D LetourneauUniversite de Sherbrooke

SherbrookeQuebec, Canada

t.salter | f.michaud | [email protected]

D.C. Lee and I.P. WerryUniversity of Hertfordshire

HatfieldHertfordshire, England

d.c.lee | [email protected]

ABSTRACTIncreasingly researchers are looking outside of normal com-munication channels (such as video and audio) to provideadditional forms of communication or interaction between ahuman and a robot, or a robot and its environment. Amongstthe new channels being investigated are infrared, proprio-ceptive and temperature sensors to detect touch. Our workaims at developing a system that can detect natural touch orinteraction coming from children playing with a robot, andadapt to this interaction. This paper reports trials doneusing Roball, a spherical mobile robot, demonstrating howsensory data patterns can be identified in human-robot in-teraction, and exploited for achieving behavioral adaptation.The experimental methodology used for these trials is re-ported, which validated the hypothesis that human interac-tion can not only be perceived from proprioceptive sensorson-board a robotic platform, but that this perception hasthe ability to lead to adaptation.

General TermsAlgorithms, Performance, Design, Experimentation, HumanFactors.

KeywordsHuman-Robot Interaction (HRI), Adaptive Mobile Robots,Sensor Evaluation, Categorizing Interaction.

∗(Produces the permission block, copyright information andpage numbering)†A full version of this paper is available as Author’s Guide toPreparing ACM SIG Proceedings Using LATEX2ε and BibTeXat www.acm.org/eaddress.htm‡Also affiliated with University of Hertfordshire

1. INTRODUCTIONTouch is an important form of communication or interactionbetween robots and humans [16], [17] or between robots andtheir environment [6]. While video and audio are typicallyused for communication and interaction between humansand robots, other sensors such as contact, infrared, pro-prioceptive and temperature sensors can provide additionalmeans of communication related to touch [6] and [13].

Our main aim is to develop a system where children can in-teract with a robot naturally and the robot can adapt to thisnatural interaction. How children interact with and perceiverobots is itself becoming a highly studied area [3], [14], [23],[5], [22]. People working with children or in therapy are be-ginning to recognize that natural touch is an important formof interaction or communication with a robot [3], [21], [16],[17], [2]. In an effort to register touch or communication,some robotic systems utilize buttons that must be pushedby a person [4], [1], [19]. Salter et al. [16], [17], [15] showedthat infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to recordinteractions or natural touch coming from children playingwith a mobile robot. The research demonstrates that it iseven possible to detect personality traits (e.g., boisterous, orcautious) of a child interacting with a wheeled robot, sim-ply from the analysis of infrared sensor data. At MIT [20], arobotic teddy bear named Huggable is being designed withfull-body sensate skin and smooth, quiet voice coil actuatorsthat are able to relate to people through touch. Huggablefeatures a series of temperature, electric field and force sen-sors which it uses to sense the interactions that people havewith it [12]. At NASA, Lumelsky [8] is working on develop-ing a “sensitive skin” that could be used to cover a robot.The skin would include more than 1,000 infrared sensorsthat would detect an object [7]. In their related work, Sarafand Maheshwari [9] claim that their device can give a robottactile sensitivity equivalent to that of human fingers, andone early use might be in minimally invasive surgery. Thesystem they have developed is based on alternating layers ofgold and semi-conducting cadmiumsulfur nanoparticles sep-arated by nonconducting, or dielectric, films. They hope tocoat a robot hand with this film. Another move away fromthe typical use of microswitches for detecting touch is real-ized in the robot seal named Paro, which was developed for

Page 2: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

Figure 1: Roball, an autonomous rolling robot.

robot assisted activity in hospitals or homes for the elderly.In this system, physical contact with the robot, for exampletouching, is recognized by a system based on balloons [21].

Our interests lie in studying how robots’ proprioceptive sen-sors (which can be used for navigation, control or other pur-poses) can be exploited as a form of communication andas way to capture natural touch or interaction between amobile robot and children.

• First, a series of trials were conducted using Roball(see Figure 1) in laboratory conditions without chil-dren and then in real life settings with children. Dataanalysis of accelerometers and tilt sensors establishedthat it is possible to detect play patterns with ac-celerometers and tilt sensors.

• Second , another series of trials were designed to followon from and expand upon this work, pursuing the goalof adapting the robots behavioral response to the stim-ulation or interaction it is receiving from children play-ing with it. The trials were conducted to test whetherheuristics identified in the previous trials could be uti-lized to enable adaptation of Roball to human inter-action. Again trials were conducted using Roball (seeFigure 1) both in the laboratory without children andin a real life setting with children.

Our findings are presented in this paper.

2. ROBALLShown in Figure 1, Roball is 6 inches in diameter and weighsabout 4 pounds [10], [11]. It consists of a plastic sphere(a hamster exercise ball) constructed from two halves thatare attached to each other. The plastic sphere is used tohouse the fragile electronics (sensors, actuators, processingelements), thus making it robust and ideal for interactionwith children. The fact that Roball is spherical encourages awide range of play situations. Movement is achieved througha combination of two propulsion motors that are used topropel the shell of Roball, and a counterweight that is usedto control direction by moving the center of gravity towards

Figure 2: Both front or back and side view.

Figure 3: Roball’s accelerometers, centered abovethe steering motor.

the required direction, see Figure 2. The propulsion motorsare attached to the shell wall. Rotation of these motorscauses Roball to move forward or backwards. The steeringmotor moves the counterweight from one side to the otherin order to move the center of gravity away from the centerof the sphere.

Roballs first prototype used binary mercury switches to de-tect and control lateral and longitudinal inclinations of theinternal plateau. To provide a wider set of perceptual states,new sensors were installed on the platform. The version ofRoball used in our work, running with a PIC18F458 micro-controller with 32 kBytes of internal memory, 1.5 kBytes ofRAM and operating at 40 MHz (10 MIPS), has the followingspecial features:

• Accelerometers - Roball has three accelerometers,one for each axis (X, Y and Z). Analog ADXL311miniature accelerometers are used to measure Roballsacceleration in any of the three axes directions in therange 0 to 2g (see Figure 3).

• Tilt sensors - There are three tilt sensors, one for lefttilt, one for right and one for forward/backward tilt.Sharp GP1S036HEZ miniature photointerrupters areused to detect the tilt direction by sensing the move-ment of a small ball. The tilt sensors are positionedon Roballs printed circuit board. Two tilt sensors areplaced symmetrically on left/right axis (the axis cor-

Page 3: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

Figure 4: Roball’s tilt sensors: (a) placement; (b)possible values with L = Left, R = Right, B = Backand F = Front; (c) left tilt being registered.

responding to the line between Roballs two propulsionmotors). This configuration allows the detection ofeither left or right tilt with both sensors giving thesame value, and also allows detection of rotation withreadings from the sensors giving opposite left/right tiltvalues due to centrifugal acceleration. For example, ifthe ball is tilted to the left, both the right and the lefttilt sensors give a reading of (L). If the ball is tiltedto the right, both the right and left tilt sensor gives areading of (R). Finally, if the ball is spinning the rightand left tilt sensors give opposite readings: the rightsensor gives a reading of (R) and the left sensor givesa reading of (L) (see Figure 4).

3. EXPERIMENTAL SETTINGSIn both sets of trials (First and Second), the robot wasprogrammed to execute two simple behaviors: wanderingand obstacle avoidance. These behaviors were carried outfor the duration of each trial. Roball’s function was to actas a mobile, moving toy. Four small wooden walls enclosethe experimental arena, which creates a pen as shown inFigures 5 and 6. The pen is approximately 2.5 m by 2m. Every experiment was video taped for verification ofsensor readings. Sensor readings were recorded on-board therobot every 0.10 second. After the preprogrammed durationof the trial the robot stops by itself, ending the trial. Allof the laboratory experiments took place at the Universitede Sherbrooke, without children present. In the real lifesettings, the children were typically developing boys agedbetween 5 and 7 years. All the children were treated thesame. None of the children had been exposed to roboticdevices other than those found in toy stores. When the trialbegins they are asked to step inside the pen and to play

Figure 5: Examples of the experiments in the laboratory.

The robot executed a wandering and a simple obstacle avoid-

ance behaviour.

Figure 6: The children playing with Roball in a home set-

ting. Roball is programmed with adaptive behaviours.

with Roball. The experimenter attempts to have as littleeffect on the children as possible. Such as giving the sameset of instructions to all the children, also not engaging inconversation with children and not prompting the childrento play with the robot once the trial has commenced.

4. FIRST TRIALS: OBTAINING HEURIS-TICS

The first set of trials involved: a series of laboratory experi-ments then a series of trials held in real life settings, namely,a play group and a school.

4.1 Laboratory ExperimentsThese were used to investigate whether measurements fromtwo different types of proprioceptive sensors could recordthings such as jolts to the robot, the robot receiving generalinteraction, the robot being carried or the robot being spun.The experiments were broken down into seven environmen-tal conditions:

1. Alone (i) Roball wandering in the laboratory.

2. Alone (ii) Roball wandering in the experimental pen.

3. Light Boxes Roball wandering in the pen with boxespresent.

4. Heavy Boxes - Roball wandering in the pen with weightedboxes present.

5. Carrying Experimenter walking whilst carrying Roball.

Page 4: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

Figure 7: The 3 different axes reading for each ofthe environmental conditions.

6. Interaction Experimenter simulating interaction e.g.kicking, pushing, banging Roball.

7. Spinning Experimenter spinning Roball.

Data analysis was performed after the experiments had beenconducted. Analysis was conducted by using simple calcu-lations on the data collected from the accelerometers andtilt sensor during these seven different environmental condi-tions. The data comes from an average of the 3 experimentsof 5 minutes each, repeated for each of the seven environ-mental conditions. Data is being recorded at 10 Hz.

4.1.1 Accelerometer ResultsFigure 7 shows that during the seven different environmen-tal, each of the accelerometers produces a different (average)signature. Although there are similarities between the sig-natures, each is unique in some way.

Figure 8 illustrates that the difference between the X ac-celerometer and the Z accelerometer (X Z) also producedvery interesting results that could be used to discriminatethe robots interaction status. For instance, only in condi-tion (6) Carrying do we see a negative X-Z difference; also,condition (1) Alone (i) shows the largest difference betweenthese two axes.

Line graphs produced of the trials clearly show a differencein the way human contact with robot is registered by ac-celerometers to what these sensors register when the robotis alone (see Figures 9 and 10). These graphs show a markeddifference in sensor readings and identify that there defi-nitely is the possibility of using these readings to correctlyclassify human interaction and thus have the ability to adaptto this information. When the robot has experienced inter-action we see all three axes readings show jagged readingsthat constantly cross with each other. When the robot is,alone, we see large gaps between each of the axis.

4.1.2 Tilt Sensor ResultsData recorded from the tilt sensors readings during eachof the When the robot is spinning, two tilt sensors shouldgive different values as described in Section 2 (see Figure

Figure 8: Averaged differences between the X andZ accelerometer readings.

Figure 9: The erratic X, Y and Z axis sensor datafrom the accelerometers when Roball is in experi-mental condition (6) Interaction.

Figure 10: Accelerometer data when Roball is inexperimental condition (1) Alone(i).

Page 5: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

Figure 11: Tilt sensor readings for the seven envi-ronmental conditions.

4). From the results shown in Figure 11, we can see thatcondition (7) Spinning, as expected, produces the highestresults.

4.2 Real Life SettingsTwo real life setting were used to confirm that the datafound under laboratory conditions would also be found inthe real world. A play group and a school setting were used.In total eight boys participated in the trial, each playingwith Roball at least twice. Trials were initially conductedfor 5 minutes but this seemed to long for the children tohold their attention span and so the length of the trials wereshortened to 4 minutes. Similar results were obtained fromthe analysis of accelerometers, tilt sensors and line graphsfrom the real life settings to those found in the lab. As withthe trials conducted in the laboratory we found that

• Cautious children that did not play with Roball verymuch receive the largest difference between the X - Zaxes, see Figure 8 for the results from the laboratory.

• We get the highest number of different tilt sensor read-ings for active children that spin Roball, see Figure 11for the results from the laboratory.

• When a child plays with Roball we observe that inter-action can be seen as jagged lines. When the child didnot play with Roball we see gaps between the differentaxes (see Figure 12).

4.3 FIRST TRIAL: ConclusionsThis first set of trials showed that it is possible to detectdifferent environmental conditions through the analysis ofproprioceptive sensors (accelerometers and tilt) [18]. Over-all, these analyses indicate that different environmental con-ditions, which can be associated with forms of interaction,can be detected through the analysis of proprioceptive sen-sor data. The detection that the robot is being carried isthe easiest, followed by the detection that the robot is beingspun. The detection of general interaction with a person isnot so easy, but still possible, however it is easier to detectwhen the robot is alone, or not receiving any type of inter-action from a person. To categorize between these states,

Figure 12: Accelerometer readings when a child wasinteracting with Roball in a real life setting.

the sensor reading space can be zoned into regions. Theobjective is to achieve this categorization using an on-boardfeature detection algorithm, in real-time, and to adapt therobots behavior to the interaction experienced.

5. HEURISTICS FOR BEHAVIORAL ADAP-TATION

It was found in the first trial that it is possible to detect dif-ferent environmental conditions. More specifically, relatedto interaction with people, accelerometer and tilt readingsthat can be classified into zones which detect four modes ofinteraction.: ALONE, GENERAL INTERACTION, CAR-RYING and SPINNING. Another condition, named NO CON-DITION, is necessary for situations not part of the otherfour. By detecting these states, the robots behavior couldbe changed or adapted to respond in a particular fashion tointeraction with children. The algorithm developed uses thefollowing five rules which are specific instantiations of theheuristics derived from the analysis presented in Section 4and in [18].

A Being Alone. If the average difference between the Xand Z accelerometer readings is above 0.05, set currentmode to ‘ALONE’.

B Receiving General Interaction. If the average dif-ference between the X and Z accelerometer readingsis below 0.03 and above zero, set current condition to‘GENERAL INTERACTION’.

C Being Carried. If the average difference between theX and Z accelerometer readings is negative, set currentcondition to ‘CARRYING’.

D Being Spun. If the tilt sensors show different read-ings (see Figure 11), set condition to ‘SPINNING’. An-other way to detect spinning is if the average readingfor the Z axis is positive and coupled with an averageY axis reading of above 0.05 (see Figure 7).

E No Condition. If the sensor readings do not fall intoone of the above categories, set the condition to ‘NOCONDITION’.

Page 6: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

Different from the first trial, the algorithm uses a tempo-ral window of 4 seconds to calculate an average of the sen-sor readings and thus derive which condition it believes therobot is currently experiencing. This window is moved for-ward in time by 0.10 sec increments.

After examining the heuristics it was decided to develop analgorithm that detected conditions based on the ease withwhich it was believed they were detectable i.e. checking forconditions that are easy to classify first. The final flow of thealgorithm was determined by various tests conducted in thelaboratory. Tests were conducted that checked the abilityof the algorithm to detect the differing conditions until theoptimal flow for the algorithm was achieved.

The algorithm is designed to first attempt to detect the con-dition SPINNING, by looking at the difference in the tiltsensor readings. If this condition is found to be true, noanalysis of the accelerometers is carried out. However, ifthis is found to be false, the analysis of the accelerometersis then performed. For this analysis, the condition CAR-RYING is first checked by looking for a negative (X Z)average. Next, the condition GENERAL INTERACTIONis looked for by seeing if the Z axis is below zero. Then,the condition SPINNING is analysed by checking for a pos-itive Z axis coupled with an average Y axis reading of above0.05. the condition ALONE is then identified if the X - Zaverage is above 0.05. Next, the condition GENERAL IN-TERACTION is classified if the X - Z average is below 0.03but above zero. If, after all this, the current average withinthe four second window does not fall into any of the cate-gorises, NO CONDITION is set as the output. The resultingpseudo-code of the algorithm is:

IF tilt sensors are different output SPINNINGELSE %Check accelerometers

IF x-z average < 0 output CARRYING

ELSE IF z average < 0 output GENERAL INTERACTION

ELSE IF z average > 0 &&

y average > 0.05 output SPINNING

ELSE IF x-z average > 0.05 output ALONE

ELSE IF x-z average > 0 && x-z average < 0.03

output GENERAL INTERACTION

ELSE output NO CONDITION

END IF

END IF

6. SECOND TRIALS: BEHAVIORAL ADAP-TATION

A second series of trials were conducted that mimicked thosecarried out in the first set of trials. Again trials were con-ducted in the laboratory and a real life setting.

6.1 Laboratory ExperimentsThe experiments were broken down into the four modes ofinteraction listed below:

• Being Alone - Roball wandered in the pen by itself,no objects or humans present.

• Being CarriedThe experimenter walked whilst carrying Roball forthe duration of the experiment.

• Receiving General InteractionThe experimenter simulated interaction from a child,for example, pushing, banging and getting in the wayof the robot.

• Being SpunThe experimenter purposely spun the robot for theduration of the experiment.

Three separate trials were conducted for each of the fourconditions and each individual trial lasted again for a du-ration of 4 minutes. Thus, in total, 12 experiments werecarried out, lasting a total of 48 minutes. This resulted in2360 interaction classifications (10 per second for 240 sec-onds less 40 for the four seconds in which the algorithm isinitializing i ts temporal average window). The adaptationalgorithm was implemented on-board Roball, and the iden-tified states were recorded as interaction occurred. Table 1presents the observed results of the identified states (A =ALONE, B = GENERAL INTERACTION, C = CARRY-ING, D = SPINNING, E = NO CONDITION) in relationto the four modes of interaction.

6.1.1 Algorithm ResultsThe results represent the percentage of time that the statewas identified during the trial. The objective is to maximizevalid identification and minimize false detection. As can beseen from the leading diagonal of Table 1, the robot canidentify the following with reasonable accuracy:

• Being Alone (97%)

• Being Carried (92%)

• Being Spun (77%)

However, identifying Receiving General Interaction (10%) isrevealed to be more difficult.

A probable cause for this is that at times the robot is in factSPINNING or experiencing ALONE during the “General In-teraction” trials. Such conditions would therefore be iden-tified under the corresponding categories, (D) SPINNING45% and (A) ALONE 19% of the time. Therefore, addingthe results for conditions (A) ALONE, (B) GENERAL IN-TERACTION and (D) SPINNING, a total of 74% classifi-cations were correctly identified by the algorithm during theGENERAL INTERACTION experiment.

It should be noted that the experimenters simulation of gen-eral interaction was fairly vigorous. For example, the exper-imenter pushed and kicked the robot with quite some force,which did cause the robot to spin. This is not always thecase with children, as can be seen from the line graph of achild’s interaction, see Figure 12. In general, from observa-tions of the first set of trials, children seem to interact withthe robot in a more punctuated manner, so that a generalinteraction is rather rare. For example, they may push the

Page 7: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

robot then wait, or spin the robot but not then straightaway push and kick the robot. Therefore, it is believed thatrecording the varying interaction may possibly be easier ina real life setting than in the laboratory.

The misclassification of the condition CARRYING whilstreceiving general interaction as shown in Table 1 requiresfurther discussion. It was discovered that when the robothits the wall of the pen, it actually records the same read-ings as being carried. This is because as the robot hits thepen wall, it slightly rolls up the wall and this causes similaraccelerometer readings as when the robot is picked up.

This therefore helps to explain why when the robot is Re-ceiving General Interaction, the condition CARRYING wasregistered 19% of the time and similarly when the robotwas Being Spun, the condition CARRYING was registered17%. During both trials the robot did roll up the wall ofthe pen. Where as, when the robot was Alone it did not rollup the wall thus we see the condition CARRYING recorded0.5%. The problem of the robot rolling up the wall wasnot considered significant as the pen was only beingused specifically in this research work. In other reallife settings the robot would not be so confined andtherefore this phenomena would not happen so often.***

6.2 Adding Adaptation and Testing in a RealLife Setting

Three adaptive behaviors were added to the robot: two be-haviors involved vocals and one behavior involved motioncoupled with vocals.

1. When the robot classifies its sensor readings as SPIN-NING the robot produces the sound: ‘weeeeeeeeeeeeee’.

2. When the robot classifies its sensor readings as CAR-RYING it stops all motion and says ‘put me down’.

3. When the robot classifies its sensor readings as ALONEit says ‘play with me’.

Both the sounds and the movement response are repeateduntil the condition that initiated the behavior changes. Au-dible responses are easily perceived by an external observer,which facilitates post-experiment analysis compared to achange in the behavioral control of the robot. In this workit was decided to begin simply with limited adaptation forclear and precise evaluation of the algorithms performancein the child-robot trials.

The experiments in a real life setting took place at a house,as shown in Figure 6. All the children were known to theexperimenter in a social context. Different to the first trial,the children were told two things about the robot: 1) therobot could be spun; 2) they could pick the robot up butthey must not then drop it and that they should put therobot gently down on the floor. Each trial with the childrenlasted for four minutes.

From the preliminary results observed with children inter-acting with the robot in trial, we clearly observe that the

robot did react to the children and that when the robotdid react there was an increased level of involvement by thechildren. However, it was noticed that at times the robotdid not react correctly. One clear case is the identificationof Carrying when it hit the wall of the pen, causing therobot to stop all motion and say Put me down. Interest-ingly, as a side effect, this caused an even higher level ofengagement and interaction from the children: e.g., havingthe child looking at the experimenter and saying it is askingme to put it down and then proceeding to aid the robot bymoving it so that it could progress on its way. Overall, theresponse of the children was very encouraging. It shows thatin principle that is possible to adapt the behavior of a robotusing sensor readings. Evident was that there was a increasein the level of the childrens engagement and interaction insecond trial compared to that of the first trial. It seems thatthis was due to a more complex behavior. Increasing levelsof engagement and interaction with children, which was theultimate goal of the robot. A further and detailed analysisof the child-robot study needs to done. Also we need toconduct further trials with a larger sample size of childrenand we need to fully investigate the reasons for any incor-rect categorization of the algorithm, which ultimately leadsto incorrect reaction of the robot.

7. CONCLUSIONSThis work demonstrates that proprioceptive sensors are ca-pable of detecting and recording human interaction. Throughthe analysis of accelerometers and tilt sensors (propriocep-tive sensors) on the Roball’s platform, it is found that that ispossible to detect the robot being carried, being spun, beingalone and also interacting with a person. The analysis re-quired is simple, work in real-time on a small embedded mi-crocontroller, and does not require complex processes. Car-rying out preliminary trials in the laboratory without chil-dren helped in providing base line and bench marks readingsfor successive child-robot trials. Further work will includemore detailed analysis of the adaptation algorithm used withRoball interacting with children. Also planned is the use ofthe adaptation algorithm to change the robots navigationalbehavior, and see the effects on engagement and interactionwith children.

8. ACKNOWLEDGMENTSF. Michaud holds the Canada Research Chair (CRC) in Mo-bile Robotics and Autonomous Intelligent Systems. Thiswork is funded by the CRC and the Canadian Foundationfor Innovation.

9. REFERENCES[1] A. Billard. Robota: Clever toy and educational tool.

Robotics and Autonomous Systems, 42:259–269, 2003.

[2] A. Duquette, H. Mercier, and F. Michaud.Investigating the use of a mobile robotic toy as animitation agent for children with autism. InInternational Conference on Epigenetic Robotics,Paris, France, 2006.

[3] T. Ito and N. P. R. Center. How children perceiverobots. http://www.incx.nec.co.jp/robot/english/univ/05/univ e05.html, last accessed 06/10/04, 2003.

Page 8: Using proprioceptive sensors for categorizing interactions · that infrared sensors on-board a mobile robot, typically ex-ploited for navigation purposes, can also be used to record

Table 1: The rows of the table correspond to the four tested environmental conditions. The columns of thetable correspond to the five categories derived from the sensor readings, A = ALONE, B = INTERACTION,C = CARRYING, D = SPINNING, E = NO CONDITION. The percentage of correctly classified interactionis therefore shown in the leading diagonal. For example, the robot being alone is correctly classified 97% oftime.

A B C D E

Being Alone 97% 0.5% 0.5% 1% 1%

Receiving General Interaction 19% 10% 19% 45% 7%

Being Carried 0.5% 3% 92% 3% 1.5%

Being Spun 2% 3% 17% 77% 1%

[4] B. Jensen, G. Froidevaux, X. Greppin, A. Lorotte,L. Mayor, M. Meisser, G. Ramel, and R. Siegwart.The interactive autonomous mobile system robox. InIROS 2002, IEEE/RSJ International Conference onIntelligent Robots and Systems, pages 1221–1227.IEEE Press, 2002.

[5] T. Kanda, T. Hirano, D. Eaton, and H. Ishiguro.Interactive robots as social partners and peer tutorsfor children: A field trial. Human-ComputerInteraction, 19:61–84, 2004.

[6] O. Kerpa, K. Weiss, and H. Worn. Development of aflexible tactile sensor system for a humanoid robot. InIEEE International conference on Intelligent Robotsand Systems (IROS 2003). IEEE Press, 2003.

[7] V. Lumelsky. NASA -http://www.nasa.gov/vision/earth/everydaylife/vladskin.html. Last accessed 14-06-06.

[8] V. Lumelsky, M. Shur, and S. Wagner. Sensitive skin.IEEE Sensors Journal, 1, 2001.

[9] V. Maheshwari and R. Saraf. High-resolution thin-filmdevice to sense texture by touch. Science, pages1501–1504, June 2006.

[10] F. Michaud and S. Caron. Roball, the rolling robot.Autonomous Robots, 12(2):211–222, 2002.

[11] F. Michaud, J.-F. Laplante, H. Larouche, A. Duquette,S. Caron, D. Letourneau, and P. Masson. Autonomousspherical mobile robot for child-development studies.Systems, Man, and Cybernetics, 35:471–480, 2005.

[12] MIT. http://www.media.mit.edu/research/ResearchPubWeb.pl?ID=53, Last accessed 24/08/06.

[13] T. Miyashita, T. Tajika, H. Ishigurio, K. Kogrue, andN. Hagita. Haptic communication between humansand robots. In 12th International Symposium ofRobotics Research, San Francisco, CA, USA, 2005.

[14] B. Robins, K. Dautenhahn, R. te Boekhorst, andA. Billard. Effects of repeated exposure of a humanoidrobot on children with autism. Universal Access andAssistive Technology (CWUAAT), pages 225–236,2004.

[15] T. Salter and K. Dautenhahn. Guidelines forrobot-human environments in therapy. In IEEERo-man 2004, 13th IEEE International Workshop onRobot and Human Interactive Communication, pages41–46, Kurashiki, Okayama, Japan, 2004. IEEE Press.

[16] T. Salter, K. Dautenhahn, and R. te Boekhorst.Learning about natural human-robot interaction.Robotics and Autonomous Systems, 54(2):127–134,2004.

[17] T. Salter, K. Dautenhahn, and R. te Boekhorst.Robots moving out of the laboratory - detectinginteraction levels and human contact in noisy schoolenvironments. In IEEE Ro-man 2004, 13th IEEEInternational Workshop on Robot and HumanInteractive Communication, pages 563–568, Kurashiki,Okayama, Japan, 2004. IEEE Press.

[18] T. Salter, F. Michaud, K. Dautenhahn, D. Letourneau,and S. Caron. Recognizing interaction from a robot’sperspective. In RO-MAN 05, 14th IEEE InternationalWorkshop on Robot and Human InteractiveCommunication, Nashville, TN, USA, 2005.

[19] Sony. http://www.aibo-europe.com, lasted accessed06/10/04.

[20] W. D. Stiehl, J. Lieberman, C. Breazeal, L. Basel,L. Lalla, and M. Wolf. The design of the huggable: Atherapeutic robotic companion for relational, affectivetouch. In AAAI Fall Symposium on Caring Machines:AI in Eldercare, Washington, D.C., 2005.

[21] D. Wada, T. Shibata, T. Saito, and K. Tanie. Robotassisted activity for elderly people and nurses at a dayservice center. In IEEE International Conference onRobotics and Automation, pages 1416–1421,Washington, DC, 2002.

[22] T. Watanabe, R. Danbara, and M. Okubo. Interactor:Speech-driven embodied interactive actor. In IEEERo-man 2002, 11th International Workshop on Robotand Human Interactive Communication, pages430–435, Berlin, Germany, 2002. IEEE Press.

[23] S. Woods, K. Dautenhahn, and J. Schulz. Child andadults perspectives on robot appearance. In AISB ’05Symposium on Robot Companions: Hard Problems andOpen Challenges in Robot-Human Interaction,Hertfordshire, England, UK, 2005.