virtual world simulations to support robot-mediated interaction

37
Virtual World simulations to support Robot-Mediated Interaction. Dr. Michael Vallance Future University Hakodate, Japan http://www.mvallance.net

Upload: michael-vallance

Post on 11-May-2015

728 views

Category:

Education


0 download

DESCRIPTION

Invited keynote presentation given at Virtual Worlds Best Practices in Education conference in July 2013. Website http://www.vwbpe.org/ai1ec_event/keynote-speaker-michael-vallance-sl-dafydd-beresford?instance_id=413

TRANSCRIPT

Page 1: Virtual World simulations  to support  Robot-Mediated Interaction

Virtual World simulations to support

Robot-Mediated Interaction.

Dr. Michael VallanceFuture University Hakodate, Japan

http://www.mvallance.net

Page 2: Virtual World simulations  to support  Robot-Mediated Interaction

Research aim (long term): To design an evidence-­‐based  framework   of   learning when undertaking tasks   of   measurable  complexity in a 3D  virtual  world.

How?(i) procedural processes, (ii) learning reflections,(iii) collate data of students collaborating in-world when programming a robot

# A successful task consists of a robot and program solution to solve specified circuit challenges.

In   this   presentation, the focus is upon the   development   of  measuring   complexity   of   tasks   involving   robot-­‐mediated  interactions  (RMI).

Page 3: Virtual World simulations  to support  Robot-Mediated Interaction

March 11, 2011 Fukushima Japan nuclear plant disaster.

Earthquake and tsunami damaged cooling systems to reactors.

Four reactors exploded and radioactivity was released to the atmosphere.

Currently: evacuees cannot return home and depression is

becoming prevalent among the strained residents [1]; the Japanese government has changed its

criteria for dangerous levels of radioactivity so leaving residents confused [2];

workers are struggling to maintain the safety of the plant [3];

deformities have been discovered in local wildlife [4].

Why?  Our motivation for context

Page 4: Virtual World simulations  to support  Robot-Mediated Interaction

Lack of robots in Japan to assist with the recovery operations!!!

Less than a week iRobot USA donated two PackBot 510 robots and Warrior 710 robots, and iRobot engineers trained Japanese operators.

3 weeks for TEPCO to authorize their use [5].

Page 5: Virtual World simulations  to support  Robot-Mediated Interaction

1. People need to be better informed and equipped to make sense of information.

Give students learning opportunities: reflecting, organizing, negotiating and creating.

A challenging project like programming robots also provides opportunities for learning content in the Science, Technology, Engineering and Maths (STEM) subjects.

2. International collaboration is essential communication for now and the future.

A virtual world as a future 3D space. A safe medium for communication and experiential learning.

The tasks in this research aim to support (1) and (2).

As educators, what can we learn from this disaster?

Page 6: Virtual World simulations  to support  Robot-Mediated Interaction

The students’ aim is to communicate solutions to problems which involve the programming of a LEGO robot to follow specific circuits.

This is undertaken by 1. designing circuits - with robot maneuvers and sensors 2. experiencing collaboration - students in Japan and UK within 3D space.

Experiences lead to personal strategies for teamwork, planning, organizing, applying, analyzing, creating and reflection.

# Measured as Essential Skills for Wales Baccalaureate Qualification, UK. Evidence required by Education Authority for post-16 qualification.

About the research ...

Page 7: Virtual World simulations  to support  Robot-Mediated Interaction

Literature review of task complexity involving robots

Page 8: Virtual World simulations  to support  Robot-Mediated Interaction

CTC  =  Σ  (d  +  m  +  s+  o)

for  exampleCTC = Σ (4 + 3 + 2 + 2) = 11

There is no consensus in the discipline of Robotics or Human-Robot Interaction for accurately measuring task complexity [6].

Given the specific purposes of the robot in our research, task complexity was calculated according to the number of sections that make up a given maze [7] [8].

Circuit Task Complexity (CTC) = number of directions + number of maneuvers + number of sensors + number of obstacles.

Circuit  Task  Complexity

Page 9: Virtual World simulations  to support  Robot-Mediated Interaction

We found that the logic of assigning task complexity to circuits was inadequate.

For instance, complexity values were assigned to distinct maneuvers such as forward – turn – back.

Over the course of our previous research, as circuits became more challenging, the NXT programming became more complex.

Especially adding sensors to maneuver around and over obstacles. Simply counting the number of obstacles in the circuit task

complexity was flawed because the programming required to maneuver over a bridge using touch sensors, for instance, was far more complex than maneuvering around a box using touch sensors.

CTC  =  Σ  (d  +  m  +  s+  o)Circuit  Task  Complexity

Page 10: Virtual World simulations  to support  Robot-Mediated Interaction

In the NXT Mindstorms software, the Move block controls the LEGO robot direction and turns.

Move block contains 6 variables:NXT ‘brick’ port link - direction - steering - power - duration - next action.

In other words, the students have to make 6 specific decisions about the values which make up the programmable block. Therefore, we assign v1 a value of 6.

This was repeated for sensor, switch and loop.

Robot Task Complexity

RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3    

Page 11: Virtual World simulations  to support  Robot-Mediated Interaction

RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3      

where,M  =  number  of  moves  (direcHon  and  turn)  S    =  number  of  sensorsSW  =  number  of  switchesL  =  number  of  loops

for  example  RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3    RTC = (8 x 6) + (3 x 5) + 0 + 3RTC = 66

v  =  number  of  decisions  required  by  user  for  each  programmable  blockv1  =  6  v2  =  5v3  =  2

Robot  Task  Complexity

We acknowledge that, at present, our modified Robot Task Complexity metric applies only to the Mindstorms NXT software and LEGO robot, but it does provide a useful indicator in our attempts to analyze the experiential learning during the collaborative tasks. The CTC problem can now be evaluated against the RTC solution.

Page 12: Virtual World simulations  to support  Robot-Mediated Interaction

Students in one country 1. provided with task specification 2. work on a solution to the task3. construct their circuit in the virtual world + in their real-world lab4. develop a NXT program to maneuver the physical LEGO robot

appropriately.

The problem and the proposed solution are then communicated in real-time to   students   in   the   other  country via the 3D virtual world.

Task implementation

Page 13: Virtual World simulations  to support  Robot-Mediated Interaction

Task specification examples

Task Task: robot actions

CTC/ target CTC only / objective is to iteratively increase CTC/

Collaboration

STEM/ anticipated

Essential Skills (Wales Baccalaureate)/ anticipated

RTC/ post task calculation based upon students’ solution.

T1

Movement: follow the line.Sensors: light and touch

CTC = Σ (d + m + s+ o) CTC= 1+2+2+1 = 7

Japan teach UK

S: Recognition of light sensor values. What happens when trigger point increased/ decreased?T: Learn how to organise NXT program blocks logically.E: Construct a robot. Connect software to hardware.M: Recognise spatial movements and the problem of friction. Change surface to see if robot works the same. Calculate coefficient of friction.

IdentifyPlan/ manageExplore/ Analyse (organize)Evaluate (checking)Reflect

T2

Movement: follow the line.Sensors: colour and action.

CTC= 1+2+2+2 = 8UK teach Japan

S: Recognition of light sensor values. What happens when trigger point increased/ decreased? How does the NXT sensor recognise colour R, G or B? Try different colour variations and observe subsequent robot actions.T: Learn how to organise NXT program blocks logically.E: Construct a robot. Connect software to hardware.M:

IdentifyPlan/ manageExplore/ Analyse (organize)Evaluate (checking)Reflect

T3

Movement: square.Sensors: touch and sound.

CTC = 4+3+1+1 = 9Japan teach UK

S: T: Learn how to organise NXT program blocks logically.E: Construct a robot. Connect software to hardware.M: Calculate distance, speed and force (touch).

IdentifyPlan/ manageExplore/ Analyse (organize)Evaluate (checking)Reflect

Page 14: Virtual World simulations  to support  Robot-Mediated Interaction

Resources.• LEGO Mindstorms NXT software version 2.1• LabView 2010 with NXT module. • LEGO robot 8527 kit• LEGO blocks and similar workspaces/lab in Japan university + 2 UK schools• All have same Apple technologies (MacBook Pro + OSX 10.7)

Page 15: Virtual World simulations  to support  Robot-Mediated Interaction

reactor off switchradioactive bins

controlstation

Page 16: Virtual World simulations  to support  Robot-Mediated Interaction
Page 17: Virtual World simulations  to support  Robot-Mediated Interaction
Page 18: Virtual World simulations  to support  Robot-Mediated Interaction
Page 19: Virtual World simulations  to support  Robot-Mediated Interaction

www.firesabre.com

Resources.• Aurora-Sim hosted by Firesabre.

Page 20: Virtual World simulations  to support  Robot-Mediated Interaction
Page 21: Virtual World simulations  to support  Robot-Mediated Interaction
Page 22: Virtual World simulations  to support  Robot-Mediated Interaction
Page 23: Virtual World simulations  to support  Robot-Mediated Interaction
Page 24: Virtual World simulations  to support  Robot-Mediated Interaction
Page 25: Virtual World simulations  to support  Robot-Mediated Interaction
Page 26: Virtual World simulations  to support  Robot-Mediated Interaction

Virtual Fukushima in JIBE hosted by Reaction

Grid reactiongrid.net

Page 27: Virtual World simulations  to support  Robot-Mediated Interaction

Virtual Fukushima in JIBE hosted by Reaction Grid reactiongrid.net

Page 28: Virtual World simulations  to support  Robot-Mediated Interaction

telerobotics “for the rest of us”

inspired by

Page 29: Virtual World simulations  to support  Robot-Mediated Interaction

Task flow chart for simulation

Page 30: Virtual World simulations  to support  Robot-Mediated Interaction

Task Task descriptionT1 Assemble LEGO robots. JPN + UK students introductionsT2 NXT program + circuit. JPN teaching UKT3 NXT program + circuit (90 degree turns + measured length). UK teaching JPNT4 Circuit + NXT program. Move. Touch sensor. Turn 90 degrees. JPN teaching JPN.T5 Circuit + NXT program. Around obstacles. JPN teaching JPN.T6 Circuit + NXT program. Around obstacles. JPN teaching JPN.T7 NXT program + touch sensors + circuit. Locate and press switch off. JPN teaching JPN.T8 Over an obstacle. NXT program + sensors + bridge building (cardboard). JPN teaching JPN.T9 Over an obstacle. NXT program + sensors + bridge building (wood). JPN teaching JPN.T10 Robot arm + scoop. UK teaching JPNT11 Robot arm + NXT program. JPN preparationT12 Robot arm + scoop + NXT program. Streaming video. JPN teaching UK.T13 Programming LabView for remote control.T14 Programming LabView for remote control.T15 Programming LabView for remote control.T16 UK teaching Japan. Robot construction + NXT program + stop and swing arm to hit ball.T17 Suika robot. Rotate + follow line+ sensor + chop down. Japan preparation 1.T18 Suika robot. Rotate + follow line+ sensor + chop down. Japan preparation 2.T19 Suika robot. Rotate + follow line+ sensor + chop down. Japan preparation 3.T20 Robot construction + NXT program + + obstacles + sensors.T21 Suika robot. Rotate + follow line+ sensor + chop down. Japan teach UK.T22 Programming LabView for remote control.T23 Programming LabView for remote control.T24 Remote control for search & rescue circuit A.T25 Remote control for search & rescue circuit B.T26 Remote control for search & rescue circuit C.T27 Remote control for search & rescue circuit D.T28 Move to black line, stop and throw ball to hit over obstacle. UK teaching Japan.

Tasks

Page 31: Virtual World simulations  to support  Robot-Mediated Interaction

CTC  =  Σ  (d  +  m  +  s+  o)

RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3   

Task CTC RTCT2 0.56 0.22T3 0.50 0.42T4 0.81 0.22T5 0.81 0.57T6 1.00 0.85T7 0.69 1.00T8 0.25 0.39T9 0.31 0.33T10 0.19 0.20T11 0.63 0.76T12 0.63 0.84T16 0.56 0.83T17 0.25 0.22T18 0.31 0.65T19 0.31 0.65T20 0.69 0.48T21 0.31 0.65T28 0.25 0.17

Circuit  Task  ComplexityRobot  Task  Complexity

Task

Task

Com

plex

ity

Page 32: Virtual World simulations  to support  Robot-Mediated Interaction

Graph  of  Task  ~  Task  Fidelity

Task TFT2 0.34T3 0.08T4 0.59T5 0.24T6 0.15T7 -0.31T8 -0.14T9 -0.02T10 -0.01T11 -0.13T12 -0.21T16 -0.27T17 0.03T18 -0.34T19 -0.34T20 0.21T21 -0.34T28 0.08

Task

Task

Fid

elity

Page 33: Virtual World simulations  to support  Robot-Mediated Interaction

Immersion ( flow ) - how immersed students become within the process of each task.

To record immersion (or flow), a virtual FlowPad appears in front of the virtual world avatars.

At regular intervals during the task procedures each avatar has to answer two questions, with four options:

Q1. How challenging is the activity? • Difficult (score 4)• Demanding (score 3)• Manageable (score 2)• Easy (score 1).

Q2. How skilled are you at the activity? • Hopeless (score 1)• Reasonable (score 2)• Competent (score 3)• Masterful (Score 4).

These questions were chosen based upon research in flow by Pearce et al. [9].

Page 34: Virtual World simulations  to support  Robot-Mediated Interaction

Graph oftask immersivity (flow)

Task Challenge SkillT2 0.5 1T3 0.75 0.5T4 0.5 0.75T5 1 0.67T6 0.8 0.67T7 0.67 0.8T8 0.67 0.5T9 0.42 0.92T10 0.42 0.5T11 0.8 0.5T12 0.58 0.58T16 0.8 0.45T17 0.25 1T18 0.7 0.7T19 0.25 1T20 0.94 0.5T21 0.75 0.75T28 0.75 0.58

Page 35: Virtual World simulations  to support  Robot-Mediated Interaction

If we look at the data of Task Fidelity and immersivity, we suggest that T10 and T28 would be considered most successful tasks when students are engaged in robot mediated interactions.

TF value for T28 was only + 0.08; slightly above the optimal Task Fidelity line. T28 was slightly below the optimal path of immersivity.

Similarly for T10 with immersivity slightly above optimal path of immersivity and Task Fidelity at +0.01.

The challenge for instructors is to seek tasks similar to T28 and T10 where immersivity is close to or on the optimal path of immersivity, and task complexity is close to or on the optimal line of Task Fidelity.

The challenge for researchers is to seek ways to transfer these observations to further tasks with different participants in order to develop more reliable optimal learning tasks when engaged in robot mediated interactions in a virtual space [10].

Page 36: Virtual World simulations  to support  Robot-Mediated Interaction

This applied research is developing metrics for learning when conducting virtual world tasks. The motivation to implement this research was the nuclear disaster of 3-11. A virtual Fukushima nuclear plant and an OpenSim training space have been iteratively designed and built. International collaboration by students as non-experts has highlighted the benefits and challenges posed when engaged in constructing robot-mediated interactions (RMI) within the context of distance-based communication in 3D spaces. Students’ immersion (or flow), Circuit Task Complexity, and Robot Task Complexity have been calculated. Optimal learning tasks have been highlighted. A new metric is suggested for measuring tasks involving robots, which we term Task Fidelity [10].

Many thanks to UK collaborators and students at University of South Wales and Cynon Valley schools, my students at Future University, Japan, and metaverse designers at Firesabre and

Reaction Grid.

Conclusion

Next questionHow can a better taxonomy be designed to identify specific learning when students are engaged in mixed reality (real and 3D virtual world) Robot Mediated Interactions?

Acknowledgements

Page 37: Virtual World simulations  to support  Robot-Mediated Interaction

References

(1) T. Morris-Suzuki, D. Boilley, D. McNeill and A. Gundersen. Lessons from Fukushima. Netherlands: Greenpeace International, February 2012.

(2) J. Watts. “Fukushima parents dish the dirt in protest over radiation levels.” The Guardian, May 2, 2011. [Online]. Available: http://www.guardian.co.uk/world/2011/may/02/parents-revolt-radiation-levels [Accessed August 20, 2012].

(3) L. W. Hixson. “Japan’s nuclear safety agency fights to stay relevant.” Japan Today. [Online]. Available: http://www.japantoday.com/category/opinions/view/japans-nuclear-safety-agency-Fig.hts-to-stay-relevant [Accessed August 20, 2012].

(4) N. Crumpton. “Severe abnormalities found in Fukushima butterflies.” BBC Science & Environment. [Online]. Available: http://www.bbc.co.uk/news/science-environment-19245818 [Accessed August 20, 2012].

(5) E. Guizzo. “Fukushima Robot Operator Writes Tell-All Blog.” IEEE Spectrum, August 23, 2011. [Online]. Available: http://spectrum.ieee.org/automaton/robotics/industrial-robots/fukushima-robot-operator-diaries [Accessed August 20, 2012].

(6) M. Vallance and S. Martin. “Assessment and Learning in the Virtual World: Tasks, Taxonomies and Teaching For Real.” Journal of Virtual Worlds Research Vol. 5, No. 2, 2012.

(7) S. B. Barker and J. Ansorge. “Robotics as means to increase achievement scores in an informal learning environment.” Journal of Research in Technology and Education, Vol. 39, No. 3, pp. 229-243, 2007.

(8) D.R. Olsen and M.A. Goodrich, “Metrics for evaluating human-robot interactions.” [Online]. Available: http://icie.cs.byu.edu/Papers/RAD.pdf [Accessed March 14, 2009].

(9) M. Pearce, M. Ainley and S. Howard. “The ebb and flow of online learning.” Computers in Human Behavior, Vol. 21, pp. 745–771, 2005.

(10) M. Vallance, C. Naamani, M. Thomas and J. Thomas. “Applied Information Science Research in a Virtual World Simulation to Support Robot Mediated Interaction Following the Fukushima Nuclear Disaster.” Communications in Information Science and Management Engineering (CISME). Vol. 3 Issue 5, pp. 222-232.