wearable haptics and hand tracking via an rgb-d camera or...

1
Wearable Haptics and Hand Tracking via an RGB-D Camera for Immersive Tactile Experiences * Leonardo Meli 1,2 , Stefano Scheggi 1 , Claudio Pacchierotti 1,2 , Domenico Prattichizzo 1,21 Dept. of Information Engineering and Mathematics, University of Siena, Siena, Italy. 2 Dept. of Advanced Robotics, Istituto Italiano di Tecnologia, Genova, Italy. (a) (b) (c) Figure 1: Tactile interaction in a virtual environment. User’s fingertips are tracked using an RGB-D camera, while three wearable haptic interfaces (a)-(b) provide the human operator with the compelling illusion of touching a virtual object (c). 1 Introduction In 1997 Sony revolutionized the gaming industry by introducing a simple but effective vibrotactile feedback in its DualShock con- troller for PlayStation. By 2013, more than 400M units have been sold. Nowadays, the game interface Wii Remote motion controller provides a similar feature, but wirelessly, and can be considered the most popular portable haptic interface, with over 100M sales. However, its force feedback is still limited to vibrations, reducing the possibility of simulating any rich contact interaction with vir- tual and remote environments. Towards a more realistic feeling of interacting with virtual objects, researchers focused on glove-type haptic displays such as the Rutgers Master II and the CyberGrasp, which provide force sensations to all the fingers of the hand simulta- neously. However, although they provide a compelling force feed- back, these displays are complex and very expensive - the Cyber- Grasp, for instance, costs more than 60,000 US dollars! Thus, it becomes crucial to find a trade-off between a realistic feel- ing of touch and cost/portability of the system. In this regard, we found tactile technologies very promising. Tactile devices are hap- tic interfaces able to provide tactile force feedback only (they do not provide any kind of kinesthetic force). This property makes possible to dramatically simplify their form factor and provide a compelling and realistic feeling of touching virtual objects [Pac- chierotti et al. 2014]. 2 Contribution In this work we present a wearable and portable fingertip tactile device for immersive virtual reality interaction. It consists of two platforms: one fixed to the back of the finger and one in contact with the fingertip, connected by three cables (see Fig. 1a). Three small electrical motors, equipped with position encoders, control * The research has received funding from the European Union Seventh Framework Programme FP7/2007-2013 with project “WEARHAP” (grant 601165). e-mail:{meli, scheggi, pacchierotti, prattichizzo}@dii.unisi.it the length of the cables, thus being able to tilt and move the plat- form towards or away from the fingertip. As a consequence, a 3-D force can be displayed to the user. The force to be provided was estimated according to a mathematical model of the finger, which considered a linear relationship between resultant wrench at the fin- gertip and device’s platform displacement. Each device has to be connected to a wrist bracelet, providing power and wireless connec- tion to an external computer. Up to five devices can be connected to a single bracelet, thus providing a flexible and modular solution as depicted in Fig. 1b. We validated this tactile interface in a vir- tual reality scenario (see Fig. 1c). Users were asked to wear three tactile devices on one hand and interact with different virtual ob- jects. The hand pose was tracked using a model based hand tracker via an RGB-D camera [Oikonomidis et al. 2011]. A virtual hand [ ˇ Sari´ c 2011] mimicked the user’s hand pose. As soon as the hand came in contact with a virtual object, the tactile devices applied the requested force to the users’ fingertips, providing them with the compelling sensation of touching the virtual environment. The pro- posed tactile system is extremely portable, cost-effective and com- pletely wireless (there are virtually no workspace restrictions apart from the ones related to the gesture recognition technique). A rep- resentative video can be found at http://goo.gl/t25yc8. Experiment. Ten participants tried our immersive virtual reality system. They were asked to wear the three tactile devices and inter- act with the virtual environment for 10 minutes: first with no force feedback (tactile devices switched off) and then with force feed- back. Participants were then asked to fill in a 6-item questionnaire using bipolar Likert-type seven-point scales. It contained a set of assertions, where a score of 7 was described as completely agree and a score of 1 as completely disagree with the assertion. It was divided in two sections considering (i) the realism of the interac- tion and (ii) the portability of the system. Participants considered the system portable and very easy to wear. Moreover, they found the tactile feedback natural and compelling. The tactile information was considered paramount towards a realistic experience. References OIKONOMIDIS, I., KYRIAZIS, N., AND ARGYROS, A. 2011. Efficient model-based 3d tracking of hand articulations using kinect. In Proc. 22nd BMVC. PACCHIEROTTI , C., TIRMIZI , A., AND PRATTICHIZZO, D. 2014. Improving trans- parency in teleoperation by means of cutaneous tactile force feedback. ACM Trans. Appl. Percept. 11, 1, 4:1–4:16. ˇ SARI ´ C, M., 2011. Libhand: A library for hand articulation. Version 0.9.

Upload: others

Post on 15-Mar-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Wearable Haptics and Hand Tracking via an RGB-D Camera or ...sirslab.dii.unisi.it/papers/2014/Meli.SIGGRAPH.2014.Haptics.Fin.pdf · force can be displayed to the user. The force to

Wearable Haptics and Hand Tracking via an RGB-D Camerafor Immersive Tactile Experiences∗

Leonardo Meli1,2, Stefano Scheggi1, Claudio Pacchierotti1,2, Domenico Prattichizzo1,2†

1 Dept. of Information Engineering and Mathematics, University of Siena, Siena, Italy. 2 Dept. of Advanced Robotics, Istituto Italiano di Tecnologia, Genova, Italy.

(a) (b) (c)

Figure 1: Tactile interaction in a virtual environment. User’s fingertips are tracked using an RGB-D camera, while three wearable hapticinterfaces (a)-(b) provide the human operator with the compelling illusion of touching a virtual object (c).

1 IntroductionIn 1997 Sony revolutionized the gaming industry by introducinga simple but effective vibrotactile feedback in its DualShock con-troller for PlayStation. By 2013, more than 400M units have beensold. Nowadays, the game interface Wii Remote motion controllerprovides a similar feature, but wirelessly, and can be consideredthe most popular portable haptic interface, with over 100M sales.However, its force feedback is still limited to vibrations, reducingthe possibility of simulating any rich contact interaction with vir-tual and remote environments. Towards a more realistic feeling ofinteracting with virtual objects, researchers focused on glove-typehaptic displays such as the Rutgers Master II and the CyberGrasp,which provide force sensations to all the fingers of the hand simulta-neously. However, although they provide a compelling force feed-back, these displays are complex and very expensive - the Cyber-Grasp, for instance, costs more than 60,000 US dollars!Thus, it becomes crucial to find a trade-off between a realistic feel-ing of touch and cost/portability of the system. In this regard, wefound tactile technologies very promising. Tactile devices are hap-tic interfaces able to provide tactile force feedback only (they donot provide any kind of kinesthetic force). This property makespossible to dramatically simplify their form factor and provide acompelling and realistic feeling of touching virtual objects [Pac-chierotti et al. 2014].

2 ContributionIn this work we present a wearable and portable fingertip tactiledevice for immersive virtual reality interaction. It consists of twoplatforms: one fixed to the back of the finger and one in contactwith the fingertip, connected by three cables (see Fig. 1a). Threesmall electrical motors, equipped with position encoders, control

∗The research has received funding from the European Union SeventhFramework Programme FP7/2007-2013 with project “WEARHAP” (grant601165).†e-mail:{meli, scheggi, pacchierotti, prattichizzo}@dii.unisi.it

the length of the cables, thus being able to tilt and move the plat-form towards or away from the fingertip. As a consequence, a 3-Dforce can be displayed to the user. The force to be provided wasestimated according to a mathematical model of the finger, whichconsidered a linear relationship between resultant wrench at the fin-gertip and device’s platform displacement. Each device has to beconnected to a wrist bracelet, providing power and wireless connec-tion to an external computer. Up to five devices can be connectedto a single bracelet, thus providing a flexible and modular solutionas depicted in Fig. 1b. We validated this tactile interface in a vir-tual reality scenario (see Fig. 1c). Users were asked to wear threetactile devices on one hand and interact with different virtual ob-jects. The hand pose was tracked using a model based hand trackervia an RGB-D camera [Oikonomidis et al. 2011]. A virtual hand[Saric 2011] mimicked the user’s hand pose. As soon as the handcame in contact with a virtual object, the tactile devices appliedthe requested force to the users’ fingertips, providing them with thecompelling sensation of touching the virtual environment. The pro-posed tactile system is extremely portable, cost-effective and com-pletely wireless (there are virtually no workspace restrictions apartfrom the ones related to the gesture recognition technique). A rep-resentative video can be found at http://goo.gl/t25yc8.

Experiment. Ten participants tried our immersive virtual realitysystem. They were asked to wear the three tactile devices and inter-act with the virtual environment for 10 minutes: first with no forcefeedback (tactile devices switched off) and then with force feed-back. Participants were then asked to fill in a 6-item questionnaireusing bipolar Likert-type seven-point scales. It contained a set ofassertions, where a score of 7 was described as completely agreeand a score of 1 as completely disagree with the assertion. It wasdivided in two sections considering (i) the realism of the interac-tion and (ii) the portability of the system. Participants consideredthe system portable and very easy to wear. Moreover, they foundthe tactile feedback natural and compelling. The tactile informationwas considered paramount towards a realistic experience.

ReferencesOIKONOMIDIS, I., KYRIAZIS, N., AND ARGYROS, A. 2011. Efficient model-based

3d tracking of hand articulations using kinect. In Proc. 22nd BMVC.

PACCHIEROTTI, C., TIRMIZI, A., AND PRATTICHIZZO, D. 2014. Improving trans-parency in teleoperation by means of cutaneous tactile force feedback. ACM Trans.Appl. Percept. 11, 1, 4:1–4:16.

SARIC, M., 2011. Libhand: A library for hand articulation. Version 0.9.