discrimination of springs with vision, proprioception, and...

12
Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch Cues Netta Gurari 1 , Jason Wheeler 2,3 , Amy Shelton 4 , and Allison M. Okamura 1,2 1 Department of Mechanical Engineering, 4 Department of Pyschological and Brain Sciences, Johns Hopkins University, Baltimore, MD, 21218, USA 2 Department of Mechanical Engineering, Stanford University, Stanford, California, 94305, USA 3 Intelligent Systems Controls, Sandia National Laboratories, Albuquerque, NM, 87185, USA Abstract. During upper-limb prosthesis use, proprioception is not available so visual cues are used to identify the location of the artificial limb. We investigate the efficacy of a skin stretch device for artificially relaying proprioception during a spring discrimination task, with the goal of enabling the task to be achieved in the absence of vision. In this study, intact users perceive the location of a virtual prosthetic limb using each of four sensory conditions: Vision, Proprioception, Skin Stretch, and Skin Stretch with Vision. For the conditions with skin stretch, a haptic device stretches the forearm skin by an amount proportional to the an- gular rotation of a virtual prosthetic limb. Sensory condition was not found to significantly influence task performance, exploration methods, or perceived use- fulness. We conclude that, in the absence of vision, artificial skin stretch could be used by prosthesis wearers to obtain position/motion information and identify the behavior of a spring. Keywords: Proprioception, Compliance, Sensory Substitution, Prosthetics 1 Introduction Proprioception, the perception of the location of one’s limbs in space and how they are moving, is necessary for intuitive and fluid motor control [5,24]. Trained move- ments can be performed with minimal or no proprioception, possibly using internal models, low-level controllers (an open-loop control strategy), and/or visual cues. How- ever, when users interact with new environments, particularly in the absence of sight, proprioception is necessary to close the feedback loop and provide real-time haptic in- formation about the limb. Currently when an upper-extremity amputee controls an artificial limb, propriocep- tive information is primarily relayed through sight (Figure 1). However, for scenarios such as when lights are turned off in a room or at night, vision is not available. Also, upper-limb amputees desire that the visual demand of a prosthesis be reduced [1]. State- of-the-art in commercially available upper-extremity prostheses enables only minimal non-visual position feedback, via auditory cues and socket force/torque cues. Several groups are investigating methods for relaying proprioceptive information neurally [8, 14], but decades may pass before such technology is available to safely convey informa- tion to humans. Researchers are also investigating methods for relaying proprioceptive information noninvasively using sensory substitution [2, 4, 20–22]. For example, Bark, et al. developed a novel skin stretching device for this purpose [2, 29].

Upload: lythuy

Post on 02-Apr-2018

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

Discrimination of Springs with Vision, Proprioception,and Artificial Skin Stretch Cues

Netta Gurari1, Jason Wheeler2,3, Amy Shelton4, and Allison M. Okamura1,2

1 Department of Mechanical Engineering, 4 Department of Pyschological and Brain Sciences,Johns Hopkins University, Baltimore, MD, 21218, USA

2 Department of Mechanical Engineering, Stanford University,Stanford, California, 94305, USA

3 Intelligent Systems Controls, Sandia National Laboratories,Albuquerque, NM, 87185, USA

Abstract. During upper-limb prosthesis use, proprioception is not available sovisual cues are used to identify the location of the artificial limb. We investigatethe efficacy of a skin stretch device for artificially relaying proprioception duringa spring discrimination task, with the goal of enabling the task to be achieved inthe absence of vision. In this study, intact users perceive the location of a virtualprosthetic limb using each of four sensory conditions: Vision, Proprioception,Skin Stretch, and Skin Stretch with Vision. For the conditions with skin stretch,a haptic device stretches the forearm skin by an amount proportional to the an-gular rotation of a virtual prosthetic limb. Sensory condition was not found tosignificantly influence task performance, exploration methods, or perceived use-fulness. We conclude that, in the absence of vision, artificial skin stretch couldbe used by prosthesis wearers to obtain position/motion information and identifythe behavior of a spring.

Keywords: Proprioception, Compliance, Sensory Substitution, Prosthetics

1 Introduction

Proprioception, the perception of the location of one’s limbs in space and how theyare moving, is necessary for intuitive and fluid motor control [5, 24]. Trained move-ments can be performed with minimal or no proprioception, possibly using internalmodels, low-level controllers (an open-loop control strategy), and/or visual cues. How-ever, when users interact with new environments, particularly in the absence of sight,proprioception is necessary to close the feedback loop and provide real-time haptic in-formation about the limb.

Currently when an upper-extremity amputee controls an artificial limb, propriocep-tive information is primarily relayed through sight (Figure 1). However, for scenariossuch as when lights are turned off in a room or at night, vision is not available. Also,upper-limb amputees desire that the visual demand of a prosthesis be reduced [1]. State-of-the-art in commercially available upper-extremity prostheses enables only minimalnon-visual position feedback, via auditory cues and socket force/torque cues. Severalgroups are investigating methods for relaying proprioceptive information neurally [8,14], but decades may pass before such technology is available to safely convey informa-tion to humans. Researchers are also investigating methods for relaying proprioceptiveinformation noninvasively using sensory substitution [2, 4, 20–22]. For example, Bark,et al. developed a novel skin stretching device for this purpose [2, 29].

Page 2: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

2 Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch

Proprio-ceptive MotionCorollary Discharge

Finger/Hand Forces/Torques

Corollary Discharge

Finger/Hand Forces/Torques

Experimental Sensory Conditions

Corollary DischargeSocket Forces/Torques

Vision

Myoelectrically Controlled Upper-Limb Prosthesis Use

VisionCorollary Discharge

Finger/Hand Forces/Torques

Artificial Position/MotionCorollary Discharge

Finger/Hand Forces/Torques

Artificial Position/Motion

Skin Stretch Skin Stretch with Vision

ProprioceptionVision

Vision

Fig. 1. Description of how users of myoelectric upper-limb prostheses and our system may per-ceive the location of the artificial limb.

Here, we test the feasibility of the skin stretch device [2, 29] to artificially conveyproprioception during an object interaction task: pressing on a linear spring. This sim-ple task characterizes the effect of proprioception during activities such as identifyingthe ripeness of a piece of fruit or the air pressure in a tire. The stiffness of a spring isdefined by the relationship between the amount the object compresses and the appliedforce, or from the human’s perspective, the relationship between the amount a body parttravels and the force the human applies. Prior work showed that when position/motioncues were perceived proprioceptively, spring discrimination performance did not signif-icantly differ from when position/motion cues were perceived visually [12]. This studyis the first to compare task performance in a scenario relaying artificial proprioception(via skin stretch feedback), natural proprioception, and visual feedback. Our goal is todetermine whether the skin stretch device is suitable for relaying positional cues to anupper-extremity amputee using an artificial limb, so that a prosthetic arm can be usedin the absence of sight.

2 Background

2.1 Spring Perception

Proprioceptive, force, cutaneous, and/or visual cues combine to define an object’s stiff-ness [15, 19, 25, 27, 28] by integrating information from sources such as muscle spindlefibers, Golgi tendon organs, joint angle receptors, cutaneous mechanoreceptors, andcorollary discharges (a copy of the efferent command that is sent to the sensory percep-tion area of the brain) [6, 7, 9, 10, 17]. To discriminate springs, humans use both forceand position cues when the distance the limb travels is variable, whereas humans useonly force cues when the distance the limb travels is fixed [18, 23, 26]. Position cues arealways sensed to identify the distance traveled. Spring discrimination capabilities may

Page 3: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch 3

improve if visual cues are relayed in addition to proprioception [28]; however, the visualcues may dominate over kinesthetic cues when a visual-haptic mismatch exists [25].

2.2 Sensory Substitution for Upper-Limb Prosthesis Users

To relay proprioceptive information artificially, it is necessary to consider which sensorymodality to stimulate (e.g., audio, visual, haptic) and the location to stimulate (e.g., foot,arm). Additionally, the sensory display should be unobtrusive and provide backgroundinformation that does not interfere with one’s daily activities [16]. We aim to providehaptic feedback on a part of the body that is not actively used, and provide noninvasivestimulation for reasons of safety and user comfort.

Electrocutaneous and vibrotactile stimulation have been explored for haptically andnoninvasively relaying information during upper-limb prosthesis use [2, 4, 13, 20–22].A major concern for electrocutaneous feedback is that the cues may be uncomfortable,or even painful, if the stimulus is not properly designed. For vibratory feedback, a majorconcern is that the stimulation may be uncomfortable and/or ignored if the signal is con-tinuous. More recently, skin stretching devices were designed for artificially relayinginformation [2, 11]. These devices can relay positional cues by giving a direct mappingbetween skin stretch amount and the artificial limb’s location. Skin stretch feedback wasshown to give superior targeting capabilities to vibratory feedback and no feedback [2].

3 Experimental Setup

3.1 Hardware

A custom user interface and skin stretch device, as shown in Figure 2, are used fortesting. Haptic and visual rendering updates occur at 1 kHz and 33 Hz, respectively.

Custom User Interface The apparatus was first described in [12] and was slightlymodified for this study. It is a one-degree-of-freedom, impedance-controlled haptic de-vice that allows control of a virtual spring by rotating one’s right index finger. The me-chanical workspace of the device is approximately −10◦ to 60◦ of rotation. The user’sright index finger attaches to the finger plate by a velcro strap, and the user grasps thecylindrical tube, with outside diameter of 3.2 cm, so that motion is solely about themetacarphophalangeal (MCP) joint. Visual feedback is provided on a Dell 1908WFPtflat panel monitor (Texas, United States) with resolution of 0.2 mm/pixel. The horizon-tal placement of the monitor was chosen by visual inspection so that the real finger isaligned with the virtual finger, and the vertical placement was chosen so that the realfinger is offset from the virtual finger by approximately 0.09 m.

An ATI Nano-17 6-axis force/torque sensor (North Carolina, United States), withresolution of 0.0017 N along the testing axis, measures applied finger force. It is affixedto the finger plate at an adjustable distance of l f −0.015 m, where l f is the distance be-tween the MCP joint and finger tip. The finger plate is connected to a capstan drive witha ratio of 10:1, which is driven by a non-geared, backdrivable Maxon RE 40 DC motor(Sachseln, Switzerland). The capstan drive minimizes effects of friction and backlash

Page 4: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

4 Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch

Graphical Display of Virtual Finger Motor with Encoder

Force Sensor Finger PlateSkin Stretch Device Cylindrical Tube

Headphones Playing White Noise

Fig. 2. Setup for spring discrimination experiment on a custom haptic interface using skin stretchfeedback to the ipsilateral arm.

in the system, and increases the maximum torque output to 1.8 N·m. A HEDS 5540encoder, with resolution of 0.018◦, is attached to the motor.

Skin Stretch Device A characterization of the skin stretch device and results on itsperformance during a targeting task are given in [29]. Here, the device conveys proprio-ception by stretching the skin clockwise and counterclockwise. The maximum comfort-able range-of-motion in each direction is 40◦. A Shinsei USR30-B3 non-backdriveableultrasonic motor (Himeji, Japan), with a capstan/cable transmission that has a speedratio of 6:1, applies a torque to the end-effector, and an optical encoder measures theangular rotation. The end-effector is comprised of two cylindrical disks with diame-ter of 1.4 cm spaced 2.6 cm apart. The body of the device was created using ShapeDeposition Manufacturing [3] and is fastened to the arm with velcro straps.

3.2 Sensory Conditions

The angular rotation of the virtual prosthetic limb is relayed by Vision, Proprioception,Skin Stretch, or Skin Stretch with Vision. For each condition, 11 virtual springs arerendered, displaying stiffnesses of approximately kdes1,...,11 = 250 N/m to 330 N/m inincrements of 8 N/m. For all conditions, aside from Vision, the stiffnesses commandedto the motor, kcmd1,...,11 , were offset from kdes1,...,11 to compensate for friction and in-ertial forces in the system. Additionally, for all conditions the low-pass-filtered forcemeasurement, F̂f p,meas, is used in the controllers.

Proprioception The user’s finger is permitted to rotate (Proprioception On), or it isheld stationary by mechanically locking the finger plate at 0◦ (Proprioception Off). The

Page 5: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch 5

former motion is elastic and the latter is isometric. The user presses on the finger plateand feels a rotation of the finger about the MCP joint by an amount, θ f , as measured bythe encoder, and a resistive torque from the finger plate. The desired torque output bythe motor to the finger plate, τ f p,out put , is:

τ f p,out put =π(l f −0.015)

180◦· kcmdiθ f (l f −0.015), (1)

where the commanded stiffnesses are kcmd1,...,11 = (kdes1,...,11 +5) N/m. A low-level pro-portional controller with error gain of kp = 5.0 enforces the desired haptic response;thus, the commanded motor torque, τ f p, is:

τ f p = τ f p,out put + kp[τ f p,out put − F̂f p,meas(l f −0.015)]. (2)

Skin Stretch The skin stretch device rotates on the user’s forearm to indicate the lo-cation of the virtual finger (Skin Stretch On), or it is held stationary (Skin Stretch Off).When Skin Stretch is On, the user’s real finger is fixed in place by mechanically lock-ing the finger plate at 0◦. The desired rotation amount for the skin stretch device, θssdes ,is defined by the user-selected maximum force, Fmax, and the commanded stiffnesses,kcmd1,...,11 = (kdes1,...,11 −7) N/m, giving:

θssdes =

40◦F̂f p,meas

Fmax

kcmd1kcmdi

, if F̂f p,meas ≤ Fmax

40◦, if F̂f p,meas > Fmax

. (3)

The rotation amount of the device, θssmeas , is obtained from encoder measurements,and a low-level proportional controller with error gain of kpss = 2.0 s−1 commands therotational speed, ωsscom , as:

ωsscom =

{0, if

∣∣θssdes −θssmeas

∣∣< 0.08◦

ωsscom = kpss(θssdes −θssmeas), otherwise, (4)

where a deadband of 0.08◦ was implemented to eliminate chatter.For the Skin Stretch with Vision condition, angular rotation of the virtual finger,

θv f , on the monitor is displayed as:

θv f =θssmeasFmax

40◦kcmd1

180◦

πl f. (5)

Vision A graphical representation of the virtual finger is either displayed on the mon-itor (Vision On), or it is not displayed (Vision Off). The virtual finger is rendered asa vertical line the length of the user’s real finger, l f . The lower endpoint of the vir-tual finger is connected to a stationary vertical line that represents the subject’s hand,and both lines are solid black with a thickness of 2.0 mm. The commanded stiffnessesare kcmd1,...,11 = kdes1,...,11 N/m, and the amount the virtual finger rotates about its lowerendpoint in the counter-clockwise direction from the 0◦ vertical positions, θv f , is:

θv f =F̂f p,meas

kcmdi

180◦

π(l f −0.015). (6)

Page 6: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

6 Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch

3.3 Relationship to Myoelectrically Controlled Upper-Limb Prosthesis Use

The goal of this study was to assess the feasibility of using skin stretch for relayingproprioceptive information under idealized control conditions. The primary applicationwe focus on is myoelectrically controlled upper-limb prosthesis use; however, this studycan extend to the role of artificially relaying proprioception for other scenarios includingteleoperating a system. Figure 1 displays how a myoelectrically controlled upper-limbprosthesis user and a user of our experimental setup perceive the location of the artificiallimb. The efferent commands are similar for all scenarios – for our study, the forcesensor measures the user’s intentions and for upper-limb prosthesis use, EMG sensorsmeasure the user’s intentions. Measurements of the user’s intended motion are cleanerand clearer with our system than with myoelectric sensing, and the ability to control thevirtual limb with our system is less variable to real-world effects of friction and inertiathat are witnessed when using a real artificial limb. For this reason, we conduct testingwith intact individuals and not upper-extremity amputees.

Differences in sensing occur in the afferent pathways, specifically how motion is re-layed. For all scenarios, corollary discharges convey position/motion information. Ad-ditionally, forces convey location; for our study, forces are perceived at the right indexfinger and palm of the hand, which is grasping the cylindrical tube, and for artificiallimbs, forces are perceived throughout the socket, or the location where the prosthesisattaches to the user. Motion is also relayed visually, proprioceptively, and/or throughskin stretch. During upper-limb prosthesis use and our Vision condition, the motionof the artificial limb and of the virtual line are respectively perceived through sight.During Proprioception, motion is perceived through the movement of the real finger.During Skin Stretch, motion is perceived through the stretching of the user’s skin. Andlast, during Skin Stretch with Vision, motion is perceived through both skin stretchingand sight. For this study, the real finger is fixed in space, and for upper-limb prosthesisuse, the phantom limb may be at a fixed location. Thus, for all scenarios, aside from Pro-prioception, stationary positional cues are relayed and sensory integration likely occurs.

4 Methods

4.1 Task

The method of constant stimuli was used to quantify perceptual performance. Partic-ipants completed a two-alternative, forced-choice task by pressing on two springs – acomparison (kcmd1,··· ,5,7,··· ,11 ) and standard (kcmd6 ) – and indicating which is more com-pliant. Pilot testing indicated that discriminating compliance was more intuitive thanstiffness (the more compliant the spring, the more the skin stretch device rotates).

To begin a trial, the participant presses a specified key on the keyboard. All keyboardpresses were made using the left hand. The participant uses the right index finger topress on the finger plate, rotating about the MCP joint only. To switch between springs,he or she applies less than a threshold force and then presses the space bar. The thresholdforce ensures that when the springs change, the participant does not perceive a visualand/or haptic change. The speed of rotation of the natural and artificial fingers is limitedto less than 100◦/sec to ensure that linear virtual springs with comparable stiffnesses

Page 7: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch 7

are rendered across all conditions. Also, the maximum applied force allowed is setto Fmax to ensure that the whole workspace of the skin stretch device is used. If theparticipant’s speed or applied force exceeds a limit, a visual indicator is posted on themonitor and is only removed by pressing the space bar. Once finished exploring thesprings, the participant presses the ‘1’ or ‘2’ key on the keyboard to identify the springthat is more compliant; this marks the end of the trial. The participant is limited to30 s of exploration time for each trial, excluding time during which error messages aredisplayed. This was shown to be a reasonable length of time to comfortably explore thesprings and respond [12].

4.2 Procedures

The experiment was conducted in two sessions to minimize fatigue and boredom. Ses-sions were held on consecutive days and each lasted approximately 90 min. During eachsession, the participant was presented with two of the four conditions. Presentation or-der of conditions, presentation order of comparison springs, and placement locationof the comparison spring (i.e., Spring 1 versus 2) was randomized for all participantsacross all conditions.

The first session began by adjusting the haptic interface for the participant’s com-fort and use. The participant was outfitted with the skin stretch device, which was tobe worn throughout the entire experiment, on the right upper forearm. Red-e TapeT M ,a strong skin-safe adhesive, maintained contact between the participant’s arm and thedevice’s end-effector. Next, the participant calibrated the device by pressing the forcesensor, which was locked in place, with a moderately high force, Fmax, that could besustained throughout the session. Then the skin-stretch controller was turned On, andthe participant pressed on the force sensor to learn the mapping between the input forceand the output device rotation. Four trials were performed for each condition, for a totalof 16 trials, with the softest spring presented twice followed by the stiffest spring. Pre-sentation order of the conditions during this learning phase was always Proprioception,Vision, Skin Stretch, and Skin Stretch with Vision. During Skin Stretch with Vision,the participant was instructed to attend to both sensory modalities equally so that theweighting of each would be equal.

At this time, the participant began the practice trials. His/her hand was concealedso that visual feedback from the real finger’s motion was not possible. The partici-pant’s measured finger speed was indicated by a visual display. The discrimination taskwas performed once for each comparison spring for a total of ten trials, and feedbackwas given on whether the response was correct so that the range of stiffnesses waslearned. The participant then began the testing trials. Headphones playing white noisewere worn to mask auditory cues from the apparatus and distractions in the room. Thediscrimination task was performed 10 times for each of the 10 comparison springs, for atotal of 100 trials, with a 90 s rest break after every block of 25 trials. Feedback was notprovided as to whether the participant responded correctly so that learning no longer oc-curred. Last, the participant completed a questionnaire, commenting on methods used todiscriminate between springs (e.g., pressing the spring and holding at a constant force,pressing continuously) and rating the difficulty of the task. After completing the ques-tionnaire for the first condition, the participant then completed the practice, testing, and

Page 8: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

8 Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch

Comparison Stiffness [N/m]oo

tno

Prp r

io f

t fr Th

” Rep n

“S if e

an

s o

ses (a)

230 270 310

0.20.40.60.81.0

0.0 290 310 330

(b)

Fig. 3. Psychometric curve for an example condition of an example participant. Proportion of“stiffer than” responses for each comparison spring (a) as data was originally collected, and (b)in the flipped- and folded-over condition.

questionnaire phases for the second condition. This marked the completion of the firstsession.

The second session was nearly identical to the first, with the participant complet-ing the practice, testing, and questionnaire phases for the two remaining conditions.Then the participant filled out an additional questionnaire, giving his/her age, gender,experience with haptic virtual environments, health concerns, experience using the rightfinger in dexterous motions, ranking of the usefulness of each condition for discrimi-nating, and additional comments.

4.3 Subjects

Approval was obtained from the Johns Hopkins University Homewood Institutional Re-view Board to collect data from human subjects. Six female and two male volunteers,who were all healthy with no neurological illnesses or right hand impairments, partic-ipated with informed consent. They ranged in age from 19 to 27 years, in right indexfinger length from 7.8 cm to 9.4 cm, and in self-reported virtual environment experiencefrom ‘None’ to ‘Some’. Participants were monetarily compensated for their time.

4.4 Data Analysis

Participant responses were first converted from compliance to stiffness to be consis-tent with prior work in which the study is discussed using stiffness. The proportion ofresponses in which the user states that the ith comparison spring is stiffer than the stan-dard spring, PSi, was then calculated. The average stiffness of all springs, k1,...,11, wasestimated using the method of total least squares, in which a best-fit line quantified therelationship between the force and position data acquired when pushed the spring [19].Figure 3(a) plots the psychometric curve of PSi versus ki for an example participant.Next, we quantified perceptual performance by calculating Area Under the NormalizedCurve (AUNC), since it gives a cumulative description of discrimination capabilitiesand allows perceptual performance to be compared across conditions. The more tradi-tional estimation of Weber fraction to compare perceptual performance was not usedsince we were not able to a get a goodness-of-fit for a number of the conditions andsubjects with this analysis method.

Page 9: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch 9

r U

d

A ea

n er

rl

d

Noma

ize

Ce

urv [N/

m]Vision Skin Stretch with VisionSkin StretchProprioception

0.8

0.4

0.2

0.0

0.6

Poor Perception

Excellent Perception

Fig. 4. Mean and standard error of AUNC across all sensory conditions for all subjects.

To obtain AUNC, the psychometric curve, e.g., Figure 3(a), was flipped- and folded-over (FFO), e.g., Figure 3(b), to account for perceptual biases about the standard spring.Thus, kFFO1,...,5,7,...,11 and PSFFO1,...,5,7,...,11 were obtained respectively by:

kFFOi =

{k6 +(k6 − ki), for i = 1, . . . ,5ki+1, for i = 6, . . . ,10

, and (7)

PSFFOi =

{12 +( 1

2 −PSi), for i = 1, . . . ,5PSi+1, for i = 6, . . . ,10

. (8)

Then kFFO1,...,10 and the corresponding PSFFO1,...,10 were sorted from lowest to higheststiffness, and AUNC, AUNC, was estimated by:

ARectangle j = PSFFO j(kFFO j+1 − kFFO j) (9)

ATriangle j =12(PSFFO j+1 −PSFFO j)(kFFO j+1 − kFFO j) (10)

AUNC =9

∑j=1

(ARectangle j +ATriangle j). (11)

5 Results

5.1 Task Performance

A within-subjects one-way analysis of variance (ANOVA) with a Geisser-Greenhouseepsilon-hat adjustment to correct for violations of sphericity evaluated the effect ofcondition on task performance and exploration methods. AUNC results for all subjectsacross all conditions are summarized in Figure 4; sensory condition was not found tosignificantly affect perceptual performance [F(3,21) = 0.84, ε̂ = 0.5134, p = 0.43].This finding indicates that the spring discrimination task may be achieved using posi-tion/motion feedback from the skin stretch device. Sensory type was also not found toaffect number of spring presses [F(3,21) = 0.42, ε̂ = 0.5800, p = 0.64], total time spentpressing springs [F(3,21) = 1.54, ε̂ = 0.6964, p = 0.25], or maximum penetration depth[F(3,21) = 1.18, ε̂ = 0.5191, p = 0.33]. This suggests that the manner in which the dis-crimination task is performed may not significantly change if the skin stretch device isused to convey positional cues.

Page 10: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

10 Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch

02468

Task

D ffculi i

ty

Very Easy

Very Difficult

Vision Proprioception Skin Stretch Skin Stretch with Vision

(b)U e

l es

s fun s

R nin s

a kg

Least Useful

Most Useful

012344

Vision Proprioception Skin Stretch Skin Stretch with Vision

(a)

Fig. 5. Median, lower, and upper quartiles across all sensory conditions and subjects for (a) use-fulness rankings and (b) task difficulty ratings.

5.2 Questionnaire

The Friedman test, a nonparametric version of the within-subjects one-way ANOVA,identified the effect of condition on perceived task difficulty and usefulness rankings.User ratings for task difficulty across all subjects and conditions are given in Figure 5(a).Sensory condition was found to have a significant effect [χ2(3) = 9.64, p = 0.022],however a post-hoc analysis using the Wilcoxon Signed-Rank Test and a Bonferronicorrection (significance level of p < 0.008), did not find any pairs to significantly dif-fer. Usefulness rankings across all conditions and subjects are shown in Figure 5(b);sensory condition was not found to have an effect [χ2(3) = 4.95, p = 0.176].

6 Discussion

The feasibility of using a skin stretch device to replace visual cues during upper-limbprosthesis use was investigated. The skin stretch device was not found to alter task per-formance or exploration method when compared to natural vision or proprioception.A possible reason for why statistical significance was not observed is because of thelimited number of subjects tested. However, the means in AUNC differ by such a smallamount across the conditions, that practically, in the absence of sight, the force cuesfelt at the right index finger and hand along with the positional cues relayed by theskin stretch device provide sufficient information for discriminating springs. Thus, theresults indicate that the device is suitable for relaying position information during thetested object manipulation task.

Subjective responses gave similar results to the quantitative findings. Usefulnessrankings for each condition were not found to significantly differ. The lack of significantdifferences may be because of the limited information conveyed from the data collected;the more informative ratings (e.g., 1 through 10), rather than rankings, may have beenmore descriptive. A statistically significant difference was found for task difficulty, butthe post-hoc test used for nonparametric data is very conservative and did not find anysignificantly different pairs. Based on the median values in Figure 5, it seems that thetask is perceived as more difficult when using the skin stretch device than when usingsight or proprioception; additional subject testing will illuminate whether this is true.

The addition of vision to skin stretch was not found to enhance perceptual perfor-mance or the subjective experience over skin stretch alone. Even though a goal is to

Page 11: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch 11

reduce the visual demand while using a prosthesis [1], based on our results we do notrecommend the skin stretch device for use in addition to sight during similar manipu-lation tasks. Obtaining positional cues from the skin stretch device may make the taskfeel more difficult than using sight alone.

Future research could test the effect of longer training time with the device on taskperformance and subjective ratings. Also, additional studies could investigate the feasi-bility of using several skin stretch devices simultaneously to convey a richer and higherdimensionality of hand configuration.

Acknowledgments. This work was supported by the JHU Applied Physics Laboratoryunder the DARPA Revolutionizing Prosthetics program, contract N66001-06-C-8005, aNSF Graduate Research Fellowship, Johns Hopkins University, Brain Science Institute,a travel award from the IEEE Technical Committee on Haptics, and Stanford University.We thank Mark Cutkosky and Karlin Bark for their help.

References

1. Atkins, D.J., Heard, D.C.Y., Donovan, W.H.: Epidemiologic overview of individuals withupper-limb loss and their reported research priorities. Journal of Prosthetics and Orthotics8(1), 2–11 (1996)

2. Bark, K., Wheeler, J.W., Premakumar, S., Cutkosky, M.R.: Comparison of skin stretch andvibrotactile stimulation for feedback of proprioceptive information. In: Proceedings of the16th International Symposium on Haptic Interfaces for Virtual Environments and Teleoper-ator Systems. pp. 71–78 (2008)

3. Binnard, M., Cutkosky, M.R.: Design by composition for layered manufacturing. Journal ofMechanical Design 122(1), 91–101 (2000)

4. Cipriani, C., Zaccone, F., Micera, S., Carrozza, M.C.: On the shared control of an EMG-controlled prosthetic hand: Analysis of user-prosthesis interaction. IEEE Transactions onRobotics 24(1), 170–184 (2008)

5. Cole, J.: Pride and a Daily Marathon. The MIT Press (1995)6. Collins, D.F., Prochazka, A.: Movement illusions evoked by ensemble cutaneous input from

the dorsum of the human hand. Journal of Physiology 496(3), 857–871 (1996)7. Collins, D.F., Refshauge, K.M., Todd, G., Gandevia, S.C.: Cutaneous receptors contribute to

kinesthesia at the index finger, elbow, and knee. Journal of Neurophysiology 94(3), 1699–1706 (2005)

8. Dhillon, G.S., Horch, K.W.: Direct neural sensory feedback and control of a prostheticarm. IEEE Transactions on Neural Systems and Rehabilitation Engineering 13(4), 468–472(2005)

9. Edin, B.B.: Quantitative analyses of dynamic strain sensitivity in human skin mechanorecep-tors. Journal of Neurophysiology 92(6), 3233–3243 (2004)

10. Gandevia, S.C., Smith, J.L., Crawford, M., Proske, U., Taylor, J.L.: Motor commands con-tribute to human position sense. Journal of Physiology 571(3), 703–710 (2006)

11. Gleeson, B.T., Horschel, S.K., Provancher, W.R.: Communication of direction through lateralskin stretch at the fingertip. Third Joint EuroHaptics Conference and Symposium on HapticInterfaces for Virtual Environment and Teleoperator Systems. pp. 172–179 (2009)

12. Gurari, N., Kuchenbecker, K.J., Okamura, A.M.: Stiffness discrimination with visual andproprioceptive cues. In: Proceedings of the Third Joint Eurohaptics Conference and Sympo-sium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. pp. 121–126(2009)

Page 12: Discrimination of Springs with Vision, Proprioception, and ...nettagurari.com/Research/Gurari12-EH.pdf · Discrimination of Springs with Vision, Proprioception, and Artificial Skin

12 Discrimination of Springs with Vision, Proprioception, and Artificial Skin Stretch

13. Kaczmarek, K.A., Webster, J.G., y Rita, P.B., Tompkins, W.J.: Electrotactile and vibrotactiledisplays for sensory substitution systems. IEEE Transactions on Biomedical Engineering38(1), 1–16 (1991)

14. Kuiken, T.A., Marasco, P.D., Lock, B.A., Harden, R.N., Dewald, J.P.A.: Redirection of cu-taneous sensation from the hand to the chest skin of human amputees with targeted reinner-vation. Proceedings of the National Academy of Sciences of the United States of America104(50), 20061–20066 (2007)

15. Kuschel, M., Luca, M.D., Buss, M., Klatzky, R.L.: Combination and integration in the per-ception of visual-haptic compliance information. IEEE Transactions on Haptics 99(4), 234 –244 (2010)

16. MacLean, K.E.: Putting haptics into the ambience. Transactions on Haptics 2(3), 123–135(2009)

17. McCloskey, D.I.: Kinesthetic sensibility. Physiological Reviews 58(4), 763–820 (1978)18. Paljic, A., Burkhardt, J.M., Coquillart, S.: Evaluation of pseudo-haptic feedback for simulat-

ing torque: a comparison between isometric and elastic input devices. In: Proceedings of the12th International Symposium on Haptic Interfaces for Virtual Environment and Teleopera-tor Systems. pp. 216–223. stiffness dscrimination isometric elastic haptic device (2004)

19. Pressman, A., Welty, L.J., Karniel, A., Mussa-Ivaldi, F.A.: Perception of delayed stiffness.International Journal of Robotics Research 26(11-12), 1191–1203 (2007)

20. Pylatiuk, C., Kargov, A., Schulz, S.: Design and evaluation of a low-cost force feedbacksystem for myoelectric prosthetic hands. Journal of Prosthetics and Orthotics 18(2), 57–61(2006)

21. Riso, R.R., Ignagni, A.R.: Electrocutaneous sensory augmentation affords more preciseshoulder position command generation for control of FNS orthoses. Proceedings of the An-nual Conference on Rehabilitation Technology pp. 228–230 (1985)

22. Rohland, T.A.: Sensory feedback for powered limb prostheses. Medical and Biological En-gineering and Computing 13(2), 300–301 (1975)

23. Roland, P.E., Ladegaard-Pedersen, H.: A quantitative analysis of sensations of tensions andof kinaesthesia in man. Brain: a Journal of Neurology 100(4), 671–692 (1977)

24. Sainburg, R.L., Ghilardi, M.F., Poizner, H., Ghez, C.: Control of limb dynamics in normalsubjects and patients without proprioception. Journal of Neurophysiology 73(2), 820–835(1995)

25. Srinivasan, M.A., Beauregard, G.L., Brock, D.L.: The impact of visual information on thehaptic perception of stiffness in virtual environments. In: Proceedings of the 5th InternationalSymposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Amer-ican Society of Mechanical Engineers Dynamic Systems and Control Division. vol. 58, pp.555–559 (1996)

26. Tan, H.Z., Durlach, N.I., Beauregard, G.L., Srinivasan, M.A.: Manual discrimination ofcompliance using active pinch grasp: the roles of force and work cues. Perception & Psy-chophysics 4(57), 495–510 (1995)

27. Tiest, W.M.B., Kappers, A.M.L.: Cues for haptic perception of compliance. IEEE Transac-tions on Haptics 2(4), 189–199 (2009)

28. Varadharajan, V., Klatzky, R., Unger, B., Swendsen, R., Hollis, R.: Haptic rendering and psy-chophysical evaluation of a virtual three-dimensional helical spring. In: Proceedings of the16th International Symposium on Haptic Interfaces for Virtual Environments and Teleoper-ator Systems. pp. 57–64 (2008)

29. Wheeler, J., Bark, K., Savall, J., Cutkosky, M.: Investigation of rotational skin stretch forproprioceptive feedback with application to myoelectric prostheses. IEEE Transactions onNeural Systems and Rehabilitation Engineering 18(1), 58–66 (2009)