development of a 3d printed prosthetic myoelectric hand · development of a 3d printed prosthetic...

6
DEVELOPMENT OF A 3D PRINTED PROSTHETIC MYOELECTRIC HAND DRIVEN BY DC ACTUATORS Emanuel de Jesus Lima , Armando S. Sanca , Adam Arabiam Technology Department State University of Feira de Santana Feira de Santana, BA, Brazil Engineering Department Seattle Pacific University Seattle, WA, USA Emails: [email protected], [email protected], [email protected] Abstract— Upper limb amputees receiving myoelectric devices are currently limited to either highly limiting single actuating devices or extremely expensive multi-finger grasping designs. In this paper, we show that it is possible to construct a hand prosthesis with a varied, multi-finger gestures set that is affordable for low income amputees. The system includes an innovative, noninvasive device to be placed on the forearm muscles for capturing surface electromyography (sEMG) signals, a unique intelligent system for classification of hand gestures, packaged with an easy to use, assemble, and maintain prosthetic device. This work describes a real- time, portable system based on the Myo armband and a 3D printed prosthesis. The results demonstrate that this approach represents a significant step towards more intuitive, low cost myoelectric prostheses with the possible extension to other assistive robotic devices. Keywords— Hand Prosthetic, Myoelectric Signals, k-NN, 3D Printing. Resumo— Amputados de membros superiores que recebem pr´ oteses mioel´ etricas s˜ ao atualmente limitados a dispositivos com acionamento ´ unico, ou restritos a projetos com elevado custo, mas que proporcionam o acionamento de m´ ultiplos dedos. Neste artigo, mostramos que ´ e poss´ ıvel construir uma pr´ otese de uma m˜ ao ultiplos d´ ıgitos com um conjunto de gestos variados, que pode ser acess´ ıvel para pessoas amputadas com recursos econˆ omicos reduzidos. O sistema inclui um dispositivo inovador e n˜ ao invasivo posicionado nos m´ usculos do antebra¸ co para a captura de sinais por eletromiografia de superf´ ıcie (sEMG), um sistema inteligente exclusivo para a classifica¸ ao de gestos da m˜ ao e um prot´ otipo de uma pr´ otese de f´ acil uso e montagem. Este trabalho descreve um sistema port´ atil em tempo real, baseado na bra¸cadeira Myo e uma pr´ otese impressa em 3D. Os resultados demonstram que esta abordagem representa um passo significativo no desenvolvimento de pr´ oteses mioel´ etricas mais intuitivas e de baixo custo, com a poss´ ıvel extens˜ ao para outros dispositivos rob´ oticos assistivos. Keywords— Pr´ otese da m˜ ao, Sinais Mioel´ etricos, k-NN, Impress˜ ao 3D. 1 Introduction In medicine, a prosthesis is an artificial device that replaces a missing body part, which may be lost through trauma, disease, or congenital con- ditions. These devices can be designed to pro- vide a better aesthetic appearance and psycholog- ical feeling of the whole to the amputee, to im- prove the functions of lost limbs, or some combi- nation of these goals. For years development has been stymied by limitations of available technol- ogy (Mikl´os, 2017). However, in the late twen- tieth century and early twenty-first century up- per limb prosthetic devices that provide natural control based on remaining neuromuscular con- nections became commercially available, first us- ing analog control systems (Cordella et al., 2016), and recently, digital signals processing systems (Mendes-Jr. et al., 2016). Modern prosthetists can offer a range of de- vices that leverage different, and in many cases highly advanced technologies, but in many cases these same prostheses have a high cost, severely limiting the access of people to purchase this type of equipment. According to ABOTEC (Brazilian Association of Technical Orthopedics), less than 3% of Brazilian disabled people can have access to high-tech prostheses (Garcia, 2012) and the De- cree N o 3.298, of december 20, 1999, of Brazilian government, the access to assistive devices and technologies for persons with disabilities are fore- seen. There are now numerous devices available to upper limb amputees (arms and hands) that use sensors to capture information from muscle contractions responsible for activation of human motor units, and send this information to a sys- tem control to activate the electro-mechanism of the prosthesis. Those devices are commonly re- ferred to as myoelectric arms, and myoelectric hands (Peerdeman et al., 2011). In Brazil, re- searchers from some Universities are developing a 3D prosthetic device and sEMG signals process- ing intended from low-cost to high-tech gadgets (Mendes-Jr. et al., 2016). In other countries, pros- thetics projects like the Open Hand Project and the Enable Community Foundation (ECF) also develop assistive technologies. The latter provides the free 3D prototypes project files so that any- one with access to a 3D printer can make parts for prosthetics. XIII Simp´osio Brasileiro de Automa¸ ao Inteligente Porto Alegre – RS, 1 o – 4 de Outubro de 2017 ISSN 2175 8905 833

Upload: dangduong

Post on 08-Nov-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

DEVELOPMENT OF A 3D PRINTED PROSTHETIC MYOELECTRIC HANDDRIVEN BY DC ACTUATORS

Emanuel de Jesus Lima∗, Armando S. Sanca∗, Adam Arabiam†

∗Technology DepartmentState University of Feira de Santana

Feira de Santana, BA, Brazil

†Engineering DepartmentSeattle Pacific University

Seattle, WA, USA

Emails: [email protected], [email protected], [email protected]

Abstract— Upper limb amputees receiving myoelectric devices are currently limited to either highly limitingsingle actuating devices or extremely expensive multi-finger grasping designs. In this paper, we show that itis possible to construct a hand prosthesis with a varied, multi-finger gestures set that is affordable for lowincome amputees. The system includes an innovative, noninvasive device to be placed on the forearm musclesfor capturing surface electromyography (sEMG) signals, a unique intelligent system for classification of handgestures, packaged with an easy to use, assemble, and maintain prosthetic device. This work describes a real-time, portable system based on the Myo armband and a 3D printed prosthesis. The results demonstrate that thisapproach represents a significant step towards more intuitive, low cost myoelectric prostheses with the possibleextension to other assistive robotic devices.

Keywords— Hand Prosthetic, Myoelectric Signals, k-NN, 3D Printing.

Resumo— Amputados de membros superiores que recebem proteses mioeletricas sao atualmente limitadosa dispositivos com acionamento unico, ou restritos a projetos com elevado custo, mas que proporcionam oacionamento de multiplos dedos. Neste artigo, mostramos que e possıvel construir uma protese de uma maomultiplos dıgitos com um conjunto de gestos variados, que pode ser acessıvel para pessoas amputadas comrecursos economicos reduzidos. O sistema inclui um dispositivo inovador e nao invasivo posicionado nos musculosdo antebraco para a captura de sinais por eletromiografia de superfıcie (sEMG), um sistema inteligente exclusivopara a classificacao de gestos da mao e um prototipo de uma protese de facil uso e montagem. Este trabalhodescreve um sistema portatil em tempo real, baseado na bracadeira Myo e uma protese impressa em 3D. Osresultados demonstram que esta abordagem representa um passo significativo no desenvolvimento de protesesmioeletricas mais intuitivas e de baixo custo, com a possıvel extensao para outros dispositivos roboticos assistivos.

Keywords— Protese da mao, Sinais Mioeletricos, k-NN, Impressao 3D.

1 Introduction

In medicine, a prosthesis is an artificial devicethat replaces a missing body part, which may belost through trauma, disease, or congenital con-ditions. These devices can be designed to pro-vide a better aesthetic appearance and psycholog-ical feeling of the whole to the amputee, to im-prove the functions of lost limbs, or some combi-nation of these goals. For years development hasbeen stymied by limitations of available technol-ogy (Miklos, 2017). However, in the late twen-tieth century and early twenty-first century up-per limb prosthetic devices that provide naturalcontrol based on remaining neuromuscular con-nections became commercially available, first us-ing analog control systems (Cordella et al., 2016),and recently, digital signals processing systems(Mendes-Jr. et al., 2016).

Modern prosthetists can offer a range of de-vices that leverage different, and in many caseshighly advanced technologies, but in many casesthese same prostheses have a high cost, severelylimiting the access of people to purchase this typeof equipment. According to ABOTEC (Brazilian

Association of Technical Orthopedics), less than3% of Brazilian disabled people can have accessto high-tech prostheses (Garcia, 2012) and the De-cree No 3.298, of december 20, 1999, of Braziliangovernment, the access to assistive devices andtechnologies for persons with disabilities are fore-seen. There are now numerous devices availableto upper limb amputees (arms and hands) thatuse sensors to capture information from musclecontractions responsible for activation of humanmotor units, and send this information to a sys-tem control to activate the electro-mechanism ofthe prosthesis. Those devices are commonly re-ferred to as myoelectric arms, and myoelectrichands (Peerdeman et al., 2011). In Brazil, re-searchers from some Universities are developinga 3D prosthetic device and sEMG signals process-ing intended from low-cost to high-tech gadgets(Mendes-Jr. et al., 2016). In other countries, pros-thetics projects like the Open Hand Project andthe Enable Community Foundation (ECF) alsodevelop assistive technologies. The latter providesthe free 3D prototypes project files so that any-one with access to a 3D printer can make partsfor prosthetics.

XIII Simposio Brasileiro de Automacao Inteligente

Porto Alegre – RS, 1o – 4 de Outubro de 2017

ISSN 2175 8905 833

In this paper, a brief introduction was pro-vided in section 1. In section 2, a literature review,addressing existing prosthetics hands both in aca-demic research and commercial marketplace, thesignal processing achieved by machines learningalgorithms, and the benefits of 3D printing is pre-sented. In section 3, describes the componentsand the system used to build the prototype. Nu-merical analysis and experimental results are pre-sented to show the performance of the system, in-cluding data collection and gesture recognition insection 4. Finally, section 5 provides some conclu-sions and potential future works.

2 Literature Review

2.1 Prosthetics Hands in the World

Upper limb prostheses are used following am-putation at any level from the hand to the shoul-der. The major goals of the upper limb prosthesisare to restore natural appearance and function.Reaching these goals also requires sufficient com-fort and ease of use for continued acceptance bythe user. The level of amputation (fingers, hand,wrist, elbow, and shoulder), therefore the com-plexity of joint movement, results in significant in-creases in technical challenge for higher level am-putations (Cordella et al., 2016). There are threedifferent kinds of prosthetic limbs: aesthetics; me-chanical, and myoelectric prosthesis, that are de-signed and often depend on the needs of the am-putee and the site of the amputation. Aestheticprostheses are designed for patients to cope withthe traumatic experience. The mechanical pros-theses rely on cables, does carry out any move-ments, typically actuated by the trapezius mus-cles, to control function of the terminal device.The myoelectric prostheses are devices controlledthrough muscular activity in a way that mimicshow the subjects used to activate their musclesbefore limb loss (Peerdeman et al., 2011; Mendes-Jr. et al., 2016).

2.2 Machines Learning Algorithms

TrainingGestures

Sensors DataVectors

GesturesLabels

MachineLearningAlgorithm

NewGesture

PredictiveModel

ExpectedGesture

Supervised Learning Model

Figure 1: Supervised Learning Algorithm.

Say you want to classify the grey pointinto a class. Here, there are potentialclasses - green, yellow and orange.

0. Look at the data

Start by calculating the distancebetween the grey point and all otherpoints.

1. Calculate distances

2.4

2.1 3.1

4.5

Next, find the nearest neighbours byranking points by increasing distance.The nearest neighbours (NNs) of thegrey point are the ones closest indataspace.

2. Find Neighbours

2.1 1st NN

2.4 2nd NN

3.1 3rd NN

4.5 4th NN

Point Distance

Vote on the predicted class labelsbased on the class of the nearestneighbours. Here, the labels werepredicted based on the k=3 nearestneighbours.

3. Vote on labels

2 Class winsthe vote!

1Point istherefore predictedto be of class

Class # of votes

1

X1X1

X2X2

Figure 2: k-NN classification procedure.

Recent innovations in signal processing tech-niques and mathematical models have madeit practical to develop advanced electromyogra-phy (EMG) recognition and analysis methods(Peerdeman et al., 2011). Different mathematicaldescription methods and machine learning tech-niques such as Artificial Neural Networks (ANN),Fuzzy Systems, Probabilistic model algorithms,Metaheuristic and Swarm intelligence algorithms,and some hybrid algorithms are used for charac-terization of EMG signals (Cordella et al., 2016).In the past couple of decades it has become a com-mon tool in almost any task that requires infor-mation extraction from large data sets (Shalev-Shwartz and Ben-David, 2014). In this devel-opment we focus on the supervised learning al-gorithm k-Nearest Neighbors (k-NN), Figure 1,in which the algorithm generates a function thatmaps inputs to desired outputs. One standardformulation of the supervised learning task is theclassification problem: the learner is required tolearn (to approximate the behavior of) a functionwhich maps a vector into one of several classesby looking at several input-output examples ofthe function. Its simplicity and effectiveness haveled it to be widely used in a large number ofclassification problems (Shalev-Shwartz and Ben-David, 2014). The k-NN classifies objects basedon closest feature space in the training set (Fig-ure 2). The training sets are mapped into multi-dimensional feature space. The feature space ispartitioned into regions based on the category ofthe training set. A point in the feature spaceis assigned to a particular class if it is the mostfrequent class among the k nearest training data.

XIII Simposio Brasileiro de Automacao Inteligente

Porto Alegre – RS, 1o – 4 de Outubro de 2017

834

Generally Euclidean distance is used in comput-ing the distance between the vectors (Pedregosaet al., 2011).

The scikit-learn project (Pedregosa et al.,2011) provides an open source machine learninglibrary for Python programming language and itwas used to classify the sEMG data in this devel-opment (Scikit-Learn, 2016).

3 System Development Description

3.1 Prototype components

Developed by Thalmic Labs, Myo (Figure 3)is a lightweight elastic armband to register gesturecommands. Myo consists of a number of medi-cal grade stainless steel sEMG sensors and highlysensitive Inertial Measurement Unit (IMU) thatmeasure electrical activity in a forearm muscle totransmit gestures that are made with a hand toa connected device via Bluetooth. To access thesEMG signals and motion parameters on the wornarm, Thalmic Labs provides a Software Develop-ment Kit (SDK) and a Myo connect API, whichoffers complete facilities for writing applicationsthat make use of the Myo armband’s capabilitiesin different platforms (MyoTM, 2016).

Figure 3: MyoTMArmband by ThalmicTMLabs.

A 3D printed hand used in this development isan adaptation of the Raptor Hand created by thee-NABLE project (eNABLE, 2015). Features in-clude 3D printed snap pins, a modular tensioningsystem, and compatibility with both velcro andleather palm enclosures. The Raptor Hand is li-censed under the Creative Commons-Attribution-Share Alike license, which says that it can betransformed, and built upon the material for anypurpose, even commercially.

The system is composed of these main ele-ments: Myo armband (sEMG sensors); the IntelEdison compute module with a GPIO block; thesignal conditioning board powered by batteries,and a 3D printed prosthetic hand actuated by DCservomotors, commonly used in prosthetics (Fig-ure 4). The solution allows the user to operatethe prosthesis by contracting the forearm musclesin an intuitive way. The prototype was tested todemonstrate high classification success rates andmultiple gestures support with low-cost.

3D Printed Hand Prosthesis Intel Edison

Battery

Myo Armband

Figure 4: Myoelectric Hand Prototype in UEFS.

3.2 Software processing

In this subsection, we describe the proce-dures for capturing, conditioning, processing ofsEMG signals, and operating the servomotors us-ing PWM signals. Figure 5 presents the softwareoverview.

Reading Labeling Recording FILE

ClassificationGenerates outputsto servo movements

MYO SERVOSReading

Reading

CALIBRATION STAGE

REAL TIME CONTROL

Figure 5: Sofware Overview.

The library myo-raw provides an interface tocommunicate with the Myo and giving access tosEMG data sensor at 200Hz and IMU data at50Hz (MyoTM, 2016). The main goal here is todetermine the performed hand gesture, based onthe received sEMG data from the forearm, whilemaintaining real-time response. The signal classi-fication is done by implementing the k-NN algo-rithm available in the Python module for machinelearning scikit-learn. The k-NN algorithm (Scikit-Learn, 2016) implements learning based on the knearest neighbors of each query point, where k isan integer value specified by the user. The optimalchoice of the value k is highly data-dependent: ingeneral a larger k suppresses the effects of noise,but makes the classification boundaries less dis-tinct. In the literature, there is no consensus onhow this value can be calculated. The alternativeadopted in this work was to use one of the recom-mendations that says that the value of k can beequal to the square root of the size of the train-ing base divided by 2, yet this value must be anodd value to decrease the chances of a tie (Scikit-Learn, 2016). As the training base is defined in thecalibration stage where approximately 900 sam-ples are collected for each gesture, the value of kin this case was set to 15.

To classify data we have to train it by defining

XIII Simposio Brasileiro de Automacao Inteligente

Porto Alegre – RS, 1o – 4 de Outubro de 2017

835

gestures and assign numbers to them. The Myo isplaced at the top of the subject’s forearm and thesubject is instructed to execute each gesture (rest-ing, making a fist, index finger, thumb finger) forapproximately twenty seconds. It is necessary tostart the program while the gesture is being held,not while the limb is moving to or from the ges-ture, and try moving the limb around a little inthe gesture while recording data to give the pro-gram a more flexible idea of what the gesture is.As long as the algorithm receive a gesture num-ber as argument, the current sEMG readings willbe labeled and recorded as belonging to the ges-ture of that number. It is done by holding downa number key on the keyboard.

Having the program running, any time a newreading comes in, the program classifies it basedon the trained values to determine which gestureit looks most like. If running in an environmentwith a screen, the screen can be used to displaythe number of samples currently labeled as be-longing to each gesture, and a histogram display-ing the classifications of the last 25 inputs. Themost common classification among the last 25 isshown in and should be taken as the program’sbest estimate of the current gesture.

When the system is started, three threads arecreated to manage each servo unit. Once a ges-ture is classified, the number assigned to that ges-ture on the calibration stage is sent to the threads,then the program checks if the current servo po-sition needs to be changed based on the numberwas received (Figure 6). Then, the servomotorspull the cords to make the desired position. Theservomotors units are operated by commands re-ceived from the Intel Edison board in a form ofPulse-Width Modulation (PWM) signals.

Send Signal

Receive Gesture

Start

Gesturehas changed?

Servo anglehas changed?

FalseFalse

True True

Figure 6: Software Overview Workflow.

4 Experiments and Results

Data collecting experiments were carried outin order to evaluate streaming performance of thedevice, and to compare different sets of data fora same gesture in different arm positions, Figure7. To collect data, the device (Myo) was placedon the forearm and then the gestures were ex-ecuted for three different arm positions (point-ing up/down/forward with extended and relaxedfingers and making a fist and slowly closing thehand), Figure 8.

Up Down Forward

Figure 7: Arm positions for data collecting.

0 100 200 300 400 500 600 700 800 9000

1000

EMG GRAPH - Sensors Response

S1

0 100 200 300 400 500 600 700 800 9000

1000

S2

0 100 200 300 400 500 600 700 800 9000

1000

S3

0 100 200 300 400 500 600 700 800 9000

1000

S4

0 100 200 300 400 500 600 700 800 9000

1000

S5

0 100 200 300 400 500 600 700 800 9000

1000

S6

0 100 200 300 400 500 600 700 800 9000

1000

S7

0 100 200 300 400 500 600 700 800 9000

1000

S8

Samples

Down Forward Up

Figure 8: sEMG for the Closing Hand Movement.

Figure 8 shows in details the sensors responsefor the slowly closing hand gesture having the armpositioned in three directions: up, down, and for-ward. The blue line shows the response when thearm is pointing down, the green line when is point-ing forward, and the red line when is pointing up.Si, where i = 1, · · · , 8 are the sensors numbersvalue. We can see that although it exists differ-ences between the lines when a gesture is executedin different arm position, the classifier can handleit predicting the correct gesture, as we demon-strate in Figure 10. After data collection was com-plete, experiments were conducted using the clas-sifier to recognize gestures. In all the experimentsdescribed here, the gestures were performed by anun-amputee or with un-malformation in the arm,and a reasonably hair-free arm (hair on the armmakes the readings less accurate since it decreasesthe contact of the sensor with the skin). The Myowas placed at the top of the subject’s forearm andthe subject was instructed to execute the com-mands as described in each experiment. Figure 9presents the implemented gestures which include(from top to bottom) rest position, making a fist,extending the index finger, and hiding the thumb.

XIII Simposio Brasileiro de Automacao Inteligente

Porto Alegre – RS, 1o – 4 de Outubro de 2017

836

Rest (0)

Fist (1)

Index (2)

Thumb (3)

Figure 9: Hand Gestures Set.

0 5 10 15 20 25 300

500

1000EMG GRAPH and Gesture Classification - Rest(0) Fist(1) Index(2) Thumb(3)

S1

0 5 10 15 20 25 300

500

1000

S2

0 5 10 15 20 25 300

500

1000

S3

0 5 10 15 20 25 300

500

1000

S4

0 5 10 15 20 25 300

500

1000

S5

0 5 10 15 20 25 300

500

1000

S6

0 5 10 15 20 25 300

500

1000

S7

0 5 10 15 20 25 300

500

1000

S8

0 5 10 15 20 25 30

0

2

4

time (s)

Gestu

res

Rest Fist Index Thumb Rest Fist Index Thumb

01 1

0

223 3

Figure 10: sEMG for tests.

Figure 10, show the result of experimentswhere all the four gestures (rest, making a fist,index finger, thumb finger) were performed for atime and the input signals from the eight sen-sors, and the predicted class were recorded. Ofthe tests performed, the effect of the arm positionis not highly significant for gesture identification.Furthermore, we notice that as the hand is be-ing closed which is done by increasing the muscu-lar contraction, the values of the sensors increaseas well. In addition, it is visible that some sen-sors can contribute (increase the value) to a deter-mined gesture more than others. The blue lines,in Figure 10, represents the sensors values overthe time. The yellow, red, blue and black lines,in the gestures, represent the predicted class wereperformed, and the black circle the transition be-tween gestures, when the arm is moved or placed.These values may change if the sensor is moved or

Peformed Predicted

Gesture Gesture Accuracy Servo1 Servo2 Servo3

Rest(0) Rest 96.5% 90 -90 -90

Fist(1) Fist 92.5% -90 90 90

Index(2) Index 89.5% -90 -90 90

Thumb(3) Thumb 91.5% -90 -90 -90

Table 1: Servomotors with constraint angles.

placed in a different portion of the arm, thus ne-cessitating the calibration stage, every time, thesensor is moved or removed from the arm.

When the recording began the hand was in the“rest”position (class 0), a few seconds later, “mak-ing a fist” (class 1) gesture was performed and ascan be seen there is a change in the sensor level(blue line), and in the predicted class (yellow, red,blue and black lines), the same fact occurs whenthe others gestures were executed over the time.We also notice some peaks and fast predicted classchange (black circle). It happens mostly in thetransition between gestures because the algorithmfaces an unknown gesture and try to classify thatinput as belonging to the class that is most likelyto match it. This effect is filtered in the outputsignal to the servo by sending the signal only whena class is the most common in the last twenty-fiveclassified classes. Because the sample rate is highrelative to the speed of operation this did not havean effect on the response.

The k-NN algorithm computes the k nearestneighbors for a current reading, which means thatif a pose not trained is performed, the algorithmwill estimate the one in the training set which ismost likely to match it. Future works may addressthis issue by, for example, limiting the distancerange between two points.

The final experiment was performed on theIntel Edison environment. At this point wechose the gestures to be implemented and putall parts of the system together. In the calibra-tion stage the implemented gestures were per-formed and was assigned a number to identifythe class which that gesture belongs to. “Restposition” was labeled as number 0, “making afist” as number 1, extending the index fingeras number 2, and hiding the thumb as number3. The algorithm was then executed, and as agesture was performed, the hand prosthesis re-sponded with the same gesture. A video with thedemonstration is available at the following link:(https://youtu.be/W8blCFG8PAI).

Using the same gesture set which had alreadybeen calibrated, we executed each gesture for ap-proximately five seconds, recorded the number ofsamples taken in that time and the number of timethat the most common gesture was predicted, andthe command of servo angle sent to the actuatorsunits. For example, gesture 1 was executed for 5

XIII Simposio Brasileiro de Automacao Inteligente

Porto Alegre – RS, 1o – 4 de Outubro de 2017

837

seconds, during this time were taken 235 samples,227 of these samples were predicted as belongingto be class 0 (gesture 1), which represents a per-centual of 96.5% of success. The same calculationwas made for the others gestures, and the resultis presented in Table 1. Servoi, where i = 1, 2 and3, controls the thumb, index, and the three othersfingers (middle, ring, and pinky finger). For theServo1 and Servo2, the “−90◦” angle means thefinger is open, while “90◦” is closed. Due to me-chanical constraints, Servo3 operates in oppositedirection relative to the Servo1 and Servo2. Table1, illustrates the rest gesture was identified so allthe fingers were open at the considered time.

5 Conclusion and Future Works

The Myo armband is a promising interface fordevelopment of prosthetic devices. The invest-ment costs, only to implement, appear in Table 2.Our results show that the quality of the motionand muscle sensing data is attractive for sEMGsignal classification. However, unofficial supportfor Linux platform resulted in a significant in-crease in debug time and would be addressed withcommercial support from a manufacturer providedSDK. In this project, our goal was to design an af-fordable solution for the hand prosthesis problem.The experimental results of the trials describedin this paper demonstrate that this myoelectricinterface and control system have the great po-tential to become a usable means for amputeesto achieving both ease of use and dexterous func-tionality by making it affordable for low incomeindividuals and by allowing them at last to con-trol their hand prosthesis in a more intuitive andnatural way. In the video that can be accessedin this link (https://youtu.be/9s-8xCSUViU), weshow an amputee using the system. Future workwill address known weaknesses of this solution.First, we plan to build the entire system in a em-bedded platform for aesthetic purposes. We ad-ditionally plan to examine different classificationalgorithms to get a higher gesture recognition ac-curacy. Further, we plan to use the sEMG signalsto find the mathematical models to regulate themovements and not just identify them, and ex-tend this solutions in a way that each finger canbe controlled separately.

Parts Cost (R$)3D printed hand 200.00Actuator units (MG90S Servo) 65.70Sensor Myo Armband 1090.00Microprocessor Intel Edison 380.30Base, GPIO, and Battery blocks 254.90

TOTAL 1990.90

Table 2: Investment cost of prototype.

References

Cordella, F., Ciancio, A. L., Sacchetti, R., Davalli,A., Cutti, A. G., Giglielmelli, E. and Zollo,L. (2016). Literature review on needs of up-per limb prosthesis users, Frontiers in Neu-roscience 10(209): 1–14.

eNABLE (2015). The raptor handby e-NABLE hand, Techni-cal report, enablingthefuture.org,http://enablingthefuture.org/upper-limb-prosthetics/the-raptor-hand/.

Garcia, V. (2012). Veja os primeiros re-sultados do Censo 2010 sobre Pes-soas com Deficiencia, Technical report,http://www.deficienteciente.com.br/veja-os-primeiros-resultados-do-censo-2010-sobre-pessoas-com-deficiencia.html.

Mendes-Jr., J. J. A., Pires, M. B., Okida, S. andJr., S. L. S. (2016). Robotic Arm Activationusing Surface Electromyography with Lab-VIEW, IEEE Latin America Transactions14(8): 3597–3605.

Miklos, V. (2017). The history of prostheticsreveals a long tradition of human cy-borgs, Technical report, io9 we come fromfuture, http://io9.gizmodo.com/the-history-of-prosthetics-reveals-a-long-tradition-of-1552921361.

MyoTM (2016). Myo gesture control arm-band, Technical report, Thalmic Labs Inc.,https://www.myo.com/.

Pedregosa, F., Varoquaux, G., Gramfort, A.,Michel, V., Thirion, B., Grisel, O., Blondel,M., Prettenhofer, P., Weiss, R., Dubourg,V., Vanderplas, J., Passos, A., Cournapeau,D., Brucher, M., Perrot, M. and Duchesnay,E. (2011). Scikit-learn: Machine Learningin Python, Journal of Machine Learning Re-search 12 pp. 2825–2830.

Peerdeman, B., Boere, D., Witteveen, H., Velt,R. H., Hermens, H., Stramigioli, S., Riet-man, H., Veltink, P. and Mistra, S. (2011).Myoelectric forearm prostheses: State of theart from a user-centered perspective, Journalof Rehabilitation Research and Development48(6): 719–738.

Scikit-Learn (2016). Nearest Neigh-bors, Technical report, http://scikit-learn.org/stable/modules/neighbors.html.

Shalev-Shwartz, S. and Ben-David, S. (2014). Un-derstanding machine learning: From theoryto algorithms, Cambridge University Press.

XIII Simposio Brasileiro de Automacao Inteligente

Porto Alegre – RS, 1o – 4 de Outubro de 2017

838