gesture regulated animatronic robotic hand · using matlab to control arduino based robotic arm....

12
GESTURE REGULATED ANIMATRONIC ROBOTIC HAND Rahul Gowtham Poola 1 , Mrs. A.Anilet Bala(Assistant Professor) 2 , Rajesh Peddikuppa 3 ,Kasireddy Mahanth Reddy 4 Dept. of Electronics and Communication Engineering SRM Institute of Science and Technology, Chennai, India June 25, 2018 Abstract Humans face innumerable challenges in a meteoric growing world where humans encounter difficulties in accomplishing tasks. This can be surmounted using an animatronic hand. It can play a major role in Military, Industries, and medical exercises. Animatronic involves the use of electro-mechanical devices to design systems that replicate human bodily movements. Animatronics is an interdisciplinary field necessitating robotics, electronicsand mechanical studies. This paper showcases various practical applications of an animatronic model of hand by the use of MATLAB. Gesture-based communication is utilized in this design. A gesture will be recognized through a webcam and image grabbed using Image Acquisition toolbox of MATLAB. This Image can be further processed and output can be interfaced to animatronic hand and can be controlled by human hand. MATLAB and Arduino are the key elements of this proposed model. The practical design of the Robotic Hand is made feasible for real-time applications in Automation and rescue tactics. 1 International Journal of Pure and Applied Mathematics Volume 120 No. 6 2018, 1335-1346 ISSN: 1314-3395 (on-line version) url: http://www.acadpubl.eu/hub/ Special Issue http://www.acadpubl.eu/hub/ 1335

Upload: others

Post on 04-Jun-2020

19 views

Category:

Documents


0 download

TRANSCRIPT

GESTURE REGULATEDANIMATRONIC ROBOTIC HAND

Rahul Gowtham Poola1,Mrs. A.Anilet Bala(Assistant Professor)2 ,

Rajesh Peddikuppa3,Kasireddy Mahanth Reddy4

Dept. of Electronicsand Communication Engineering

SRM Institute of Scienceand Technology, Chennai, India

June 25, 2018

Abstract

Humans face innumerable challenges in a meteoricgrowing world where humans encounter difficulties inaccomplishing tasks. This can be surmounted using ananimatronic hand. It can play a major role in Military,Industries, and medical exercises. Animatronic involvesthe use of electro-mechanical devices to design systemsthat replicate human bodily movements. Animatronics isan interdisciplinary field necessitating robotics,electronicsand mechanical studies. This paper showcasesvarious practical applications of an animatronic model ofhand by the use of MATLAB. Gesture-basedcommunication is utilized in this design. A gesture will berecognized through a webcam and image grabbed usingImage Acquisition toolbox of MATLAB. This Image canbe further processed and output can be interfaced toanimatronic hand and can be controlled by human hand.MATLAB and Arduino are the key elements of thisproposed model. The practical design of the Robotic Handis made feasible for real-time applications in Automationand rescue tactics.

1

International Journal of Pure and Applied MathematicsVolume 120 No. 6 2018, 1335-1346ISSN: 1314-3395 (on-line version)url: http://www.acadpubl.eu/hub/Special Issue http://www.acadpubl.eu/hub/

1335

Keywords:Micro Grid, Distributed Generation,Multiobjective optimization,PI controller, Fuzzy LogicController

1 INTRODUCTION

The role of an efficient Robotic Hand controller unit is to createan interactive medium between human and animatronic robotichand using sign gestures. The idea led to the development of ananimatronic robotic hand which is physically identical to thehuman hand and to expand abilities of the human hand by meansof Image Processing approach. The motive of human gesturerecognition and modeling is transforming gesture into a messageto be conveyed. This inspired us to make an animatronic robothand which can play the role of the human hand in antagonisticconditions without the actual presence of the human hand in thatenvironment. Gestures are expressive, meaningful body actionsthat are used in daily communication [1]. The human hand isversatile in nature but is not capable of working underantagonistic condition [2]. Gestures are basic mode of non-oraland wordless communication in which visible body activitiesconvey messages [3].

Modeling and recognizing of gestures involve strong challengesdue to variation in size and shape. Human gestures are classifiedas Stationary Gestures and nonstationary Gestures. A stationarygesture is fixed image position and posture, represented by anindividual image while a nonstationary gesture is an ambulanttype, represented by a chain of images [3]. The motive behind thisproject is to build a Human-Robot Interactive controller that useshand gestures of humans and relates the gestures to anappropriate action of the Robotic Hand. The gestures are used tosteer a movable robotic hand and the applications are fulfilledaccordingly. The entire activity will be implemented in real-time,employing efficient programming and algorithms. GestureRecognition and Modelling enable the interface between Humanand Computer controllers [3]. Gestures recognition is implementedby employing modes and tools associating with Image Processing.

2

International Journal of Pure and Applied Mathematics Special Issue

1336

Enormous research work is done in the field of thehuman-robotinterface. Miscellaneous articles have been surveyedregarding the gesture detection for the real-time movement ofRobotic hand [4]. The diverse fields related to same techniquesinvolve sign language, computer graphics and automated robotsused to substitute humans.A novel method of gesture modelingand gesture recognizing using flex sensor, employing wirelesscommunication is evidently proposed. The gesturemodelinginvolves three criteria namely, hand position, handorientation and hand movement criteria. The Flex sensorscalculate the amount of bend the human finger has produced andtransmits to the Robotic controller unit. The program relates theamount of bend of Flex sensors with the rotation of RoboticHand. The different bending angles replicate different movementsof the Robotic Hand [1].

Most of the hand interactive systems comprise of threestages:detecting, Coursing, Perceiving. The detection stagedefines and extracts the optical features that define the gestureswith respect to the camera. The coursing stage is responsible forachieving an incorporation between consecutive frames of theimage. It is done to make the controller enlightened of ”what iswhere”. The perceiving stage involves grouping of the temporaldata extracted in the previous stages and assigning the groupswith specific labels associated with particular gestures [3].

2 DESIGN METHODOLOGY

The Design Methodology of the Robotic Hand Using ImageProcessing involves Several stages. The Image Processing flowdiagram (Fig.1) describes the different stages involved inconverting Gestures into the Real-Time movement of the RoboticHand [5].

3

International Journal of Pure and Applied Mathematics Special Issue

1337

Fig. 1. Project Flow Diagram

A. Image Processing

Input gestures are captured using webcam and images areacquired through Matlab programmed with Image acquisitiontoolbox. The gestures are captured with a limited time grabinterval and will be flushed.

B. Image Acquisition

The video input is acquired and processed by this stage. In orderto send the image information to the controller, the capturedgesture images by the laptop camera are to be processed. Theacquired images can be transfigured into appropriateconfiguration and can be coded accordingly. The crucial tool foracquiring an image in Matlab is the image acquisition toolbox [6].

C. Pre-Processing

Image pre-processing does not improve image information content

4

International Journal of Pure and Applied Mathematics Special Issue

1338

but can decrease it if entropy is a measure of information. Thepre-processing stage improves the image parameters thatoverpower the undesirable distortions or augment imagecharacteristics essential for further processing stages [6].

D. Noise Reduction

Noise reduction involves removing an unwanted noise from asignal i.e. is unwanted background [6].

E. Resizing

Reducing the size of the acquired image to the required size andparameters [6].

F. Pruning

Pruning is the type of morphological process used to removeunwanted object from the image [6].

G. Segmentation

This stage sectionalizes an image into multiple sectionsandfictionalizes the physical representation of an image into amore meaningful and simplified representation by locating theobjects and its relative boundaries with accuracy. Segmentationstage assigns a specific tag to a specific pixel in an image byallowing the pixels with the same tag constitute communalcharacteristics [7].

H. Feature Extraction

When the image can’t be used as a signal for transmission, thefeatures of the image containing the message that the image wantsto convey can be transmitted. The feature is the output of theimage processing [6].

5

International Journal of Pure and Applied Mathematics Special Issue

1339

TABLE I. MATLAB CODE WORD AND OPERATION

The feature of the Gesture extracted is transferred to XBeeTransceiver through RS232 port. The RS232 port is SerialCommunication port. The XBee acts a transmitter and transmitsasignal to other XBee that acts as a receiver. The signals arereceived through the Wireless medium. The received signals aretransferred to Arduino for interfacing with the Arduino.

3 RESULTS AND DISCUSSION

Gestures are captured as RGB color image. The image undergoesProcessing and features are extracted as input to be interfaced withthe Robotic Hand. When a gesture is shown as number ”3”, it iscaptured using camera and Image Processing takes place (Fig.2).The feature extracted from the gesture number 3 is transmittedby XBee Transmitter and received using another XBee receiver.The received signal is transferred to Arduino for interfacing withthe Robotic Hand. As a result, the third LED will glow statingthe motion of the robotic hand related to gesture ”3” is workingaccordance with the input gesture. The Robotic hand will performa combination of three movements namely open, down, close, upand left (Fig.3 and Fig.4).

6

International Journal of Pure and Applied Mathematics Special Issue

1340

Fig. 2. Image Processing Stages

Fig. 3. Robotic Hand (Left movement)

7

International Journal of Pure and Applied Mathematics Special Issue

1341

Fig. 4. Robotic Hand (Right movement)

TABLE II. GESTURES AND ROBOTIC HAND MOVEMENTS

The Robotic Hand can be programmed with a different combinationof motions for different human gestures. The motion of Robotichand involves single action or two combination actions or threecombination actions and so on accordingly. The existing model hasa success percentage of 80% with 8 successful detections out of 10random tests performed. The proposed model with combinationmovements of Robotic Hand has a success percentage of 90% with9 successful detections out of 10 random tests.

8

International Journal of Pure and Applied Mathematics Special Issue

1342

TABLE III. TEST ANALYSIS

Graphs (Fig.5 and Fig.6) illustrates the time taken for the systemto acquire, process the image, transmit the signalandcommunicate to the Robotic Hand to perform the appropriatemotion or a combination of motions. The four combinationmovement of Robotic Hand has time delay more than that ofthree combination movements for different gestures.

Fig. 5. Three Combination Movements

9

International Journal of Pure and Applied Mathematics Special Issue

1343

Fig. 6. Four Combination Movements

4 ACKNOWLEDGEMENT

The author would like to earnestly acknowledge the efforts andcontributions made by the teammates in transfiguring a visionaryidea into a realistic design. The author would also like to sincerelythank project mentor for the valuable guidance, consistentencouragement and timely help for making a successful researchwork.

References

[1] Yuanhao Wu, Ken Chen and Chenglong Fu, NaturalGesture Modelling and Recognition Approach Based onJoint Movements and Arm Orientations, IEEE SENSORSJOURNAL, VOL. 16, November 1, 2016.

[2] Abdullah Shaikh, Gandhar Khaladkar, Rhutuja Jage andTripti Pathak Javed Taili, Robotic arm movement wirelesslysynchronized with human arm movements using real-timeimage processing. 2013 Texas Instruments India Educatorsconference.

10

International Journal of Pure and Applied Mathematics Special Issue

1344

[3] Gaurav Chauhan and Prasad Chaudhari, Gesture-basedwireless control using Image Processing. 5th NirmalaUniversity International Conference on Engineering, 2015.

[4] Pankaj S Lengare and Milind E Rane, Human Hand Trackingusing MATLAB to control Arduino based Robotic Arm.International conference on pervasive computing, 2015.

[5] Panth Shah, Tithi Vyas. Interfacing of MATLAB withArduino for object detection Algorithm Implementation usingSerial Communication. International Journal of EngineeringResearch and Technology (IJERT), Vol. 3, Issue 10, October2014.

[6] C.Theis.I.Iossifidis and A.Steinhage, Image Processingmethods for interactive Robot control. Proceedings 10th IEEEInternational Workshop on Robot and Human InteractiveCommunication, 2001.

[7] Operating two servos motors with Arduino.http://www.robotoid.com/appnotes/arduino-operating-two-servos.html

[8] Bhumeshwari Basule, Shubhangi Borkar. Robotic Four FingerARM controlling using Image Processing. Volume 7, IssueNo.7, IJESC.

11

International Journal of Pure and Applied Mathematics Special Issue

1345

1346