design and implementation of arduino-based …...through gesture control. it consists of the...
TRANSCRIPT
DOI:10.23883/IJRTER.2018.4361.BNRM9 90
DESIGN AND IMPLEMENTATION OF ARDUINO-BASED GESTURE-
CONTROLLED SYSTEM WITH ACCELEROMETER
Nduanya, U. I.1, Oleka, C. V.2, Orji, E. Z.3 1,2,3 Department of Computer Engineering, Enugu State University of Science and Technology
(ESUT). Enugu State, Nigeria.
Abstract – The design of robots is to easily perform tasks which are difficult or impossible for humans
to perform. They are programmed to access areas that prove to be inaccessible to humans. This paper
presents the design of a robot which can take direct, real time instructions from a human operator
through gesture control. It consists of the transmitter and the receiver parts and was programmed using
Arduino. It was developed with Arduino-Mega, accelerometer and the robot-arm, which is controlled
according to the signals it receives and are sent by the gestures of the hand.
Keywords: Arduino-Mega, Accelerometer, Robot-arm, Gesture control.
I. INTRODUCTION Today human machine interaction is moving away from keyboard, mouse, pen drives and touch screen
and is becoming pervasive and much more compatible with the physical world.
With each passing day the gap between machines and human is being reduced with the introduction of
new technology to ease the standard of living; having a future scope of advanced robotic arms designed
like the human hand itself, that can easily be controlled using hand gesture only.
Gesture recognition systems which are used to control memory and display devices in a remote
environment are examined for the same reason. People frequently use gestures to communicate.
Gestures are used for everything from pointing at a person to get their attention to conveying
information about space and temporal characteristics (Aquib,2015).
Gesture recognition is a topic in Computer Science, Engineering and Language Technology with the
goal of interpreting human gestures via mathematical algorithms (a self-contained step-by-step set of
operations to be performed) (Daksh,2002). Hitherto the use of gesture has been in existence, for
instance from the first century to the present century humans have been able to communicate using
gestures, ranging from hand; like waving goodbye, nodding, winking of the eyes and so on. The gesture
based systems took advantages of this fact by presenting a system that can copy and replicate the
gestures of humans with respect to the system being controlled. Following the definition of Merriam
Webster dictionary; gesture is a movement of the body (especially of the hands and arms) that shows
or emphasizes an idea or a feeling (Gartner, 2007).
However, the identification and recognition of posture, gait, proxemics, and human behaviors is also
the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers
to begin to understand human body language, thus building a richer bridge between machines and
humans than primitive text user interfaces or even Graphical User Interfaces (GUIs), which still limit
the majority of input to keyboard, mouse and touch screen system. In this chapter the various
terminologies used in this project are defined (Pinto, 2012).
International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 07; July - 2018 [ISSN: 2455-1457]
@IJRTER-2018, All Rights Reserved 91
II. RELATED LIREATURE The 2016 BMW 7, with its "gesture control" system, was designed by the BMW (Bayerische Motoren
Werke) German multinational automotive manufacturing company. The 2016 BMW 7 Series has set
its sights on introducing a few more hand gestures to a driver's lexicon. This standard system allows
one to forego pressing any buttons or turning any knobs to do things such as increase or decrease the
audio volume. Making any of six available gestures in the vicinity of the center console allows 3-D
sensors to register and complete the requested-by-movement actions. Whenever a gesture is available,
a small icon appears on the 10.2-inch high-resolution touch-screen on the dashboard (Messenbaugh,
2016). There are about seven series of operation that can be carried out using gestures. Wachs et al.
(2007), researchers at Ben Gurion University (BGU) Beer-Sheva, Israel, developed a vision-based
gesture recognition/capture system that enables doctors to manipulate digital images during medical
procedures using hand gestures instead of touch screens or computer keyboards. It interperetes user’s
gestures in real-time to manipulate objects in an image visualization environment. The system was
tested during a neurosurgical brain biopsy at Washington Hospital Center (WHC; Washington, DC,
USA). Licsár and Szirányi (2004), presented a human-computer interface for a virtual mouse system
in a projector-camera configuration. The system applies a boundary-based method to recognize poses
of static hand gestures. The virtual mouse-based application is controlled by the detected hand poses
and the palm positions. The virtual user-interface can be displayed onto the projected background
image, so the user controls and interacts directly with the projected interface realizing an augmented
reality. The system detects hand poses by shape analysis resulting in a larger vocabulary set in the
communication. The correct shape detection is solved in the presence of a cluttered background. The
system used segmented method. Purkayastha et al. (2014) focuses on the development of the robotic
Arm by using Flex Sensor, ZigBee and 3 Servo motor connected to the Arduino Uno which is
controlled by processing software and a computer mouse. This robotic arm is cheap and easily
available which makes it free from unnecessary wire connection, reducing its complexity. Kaura et al
(2013) implements a system through which the user can give commands to wireless Robot using
gesture. Here, the user control or navigate the robot by using gesture of palm. The command signals
are generated from these gestures using image processing and signals are passed to the robot to
navigate it in the specified direction. Jena et al. (2015) proposed a system; a gesture-controlled robot
using Arduino Uno and accelerometer. The Arduino Uno reads the analog output values i.e., x-axis
and y-axis values from the 3 axis accelerometer and converts the analog value to respective digital
value. The digital values are processed by the Arduino Uno and send to the RF transmitter which is
received by the Receiver and is processed at the receiver end which drives the motor to a particular
direction.
III. ARDUINO MEGA-BASED GESTURE-CONTROLLED SYSTEM WITH
ACCELEROMETER DESIGN
Figure 1. The System Structure
Transmitter
Unit Sensing
Unit
Motor
Drivers Receiver
Unit
Power Supply
z
International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 07; July - 2018 [ISSN: 2455-1457]
@IJRTER-2018, All Rights Reserved 92
Arduino Mega R3
Figure 2. Pin description of Arduino Mega
The Arduino Mega 2560 is a microcontroller board based on the ATmega2560
It has 54 digital input/output pins (of which 14 can be used as PWM outputs), 16 analog inputs, 4
UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP
header, and a reset button. It contains everything needed to support the microcontroller; simply connect
it to a computer with a USB cable or power it with a AC to-DC adapter or battery to get started. The
Mega is compatible with most shields designed for the Arduino Duemilanove or Diecimila.
The Arduino Mega-controlled robot designed using the following modules/units:
Transmitter module, Sensor module, Receiver module, Motor drivers, Power supply, Software
module
3.1.Sensor Module
The design of the sensing parts of this project started with interfacing the sensors with Arduino to get
the response of the sensors and monitor the output values using Arduino IDE serial monitor as the
sensors are tilted to different planes.
The sensor modules consists of the accelerometer (MPU 6050),The accelerometer of this type that has
a gyro in it, To pick the coordinate, the Interrupt, SDA and SCA pins are connected to the arduino.
The measured values appear as change in voltage at three output pins with respect to a common ground.
The sensor measures acceleration with the help of a layer of polysilicon suspended above silicon wafer
with the help of polysilicon springs. The motion of this mass is translated into the motion of the plates
of a differential capacitor and thereby providing an output proportional to acceleration. The pinout has
more pins than what one would find in a commercially available product. One would be familiar with
the three output pins and the pins for VCC and GND.
The ST pin helps in doing a self-test of the accelerometer. When a small voltage of less than 3.6V is
applied on the pin output of the accelerometer is affected due to the generation of electrostatic force.
It is left open circuit like the No Connection (NC) pins (Grumman, 2012). The COM pins are common
ground which are generally short-circuited, phase sensitivity demodulation technique are then used to
determine the magnitude and direction of the acceleration.
International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 07; July - 2018 [ISSN: 2455-1457]
@IJRTER-2018, All Rights Reserved 93
Figure 3. Pinout of MPU 6050
3.2. The Transmitter Module
In the transmitter part the Accelerometer and RF Module unit are used. The MPU 6050 communicates
with the Arduino through the I2C protocol. The MPU 6050 is connected to the Arduino. If the MPU
6050 module has a 5V pin, then it can be connected to the Arduino’s 5V pin. If not, it will be connected
to the 3.3V pin. Next, the GND of the Arduino is connected to the GND of the MPU 6050.
The program that will run here, also takes advantage of the Arduino’s interrupt pin. Connect the
Arduino’s digital pin 2 (interrupt pin 0 & interrupt pin 1) is connected to the pin labeled as INT on the
two MPU 6050 accelerometers. Next, the I2C lines needs to be set up. To do this, connect the pin
labeled SDA on the MPU 6050 to the Arduino’s analog pin 4 (SDA) and the pin labeled as SCL on
the MPU 6050 to the Arduino’s analog pin 5 (SCL) then connect the SDA and SCL of the Second
MPU 6050 to analog pin 6 and pin 7 Respectively.
Figure 4. Interfacing Arduino to MPU 6050
3.3. The Receiver Module
This consists of RF receiver Module, the motor actuator, (L239D), Robotic Arm and the Arduino
board, the receiver receives the serial information. The data is then sent to the microcontroller unit
which is the data sent to the motor actuator and The Robotic Arm Depending on the Data that was
sent. The geared motor is then connected in order to respond to the signal output from the motor
actuator.
International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 07; July - 2018 [ISSN: 2455-1457]
@IJRTER-2018, All Rights Reserved 94
The RF module, as the name suggests, operates at Radio Frequency. The corresponding frequency
range varies between 30 kHz & 300 GHz. In this RF system, the digital data is represented as variations
in the amplitude of carrier wave. This kind of modulation is known as Amplitude Shift Keying (ASK).
Transmission through RF is better than IR (infrared) because of many reasons. Firstly, signals through
RF can travel through larger distances making it suitable for long range applications. Also, while IR
mostly operates in line-of-sight mode, RF signals can travel even when there is an obstruction between
transmitter & receiver. Next, RF transmission is more strong and reliable than IR transmission. RF
communication uses a specific frequency unlike IR signals which are affected by other IR emitting
sources.
3.4. The Motor Driver
The motor driver IC L239d drives the motor.
The L293D is a monolithic integrated, high voltage, high current, 4-channel driver.” Basically this
means using this chip one can use DC motors and power supplies of up to 36 Volts. That is some big
motors and the chip can supply a maximum current of 600mA per channel. The L293D chip is also
what is known as a type of H-Bridge. The H-Bridge is typically an electrical circuit that enables a
voltage to be applied across a load in either direction to an output. This means one can essentially
reverse the direction of current and thus reverse the direction of the motor. It works by having 4
elements in the circuit commonly known as corners: high side left, high side right, low side right, and
low side left. By using combinations of these one are able to start, stop and reverse the current. One
could make this circuit out of relays but it is easier to use an IC because The L293D chip is 2 H-Bridge
circuits, 1 per side of the chip or 1 per motor. The bit that was really needed in all of this is the 2 input
pins per motor.
Voltage regulation was slightly neglected because it allows for 2 power sources one direct source, up
to 36V for the motors and the other, 5V, to control the IC which can be supplied from the same power
supply which you use for a microcontroller. The only thing to remember is that the grounding
connection must be shared/ common for both supplies.
Figure 5. The Motor driver with the motors connected to the Arduino
D. The Power Supply Unit
In this section a power supply unit of 5v was designed for the Transmitter and The Receiver. Also a
stepdown of the five volts for to 3.3v for the accelerometer since it was specified that 3.3v should be
used for the sensor to avoid damage. Also there is an external power supply to the motor through a 12v
battery with the motor actuator been enabled by 5v. Also for the microcontroller another 9v battery
was used as the Arduino board was able to step it down to 5v. The step down to 5v was actualized
using a 5v regulator.
E. Mechanical Part
The receiving section is made of a thermoplastic material that is light weighted and very durable since
a less weighted substance will allow the geared motor to move effectively and efficiently. The system
measures about (22cm by 15cm by 5cm). The body was made of a moderate weighted material to
International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 07; July - 2018 [ISSN: 2455-1457]
@IJRTER-2018, All Rights Reserved 95
enable good movement of the motors, if it is light weighted the motor will just be rotating and if it is
heavy in terms of weight the robot will not move. Below is the robot arm at different stages:
(a) (b) (c)
(d) Figure 6. (a) Left hand movement (b) Upward movement of the arm
(c) Ready to grip(d) Forward movement
F. Software Module
The software was basically written using Arduino IDE as it allows one to write and burn the codes into
the microcontroller using Arduino board and/or external code Burner by installing a boot loader into
the microcontroller for compatibility. The microcontroller is programmed such that when a certain
range of tilting occurs, it will transmit signals to the motor to move to the direction required. These
tilts were determined from the calibration of the sensor and recording some values at which there is a
high from a particular direction, for example, the +x direction has a high at a about 1.8v. With this
knowledge these programs were written by making reference to the digital pins in the microcontroller.
But since the comparator gives an active high, that is, it gives a high output instead of a low, the code
had to be written in the reverse direction as show in the table 1:
Table 1: Robot Movement
Robot
movement
Input1
D11
Input 2 Input 3 Input
4
Forward –Y 1 0 1 0
Backwards +Y 0 1 0 1
Right +X 1 1 0 0
Left –X 1 0 0 1
Stop Z 1 1 1 1
IV. CONCLUSION
The objective of this work was achieved, which is to design an Arduino-based gesture controlled
system with accelerometer. It was developed successfully as the movement is easy to control and user-
friendly. The arm has six (6) degrees of freedom. It can grip and pick. This design can be upgraded for
more enhanced use in hazardous environments and minefields.
REFERENCES I. Aquib, J. K. (2015, January). Do-It-Yourself, Electronics for You. EFYMAG, pp.84-86.
II. Daksh. (2002). Retrieved March 9, 2017, from Wikipedia: http://en.wikipedia.org/wiki/DRDO.
III. Gartner. (2007) Retrieved March 9, 2017, from Gartner: http://www.gartner.com/it-glossary-control
IV. Kaura, H. K., Honrao, V., Patil, S. & Shetty, P. (2013). Gesture Controlled Robot using Image Processing.
International Journal of Advanced Research in Artificial Intelligence (IJARECE), 2(5), 69-77.
V. Licsár, A. & Szirányi, T. (2004). Hand Gesture Recognition in Camera-Projector System. Proceedings on ECCV
Workshop on HCI, Prague, Czech Republic. doi:10.1007/978-3-540-24837-8_9
International Journal of Recent Trends in Engineering & Research (IJRTER) Volume 04, Issue 07; July - 2018 [ISSN: 2455-1457]
@IJRTER-2018, All Rights Reserved 96
VI. Messenbaugh, C. (2016). USA Today: BMW’s 7-Series ‘gesture controls’ work pretty well. Retrieved March 9,
2017, from http://www.usatoday.com/cars
VII. Jena, S., Nayak, S. K., Sahoo, S. K., Sahoo, S. R. & Dash, S. (2015) Accelerometer Based Gesture Controlled
Robot Using Arduino. International Journal of Engineering Sciences & Research Technology. 4(4).
VIII. Pinto, J. (2012). Robotics Technology Trends. Retrieved May, 2017, from
http://www.automation.com/library/articles-white-papers/articles-by-jim-pinto/robotics-technology-trends
IX. Purkayastha, A., Prasad, A. D., Bora, A., Gupta, A. & Singh, P. (2014). Hand Gestures Controlled Robotic Arm.
Journal of International Academic Research for Multidisciplinary, 2(4), 234-240.
X. Wachs, J., Stern H., Edan Y., Gillam M., Feied C., Smith M. & Handler J. (2008). Real-time hand gesture interface
for browsing medical images. International Journal of Intelligent Computing in Medical Sciences & Image
Processing, 2(1), 15-25, doi: 10.1080/1931308X.2008.10644149