hemant sonawane report

22
1

Upload: hemant-sonawane

Post on 14-Apr-2017

115 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Hemant Sonawane report

1

Page 2: Hemant Sonawane report

2

ABSTRACT: -

This project focuses on the research of human robot interaction through tele-operation with

the help of brain-computer interfaces (BCIs). To accomplish that, a working system has

been created based on off-the-shelf components. The experimental prototype uses the basic

movement operations and obstacle detection of a Lego Mindstorms NXT Robot. There are

two versions of this prototype, taking readings from the users' brain electrical activity in

real-time performance. The first version is Mindset, and is based on the attention levels of

the user as the robot accelerates or decelerates. The second version is Emotiv Epoc headset

which we actually used in our project. Its taking readings from 14 sensors, being able to

control fully the robot.

Page 3: Hemant Sonawane report

3

INTRODUCTION: -

Robotics is an increasingly developed field with elements from different scientific principles

including electronics, mechatronics, software engineering, physiology, human-machine

interaction and psychology. On the other hand, Brain-Computer Interface (BCI) technology

is a rapidly growing field of research with various applications in modern computer games,

prosthetics and control systems through to medical diagnostics. Although research on BCI

started during the 1970’s only the last few years we have been able to introduce brain-

computer interfacing to simple users mostly through serious games with commercial

products. In the past, a number of research groups have investigated various ways in

controlling robotic platforms mainly electrical wheelchairs or robotic arms for people

suffering from a diverse range of impairments. A very good example that combines both

platforms is the 9-degree-of-freedom (DoF) Wheelchair-Mounted Robotic Arm System

from the University of South Florida, Tampa. This kind of research had a profound impact

on the person’s independence; social-activity; and self-esteem. This technology launched for

commercial use with headsets that use simplified version of an electroencephalograph.

Examples include the mind controlled electric carts using NeuroSky’s technology with the

drivers wearing the Mindset headsets demonstrated on Courtesy of “The Doctors” show on

CBS. Another relevant project originated from the BCI research group of The University of

Essex and demonstrated the NXT robot as the main control object on Channel 4 TV in 2007.

Another example of a nonmedical EEG for controlling a remote device is the Emotiv Epoc

headset moving a robotic arm. Using a series of 16 electrodes, the EPOC can measure levels

of attention and facial expressions. Emotiv Epoc was launched in the same period as the

Neurosky’s Headset. The aim of this research is to introduce the field of human-robot

interaction using BCIs to tele-operate a robotic unit. In particular, this research focuses on

how a robotic unit operated through brainwaves can overcome the kinetic constraints of the

user. The prototypes use the basic movement operations and obstacle detection of a Lego

Mindstroms NXT Robot. The robot is controlled through a Neurosky Mindset for the first

prototype and Emotiv Epoc for the second taking readings from the users' brain electrical

activity in realtime performance. This activity produced by firing neurons of the brain is

recorded through electroencephalography (EEG) and it can be used as the basis for the

development of serious applications ranging from computer games to computer simulations.

Page 4: Hemant Sonawane report

4

SYSTEM ARCHITECTURE: -

For the development of the experimental prototype, a Desktop PC running Windows 8 was

used as the main development system and a Netbook PC running Windows 10 for

demonstration purposes. The basic hardware components are a Lego Mindstorms NXT

robot, the Neurosky Headset and the Emotiv Epoc headset as illustrated in Figure 1.

Figure: 1 System Architecture

The raw EEG has usually been described in terms of frequency bands: GAMMA greater

than 30 (Hz) BETA (13-30 Hz), ALPHA (8-12 Hz), THETA (4-8 Hz), and DELTA (less

than 4 Hz) [12]. The headset is calculating through a chip the Raw EEG to produce the

eSense Meters (Attention and Meditation) and output it to the PC [5]. On the other hand, the

Emotiv EPOC Headset is a neuro-signal acquisition and processing wireless neuroheadset

with 14 saline sensors being able not only to detect brain signals but also user’s facial

expressions. As for the robot, the Lego Mindstorms NXT kit has been used being a cross

platform programmable robotics unit released by Lego in 2006. The version that was used

includes a 32-bit AT91SAM7S256 (ARM7TDMI) main microprocessor at 48 MHz (256 KB

flash memory, 64 KB RAM). It includes 4 different types of sensors such as: servo motors,

touch, light, sound and ultrasonic.

Three identical servo motors that have built-in reduction gear assemblies with internal

optical rotary encoders that sense their rotations within one degree of accuracy. The touch

sensor detects whether it is currently pressed, has been bumped, or released. This has been

implemented in the prototype to allow for collision detection. The light sensor detects the

light level in one direction, and also includes an LED for illuminating an object. The light

sensor can sense reflected light values (using the built-in red LED), or ambient light on a

scale of ‘0’ to ‘100’, ‘100’ being very bright and ‘0’ being dark. If calibrated, the sensor can

also be used as a distance sensor. The light sensor is used as a height detector on this model

programming language that was used is JAVA with the use of Eclipse IDE. LeJOS was

installed to the NXT robot as the main OS so it can be programmed in JAVA. LeJOS is a

tiny Java Virtual Machine ported to the NXT brick in 2006. The main advantage of LeJOS

NXJ is that it provides the extensibility and adaptability of JAVA [13]. Furthermore, the

pccomm and blue cove libraries were used for the Bluetooth communication between the

computer and the robot. Finally, for the computer headset communication set the think gear

library has been used from the Mindset development tools and for the Emotiv Epoc headset

the Emotiv Development Kit.

Page 5: Hemant Sonawane report

5

BRAIN COMPUTER INTERFACE: -

Human-machine interface is one of the growing areas of research and development in recent

years. Most of all efforts through innovative interface has been paid to the design user-

friendly and ergonomic systems, such as voice recognition or virtual reality. Brain-

computer interface (BCI), sometimes called a direct neural interface or interfaces Machine

Brain, is a direct communication path between the human or animal brain (or brain cell

culture) and an external device. In one BCI device, either computers accept commands from

the brain or send signals to it. Both variants are also possible. Two communications are

permitted only in BCI brain and an external device to exchange information, but so far only

been successfully implanted in animals or humans. In the previous definition of the brain

means the brain or nervous system forms an organic life rather than the mind. Computer

refers to any processing or computing devices, from simple circuits after silicon chips

(including hypothetical future technologies such as quantum computing)

Figure: 2 BCI System

Page 6: Hemant Sonawane report

6

USED SOFTWARE AND EQUIPMENTS: -

1) Emotive EPOC: -

Emotiv EPOC is a wireless EEG system for research enabling entertainment, market

research and usability testing. Emotiv is used in more than 100 countries in which published

work in different areas of research on this topic. This device is intended for practical

research applications, providing access to high quality raw EEG data using software and

Enhanced Software development kit (SDK). Is it possible to conduct research Application

Programming Interface (API) and detection libraries that recognize facial expressions,

mental commands and performance metrics.

Figure: 3 Emotiv EPOC Headset

It is possible to connect to a PC, Tablet PC and smartphones through a protected 2.4 GHz

wireless network. HW key is compatible with USB and requires no custom drivers. It is

powered lithium battery that provides 12 hours of continuous operation. Emotiv EPOC has

14 channel EEG + 2 reference, which offer optimum location for accurate spatial resolution.

Channel names are based on the international 10-20 system electrode placement: AF3, F7,

F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4, with reference electrodes in the P3 / P4.

This device uses a sequential method sampling and one ADC. It works at a resolution of 14

bits per channel frequency response 0.16 - 43 Hz

Figure: 4 Location of Electrode

Page 7: Hemant Sonawane report

7

2) EmoComposer: -

The equipment includes Emotiv Epoc is EmoComposer version 1.0.0.5-PREMIUM which

serves as a simulator Emotiv Epoc. This program has been used in programming application

because it can send the same signals as the Emotiv Epoc.

Figure: 5 Location of Electrode

3) Emotive Control Panel: -

Emotiv Control Panel version 1.0.0.5-PREMIUM (build 116) is also part of Emotiv

Epoc. Unlike EmoComposeru that transmits signals, the program broadcast signals direct

and detects them. The program can communicate with the Emotiv Epoc and is

EmoComposerem

Page 8: Hemant Sonawane report

8

There are instructions on how to properly deploy Emotiv Epoc. This program also

differentiates strength Signal individual sensors by color: -

Black - no signal

Red - a very weak signal,

orange - weak signal

Yellow - decent signal,

Green - good signal

Figure: 6 Emotive Control Panel

4) LEGO Mindstorms: -

LEGO MINDSTORMS EV3 is a combination of universal LEGO building system and the

most advanced technologies. With this kit can build robots that can walk or travel, speak and

perform everything just can invade its builder. The kit includes 3D tutorials, step by step

how to build TRACK3R, R3PTAR, SPIK3R, EV3RSTORM and GRIPP3R. These robots

can be revived through a simple, intuitive, icon-based programming interface. The robot can

be controlled via remote control or through free application on your smartphone or tablet.

Page 9: Hemant Sonawane report

9

Figure: 7 Controller EV3

5) LEGO Digital Designer:-

Figure: 8 LEGO Digital Designer

Page 10: Hemant Sonawane report

10

SELECT THE APPROPRIATE SIGNAL: -

Emotiv Epoc can recognize these three types of signals:

Expressive,

affective,

cognitive.

1) Expressive signals

Expressive signals are signals emitted by the brain based on eye movement and expression

face. EmoComposer capable of sending and Emotiv Control Panel to detect these expressive

signals:

blink with both eyes (B link)

blink his left eye (Left Wink)

blink his right eye (Right Wink)

look to the left (Look Left)

look right (Look Right)

raising eyebrows (Raise Brow)

frown (Furrow Brow)

Smile (Smile)

grin (clench)

smirk right corner with the top (Right smirk)

grin to the left corner on the mountain (Left smirk)

laugh (Laugh).

From this menu were selected into all options to sneer at the right or

left corner up. The expression signals are suitable to control, because it is enough to perform

individual action on his face. A person does not need to concentrate on a specific image in

your mind.

Figure: 9 Expressive Signals

Page 11: Hemant Sonawane report

11

2) Affective Signals

Figure: 10 Affective Signals

Affective signals are signals emitted by the brain based on emotion or meditation.

EmoComposer capable of sending and Emotiv Control Panel to detect these affective

signals:

The abolition or excitement (E Xcitement)

of engagement / boring (Engagement / Boredom)

Meditation (Meditation)

frustration (Frustration).

3) Cognitive signals

In this case, the cognitive brain sends signals based on imagination. The user in your mind

will introduce cube, which is moving in a certain direction. Based on the direction of

movement EmoComposer capable of sending and Emotiv Control Panel to detect these

cognitive signals:

Push

Pull

Lift

Drop

Left

Right

Rotate Left

Rotate Right

Rotate Clockwise

Rotate Counter -clockwise

Rotate Forward

Rotate Reverse.

Page 12: Hemant Sonawane report

12

Figure: 11 Cognitive Signals

MODEL FOR TESTING: -

Figure: 12 Physical Test Model

Page 13: Hemant Sonawane report

13

Figure: 13 Physical Test Model

Figure: 14 Physical Test Model

Page 14: Hemant Sonawane report

14

Flow Chart: -

Figure: 14 Flow Chart

Page 15: Hemant Sonawane report

15

CODE: -

1 private void Pripoj_Emotiv_Click(object sender, EventArgs e)

2 {

3 if (this.Pripoj_Emotiv.Text == "Připojit")

4 {

5 this.Pripoj_Emotiv.Text = "Odpojit";

6 //engine.RemoteConnect("127.0.0.1", 1726);

7 engine.Connect();

8 if (Stav_pripojeni_Lego.Text == "Připojeno")

9 {

10 Uloz.Enabled = true;

11 }

12 Emotiv_obrazek.Visible = true;

13 }

14 else

15 {

16 this.Pripoj_Emotiv.Text = "Připojit";

17 engine.Disconnect();

18 if (this.Pripoj_Lego.Text == "Odpojit")

19 {

20 Motory_Off();

21 }

22 Uloz.Enabled = false;

23 Emotiv_obrazek.Visible = false;

24 }

25 Vlakno = new Thread(new ThreadStart(this.UpdateEmotiv));

26 Vlakno.Start();

27 }

Source Code 1: Connecting and Disconnecting EPOC

1 Brick<IRSensor, Sensor, Sensor, IRSensor> Lego; // senzory Lega

2 EmoState Stavy; // expresivní stavy vysílané z headsetu

3 EmoState StavyKognitiv; // kognitivní stavy vysílané z headsetu

Source Code 2: Sensors Lega and save current expressive and cognitive states

Page 16: Hemant Sonawane report

16

1 try

2 {

3 Lego = new Brick

4 <IRSensor,Sensor,Sensor,IRSensor>(CisloComu.Text);

5 //Pro připojení přes Bluetooth

6 Lego.Connection.Open();

7 Lego.Beep(1, 1000);

8 this.Stav_pripojeni_Lego.Text = "Připojeno";

9 this.Pripoj_Lego.Text = "Odpojit";

10 if (Stav_pripojeni.Text == "Připojeno")

11 {

12 Uloz.Enabled = true;

13 }

14 Lego_obrazek.Visible = true;

15 }

16 catch

17 {

18 this.Stav_pripojeni_Lego.Text = "Odpojeno";

19 Uloz.Enabled = false;

20 }

Source Code 3: Connecting the Lego Mindstorms EV3

1 private void VyberAkceRobota(int a)

2 {

3 switch (a)

4 {

5 case 1:

6 MotorB_Rychlost = Vpred_rychlost.Value;

7 MotorC_Rychlost = Vpred_rychlost.Value;

8 Vpred_aktivita.Value = Vpred_rychlost.Value;

9 timerRobot.Start();

10 break;

11

12 case 2:

13 MotorB_Rychlost = -Vzad_rychlost.Value;

14 MotorC_Rychlost = -Vzad_rychlost.Value;

15 Vzad_aktivita.Value = Vzad_rychlost.Value;

16 timerRobot.Start();

17 break;

18

19 case 3:

20 MotorB_Rychlost = ZataceniVpravo_rychlost.Value;

21 if (ZataceniVpravo_rychlost.Value <= 21)

22 {

23 MotorC_Rychlost = 2;

24 }

25 else

26 {

27 MotorC_Rychlost = ZataceniVpravo_rychlost.Value-20;

Page 17: Hemant Sonawane report

17

28 }

29 ZataceniVpravo_aktivita.Value =

30 ZataceniVpravo_rychlost.Value;

31 timerRobot.Start();

32 break;

33

34 case 4:

35 if (ZataceniVlevo_rychlost.Value <= 21)

36 {

37 MotorB_Rychlost = 2;

38 }

39 else

40 {

41 MotorB_Rychlost = ZataceniVlevo_rychlost.Value-20;

42 }

43 MotorC_Rychlost = ZataceniVlevo_rychlost.Value;

44 ZataceniVlevo_aktivita.Value =

45 ZataceniVlevo_rychlost.Value;

46 timerRobot.Start();

47 break;

48

49 case 5:

50 MotorB_Rychlost = -ZataceniVpravo_rychlost.Value;

51 if (ZataceniVpravo_rychlost.Value <= 21)

52 {

53 MotorC_Rychlost = -2;

54 }

55 else

56 {

57 MotorC_Rychlost = -(ZataceniVpravo_rychlost.Value-20);

58 }

59 CouvaniVpravo_aktivita.Value =

60 CouvaniVpravo_rychlost.Value;

61 timerRobot.Start();

62 break;

63

64 case 6:

65 if (ZataceniVlevo_rychlost.Value <= 21)

66 {

67 MotorB_Rychlost = -2;

68 }

69 else

70 {

71 MotorB_Rychlost = -(ZataceniVlevo_rychlost.Value-20);

72 }

73 MotorC_Rychlost = -ZataceniVlevo_rychlost.Value;

74 CouvaniVlevo_aktivita.Value =

75 CouvaniVlevo_rychlost.Value;

76 timerRobot.Start();

77 break;

Page 18: Hemant Sonawane report

18

78

79 default:

80 break;

81 }

82 }

Source Code 4: Switch Setting and Speed Engines

1 private void Uloz_Click(object sender, EventArgs e)

2 {

3 int[] Emo = new int[7];

4 float[] Num = new float[6];

5 int i, j;

6 bool k = false;

7

8 if (!SpousteniMotoru)

9 {

10 Emo[0] = NabidkaEmotiv1.SelectedIndex;

11

12 Emo[1] = NabidkaEmotiv2.SelectedIndex;

13 Num[0] = (float)FloatEmotiv2.Value;

14

15 Emo[2] = NabidkaEmotiv3.SelectedIndex;

16 Num[1] = (float)FloatEmotiv3.Value;

17

18 Emo[3] = NabidkaEmotiv4.SelectedIndex;

19 Num[2] = (float)FloatEmotiv4.Value;

20

21 Emo[4] = NabidkaEmotiv5.SelectedIndex;

22 Num[3] = (float)FloatEmotiv5.Value;

23

24 Emo[5] = NabidkaEmotiv6.SelectedIndex;

25 Num[4] = (float)FloatEmotiv6.Value;

26

27 Emo[6] = NabidkaEmotiv7.SelectedIndex;

28 Num[5] = (float)FloatEmotiv7.Value;

29

30 if (Emo[0] == -1)

31 {

32 k = true;

33 MessageBox.Show("Není vybrána položka Zastavit!");

34 }

35

36 for (i = 0; i < 6; i++)

37 {

38 if (Emo[i] != -1)

39 {

40 for (j = i + 1; j < 7; j++)

41 {

42 if (Emo[i] == Emo[j])

43 {

Page 19: Hemant Sonawane report

19

44 k = true;

45 MessageBox.Show("Dva stejné ovládací

46 parametry!");

47 }

48 }

49 }

50 }

51

52 for (i = 0; i < 5; i++)

53 {

54 if (Num[i] < 0 || Num[i] > 1)

55 {

56 k = true;

57 MessageBox.Show("Síla není v rozmezí 0 - 1!");

58 }

59 }

60

61 if (!k)

62 {

63 timer.Start();

64 }

65 }

66 }

Source Code 5: Check selected items and error messages

1 switch (NabidkaEmotiv1.SelectedIndex)

2 {

3 case 0:

4 Zastaveni = Stavy.ExpressivIsBlink();

5 break;

6

7 case 1:

8 Zastaveni = Stavy.ExpressivIsLeftWink();

9 break;

10

11 case 2:

12 Zastaveni = Stavy.ExpressivIsRightWink();

13 break;

14

15 case 3:

16 Zastaveni = Stavy.ExpressivIsLookingLeft();

17 break;

18

19 case 4:

20 Zastaveni = Stavy.ExpressivIsLookingRight();

21 break;

22 } Source Code 6: Switch emotive states to stop

Page 20: Hemant Sonawane report

20

APPLICATIONS IN EDUCATION FIELD: -

Superior problems are able to develop pupils' interest in science and research in that area

and is important in the interest of students and professional activities. The output of this

work is described accurately and in school practice applicable procedure as EEG signals to

control external devices. Said the source code can be used whole or it can be tweak to suit

the needs of future users.

Page 21: Hemant Sonawane report

21

CONCLUSIONS AND FUTURE WORK

It was an interesting international experience to be part of this project and Tomas Bata

University (TBU).

Control external devices using ideas (and the facial muscles) looks like science fiction, but

progress in this area it is already so far that it is possible. It is good that researchers

interested about this topic because it can help people both handicap and move on control

various devices.

In this Project we describe the design management solution using an external robotic system

Brain signals scanned using an EEG device. The system is based on processing EEG signals

and convert the output to an external device understandable instructions.

This project presented an affordable human-robot interaction system based on tele-operation

with the help of brain-computer interfaces. The experimental prototype allows a robot to be

controlled successfully through users' brain electrical activity in real-time performance,

allowing for the generation of serious applications. The aim of this research focuses in the

ways that a robotic platform is controlled for future applications and studies the way the

human brain works (in terms of attention and medidation levels). This kind of research can

transform the way we live today enhancing human potential. Future work will include the

combination of different readings with the use of more sophisticated device equipped with

more electrodes. Moreover, a large sample of users will be tested to qualitatively measure

the effectiveness of the system. Furthermore, a third prototype will be developed with the

robot performing more functionality with the help of custom BCI devices [16]. Finally,

other physiological data will be extracted including body posture and facial expressions so

that it will be possible to determine user’s visual attention.

Page 22: Hemant Sonawane report

22

References: -

[1] PALANIAPPAN, Ramaswamy. Biological Signal Analysis [online]. Ramaswamy

Palaniappan & bookboon.com, 2010 [cit. 2016-01-11]. ISBN 978-87-7681-594-3

http://bookboon.com/cs/introduction-to-biological-signal-analysisebook

[2] Recording electrodes. In: Indiana.edu [online]. [cit. 2016-04-07]

http://cognitrn.psych.indiana.edu/busey/eegseminar/pdfs/EEGPrimerCh2.pdf

[3] Interpreting EEG Functional Brain Activity. In: FIU [online]. 2004 [cit. 2016-04-

https://cate.fiu.edu/sites/default/files/Publications/01266932.pdf

[4] KUMAR, Nitish. BRAIN COMPUTER INTERFACE. In: Digital Library

CUSAT [online]. 2008 [cit. 2016-01-13].

http://dspace.cusat.ac.in/jspui/bitstream/123456789/2621/1/Brain%20computer%20

interface.pdf

[5] Emotiv EPOC / EPOC+. In: EMOTIV [online]. 2014 [cit. 2016-02-01].

https://emotiv.com/epoc.php

[6] 31313 MINDSTORMS EV3. In: Lego Mindstorms [online]. 2016 [cit. 2016-02-01].

http://www.lego.com/cs-cz/mindstorms/products/31313-mindstormsev3