Вы находитесь на странице: 1из 62

DESIGN AND CONSTRUCTION OF A BRAIN COMPUTER INTERFACE FOR

BI-DIRECTIONAL MOVEMENTS OF A ROBOT

BY

ADEBAYO OLAMIDE

MATRIC NO: 11/ENG02/001

SUBMITTED TO

THE DEPARTMENT OF ELECTRICAL/ELECTRONICS AND COMPUTER

ENGINNEERING DEPARTMENT

AFE BABAOLA UNIVERSITY, ADO-EKITI

EKITI STATE.

IN PARTIAL FUFILMENT OF THE REQUIREMENTS FOR THE AWARD

OF BACHELOR OF ENGINEERING (B.ENG) DEGREE IN COMPUTER

ENGINEERING.

JUNE 2016.
ABSTRACT

A Brain computer interface (BCI) is a system that allows direct communication between a computer

and the human brain, bypassing the body’s normal neuromuscular pathways. The signals recorded

by the system are processed and classified to recognize the intent of the user. Though the main

application for BCIs is in rehabilitation of disabled patients, they are increasingly being used in other

application scenarios as well. One such application is the control of wheelchair movement.

In this project, an effective BCI application is developed which will help the physically challenged

to lead an independent life with the help of their brain signals using non-invasive techniques.

ii
DEDICATION

This report is dedicated to the almighty God, for his grace, mercies, protection and divine strength

from above. I also dedicate it to my family, most especially my father and mother for their moral and

financial support.

iii
ACKNOWLEDGEMENTS

Firstly, I am forever grateful to God for his mercy, grace and guidance to finish this project. I would

like to thank my parents for their continuous love and support.

I would like to thank my supervisors, Dr. T Adegboola and Engr S.O. Akinola, for their valuable

help and advice throughout this project. They encouraged me, helped me stay focused and

motivated.

I am also thankful to the lecturers of the engineering college, as they gave up their valuable time to

suggest their useful ideas and requirements. They also provided valuable feedback and suggestions,

as well as being very encouraging throughout.

iv
CERTIFICATION

This is to certify that this report was prepared and presented by ADEBAYO, OLAMIDE with

student matriculation number 11/ENG02/001 in the Department of Computer Engineering, College

of Engineering, Afe Babalola University Ado-Ekiti Nigeria during the 2015/2016 academic session

under my supervision.

_______________________ _______________________

Student Date and Sign

_______________________ _______________________

Dr. Tunde Adegbola Date and Sign

_______________________ _______________________

Engr. S. O. Akinola Date and Sign

_______________________ _______________________

Head of Department (HOD) Date and Sign

_______________________ _______________________

External Examiner Date and Sign

v
TABLE OF CONTENTS

TITLE PAGE ......................................................................................................................................... i

ABSTRACT ..........................................................................................................................................ii

DEDICATION .....................................................................................................................................iii

ACKNOWLEDGEMENTS ................................................................................................................. iv

CERTIFICATION ................................................................................................................................. v

LIST OF FIGURES ............................................................................................................................... x

CHAPTER ONE.................................................................................................................................... 1

INTRODUCTION ................................................................................................................................. 1

1.1 Background ............................................................................................................................... 1

1.1.1 Brain Computer Interface (BCI) System ............................................................................ 2

1.2 Problem Statement .................................................................................................................. 3

1.3 Aim and Objectives ..................................................................................................................... 3

1.3.1 Aim ....................................................................................................................................... 3

1.3.2 Objectives ............................................................................................................................. 3

1.4 Scope of Project...................................................................................................................... 4

1.5 Methodology ........................................................................................................................... 4

CHAPTER TWO ................................................................................................................................... 5

LITERATURE REVIEW ...................................................................................................................... 5

2.1 Introduction ................................................................................................................................. 5

2.2 Brainwaves .................................................................................................................................. 6

2.2.1 Infra-Low.............................................................................................................................. 6

2.2.2 Delta Waves ......................................................................................................................... 7

2.2.3 Theta Waves ......................................................................................................................... 7

2.2.4 Alpha Waves ........................................................................................................................ 7

2.2.5 Beta Waves ........................................................................................................................... 7

vi
2.2.6 Gamma Waves .................................................................................................................... 8

2.3 Neurofeedback............................................................................................................................. 8

2.4 Electroencephalography (EEG) ................................................................................................... 9

2.4.1 Sensorimotor activity ......................................................................................................... 10

2.4.2 P300 .................................................................................................................................... 11

2.4.3 Steady State Visual Evoked Potentials (SSVEPs) ............................................................. 11

2.4.4 Slow Cortical Potentials ..................................................................................................... 11

2.4.5 Responses to Mental Tasks ................................................................................................ 12

2.5 Brain Computer Interface (BCI) ................................................................................................ 12

2.5.1 Milestones in BCI Development ........................................................................................ 13

2.6 Components of a BCI System ................................................................................................... 14

2.6.1 Signal Acquisition ......................................................................................................... 15

2.6.2 Feature Extraction ......................................................................................................... 15

2.6.3 Feature Translation ............................................................................................................. 16

2.6.4 Device Output..................................................................................................................... 16

2.7 The Future of BCIS: Problems and Prospects ....................................................................... 17

2.7.1 Signal-Acquisition Hardware ........................................................................................ 18

2.7.2 Validation and Dissemination ....................................................................................... 18

2.7.3 Reliability ...................................................................................................................... 20

CHAPTER THREE ............................................................................................................................. 22

METHODOLOGY .............................................................................................................................. 22

3.1 Introduction ............................................................................................................................... 22

3.2 Hardware Design ....................................................................................................................... 22

3.2.1 Arduino Uno ....................................................................................................................... 22

3.2.2 An EEG Headset ................................................................................................................ 23

3.2.3 HC-05 Module .................................................................................................................... 24

3.3 Software Design.................................................................................................................... 25

vii
3.3.1 LabVIEW ...................................................................................................................... 25

3.3.2 Arduino IDE ....................................................................................................................... 27

3.3.3 BCI Communication System .............................................................................................. 28

CHAPTER FOUR ............................................................................................................................... 36

RESULTS AND DISCUSSION ..................................................................................................... 36

4.1 Introduction ........................................................................................................................... 36

4.2 Testing of Mindwave Headset ............................................................................................... 36

4.3 Testing of Arduino Uno ............................................................................................................ 37

4.4 Testing of HC-05 module .......................................................................................................... 37

4.5 Testing of LabVIEW ................................................................................................................. 38

4.6 Testing of Motors ...................................................................................................................... 39

4.7 Testing of Overall project.......................................................................................................... 40

CHAPTER FIVE ................................................................................................................................. 42

CONCLUSION, CHALLENGES ENCOUNTERED AND RECOMMENDATION ................... 42

5.1 Conclusion ................................................................................................................................. 42

5.2 Challenges Encountered ............................................................................................................ 42

5.3 Recommendation ....................................................................................................................... 43

REFERENCES .................................................................................................................................... 44

APPENDIX A ..................................................................................................................................... 47

APPENDIX B...................................................................................................................................... 51

viii
LIST OF TABLES

Table 4.1 Mindwave eSense table …………………………………………………………………..42

ix
LIST OF FIGURES

Figure 2.1: basic design and operation of any BCI system………………………………………….17

Figure 3.1: An Arduino Uno Board showing its properties……………………………………...…..22

Figure 3.2 A NeuroSky Headset (BCI device)……………………..………………………………..24

Figure 3.3 Rear view of HC05 Module……...……………………………………………...……….25

Figure 3.4 LabVIEW front panel used to interact with Mindwave headset………………………....26

Figure 3.4 an Arduino IDE showing a sketch………………………………………………………..28

Figure 3.5 Blink occurring in the second waveform….......……………………………………….....29

Figure 3.6 sketch used to program the HC05 module…….………………………………………....30

Figure 3.7 Sketch showing initial declaration of variables.……………………………………….....32

Figure 3.8 Sketch showing exact code for feature extraction and translation…….…………..……..33

Figure 3.9 Sketch showing code for controlling arduino pins based on attention levels……...……..34

Figure 3.10 Block Diagram of project design.……………………………………………………….34

Figure 3.1 Project Flowchart……………….………………………………………..……………….34

Figure 4.1 Headset’s manufacturer software used in setting up the device………………………….36

Figure 4.2 eSense readings from the arduino serial monitor………………………………………...38

Figure 4.3 DC motors with attached tires……………………………………………………………40

Figure 4.4 Final Project Design……………………………………………………………………...41

x
CHAPTER ONE

INTRODUCTION

1.1 Background

Until recently, we could say that the reality of being able to control digital devices through human

thoughts had been in the realm of science fiction. However, the advance of technology has brought a

whole new reality: Today, humans can use brainwaves to interact with devices or even affect their

environments. Brain-computer interface (BCI) technology as an emerging field may allow

individuals unable to speak and/or use their limbs to once again communicate or operate assistive

devices for walking and manipulating objects.

A BCI is a computer-based system that acquires brain signals, analyzes them, and translates them

into commands that are relayed to an output device to carry out a desired action. Mind-controlled

technology uses this brain-computer interface (BCI) to establish a pathway of communication

between the user’s brain and an external device. It has the potential to augment or even repair

patients’ damaged hearing, sight, or movement (Allison, et al., 2007).

In order to acquire these brain signals, a basic understanding of these signals and the anatomy of the

brain is highly essential. Generally when the brain neurons fire, certain signals are emitted called

brainwaves. These brainwaves are produced by synchronized electrical pulses from masses of

neurons communicating with each other. They are also divided into bandwidths to describe their

functions. Various kinds of brainwaves include alpha waves (8-12Hz), beta waves (12-88Hz),

gamma waves (38-42Hz), theta waves (3-8Hz), delta waves (<3Hz) etc (Lang & Bradley, 2007).

Alpha brainwaves are dominant during quietly flowing thoughts and in some meditative states.

Alpha is the resting state for the brain. Alpha waves aid overall mental coordination, calmness,

alertness, mind/body integration and learning. Beta brainwaves dominate our normal waking state of

consciousness when attention is directed towards cognitive tasks and the outside world. Beta is fast

1
activity, present when we are alert, attentive, engaged in problem solving, judgement, decision

making, and engaged in focused mental activity (Symphonic Mind LTD, 2009).

1.1.1 Brain Computer Interface (BCI) System

A BCI is a computer-based system that acquires brain signals, analyzes them, and translates them

into commands that are relayed to an output device to carry out a desired action. Thus, BCIs do not

use the brain's normal output pathways of peripheral nerves and muscles. This definition strictly

limits the term BCI to systems that measure and use signals produced by the central nervous system

(CNS). Furthermore, an electroencephalogram (EEG) machine alone is not a BCI because it only

records brain signals but does not generate an output that acts on the user's environment.

BCIs do not read minds in the sense of extracting information from unsuspecting or unwilling users

but enable users to act on the world by using brain signals rather than muscles. The user and the BCI

work together. The user, often after a period of training, generates brain signals that encode

intention, and the BCI, also after training, decodes the signals and translates them into commands to

an output device that accomplishes the user's intention.

By 2006, a microelectrode array was implanted in the primary motor cortex of a young man with

complete tetraplegia after a C3-C4 cervical injury. Using the signals obtained from this electrode

array, a BCI system enabled the patient to open simulated e-mail, operate a television, open and

close a prosthetic hand, and perform rudimentary actions with a robotic arm (Hochberg, et al., 2006).

This actually reflects a rich promise of BCIs. They may be used eventually to routinely replace or

restore useful function for people severely disabled by neuromuscular disorders; they might also

improve rehabilitation for people with strokes, head trauma, and other disorders. At the same time,

this exciting future can come about only if BCI researchers and developers engage and keep up with

this promising research.

2
1.2 Problem Statement

BCI is still undergoing a lot of research and hence there are still a lot of limitations. In Nigeria today,

a handful of people have the knowledge of BCI. This project looks at how BCIs can be trained to

decode brain signals to tackle the problem of people with limited moving abilities and control their

movement via a motorized BCI system.

In particular people who cannot move their muscles or do not have perfect control over their body,

but are able to control signals produced by the CNS( Central Nervous System) with their minds. This

project will hopefully devise a promising way for paralyzed patients to control devices like

computers or wheelchairs; by wearing a cap and undergoing training to learn to control a device like

a wheelchair by imagining that they’re moving a part of their body, or triggering commands with

specific mental tasks.

1.3 Aim and Objectives

1.3.1 Aim

The aim of this project is to design and implement a brain computer interface to restore useful

function for people with severe disability by neuromuscular disorders via directional movements of a

robot.

1.3.2 Objectives

The objectives of this project include the following;

i) To design a BCI system for controlling a robot.

ii) To implement the design for a BCI system.

iii) To implement control processes with LabVIEW and arduino.

iv) To test the design with a model robot.

3
1.4 Scope of Project

This project consists of a BCI system which measures majorly alpha and beta brainwaves and then

sends it to a processing system. This processing system translates the EEG data into output

commands to move the robot. The robot will be programmed to move backward and forward. Most

sensors designed to read brainwaves for delta, gamma, and theta waves are highly expensive and

thus, for this purpose only alpha and beta waves are intended to be used.

1.5 Methodology

The Neurosky Mindwave BCI (NeuroSky, 2016) sends brainwave and debugging information

wirelessly through Bluetooth communication to the computer system. LabVIEW communicates with

the BCI and displays the signals on the computer screen. LabVIEW is a visual programming

environment that seamlessly performs signal processing and permits easy development of complex

features along with error handling capabilities. The processed signal is then sent as an output which

then controls the wheelchair to move depending on specific mental task.

Another option is for the Mindwave BCI to send brainwave information wirelessly through

Bluetooth communication to an arduino microcontroller. The information is processed and debugged

in the arduino IDE and then sends out the processed signal through arduino to other digital object.

4
CHAPTER TWO

LITERATURE REVIEW

2.1 Introduction

For many years it has been speculated that electroencephalographic activity or other

electrophysiological measures of the brain function might provide a new effortless pathway for

sending messages and commands to the external world – a brain–computer interface (BCI). Over the

past 15 years, productive BCI research programs have arisen. This was encouraged by new

understanding of brain function, the advent of powerful low-cost computer equipment and by

growing recognition of the needs and potentials of people with disabilities. Research has

concentrated on developing new communication and control technology for those with severe

neuromuscular disorders.

BCIs today determine the intent of the user from a variety of different electrophysiological signals.

These signals include slow cortical potentials, P300 potentials, and mu or beta rhythms recorded

from the scalp, and cortical neuronal activity recorded by implanted electrodes. They are translated

in real-time into commands that operate a computer display or other device. Successful operation

requires that the user encode commands in these signals and that the BCI derive the commands from

the signals. Thus, the user and the BCI system need to adapt to each other both initially and

continually so as to ensure stable performance.

Current BCIs have maximum information transfer rates up to 10–25 bits/min (Allison, et al., 2007).

This limited capacity can be valuable for people whose severe disabilities prevent them from using

conventional augmentative communication methods. At the same time, many possible applications

of BCI technology, such as neuroprosthesis control, may require higher information transfer rates.

Future progress will depend on recognition that BCI research and development is an

interdisciplinary problem, involving neurobiology, psychology, engineering, mathematics, and

5
computer science (Shih, et al., 2012). Development of BCI technology will benefit from greater

emphasis on peer-reviewed research publications. BCI systems could eventually provide an

important new communication and control option for those with motor disabilities and might also

give those without disabilities a supplementary control channel or a control channel useful in special

circumstances (Wolpaw, et al., 2002).

2.2 Brainwaves

At the root of all our thoughts, emotions and behaviors is the communication between neurons

within our brains. Brainwaves are produced by synchronized electrical pulses from masses of

neurons communicating with each other. Brainwaves are detected using sensors placed on the scalp.

They are divided into bandwidths to describe their functions but are best thought of as a continuous

spectrum of consciousness; from slow, loud and functional - to fast, subtle, and complex.

It is a handy analogy to think of Brainwaves as musical notes - the low frequency waves are like a

deeply penetrating drum beat, while the higher frequency brainwaves are more like a subtle high

pitched flute. Like a symphony, the higher and lower frequencies link and cohere with each other

through harmonics. Our brainwaves change according to what we’re doing and feeling. When

slower brainwaves are dominant we can feel tired, slow, sluggish, or dreamy. The higher frequencies

are dominant when we feel wired, or hyper-alert. The descriptions that follow are only broadly

descriptions - in practice things are far more complex, and brainwaves reflect different aspects when

they occur in different locations in the brain.

Brainwave speed is measured in Hertz (cycles per second) and they are divided into bands

delineating slow, moderate, and fast waves (Symphonic Mind LTD, 2009).

2.2.1 Infra-Low

Infra-Low brainwaves are thought to be the basic cortical rhythms that underlie our higher brain

functions. Very little is known about infra-low brainwaves. Their slow nature make them difficult to

6
detect and accurately measure, so few studies have been done. They appear to take a major role in

brain timing and network function.

2.2.2 Delta Waves

Delta brainwaves are slow, loud brainwaves (low frequency and deeply penetrating, like a drum

beat). They are generated in deepest meditation and dreamless sleep. Delta waves suspend external

awareness and are the source of empathy. Healing and regeneration are stimulated in this state, and

that is why deep restorative sleep is so essential to the healing process.

2.2.3 Theta Waves

Theta brainwaves occur most often in sleep but are also dominant in deep meditation. It acts as our

gateway to learning and memory. In theta, our senses are withdrawn from the external world and

focused on signals originating from within. It is that twilight state which we normally only

experience fleetingly as we wake or drift off to sleep. In theta we are in a dream; vivid imagery,

intuition and information beyond our normal conscious awareness. It’s where we hold our ‘stuff’,

our fears, troubled history, and nightmares.

2.2.4 Alpha Waves

Alpha brainwaves are dominant during quietly flowing thoughts, and in some meditative states.

Alpha is ‘the power of now’, being here, in the present. Alpha is the resting state for the brain. Alpha

waves aid overall mental coordination, calmness, alertness, mind/body integration and learning.

2.2.5 Beta Waves

Beta brainwaves dominate our normal waking state of consciousness when attention is directed

towards cognitive tasks and the outside world. Beta is a ‘fast’ activity, present when we are alert,

attentive, engaged in problem solving, judgment, decision making, and engaged in focused mental

7
activity. Beta brainwaves are further divided into three bands; Lo-Beta (Beta1, 12-15Hz) can be

thought of as a 'fast idle, or musing. Beta (Beta2, 15-22Hz) is high engagement or actively figuring

something out. Hi-Beta (Beta3, 22-38Hz) is highly complex thought, integrating new experiences,

high anxiety, or excitement. Continual high frequency processing is not a very efficient way to run

the brain, as it takes a tremendous amount of energy.

2.2.6 Gamma Waves

Gamma brainwaves are the fastest of brain waves (high frequency, like a flute), and relate to

simultaneous processing of information from different brain areas. It passes information rapidly,

and as the most subtle of the brainwave frequencies, the mind has to be quiet to access

it. Gamma was dismissed as 'spare brain noise' until researchers discovered it was highly active

when in states of universal love, altruism, and the ‘higher virtues’.

Gamma is above the frequency of neuronal firing, so how it is generated remains a mystery. It is

speculated that Gamma rhythms modulate perception and consciousness, and that a greater presence

of Gamma relates to expanded consciousness and spiritual emergence (Symphonic Mind LTD,

2009).

2.3 Neurofeedback

Neurofeedback is a way to quantify and train brain activity; it is brainwave biofeedback. The basic

principles of how neurofeedback works are deceptively simple. Communication between groups of

cells in the brain generates thoughts, sensations, actions and emotions. This activity is detectable in

the form of brainwaves - electrical impulses generated by your brain activity.

During a neurofeedback session, sensors detect your brainwaves to see your brain in action. A

computer compares your brain activity to targets or goals for you to reach. Sounds and images tell

you immediately when your brain reaches your goal and when not - when you are activating or

8
suppressing the target area of the brain. Through this simple method, you learn how to quiet

brainwaves associated with low performance and increase brainwaves associated with optimal brain

function. Much like physical exercises strengthen and develop specific muscles, the more your brain

is exercised into reaching a new more comfortable, more efficient position, the better and stronger it

gets at it (neuroplasticity). Like any new skill, it simply requires feedback and practice (Symphonic

Mind LTD, 2009).

2.4 Electroencephalography (EEG)

Electroencephalography (EEG) is an electrophysiological monitoring method to record electrical

activity of the brain. It is typically noninvasive, with the electrodes placed along the scalp, although

invasive electrodes are sometimes used in specific applications. EEG measures voltage fluctuations

resulting from ionic current within the neurons of the brain. In clinical contexts, EEG refers to the

recording of the brain's spontaneous electrical activity over a period of time as recorded from

multiple electrodes placed on the scalp. Diagnostic applications generally focus on the spectral

content of EEG, that is, the type of neural oscillations (popularly called "brain waves") that can be

observed in EEG signals. EEG is most often used to diagnose epilepsy, which causes abnormalities

in EEG readings. It is also used to diagnose sleep disorders, coma, encephalopathies, and brain

death. EEG used to be a first-line method of diagnosis for tumors, stroke and other focal brain

disorders, but this use has decreased with the advent of high-resolution anatomical imaging

techniques such as magnetic resonance imaging (MRI) and computed tomography (CT). Despite

limited spatial resolution, EEG continues to be a valuable tool for research and diagnosis, especially

when millisecond-range temporal resolution (not possible with CT or MRI) is required.

Because of high time resolution, noninvasiveness, ease of acquisition, and cost effectiveness, the

electroencephalogram (EEG) is the preferred brain monitoring method in current BCIs.

9
Electroencephalography (EEG) is the most studied non-invasive interface, mainly due to its

fine temporal resolution, ease of use, portability and low set-up cost. The technology is somewhat

susceptible to noise however. EEG sensors are used to monitor physiological processes reflected in

brain waves which are then translated into control signals for external devices or machines. EEG

sensors have been incorporated into gaming systems that enable a player to control what happens

onscreen with a headset, EEG-controlled exoskeletons translate users’ brain signals into movements,

and implanted electrodes enable patients to control bionic limbs. (F.L, Niederemeyer E. and da

Silva, 2004) (Coenen, et al., 2014)

Mental activities and their corresponding EEG correlates are termed as electrophysiological sources

of control (Nunez & Srinivasan, 1981). The main sources are listed below:

2.4.1 Sensorimotor activity

Mu and beta rhythms (8-12 Hz & 13-30 Hz respectively) originate in the primary sensorimotor

Cortex area of the brain and are more prominent when a person is not engaged in processing

sensorimotor inputs or in producing motor outputs. A voluntary movement results in a

desynchronization in the mu and beta bands which is termed ‘event related desynchronization’

(ERD). It begins in the contralateral rolandic region of the brain about 2 seconds prior to the onset of

a movement and becomes bilateral before execution of movement. After the movement, the power in

the brain rhythm increases (event related resynchronization, ERS).

The imagination of motor movements, in particular limb movements is used in several BCIs which

identify the type of motor imagery (right/left hand/foot-movement) using a classification algorithm

that takes as features the power in the mu and beta bands at electrodes located over the primary

sensorimotor cortex. For convenience, we refer hereafter to BCIs relying on motor imagery as

ERD/ERS based BCIs (Friman, et al., 2007).

10
2.4.2 P300

Infrequent significant auditory or visual stimuli, when interspersed with frequent stimuli typically

evokes a positive peak at about 300 milliseconds after stimulus presentation in the EEG over the

parietal cortex. This peak is called P300. A P300 appears in the user’s EEG when her/his selected

choice is highlighted. Detecting the choice for which a P300 was elicited allows the BCI to know the

user’s selected choice and execute the corresponding action. The P300 detection algorithm takes as

features the signal samples (after band-pass filtering and subsampling) at parietal electrode sites

(Friman, et al., 2007).

2.4.3 Steady State Visual Evoked Potentials (SSVEPs)

When subjects are presented with repetitive visual stimuli at a rate greater than 5 Hz, a continuous

oscillatory response at the stimulation frequency and/or harmonics is elicited in the visual cortex.

This response is termed steady-state visual evoked potential (SSVEP). SSVEP based BCIs operate

by presenting the user with a set of repetitive visual stimuli at different frequencies which are

associated with actions. To select a desired action, the user needs to focus her/his attention on the

corresponding stimulus. The SSVEP corresponding to the focused stimulus is more prominent and

can be automatically detected by the BCI. Detection of SSVEPs in current BCIs relies on the

application of spatial filters (across electrodes) and temporal filters (Friman, et al., 2007).

2.4.4 Slow Cortical Potentials

SCPs are slow non-movement potential changes voluntarily generated by the subject. They reflect

changes in cortical polarization of the EEG lasting from 300ms up to several seconds. Operation of

11
SCP based BCIs is often of binary nature and relies on the subject’s ability to voluntarily shift

Her/his SCP (Friman, et al., 2007).

2.4.5 Responses to Mental Tasks

BCI systems based on non-movement mental tasks assume that different mental tasks (e.g., solving

a multiplication problem, imagining a 3D object, and mental counting) lead to distinct, task-specific

distributions of EEG frequency patterns over the scalp (Friman, et al., 2007).

2.5 Brain Computer Interface (BCI)

The advance of technology has brought a new reality: Today, humans can use the electrical signals

from brain activity to interact with, influence, or change their environments. The emerging field of

BCI technology may allow individuals unable to speak and/or use their limbs to once again

communicate or operate assistive devices for walking and manipulating objects. Brain-computer

interface research is an area of high public awareness. Videos on YouTube as well as news reports in

the lay media indicate intense curiosity and interest in a field that hopefully one day soon will

dramatically improve the lives of many disabled persons affected by a number of different disease

processes.

A brain–computer interface (BCI), sometimes called a mind-machine interface (MMI), direct neural

interface (DNI), or brain–machine interface (BMI), is a direct communication pathway between an

enhanced or wired brain and an external device. BCIs are often directed at researching, mapping,

assisting, augmenting, or repairing human cognitive or sensory-motor functions. (Shih, et al., 2012)

Research on BCIs began in the 1970s at the University of California, Los Angeles (UCLA) under a

grant from the National Science Foundation, followed by a contract from DARPA (Vidal, 1973).

12
The papers published after this research also mark the first appearance of the expression brain–

computer interface in scientific literature (Vidal, 1977). The field of BCI research and development

has since focused primarily on neuroprosthetics applications that aim at restoring damaged hearing,

sight and movement. Thanks to the remarkable cortical plasticity of the brain, signals from

implanted prostheses can, after adaptation, be handled by the brain like natural sensor or effector

channels (Levine, et al., 2000). Following years of animal experimentation, the first neuroprosthetic

devices implanted in humans appeared in the mid-1990s.

A BCI is a computer-based system that acquires brain signals, analyzes them, and translates them

into commands that are relayed to an output device to carry out a desired action. Thus, BCIs do not

use the brain's normal output pathways of peripheral nerves and muscles. This definition strictly

limits the term BCI to systems that measure and use signals produced by the central nervous system

(CNS). Thus, for example, a voice-activated or muscle-activated communication system is not a

BCI.

Furthermore, an electroencephalogram (EEG) machine alone is not a BCI because it only records

brain signals but does not generate an output that acts on the user's environment. Brain-computer

interfaces do not read minds in the sense of extracting information from unsuspecting or unwilling

users but enable users to act on the world by using brain signals rather than muscles. The user and

the BCI work together. The user often after a period of training generates brain signals that encode

intention, and the BCI, also after training, decodes the signals and translates them into commands to

an output device that accomplishes the user's intention. (Wolpaw & wolpaw, 2012)

2.5.1 Milestones in BCI Development

A question was posed by Vidal in 1977; “Can observable electrical brain signals be put to work as

carriers of information in person-computer communication or for the purpose of controlling devices

such as prostheses?” (Vidal, 1977). His Brain-Computer Interface Project was an early attempt to

13
evaluate the feasibility of using neuronal signals in a person-computer dialogue that enabled

computers to be a prosthetic extension of the brain. Although work with monkeys in the late 1960s

showed that signals from single cortical neurons can be used to control a meter needle, (Fetz, 1969)

systematic investigations with humans really began in the 1970s.

Initial progress in human BCI research was slow and limited by computer capabilities and our own

knowledge of brain physiology. By 1980, (Elbert, et al., 1980) demonstrated that persons given

biofeedback sessions of slow cortical potentials in EEG activity can change those potentials to

control the vertical movements of a rocket image traveling across a television screen.

In 1988 (Farwell & Donchin, 1988) showed how the P300 event-related potential could be used to

allow normal volunteers to spell words on a computer screen. Since the 1950s, the mu and beta

rhythms (i.e., sensorimotor rhythms) recorded over the sensorimotor cortex were known to be

associated with movement or movement imagery (Gastaut, 1952). In the late 1970s (Kuhlman, 1978)

showed that the mu rhythm can be enhanced by EEG feedback training. Starting from this

information, (Wolpaw & wolpaw, 2012; Wolpaw, et al., 2002) trained volunteers to control

sensorimotor rhythm amplitudes and use them to move a cursor on a computer screen accurately in 1

or 2 dimensions.

In 2011, (Krusienski & Shih, 2011) demonstrated that signals recorded directly from the cortical

surface (electrocorticography [ECoG]) can be translated by a BCI to allow a person to accurately

spell words on a computer screen. Brain-computer interface research is growing at an extremely

rapid rate, as evidenced by the number of peer-reviewed publications in this field over the past 10

years.

2.6 Components of a BCI System

The purpose of a BCI is to detect and quantify features of brain signals that indicate the user's

intentions and to translate these features in real time into device commands that accomplish the

14
user's intent. To achieve this, a BCI system consists of 4 sequential components (Wolpaw, et al.,

2002): (1) signal acquisition, (2) feature extraction, (3) feature translation, and (4) device output.

These 4 components are controlled by an operating protocol that defines the onset and timing of

operation, the details of signal processing, the nature of the device commands, and the oversight of

performance. An effective operating protocol allows a BCI system to be flexible and to serve the

specific needs of each user.

2.6.1 Signal Acquisition

Signal acquisition is the measurement of brain signals using a particular sensor modality (e.g., scalp

or intracranial electrodes for electrophysiological activity, fMRI for metabolic activity). The signals

are amplified to levels suitable for electronic processing (and they may also be subjected to filtering

to remove electrical noise or other undesirable signal characteristics, such as 60-Hz power line

interference). The signals are then digitized and transmitted to a computer.

2.6.2 Feature Extraction

Feature extraction is the process of analyzing the digital signals to distinguish pertinent signal

characteristics (i.e., signal features related to the person's intent) from extraneous content and

representing them in a compact form suitable for translation into output commands.

These features should have strong correlations with the user's intent. Because much of the relevant

brain activity is either transient or oscillatory, the most commonly extracted signal features in

current BCI systems are time-triggered EEG or ECoG response amplitudes and latencies, power

within specific EEG or ECoG frequency bands, or firing rates of individual cortical neurons.

15
Environmental artifacts and physiologic artifacts such as electromyographic signals are avoided or

removed to ensure accurate measurement of the brain signal features.

2.6.3 Feature Translation

The resulting signal features are then passed to the feature translation algorithm, which converts the

features into the appropriate commands for the output device (i.e. commands that accomplish the

user's intent). For example, a power decrease in a given frequency band could be translated into an

upward displacement of a computer cursor, or a P300 potential could be translated into selection of

the letter that evoked it. The translation algorithm should be dynamic to accommodate and adapt to

spontaneous or learned changes in the signal features and to ensure that the user's possible range of

feature values covers the full range of device control.

2.6.4 Device Output

The commands from the feature translation algorithm operate the external device, providing

functions such as letter selection, cursor control, robotic arm operation, and so forth. The device

operation provides feedback to the user, thus closing the control loop.

16
Figure 2.1 basic design and operation of a BCI system. Source :( Levine, et al., 2000)

(Figure 2.1) shows the basic design and operation of a BCI system. Signals from the brain are

acquired by electrodes on the scalp or in the head and processed to extract specific signal features

(e.g. amplitudes of evoked potentials or sensorimotor cortex rhythms, firing rates of cortical

neurons) that reflect the user’s intent. These features are translated into commands that operate a

device (e.g. a simple word processing program, a wheelchair, or a neuroprosthesis). Success depends

on the interaction of two adaptive controllers, user and system (Wolpaw & wolpaw, 2012).

2.7 The Future of BCIS: Problems and Prospects

Brain-computer interface research and development generates tremendous excitement in scientists,

engineers, clinicians, and the general public. This excitement reflects the rich promise of BCIs. They

17
may eventually be used routinely to replace or restore useful function for people severely disabled

by neuromuscular disorders; they might also improve rehabilitation for people with strokes, head

trauma, and other disorders.

At the same time, this exciting future can come about only if BCI researchers and developers engage

and solve problems in 3 critical areas: signal-acquisition hardware, BCI validation and

dissemination, and reliability.

2.7.1 Signal-Acquisition Hardware

All BCI systems depend on the sensors and associated hardware that acquire the brain signals.

Improvements in this hardware are critical to the future of BCIs. Ideally, EEG-based (noninvasive)

BCIs should have electrodes that do not require skin abrasion or conductive gel (i.e., so-called dry

electrodes); be small and fully portable; have comfortable, convenient, and cosmetically acceptable

mountings; be easy to set up; function for many hours without maintenance; perform well in all

environments; operate by telemetry instead of requiring wiring; and interface easily with a wide

range of applications.

In principle, many of these needs could be met with current technology, and dry electrode options

are beginning to become available (e.g. from NeuroSky). The achievement of good performance in

all environments may prove to be the most difficult requirement.

2.7.2 Validation and Dissemination

As work progresses and BCIs begin to enter actual clinical use, two important questions arise: how

good a given BCI can get (e.g. how capable and reliable) and which BCIs are best for which

purposes. To answer the first question, each promising BCI should be optimized and the limits on

18
users' capabilities with it should be defined. Addressing the second question will require consensus

among research groups in regard to which applications should be used for comparing BCIs and how

performance should be assessed. The most obvious example is the question of whether the

performance of BCIs that use intracortical signals is greatly superior to that of BCIs that use ECoG

signals, or even EEG signals. For many prospective users, invasive BCIs will need to provide much

better performance to be preferable to noninvasive BCIs. It is not yet certain that they can do so. The

data to date do not give a clear answer to this key question (Shih, et al., 2012).

On the one hand, it may turn out that noninvasive EEG- or fNIR-based BCIs are used primarily for

basic communication, while ECoG- or neuron-based BCIs are used for complex movement control.

On the other hand, noninvasive BCIs may prove nearly or equally capable of such complex uses,

while invasive BCIs that are fully implantable (and thus very convenient to use) might be preferred

by some people even for basic communication purposes. At this point, many different outcomes are

possible, and the studies and discussions necessary to select among them have just begun.

The development of BCIs for people with disabilities requires clear validation of their real-life value

in terms of efficacy, practicality (including cost-effectiveness), and impact on quality of life. This

depends on multidisciplinary groups able and willing to undertake lengthy studies of real-life use in

complicated and often difficult environments. Such studies, which are just beginning are an essential

step if BCIs are to realize their promise. The validation of BCIs for rehabilitation after strokes or in

other disorders will also be demanding and will require careful comparisons with the results of

conventional methods alone.

19
2.7.3 Reliability

The future of BCI technology certainly depends on improvements in signal acquisition, clear

validation studies and viable dissemination models. These issues pale next to those associated with

the problem of reliability. In all hands, no matter the recording method, the signal type, or the signal-

processing algorithm, BCI reliability for all but the simplest applications remains poor. BCIs suitable

for real-life use must be as reliable as natural muscle-based actions. Without major improvements,

the real-life usefulness of BCIs will, at best, remain limited to only the most basic communication

functions for those with the most severe disabilities.

Solving this problem depends on recognizing and engaging 3 fundamental issues: the central role of

adaptive interactions in BCI operation; the desirability of designing BCIs that imitate the distributed

functioning of the normal CNS; and the importance of incorporating additional brain signals and

providing additional sensory feedback.

The acquisition and maintenance of BCI-based skills like reliable multidimensional movement

control require comparable plasticity (e.g. as described by various investigators). Brain-computer

interface operation rests on the effective interaction of two adaptive controllers, the CNS and the

BCI. The BCI must adapt so that its outputs correspond to the user's intent. At the same time, the

BCI should encourage and facilitate CNS plasticity that improves the precision and reliability with

which the brain signals encode the user's intent. In summary, the BCI and CNS must work together

to acquire and maintain a reliable partnership under all circumstances. The work needed to achieve

this partnership has just begun. It involves fundamental neuro-scientific questions and may yield

important insights into CNS function in general.

Finally, current BCIs provide mainly visual feedback, which is relatively slow and often imprecise.

In contrast, natural muscle-based skills rely on numerous kinds of sensory input (e.g. proprioceptive,

cutaneous, visual, and auditory). Brain-computer interfaces that control applications involving high-

20
speed complex movements (e.g. limb movement) are likely to benefit from sensory feedback that is

faster and more precise than vision. Efforts to provide such feedback via stimulators in cortex or

elsewhere have begun. The optimal methods will presumably vary with the BCI, the application, and

the user's disability (e.g. peripheral inputs may often be ineffective in people with spinal cord

injuries).

In conclusion, many researchers throughout the world are developing BCI systems that a few years

ago were in the realm of science fiction. These systems use different brain signals, recording

methods, and signal-processing algorithms. They can operate many different devices, from cursors

on computer screens to wheelchairs to robotic arms. A few people with severe disabilities are

already using a BCI for basic communication and control in their daily lives. With better signal-

acquisition hardware, clear clinical validation, viable dissemination models, and, probably most

important, increased reliability, BCIs may become a major new communication and control

technology for people with disabilities—and possibly for the general population also.

21
CHAPTER THREE

METHODOLOGY

3.1 Introduction

This chapter states the methodology used in design and implementation of a BCI for directional

movements of a robot. It consists of the hardware and software phase which are explained below.

3.2 Hardware Design

The major hardware used for this project include a computer system, a HC05 module, an EEG

headset and an arduino uno microcontroller.

3.2.1 Arduino Uno

The Arduino Uno is a microcontroller board based on the ATmega328 with a 32KB flash memory

and clock speed of 16MHz. It has 14 digital input/output pins which consists of 6 PWM outputs, 6

analog inputs. It also has on-board a 16 MHz ceramic resonator, a USB connection, a power jack,

and a reset button. It contains everything needed to support the microcontroller; simply connect it to

a computer with a USB cable or power it with an AC-to-DC adapter or a battery to get started.

Figure 3.1: An Arduino Uno Board showing its properties. Source: ( (SparkFun, 2016))

22
3.2.2 An EEG Headset

The EEG headset used in this project is provided by NeuroSky and it is called the Neurosky

Mindwave Mobile (NeuroSky, 2016). Measuring EEG activity has traditionally required complex

equipment costing a lot of money, but after a lot of research an embeddable biosensor has been

created by NeuroSky. This precisely accurate, portable and noise filtering EEG biosensor collects

electrical signals (not actual thoughts) from the brain to translate brain activity into action. This BCI

device digitizes and amplifies raw analog brain signals to deliver concise input. It features a direct

connect to dry electrode, one EEG channel (reference + ground), extremely low level signal

detection, advanced filter with high noise immunity, raw EEG transmission at 512MHz. This device

is also able to read low alpha, high alpha, low beta and high beta waves.

The device consists of eight main parts, ear clip, flexible ear arm, battery area, power switch,

adjustable head band, sensor tip, sensor arm and inside think gear chipset. Figure 1 presents the

device design. The principle of operation is quite simple. Two dry sensors are used to detect and

filter the EEG signals. The sensor tip detects electrical signals from the forehead of the brain. At the

same time, the sensor picks up ambient noise generated by human muscle, computers, light bulbs,

electrical sockets and other electrical devices. The second sensor, ear clip, is a grounds and

reference, which allows think gear chip to filter out the electrical noise. The device measures the raw

signal, power spectrum (alpha, beta, delta, gamma, theta), attention level, mediation level and blink

detection. The raw EEG data received at a rate of 512 Hz. Other measured values are made every

second. Therefore, raw EEG data is a main source of information on EEG signals using Mindwave.

23
Figure 3.2 A NeuroSky Headset (BCI device)

3.2.3 HC-05 Module

This is a Bluetooth serial module that is used for converting serial port to bluetooth. This module has

two modes; master and slave mode. The main function of Bluetooth serial module is replacing the

serial port line when:

1. There are two MCUs (Microcontroller Unit) that want to communicate with each other. One

connects as Bluetooth master device while the other one connects as slave device. Their connection

can then be built once the pair is made. This Bluetooth connection is equivalently liked to a serial

port line connection including RXD, TXD signals. And they can be used by the Bluetooth serial

module to communicate with each other.

2. When the MCU is a Bluetooth slave module, it can communicate with Bluetooth adapter of

computers and smart phones. This will enable it create a virtual communicable serial port line

between MCU and computer or smart phone.

24
The Bluetooth devices in the market mostly are slave devices, such as Bluetooth printer, Bluetooth

GPS. The HC05 enables us to switch between master and slave to make pair and communicate with

slave devices. Communication between two Bluetooth modules requires at least two conditions: (1)

The communication must be between master and slave (2) The password must be correct.

Figure 3.3 Rear view of HC05 Module

3.3 Software Design

The major software used for this project include Arduino IDE and LabVIEW.

3.3.1 LabVIEW

LabVIEW (short for Laboratory Virtual Instrument Engineering Workbench) is a system-design

platform and development environment for a visual programming environment from National

Instruments. LabVIEW is commonly used for data acquisition, instrument control and industrial

automation on a variety of platforms.

LabVIEW ties the creation of user interfaces (called front panels) into the development cycle.

LabVIEW programs/subroutines are called virtual instruments (VIs). Each VI has three components:

a block diagram, a front panel and a connector panel. The last is used to represent the VI in the block

diagram of other calling VIs. The front panel is built using controls and indicators. Control are inputs

– they allow a user to supply information to the VI. Indicators are outputs – they indicate or display

the results based on the inputs given to the VI. The back panel, which is a block diagram contains

25
the graphical source code. All of the objects placed on the front panel will appear on the back panel

as terminals. The back panel also contains structures and functions which perform operations on

controls and supply data to indicators. The structures and functions are found on the functions palette

and can be placed on the back panel.

A VI can either be run as a program, with the front panel serving as a user interface or when

dropped as a node onto the block diagram, the front panel defines the inputs and outputs for the node

through the connector pane (National Instruments, 2015).

LabVIEW includes extensive support for interfacing to devices, instruments, cameras and other

devices. Users interface to hardware by either writing direct bus commands or using high level,

device-specific, drivers that provide native LabVIEW function nodes for controlling the device.

Figure 3.4 LabVIEW front panel used to interact with Mindwave headset. Source: (NI, 2016)

26
3.3.2 Arduino IDE

Arduino Integrated Development Environment makes it easy to write code and upload it to any

board. It runs on windows, Mac OS X and Linux. The environment is written in java and based on

processing and other open source software. The arduino IDE contains a text editor for writing code,

a message area, a text console, a toolbar with buttons for common functions and a series of menus. It

connects to the Arduino and Genuino hardware to upload programs and communicate with them.

Program written using arduino are called sketches. These sketches are written in the text editor and

are saved with the file extension [.ino]. The editor has features for cutting, pasting, searching and

replacing text. The message area gives feedback while saving, exporting and also displays errors.

The console displays text output by the Arduino software, including complete error messages and

other information. The bottom right-hand corner of the window displays the configured board and

serial port. The toolbar button allows users to verify and upload programs, create, open, save

sketches. It can also be used open the serial monitor. (Arduino Inc, 2016)

27
Figure 3.4 an Arduino IDE showing a sketch

3.3.3 BCI Communication System

The BCI system was created using two methods; 1) LabVIEW, 2) Arduino

1) LabVIEW Method

As described earlier, LabVIEW is a graphical programming environment that performs all sorts of

operations from data acquisition to industrial automation. After the installation of LabVIEW, the

next step is to create a block diagram (BD) which contains the graphical code and then the front

panel (FP) which allows a user to interact with a virtual instrument (VI). (Figure 3.3) above provides

a good visualization of neurofeedback which is a very useful tool for the user.

28
Figure 3.5 Blink occurring in the second waveform. Source: (NI, 2016)

After the Mindwave headset has been paired with the computer system, communication begins. A

specific dynamic library (dll) called think gear communication allows LabVIEW to communicate

with the headset by listening to a specific virtual port and acquire any information from that port;

this is the signal acquisition phase. This raw EEG data sent from the headset is a summation of

various waves, LabVIEW sees it as a cluster. Using the Neurosky driver for LabVIEW, it enables

this cluster to be broken down into smaller data; this is the feature extraction phase. Some notable

information that has been extracted include; alpha waves, beta waves and EoG signals (eye blinks).

The alpha waves are translated to meditation data and beta waves to attention data. Blinking causes

a spike in the raw EEG of appreciable magnitude as compared to base signal without the blink

artifact. This enables us to detect eye blinks by thresholding the EEG signal. This is the feature

translation phase. (Figure 3.5) shows the raw wave as opposed to when a blink occurred.

The translated data is then used as a control output which can then be sent to a device such as the

arduino uno microcontroller. This is the device output phase.

2) Arduino Method

29
This method first involves interfacing the Mindwave headset with the arduino board. In order for this

to happen, the arduino board needs a bluetooth functionality. This is where the HC05 serial

bluetooth module comes to play. The HC05 will be programmed to act as the master device while

the Mindwave headset will be programmed as the slave device. In order to pair this two devices, the

HC05 will initiate a bluetooth connection. This is done by sending certain AT commands to the

HC05 using arduino serial monitor. AT commands are instructions used to control a modem. (Figure

3.6) below shows the sketch upload to the arduino to program the HC05.

Figure 3.6 sketch used to program the HC05 module

After the sketch is uploaded to the arduino, some of the following codes can then be used to give the

HC05 instructions:

 AT+ROLE=0 [Set the module to be slave mode. The default mode is slave.]

 AT+ROLE=1 [Set the module to be master mode.]

 AT+CMODE=0 [Set the module to make pair with the other random Bluetooth module (No

specified address). This is the default mode.]

30
 AT+CMODE=1 [Set the module to make pair with the other Bluetooth module (specified

address). This is the preferred mode. Then the module will search the last paired module until

the module is found.]

 AT+PSWD=XXXX [Set the module pair password. The password must be 4-bits.]

 AT+UART= <XXXX>,<YY>,<ZZZZZZ> [Set the Baud Rate]

 AT+NAME=<XXXXXX> [Set the name of Device]

 AT+BIND=<MINDWAVE UNIQUE NUMBER> [pairs only to this address]

The HC05 can now initiate a bluetooth connection to the Mindwave headset and hence a successful

connection has been established. The arduino uno can now receive raw EEG data from the headset:

This is the data acquisition phase.

The arduino uno can now receive data but this data is unreadable by the arduino. For the data to be

processed, certain computation and programming is done in order to carry out feature extraction and

feature translation. The code that computes feature extraction has been given by (NeuroSky, 2016)

and is provided license free. Based on this sample code, an overall sketch was created to implement

the feature extraction and translation stage. This sketch would then be used for control output.

31
Figure 3.7 Sketch showing initial declaration of variables

(Figure 3.7) shows the initial declaration of variables needed for the whole BCI system.

(Figure 3.8) shows the exact code used in parsing the raw EEG data coming from the headset. This

code should ordinarily be in a library and then called from that library, but due to certain difficulties

the code was extracted from the library and then implemented directly in the arduino IDE.

32
Figure 3.8 Sketch showing exact code for feature extraction and translation

33
Figure 3.9 Sketch showing code for controlling arduino pins based on attention levels

(Figure 3.9) implements the device output stage in which the device used here was a DC motor. Due

to certain difficulties, the stepper motor could not be controlled.

BRAIN EEG SIGNAL SIGNAL


ACQUISITION BLUETOOTH PROCESSING
WAVES
UNIT UNIT

DC MOTORS MICRO
CONTROLLER

Figure 3.10 Block Diagram of project design

34
START

If Power
switch on?

Yes STOP
EEG signal acquisition using No

Brainwave Starter Kit

Transmit digitized values to


Arduino or LabVIEW via
bluetooth

Signal processing performed


to obtain corresponding
control signals

Activate respective motors

Fig 3.11 Project Flowchart


cFlowFlowchart

35
CHAPTER FOUR

RESULTS AND DISCUSSION

4.1 Introduction

This chapter discusses the various testing carried out in this project. Testing is an important part of

any project and is done to uncover any unwanted behavior of any sort. Both the hardware and

software were tested and the results of the tests are also discussed in this chapter.

4.2 Testing of Mindwave Headset

The Mindwave headset was first setup using a computer system and the manufacturer’s software. It

was first paired with computer system over bluetooth, then some protocols followed based on the

manufacturer’s software manual in order to ensure optimized performance from the headset.

Figure 4.1 Headset’s manufacturer software used in setting up the device

36
4.3 Testing of Arduino Uno

For total assurance of the arduino uno embedded system, it was first tested using the Arduino IDE, a

breadboard, a color led and a resistor. After all the necessary drivers have been installed on the

computer system, a simple sketch was then uploaded to the board to toggle the color led on and off.

A successful operation meant that the embedded system was ready for next set of operations to be

performed.

4.4 Testing of HC-05 module

The HC05 module is a bluetooth serial module that is expected to provide bluetooth functionality to

the arduino uno so that it can interface with the Mindwave headset. The HC05 looks and functions

similarly as HC06 module, so appropriate tests took place to ensure that the right module was being

used. While the HC05 can switch between master and slave mode, the HC06 could only function in

slave mode.

The module was first programmed from the arduino so that it could accept AT commands. As AT

commands were being sent to it from the serial monitor, the device was not able to switch between

master mode and slave mode. This test showed that the wrong module was being used and not the

HC05. The HC05 was later acquired and after a series of tests and commands, it was able to switch

between master and slave mode. After the tests and series of AT commands sent to the HC05, it was

able to successfully pair with the Mindwave headset with ease. This proved the series of testing

protocols to be a very effective one. eSense values were then acquired successfully and is shown in

(Figure 4.2) below.

37
Figure 4.2 eSense readings from the arduino serial monitor

4.5 Testing of LabVIEW

LabVIEW is a programming environment like the arduino IDE. In order to ensure it was up and

running some tests were carried out. LabVIEW was first made to perform basic signal generation

and processing tasks. After that was successful, the next test was ensuring that LabVIEW was able to

communicate with external devices. The Mindwave headset was first paired to the computer system

over bluetooth, this will create a virtual communication serial port then LabVIEW was expected to

listen to the data coming from that specified port. LabVIEW was not able to listen to the port and

therefore failed the test. After debugging and some research, it was discovered that the think gear

library had to be loaded inside LabVIEW and the NI-VISA (LabVIEW driver for communicating

with external devices) also had to be installed on the computer system. LabVIEW was later able to

38
interact with the Mindwave headset after correction of previous errors. The Neurosky driver for

LabVIEW was also installed for efficient programming in its IDE.

LabVIEW was also tested to interact with the Arduino but it failed. This was because the arduino

uno was lacking the LabVIEW firmware. After the firmware was uploaded to the arduino, it became

very easy to interact with LabVIEW. This session of tests also proved to be very effective in

achieving the overall project.

Five participants were reviewed and readings from the attention meter indicates that four of the

participants surpassed the minimum threshold of 50% for forward motion and 80% for backward

motion respectively. The participants are participant 1, 2, 4 and 5. Participant 3 was unable to attain

the required attention for the set threshold (see Appendix B for amplitude reading and attention

meter readings).

4.6 Testing of Motors

Both the DC and stepper motors were tested for precision and control using the arduino uno and an H-

bridge motor controller. A stepper motor was to be initially used but unlike the DC motor which just

required highs and lows on its control pins, the stepper motor presented to many difficulties in control

and hence was not a preferable choice of motor. The DC motors were tested using the arduino and a

motor controller.

39
Figure 4.3 DC motors with attached tires

4.7 Testing of Overall project

After all the individual components have been tested, it was time to test all of them as a unit. The

arduino method of the BCI communication system was majorly implemented in this project. It

involves first interfacing the HC05 with the arduino uno to make one unit then interfacing that unit

with the Mindwave headset. Once the Mindwave headset is powered on, the arduino is powered on

and then HC05 initiates a connection to the headset using its unique identifier. The HC05 must blink

twice per second to ensure pairing while the Mindwave headset must constantly flash blue to ensure

both devices have paired. The TX terminal led on the arduino should start blinking to show ensure

that the headset is transmitting data to arduino. The arduino then controls the motors based on the

sketches that have been uploaded to the board. (Table 4.1) shows the thresholds at which the motors

would go forward, backward and be either active or inactive.

40
Figure 4.4 Final Project Design

S/N eSense Active State Reading (%) Motion/State

1 Attention 1 50 - 79 Forwards

2 Attention 2 80 - 100 Backwards

3 Poor Quality 1 >0 Inactive

4 Poor Quality 2 0 Active

Table 4.1 Mindwave eSense table for attention, meditation and poor quality based on final code

41
CHAPTER FIVE

CONCLUSION, CHALLENGES ENCOUNTERED AND RECOMMENDATION

5.1 Conclusion

This chapter concludes this dissertation. This project has been an interesting piece of work to

manage requiring a substantial amount of effort but through this, resulted in learning a lot of skill

sets which are essential for future purposes. This project demonstrates how to develop a fully

functional BCI system from the scratch and it demonstrated techniques such as embedded system

programming, wireless communication, graphical programming among others. This project work

can be said to be a successful one as it accomplished its set objectives.

5.2 Challenges Encountered

During the course of development, the following issues were encountered:

 Programming in LabVIEW was quite challenging. It takes a lot of time and energy to write

the final code to interact with the Mindwave headset. Downloading the setup was an even

bigger challenge because no fast network was available to download it.

 Interfacing the Mindwave headset with LabVIEW was quite challenging as there was no

sufficient knowledge about what to do and whatnot.

 Interfacing the HC05 with the Mindwave headset was the most challenging of all. Firstly the

HC06 was acquired and tested to interface with the headset but it did not work. After some

testing it was realized that the HC05 was needed. Getting the HC05 was quite hard because it

was not available in any market in Nigeria. Another option was to use the RN-42 bluetooth

module but this was overseas and the shipping cost was even more expensive than the

module itself. The HC05 was fortunately acquired and it got to connect to the Mindwave

headset.

42
 The Arduino code was constantly giving errors to the compiler. It took a while but after

much work and diligent research, the codes were compiled and used to achieve set

objectives.

 Initial setup of Mindwave headset was a terrible experience. It was highly challenging and

frustrating. Firstly a fake Bluetooth dongle was acquired for the set computer system, and

after that it was an impossible task to pair the Mindwave headset with the PC. After that set

of frustrating events, a new Bluetooth dongle was acquired. It took a while but the Mindwave

headset eventually connected.

 A lot of resources were needed for this project to be carried out including time. Fake finger

batteries had to be bought every day to power the headset as that was the only available

option. Mobile data had to be bought to download the needed software due to the unavailable

wireless local area network. The motors also have quite an expensive price tag along with the

arduino board used for this project.

5.3 Recommendation

This project has substantial room for improvement, to make this project more robust and for more

suitable applications in the future, the following should be considered:

 Signal-acquisition hardware: All BCI systems depend on the sensors and associated

hardware that acquire the brain signals. Improvements in this hardware are critical to the

future of BCIs. Ideally, EEG-based (noninvasive) BCIs should have electrodes that do not

require skin abrasion or conductive gel (i.e., so-called dry electrodes); be small and fully

portable; have comfortable, convenient, and cosmetically acceptable mountings; be easy to

set up; function for many hours without maintenance; perform well in all environments;

operate by telemetry instead of requiring wiring; and interface easily with a wide range of

applications.

43
 Reliability: The BCI to be used must be very reliable. The future of BCI technology certainly

depends on improvements in signal acquisition. It should be able acquire more precise data

and transmit at a faster frequency without failing.

 Home automation BCI: The processed brain signals can be used as control signals for

different home appliances. E.g. turning on lights, emergency call etc.

 Control for robot movement in 2D: More number of sensors can be used for acquiring brain

signals from different portion of the brain. These signals can then be processed in the similar

manner and mapped to the control signals to obtain robotic movement in the other dimension

(left, right etc.).

REFERENCES

44
Akhtari, M., Bryant, C. H. & Mamelak, N. A., 2000. Conductivities of three-layer human skull.
Brain Topology, 13(1), pp. 29-42.

Allison, B. ,., Wolpaw, E. & Wolpaw, J., 2007. Brain computer interface systems: Progress and
prospects. British review of medical devices, 4(4), pp. 463-474.

Anon., 2008. The Human Brain in Numbers, s.l.: s.n.

Arduino Inc, 2016. Arduino software IDE. [Online]


Available at: www.arduino.cc/en/guide/environment
[Accessed 2016].

Berger, H., 1929. Uber das electrenkephalogramm des menchen. Arch Psychiatr Nervenkr, Volume
87, pp. 527-570.

Coenen, Anton, Fine, E. & Zayachkivska, O., 2014. Pioneer In Electroencephalography. Journal of
the History of the neurosciences, Issue 23, pp. 276-286.

Elbert, T., B, R., W, L. & N, B., 1980. Biofeedback of slow cortical potentials.. 48(3), pp. 293-301.

F.L, Niederemeyer E. and da Silva, 2004. Electroencephalography: Basic Principles, Clinical


Applications, and Related fields.. s.l.:s.n.

Farwell, L. & Donchin, E., 1988. Talking off the top of your head: toward a mental prosthesis
utilizing event-related brain potentials. Electroencephalography; Clinical Neurophysiology, 70(6),
pp. 510-523.

Fetz, E. E., 1969. Operant conditioning of cortical unit activity. Science, 163(3870), pp. 955-958.

Friman, O., Volosyak, I., & & Graser, A., 2007. Multiple Channel Detection of Steady-State Visual
Evoked Potentials for Brain-Computer Interfaces.. IEEE Transactions on Biomedical Engineering,
54(4), pp. 742-750.

Gastaut, H., 1952. Electrocorticographic study of the reactivity of rolandic rhythm. Volume 87, p.
176–182.

Guger Technologies, 2011. Guger Tech. [Online]


Available at: http://www.gtec.at
[Accessed 5 September 2011].

Heinrich, Gevensleven, H. & & Strehl, U., 2007. Neurofeedback-train your brain to train behaviour.
Journal of Child Psychology and Psychiatry, 1(48), p. 3–16.

45
Hochberg, L., Serruya, M. D. & Friehs, G. M., 2006. Neuronal ensemble control of prosthetic
devices by a human with tetraplegia.. Nature, 442(7099), pp. 164-171.

Krusienski, D. J. & Shih, J. J., 2011. Control of a visual keyboard using an electrocorticographic
brain-computer interface. Neurorehabilitative Neural Repair, 25(4), pp. 323-331.

Kuhlman, W. ,., 1978. EEG feedback training: enhancement of somatosensory cortical activity..
Electroencephalography; Clinical Neurophysiology, 45(2), pp. 290-294.

Lang & Bradley, M. M., 2007. The handbook of emotion elicitation and assessment, England:
Oxford University Press.

Levine, S. et al., 2000. A direct brain interface based on event-related potentials. IEEE transactions
on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology
Society, 8(2), pp. 180-5.

Michele, B. & et al, 2008. system that allows monkeys to control robot arms via brain signals. s.l.,
Pitt chronicle.

National Instruments, 2015. NI.com. [Online]


Available at: ww.ni.com

NeuroSky, I., 2016. Mindwave mobile. [Online]


Available at: http://developer.neurosky.com/docs/doku.php?id=mindwave_mobile_and_arduino
[Accessed 2016].

NeuroSky, I. A., 2010. ThinkGear Socket Protocol (Tech. Rep.). [Online]


Available at: http://www.neurosky.com
[Accessed 24 03 2016].

NI, 2016. The mastermind project. [Online]


Available at: decibel.ni.com
[Accessed 2016].

Nunez, P. L. & Srinivasan, R., 1981. Electric fields of the brain: The neurophysics of EEG., s.l.:
Oxford University Press.

Schlögl, A., Slater, M. & Pfurtscheller, G., 2002. Presence research and EEG, s.l.: s.n.

Shih, J. J. et al., 2012. Brain-Computer Interfaces in Medicine. 87(3), pp. 268-279.

SparkFun, l., 2016. [Online]


Available at: https://learn.sparkfun.com/tutorials/what-is-an-arduino

46
Symphonic Mind LTD, 2009. What are Brainwaves?. [Online]
Available at: http://www.brainworksneurotherapy.com/what-are-brainwaves

Symphonic Mind LTD, 2009. What is Neurofeedback?. [Online]


Available at: http://www.brainworksneurotherapy.com/what-is-neurofeedback

Vidal, J., 1973. Toward direct brain-computer communication. Annual Review of Biophysics and
Bioengineering, 2(1), pp. 157-80.

Vidal, J., 1977. "Real-Time Detection of Brain Events in EEG". IEEE Proceedings, 65(5), pp. 633-
641.

Whittingstall, K., Logothetis & K., N., 2009. Frequency-Band Coupling in Surface EEG Reflects
Spiking Activity in Monkey Visual Cortex. 64(2), pp. 9-281.

Wolpaw, J. et al., 2002. Brain-computer interfaces for communication and control. Clinical
neurophysiology, 113(6), pp. 767-791.

Wolpaw, J. & wolpaw, E., 2012. Brain-computer interfaces: Something new under the sun.,
England: E.W Oxford unversity press.

APPENDIX A

Arduino program to acquire data from the Mindwave headset and send control output to respective

motors is shown below:

47
48
49
50
APPENDIX B
Diagrams showing attention analysis and their corresponding raw data for five different people

Participant 1

Participant 2

Participant 3

51
Participant 4

Participant 5

52

Вам также может понравиться