Академический Документы
Профессиональный Документы
Культура Документы
Loader
AHMED DAUD
AISHA BATOOL
HAMAYUN KHAN
DDP-FA10-BTE-002
DDP-FA10-BTE-001
DDP-FA10-BTE-020
SPRING 2014
Project Advisor
Dr. Mujtaba Hussain Jaffery
PROJECT REPORT
PROJECT ID
TITLE
NUMBER OF
MEMBERS
1905
SUPERVISOR NAME
MEMBER NAME
TERN
REG. NO.
EMAIL ADDRESS
Ahmed Daud
CIIT/DDP-FA10-BTE-002
ahmeddaud@hotmail.com
Aisha Batool
CIIT/DDP-FA10-BTE-001
aishasheikh@yahoo.co.uk
Hamayun Khan
CIIT/DDP-FA10-BTE-020
Hamayunkhan@ymail.com
CHECKLIST:
Number of pages in this report
56
YES / NO
YES / NO
I/We confirm to state that this project is free from any type of
plagiarism and misuse of copyrighted material
YES / NO
MEMBERS SIGNATURES
Supervisors Signature
This work, entitled Hand Gesture Based Robotic Mobile Loader has been
approved for the award of
SPRING 2014
External Examiner:
Head of Department:
Declaration
No portion of the work referred to in the dissertation has been submitted in support of an
application for another degree or qualification of this or any other university/institute or
other institution of learning.
MEMBERS SIGNATURES
ii
Acknowledgements
We are grateful to ALLAH Almighty, the most kind and most merciful who provides all the
resources of every kind to us, so that we make their proper use for the benefit of mankind. We
would like to thank our parents for their prayers and moral support who kept backing us up in all
the times, both financially and morally. We are thankful to our worthy Project Advisor Dr.
Mujtaba Hussain Jaffery helped, encouraged, guided and motivated us all the way during the
difficult journey of accomplishing this project. We eulogize Dr. Mujtabas motivation and kind
guidance that transformed the dream of this project into reality. Without his supervision we would
have been nowhere. Moreover, we would like to thank faculty members of Electrical Engineering
Department for their endless and valuable support to make this happen during our final year of
degree program. Furthermore, we would like to thank project lab staff for their help and support
to complete this project.
iii
Abstract
The Kinect sensor is widely used for human motion recognition in video game playing. In this project,
Kinect sensor is being used for the motion control of a mobile robot according to the movements of hands
(gesture), as a ubiquitous human machine interface. The robot is in the form of a loader vehicle, which can
be used for transportation of goods. After decision making based on gesture recognition, the corresponding
command is sent wirelessly to the robot and action is performed according to the movement of the hands.
Moreover, Graphical User Interference (GUI) for monitoring and controlling the robot using computer is
also implemented in this project. Obstacles avoidance is being used to avoid hurdles that come across the
robots path using sonar sensors. The robot can also be controlled using speech recognition.
iv
Table of Contents
1
INTRODUCTION........................................................................................................................ 1
1.1
1.2
1.3
1.4
ROBOTICS .............................................................................................................................. 5
2.2
2.3
2.4
WIRELESS COMMUNICATION................................................................................................... 7
2.4.1
2.5
2.5.2
3.1.1
Requirements .................................................................................................................. 12
3.1.2
Design ............................................................................................................................. 12
3.1.3
X-CTU .............................................................................................................................. 12
3.1.4
Functionality.................................................................................................................... 13
3.2
2.5.1
2.6
3
3.2.1
3.2.2
3.2.3
3.2.4
4.2
COMPONENTS ....................................................................................................................... 23
4.2.1
4.2.2
Arduino ........................................................................................................................... 25
4.2.3
4.2.4
H-Bridge .......................................................................................................................... 27
4.2.5
4.2.6
Batteries.......................................................................................................................... 30
4.2.7
Distance sensor................................................................................................................ 31
4.2.8
Motors ............................................................................................................................ 32
4.2.9
LCD Display...................................................................................................................... 33
4.2.10
4.3
Gas Sensor................................................................................................................... 34
ROBOT DESIGN..................................................................................................................... 34
4.3.1
4.3.2
4.3.3
Functionality.................................................................................................................... 37
4.3.4
Dimensions ...................................................................................................................... 37
IMPLEMENTATION ................................................................................................................ 39
5.1
TOOLS.................................................................................................................................. 39
5.2
5.2.1
5.2.2
5.2.3
5.2.4
5.2.5
5.2.6
5.2.7
5.2.8
System Integration........................................................................................................... 42
5.2.9
5.3
HURDLES ............................................................................................................................. 43
5.4
TESTING ................................................................................................................................... 44
6.1
6.2
6.3
6.4
6.5
SLIPPING .............................................................................................................................. 45
CONCLUSIONS ...................................................................................................................... 46
REFERENCES ................................................................................................................................... 47
APPENDIX A: SOURCE CODE ....................................................................................................... 52
APPENDIX B: HARDWARE SCHEMATICS .................................................................................. 53
APPENDIX C: LIST OF COMPONENTS ........................................................................................ 55
vi
vii
Table of Figures
FIGURE 1-1 A STANDARD KINECT .................................................................................................................... 2
FIGURE 1-2 OVERVIEW OF ENTIRE SYSTEM ........................................................................................................ 3
FIGURE 3-1 X-CTU ................................................................................................................................... 13
FIGURE 3-2 THE VIRTUAL REALITY INTERFACE .................................................................................................. 15
FIGURE 3-3 UML USE-CASE DIAGRAM .......................................................................................................... 16
FIGURE 3-4 IMAGE PROCESSING DIAGRAM...................................................................................................... 16
FIGURE 3-5 IMAGE PROCESSING DIAGRAM FOR DECISION MAKING ....................................................................... 17
FIGURE 3-6 C SHARP BASED GUI.................................................................................................................. 17
FIGURE 3-7 C SHARP BASED SPEECH CONTROL ................................................................................................ 19
FIGURE 3-8 VR MODULE ............................................................................................................................ 20
FIGURE 3-9 VR COMMANDER SOFTWARE ....................................................................................................... 21
FIGURE 4-1 ARCHITECTURE OVERVIEW OF SYSTEM ............................................................................................ 23
FIGURE 4-2 KINETIC INTERNAL HARDWARE ..................................................................................................... 24
FIGURE 4-3 ARDUINO UNO R3 FRONT ........................................................................................................... 25
FIGURE 4-4 ARDUINO MEGA 2560 R3 FRONT................................................................................................. 26
FIGURE 4-5 ARDUINO SOFTWARE ................................................................................................................. 27
FIGURE 4-6 L298 MOTOR DRIVER ................................................................................................................. 28
FIGURE 4-7 L298 CIRCUIT DIAGRAM ............................................................................................................. 28
FIGURE 4-8 XBEE MODULE .......................................................................................................................... 29
FIGURE 4-9 12V LEAD ACID BATTERY ............................................................................................................ 30
FIGURE 4-10 HC-SR04 SENSOR................................................................................................................... 31
FIGURE 4-11 PITTMAN GEAR MOTOR ......................................................................................................... 32
FIGURE 4-12 BASIC 20X4 CHARACTER LCD .................................................................................................... 33
FIGURE 4-13 GAS SENSOR - MQ-4 ............................................................................................................... 34
FIGURE 4-14 ROBOT BASE DESIGN................................................................................................................ 35
FIGURE 4-15 STRUCTURAL DESIGN................................................................................................................ 36
FIGURE 4-16 ROBOT TOP VIEW.................................................................................................................... 37
FIGURE 4-17 LIFTER OF ROBOT .................................................................................................................... 38
FIGURE 5-1 DEVELOPING STAGE ................................................................................................................... 39
FIGURE 5-2 ROBOT STRUCTURE .................................................................................................................... 40
FIGURE 5-3 GUI ....................................................................................................................................... 41
FIGURE 5-4 INTEGRATION OF ALL MODULES ..................................................................................................... 42
FIGURE 6-1 X-CTU, ZIGBEE COORDINATOR, ZIGBEE ROUTER AND ITS CONNECTION WITH ARDUINO .............................. 44
FIGURE 10-1 SCHEMATIC H BRIDGE .............................................................................................................. 53
FIGURE 10-2 H BRIDGE PCB DESIGN............................................................................................................. 53
viii
Table of Tables
TABLE 2-1 COMPARISON OF BLUETOOTH, WI-FI, AND ZIGBEE PROTOCOLS ............................................................... 8
TABLE 4-1 KINECT SPECIFICATIONS ................................................................................................................ 25
TABLE 4-2 ARDUINO UNO SPECIFICATIONS...................................................................................................... 26
TABLE 4-3 ARDUINO MEGA SPECIFICATIONS .................................................................................................... 26
TABLE 4-4 XBEE SERIES 2 SPECIFICATIONS ....................................................................................................... 29
TABLE 4-5 LEAD ACID BATTERY SPECIFICATION................................................................................................. 30
TABLE 4-6 DISTANCE SENSOR SPECIFICATIONS ................................................................................................. 32
TABLE 4-7 ROBOT DIMENSIONS ................................................................................................................... 38
ix
Chapter 1
1
Introduction
In todays era, the robotic industry has been evolving many new trends to upsurge the efficiency,
approachability and accuracy of the systems. Robot are used to do jobs that are injurious to the
human, repetitive jobs that are tedious, hectic etc. Although robots can be used to replace humans
but still they need to be controlled by humans. Besides controlling the robotic system through
physical devices, recent techniques of controlling robotic system through gesture and speech have
become very popular. The core purpose of using gestures and speech is that it is a more natural
way of controlling and provides an intuitive form of interface with the robotic system. Automation
is an essential part of robotics today. Automated robots can perform the tasks of loading and
unloading weights according to the need. The motivation behind the development of this project
was to create a prototype for testing of different human machine interfaces. This project
contributes to the field of robotics by integrating the following departments:
i.
Speech Recognition
ii.
Gesture Recognition
iii.
iv.
Remote Monitoring
During the development of this project we followed a modular approach because modularity
provides flexibility for the further expansion, research and testing etc. by hardware and software
changes in the existing system. The main module used for gesture and speech recognition was
Kinect Sensor.
microphones. A 3D camera enables it to act as a depth sensor as well. It can also perform colour
recognition. The algorithm used in this project enables it to act both as depth sensor and a RGB
colour sensor [2].
Kinects gesture recognition is also used in the gesture based control module of the project.
A full detail of Kinects specifications and functionalities is discussed in later Chapter.
computer. The Robot process those commands and the robot perform the particular task
accordingly.
On robot obstacle detection and avoidance mechanism is also implemented. When an obstacle
comes in front of robot, the robot stops to avoid collision. Moreover the robot doesnt move
backwards when an obstacle is present on the back. The robot doesnt turn left if an obstacle is
present on left side of robot. In the same way the robot doesnt turn right if an obstacle is present
on the right side of robot.
The robot can be controlled using graphical user interface developed in C Sharp.
A wireless monitoring of robot status is also implemented. The current status of robot, battery
voltage status, obstacle warning, natural gas warning, distance sensor data and orientation of robot
can be monitored remotely. An overview of the entire system in block diagram form is displayed
in figure 1-2.
ii.
Motion Detection
iii.
iv.
Speech Recognition
Chapter 3:
This chapter explains the software architecture of the project; including Computer Vision, the
different control techniques, the communication system, and the various Human-Machine
Interface modules that have been deployed in this project.
Chapter 4:
This chapter explains the hardware architecture of the project. A brief introduction of the different
hardware components used in the project and the construction of the Robot.
Chapter 5:
This chapter details the implementation of this project in the field of control.
Chapter 6:
This chapter includes the testing of robot.
Chapter 7:
This chapter concludes the project and gives a brief overview of everything that has been achieved
through it. It also discusses the future work related to this project.
Chapter 2
2 Literature Review
Lately, quite a significant amount of time and chattels spent in the field of computing have been
directed into vision and speech. What has strained so much interest into the subfield is the
immeasurable ease and uncluttered accuracy with which the human brain accomplishes certain
tasks related to vision and speech. The concept being that the human brain is essentially an
information processing device much like the modern computer. Computer scientists have based
much of their ideas and inspirations on researching speech and computer vision. Although for a
long period, biologists have delved into and unravelled a proportion of the mysteries of the brain,
we are yet to be capable of representing its functionality with the help of a machine. This project
aims to work into human brains most important image processing and speech recognition
operations to be accomplished by a computer. It provides some vision into several basic
techniques used in fields of computing such as image processing, speech recognition and a few
advanced approaches to perform basic obstacle detection for a vehicle [3].
2.1 Robotics
A robot can be defined as a mechanical device which performs automated tasks, either according
to direct human supervision, a pre-defined program or, a set of general guidelines, using artificial
intelligence techniques. The first commercial robot was used in the automotive industry by Ford
in 1961. The robots were predominantly proposed to substitute humans in repetitive, hefty and
lethal processes. Now a days, due to economic reasons, industrial robots are purposefully used in
a lot of diverse applications [4].
The universe of Robotics is a standout amongst the most energizing territories that has been
through steady development and advancement. Robotics are interdisciplinary and have gotten
more a piece of our lives. They are no more a dream for the future however an actuality of the
present. In our regular lives, robots fill numerous essential parts from cutting the garden, securing
our uninhabited houses, nourishing our pets or building our autos. It is unmistakable that they are
all over and over the long haul they will be more present [5].
Inside robotics autonomy, unique consideration is given to mobile robots, since they have the
ability to explore in their surroundings and are not altered to one physical area. The assignments
they can perform are perpetual and the potential outcomes for them to help our lives are
incalculable (e.g., unmanned flying vehicles, self-governing submerged vehicles). As of late we
have seen a real build of versatile robots. Presently a days they are subject of significant
exploration activities being available in industry, military, security and even home situations as
5
purchaser items for stimulation and home support undertakings. The portable mechanical
technology field has numerous difficulties that need to be tended to. Route (mapping, localization,
path planning), environment observation (sensors), autonomous conduct (actuators, behaviour
rules, control mechanisms), two legged development (parity), ability to work for an enlarged
period without human intercession (batteries, human help), are exploration open door samples,
among numerous others. Some of these challenges are addressed in this thesis [6].
ii.
iii.
iv.
v.
Each gesture corresponds to a different robot control command. Then the wireless module is used
to send these different robot control commands to the robot controller. Accordingly, the robot will
perform actions, thus human-robot interaction can be achieved. The gesture recognition system
is developed with the help of Open NI libraries. Moreover each speech command corresponds to
a different robot control command and a wireless module is used to send these commands to the
robot controller to perform the desired actions.
We have selected ZigBee for our communication between computer and robot. ZigBee is the only
standard wireless technology intended to address the exclusive requirements of low-cost, lowpower wireless sensor and control networks in just about any market. ZigBee can be utilized pretty
much anywhere and is easy to implement and requires little power to operate. As our robot was a
mobile system so our main concern is power consumption and wide range so ZigBee was the best
choice in the given conditions.
Two quickly enhancing advances, voice and speech recognition, are unequivocally joined
regarding their expected reason, however the contrasts between the two are frequently confounded.
When all is said in done terms, the key distinction in the middle of voice and speech recognition
exists in the gathered information investigation and the yield from that examination. Speech
recognition gathers the talked word then dissects and presents the results as information, though
voice recognition is concerned with distinguishing the individual giving the talked word info [11].
Voice and speech recognition vary through the path in which enter is broke down. Both of these
innovations work with the human voice, changing over it into an information stream that could
be dissected. Speech recognition and voice recognition are both quickly advancing advances with
various provisions that can improve comfort, security, law authorization deliberations, and that's
just the beginning. In spite of the fact that "Speech recognition" and "voice recognition" are
regularly utilized conversely, they are distinctive innovations with drastically diverse destinations.
recognition programming incorporate restricted slang terms, conversational dialect, and exact
representation of information from people with discourse obstacles.
generated by depth sensor is a simplified 3D description, however we only treat depth image as
an additional dimension of information and still implement recognition process in 2D space. The
input data for Kinect are streams of vectors of twenty body-joint locations acquired by standard
Application Programming Interface (API) of the Kinect Software Development Kit (SDK). These
joints symbolise the human body captured by Kinect sensor. We have used the Kinect sensor
because its the latest consumer market gaming camera which is reasonably priced and extremely
practical. We selected Kinect sensor because its a multipurpose device. It can be used for both
gesture and speech recognition.
11
Chapter 3
3
Software Architecture
This chapter details the complete software architecture of the project, including the details of the
different control techniques associated with this project and detail of the communication system.
It also discusses the various HMI modules.
3.1.1 Requirements
A ZigBee coordinator communicates with one ZigBee router, attached to the robot to ensure its
real time operation.
3.1.2 Design
One of the Xbee S2 as Coordinator, another as Router for point to point two way communication.
The software used for configuring the Xbees is X-CTU. It is software from Digi International,
which can be used configure a variety of modems manufactured by Digi International, which
include Xbee, Wi-Fi etc. Xbees can either be configured by this software, or directly by any
terminal program.
3.1.3 X-CTU
XCTU shown in figure 3-1 is a free multi-platform application intended to facilitate developers
to interact with Digi RF modules through a simple-to-use graphical interface. XCTU contains
tools that allows developers to set-up, configure and test Xbee RF modules [14].
12
3.1.4 Functionality
Two ZigBees are connected directly to each other. They are in constant communication with
each other. The coordinator sends commands to the router, which forwards it to the Arduino Mega
present on the robot. It is then decoded and the respective action is performed.
3.2
HMI Modules
This section discusses the different Human-Machine Interaction modules present in the
project. The various modules include Gesture based control and speech recognition. The
modules are very important to the development of this project as they serve greatly in testing
and implementing algorithms.
13
14
left
vi. Right hand in upper rectangle and left hand in middle triangle corresponds to the
lifter up
vii. Left hand in upper rectangle and right hand in middle triangle corresponds to the
lifter down
The rectangles are created after six seconds of program run time to give users the time to settle.
The position of the interface and the size of rectangles depend upon the position of user's hand
and the distance between them at the moment of rectangle creation.
15
16
17
The Graphical User Interface (GUI) developed in the Microsoft Visual Studio runs on the remote
laptop or computer. It is a multipurpose interface for both monitoring and control of robotic loader.
The GUI sends and receives data to and from the robotic loader via Xbee module connected with
the serial port of the laptop/computer.
Following are the main features of the GUI
I.
Controlling of robot
i.
Using up arrow of keyboard or by clicking forward button, robot can move forward.
ii.
Using down arrow of keyboard or by clicking backward button, robot can move
backward.
iii.
Using left arrow of keyboard or by clicking left button, robot can move left.
iv.
Using right arrow of keyboard or by clicking right button, robot can move right.
v.
Using shift key from the keyboard or by clicking stop button, robot can stop.
vi.
By pressing 8 from the numeric keypad or by clicking up button, robot lifter can
move up.
vii.
By pressing 2 from the numeric keypad or by clicking down button, robot lifter can
move down.
viii.
By pressing 5 from the numeric keypad or by clicking lifter stop button, robot lifter
can stop.
II.
Monitoring of robot
i.
ii.
Obstacles status
iii.
iv.
Accelerometer reading
v.
vi.
vii.
18
One of the key features of the Kinect for Windows natural user interface is speech recognition.
We have developed speech based control application using Kinect sensor that respond to spoken
commands. The Kinect sensor contains a microphone array that consists of four microphones that
are arranged linearly. The microphone array is an exceptional input device for speech recognitionbased applications because it delivers enhanced sound quality through noise suppression and
acoustic echo cancellation than a comparable single microphone. The Kinect for Windows
Runtime Language Pack (included in the Kinect for Windows SDK) includes a custom acoustical
model that is augmented for the Kinect sensors microphone array.
To recognize speech, we created a Speech Recognizer Engine object. Then created a grammar
and load it into the engine. Finally, created event handlers to handle recognition events, and set
the input audio stream to be the sensor's audio source stream. In the event handler, the event
arguments contain a confidence level as to the quality of the recognition; use the quality level to
accept or reject the recognized speech as a command to our robot control application.
Following are the main features of the speech application
i.
ii.
iii.
iv.
v.
vi.
vii.
viii.
Moreover if the user speaks the correct command the corresponding command will be highlighted
on the screen and the animated turtle will move in that direction.
20
The module supports both Speaker Independent Commands and up to 32 user-defined Speaker
Dependent (SD) commands, triggers and voice passwords. The module supports Speaker
Dependent commands in any language.
The EasyVR Commander software was used to configure our VR module connected to our PC
by using the microcontroller host board. EasyVR commander is a software for recording voices
in this module. We defined a group of commands and password and generate a basic code
template to handle them. We edit the generated code template to implement the our robot
application logic, but the template contained all the subroutines or functions to handle the speech
recognition jobs.
The VR module is used in our project for on robot processing of voice commands to control robot
using the voice commands. Following are the main features of the voice application
i.
ii.
iii.
iv.
v.
vi.
vii.
viii.
22
Chapter 4
4 Hardware Architecture
This chapter discusses in detail, the hardware architecture of the project. It would include the
detail of the specifications of the components used. It will also shed light on the various aspects
that were associated with the design and construction of the robots.
4.2 Components
Following are the hardware components used in the project.
23
RGB camera for capturing colour images has been used. It can capture images at
1280x960 resolutions.
ii.
An IR Emitter and IR Sensor for depth sensing. The emitter emits a beam of infrared light
which is absorbed by the sensor on its way back. This information is then converted into
a depth image.
iii.
A multi-array microphone, containing four microphones for both recording the audio and
localizing its source.
iv.
A 3-axis accelerometer for determining the orientation of the Kinect. It is configured for
a 2g acceleration due to gravity (g) range.
24
4.2.2 Arduino
All the processing on the robot is done by Arduino microcontroller. It takes input from all the
sensors and from Xbee module and takes decision after processing all the data. Its the brain of
the robot.
Arduino is an open-source electronics prototyping platform based on flexible, easy-to-use
hardware and software [17].
25
26
Arduino is the first widespread Open Source Hardware platform. It was launched in 2005 to
simplify the process of electronic prototyping and it enables everyday people with little or no
technical background to build interactive products.
The Arduino system is a fusion of two different elements:
i.
A small electronic board containing a microcontroller that makes it easy and reasonable
to program a microcontroller, a type of tiny computer found inside millions of ordinary
items.
ii.
4.2.4 H-Bridge
H Bridge is used to drive high current motors. L298 shown in figure 4-6 is a high current, high
voltage dual full-bridge driver intended to receive standard TTL logic levels and drive inductive
27
loads such as solenoids, relays, DC and stepping motors. The emitters of the lower transistors of
each bridge are connected together and the corresponding external terminal can be used for the
connection of an external sensing resistor. An added supply input is provided so that the logic
works at a lower voltage [19]. To get a higher current of 4A, we have paralleled the outputs of
L298. Moreover we also paralleled channel 2 with channel 3 and channel 1 with channel 4. The
circuit diagram of L298 is shown in figure 4-7.
28
We used Xbee modules for communication between robot and computer. Xbee shown in figure
4-8 is a Plug and Play wireless module that works on the IEEE 802.15.4 standard. It is a simple
and reliable communication source and supports multiple topologies.
Xbee series 2 transceivers have been used as they are low power consuming devices ideal for
embedded devices having limited battery resource. They provide fast and reliable point 55 to point
communication with low response time. There are other devices as well like Bluetooth and WiFi, but they have higher cost per node and are more power consuming so would compromise on
battery life. Xbee has the flexibility to connect to 65000 nodes, compared to 7 nodes for a
Bluetooth scatter net [20].
29
4.2.6 Batteries
For the power requirements of the robot, robot is equipped with five rechargeable lead acid
batteries shown in figure 4-9.
i.
Two 12V batteries connected in parallel for power supply to power circuit for
microcontroller, sensors, and LCD etc. operation.
ii.
Two 12V batteries in series to get 24V for driving motors of robot.
iii.
30
We have used four distance sensors one on each side of robot for detecting obstacles on either
side. The HC-SR04 ultrasonic sensor uses sonar to determine distance to an object by Doppler
Effect just like dolphins or bats. The modules comprises of ultrasonic transmitters, ultrasonic
receiver and control circuit. Interfacing of sonar sensor with Arduino controller is shown in figure
4-10. The basic principle of work:
An eight 40 kHz pulse is sent by the module automatically and a pulse back signal is
scanned.
IF the signal back, through high level , time of high output IO duration is the time from
sending ultrasonic to returning.
The main reason for choosing an ultrasonic ranging module, the HC-SR04 is because of its high
ranging accuracy and stable performance. If we compared ultrasonic sensor with other IR ranging
module, HC-SR04 is more inexpensive. But still it has the identical ranging accuracy and longer
ranging distance. Its operation is not affected by sunlight or black material though acoustically
soft materials like cloth can be difficult to detect using sonar.
4.2.8 Motors
We have used two PITTMAN GM8224 DC brush gear motors shown in figure 4-11 to drive the
robot. One on the left side and one on the right side of the base of robot.
32
Specifications:
We have also used one Jye Maw gear motor for lifter.
Specifications:
Voltage: 12 VDC
Specifications:
5x8 dots
The LCD is interfaced with Arduino Mega and display the following information:
33
Obstacles distance of all four sonar sensor is displayed on second row of LCD
34
Skid Steering
Skid Steering is a close relative of the differential drive system. Skid steering is another driving
mechanism implemented on vehicles with either tracks or wheels which uses differential drive
concept. It is mostly used in tracked machines e.g. tanks. It is can also be used in four / six wheeled
robots. Skid Steering or simply tank style, is a system where each set of wheels, or tracks is
independently powered. The right and left wheels are driven separately. Steering is accomplished
by actuating each side in a different direction or at a different rate, causing the wheels or tracks
to skid, or slip, on the ground.
To drive forward, both right, and left set must be powered forward.
To drive backward, both, right and left set must be powered backward.
To steer left, to be able to pull an on-the-spot spin, the left should be put into reverse and
right in forward or the right side must be going faster than the left.
To steer right, to be able to pull an on-the-spot spin, the right should be put into reverse
and left in forward or the left side must be going faster than the right.
The advantages of Skid-steering is that we can turn on the spot. Skid-steer loaders are capable of
zero-radius, pirouette turning, which makes them exceptionally controllable and valued for
certain applications.
The disadvantages are that for a human used to a steering wheel style system it is a little nonintuitive. Moreover motors are not often identical, even in the same batch, so we may get some
drift which needs adjustment. The skid-steering automobile is turned by producing differential
velocity at the opposite sides of the vehicle. They can be improved to low ground friction by using
especially designed wheels such as the Mecanum wheel.
35
A custom made frame was used for the body of the robot. The frame has two sections. The lower
section holds the batteries and motors. The upper section holds the controller, sensors LCD and
power circuit.
In the front, the robots have a sonar pairs, which was used to detect the obstacle. When sonar pairs
is active, robot will automatically stop when it detect the obstacle in its path. In the same manner
there are sonar sensor on each side of robot for obstacle detection.
The following figure 4-16 shows the top view of the robotic loader.
36
4.3.3
Functionality
Robot can perform different movements. They include moving forwards, moving backwards,
moving left, moving right and the ability to up and down the lifter.
Moving forwards and backwards is achieved by moving both the DC motors in the same direction,
while rotations are achieved by a differential drive i.e. moving them in opposite directions.
The lifter up and down functionality has been added to allow the robot to pick the goods and place
it toward the destination.
Obstacle detection is also implement on the robot. The robot automatically stops when a hurdle
present in front of the robot.
4.3.4 Dimensions
The robot has the following dimensions:
37
Specification
Value
Total Length
50 cm
Total Height
40 cm
Total Width
30 cm
Total Weight
10 Kg Approx.
Lifter Length
20 cm
22 cm
32 cm
Table 4-7 Robot Dimensions
38
Chapter 5
5 Implementation
5.1 Tools
The following tools have been used to develop the system.
5.2
Arduino IDE
X-CTU
Processing 2
Development Stage
When we started our project, initially we divided it into different phases. Following were the
distinct phases we have experienced incrementally to comprehend our project in the given time.
39
40
Accelerometer reading
was to develop a speech based control application using Kinect sensor that respond to spoken
commands for controlling of robotic loader. We developed a C Sharp based speech recognition
application for controlling of robot.
ZigBee
ii.
Arduino controller
iii.
H-Bridge
iv.
Kinect Sensor
v.
Sonar Sensor
vi.
VR Module
vii.
Gas Sensor
viii.
5.3 Hurdles
During the implementation our project we faced different issues. The problems that we
have faced were:
5.4
There were many ways to get solution of the problems. We evaluated different approaches for
implementation of high current rating H Bridge circuits. Selection of wireless communication
protocol was also a difficult task. We evaluated different protocols and selected ZigBee. Still on
implementation of Xbee communication we faced issues in the configuration of Xbee modules
which were solved by troubleshooting.
43
Chapter 6
6 Testing
6.1 Wireless Modules Testing
The configuration of wireless modules in detail has already been discussed. Here the details of
modules connections with the microcontrollers and their working have been described. The
ground pins of Xbee S2 were connected with the ground pins of Arduino. 3.3V pin of Xbee was
connected to 3.3V pin of Arduino. The Data-out and Data-in pins of Xbees were connected to
Rx and Tx pins of Arduino respectively. At the time of transmission a green light blinks up on
coordinator as well as router, which was the sign of successful transmission of data. The following
Figure 6-1 shows how X-CTU can be used to transmit signals.
Figure 6-1 X-CTU, Zigbee Coordinator, Zigbee Router and its connection with Arduino
Hence the total run time of the system is 55 Minutes in a single recharge of batteries.
6.5 Slipping
When robot is running at full speed in forward direction
Total Slipping = 1 cm
45
CHAPTER 7
7 Conclusions and Future Work
7.1 Conclusions
The robotics field is quite promising and requires a lot of effort to develop an intelligent robot.
The ultimate goal of robotics is a super human system that personifies all the skills of humans
such as touch, intelligence, and sensitivity without any of their precincts such as ageing and
strength. Regardless of the massive work put in by the prior researchers, there is an enormous
scope for research for further growth in this field of robotics.
The use of Kinect sensor has incorporated the concept of Robotics and has helped greatly in
testing various guidance, navigation and control techniques. It has also contributed to the
computer vision techniques that are a great contribution to the eventual goal of Robotics. The
speech and gesture control gives an alternative way of controlling robots. Gesture and speech
control are more natural way of controlling devices and makes the control of robots more easy
and efficient.
46
References
[1] L. Holmquest, Listening with Kinect, Microsoft, 12 December 2012. [Online]. Available:
http://msdn.microsoft.com/en-us/magazine/jj884371.aspx. [Accessed 25 March 2014].
[2] Microsoft, Kinect for Windows Sensor Components and Specifications., Microsoft, 06
December 2012. [Online].
Available: http://msdn.microsoft.com/en-us/library/jj131033.aspx. [Accessed 25 May
2014].
[3] Singh, S.; Keller, P., "Obstacle detection for high speed autonomous navigation," Robotics
and Automation, 1991. Proceedings., 1991 IEEE International Conference on , vol., no.,
pp.2798,2805 vol.3, 9-11 Apr 1991
[4] L. Brthes, P. Menezes, F. Lerasle, and J. Hayet. Face tracking and hand gesture recognition
for human robot interaction. In International Conference on Robotics and Automation,
pages 1901-1906, 2004. New Orleans, Louisiana.
[5] Geng Yang, Yingli Lu, motor and motion control system, Tsinghua University
Press,2006.3, pp.85-110.
[6] D. Press, Sci Sports: Killer Robots, 11 March 2013. [Online]. Available:
http://press.discovery.com/us/sci/programs/sci-sports-killer-robots. [Accessed 21 May
2014].
[7] C. Janssen, Wireless Communications, techopedia, 2 June 2006. [Online]. Available:
http://www.techopedia.com/definition/10062/wireless-communications. [Accessed 25 May
2014].
[8] Jin-Shyan Lee; Yu-Wei Su; Chung-Chou Shen, "A Comparative Study of Wireless
Protocols: Bluetooth, UWB, ZigBee, and Wi-Fi," Industrial Electronics Society, 2007.
IECON 2007. 33rd Annual Conference of the IEEE , vol., no., pp.46,51, 5-8 Nov. 2007
[9] J.-S. Lee, Y.-W. Su and C.-C. Shen, A Comparative Study of Wireless Protocols:
Bluetooth, UWB, ZigBee, and Wi-Fi, in 33rd Annual Conference of the IEEE, Taipei,
2007.
[10] R. H. Dictionary, speech recognition, Random House, Inc. , 21 June 2013. [Online].
Available: http://dictionary.reference.com/browse/speech+recognition. [Accessed 25 May
2014].
[11] Luo Zhizeng, Zhao Jingbing, Speech Recognition and Its Application in Voice-based Robot
Control System, International Conference on Intelligent Mechatronics arid Automation,
960-963, 2004.
47
[12] N. Unuth, What is Speech Recognition?, About.com, 22 May 2014. [Online]. Available:
http://voip.about.com/od/voipbasics/a/What-Is-Speech-Recognition.htm.
[Accessed
28
May 2014].
[13] Yuan Meng, "Speech recognition on DSP: Algorithm optimization and performance
analysis", The Chinese University of Hong Kong, July 2004
[14] DIGI, Digi your M2M EXPERT. XBEE/XBEE-PRO., Digi International Inc., 3 April
2014. [Online]. Available:
http://www.digi.com/support/productdetail?pid=3430&osvid=0&type=documentation.
[Accessed 25 May 2014].
[15] EasyVR Arduino Shield 2.0, TIGAL KG, 6 June 2013. [Online]. Available:
http://www.veear.eu/products/easyvr-arduino-shield/. [Accessed 11 May 2014].
[16] Kinect for Windows Sensor Components and Specifications, Microsoft, 21 March 2013.
[Online]. Available: http://msdn.microsoft.com/en-us/library/jj131033.aspx. [Accessed 25
May 2014].
[17] Arduino, 22 February 2012. [Online]. Available: http://arduino.cc/. [Accessed 12 May
2014].
[18] Arduino, Arduino Uno, SmartProjects, [Online]. Available:
http://arduino.cc/en/Main/arduinoBoardUno. [Accessed 25 May 2014].
[19] l298 Dual Full Bridge Driver, STMicroelectronics, 15 July 2011. [Online]. Available:
http://www.st.com/web/catalog/sense_power/FM142/CL851/SC1790/SS1555/PF63147.
[Accessed 25 May 2014].
[20] XBee / XBee-PRO ZB (S2) Modules, Digi International Inc., 14 August 2012. [Online].
Available:
http://www.digi.com/support/productdetail?pid=3430&osvid=0&type=documentation.
[Accessed 25 May 2014].
[21] ZigBee Technology, ZigBee Alliance, 1 January 2008. [Online]. Available:
http://www.zigbee.org/About/AboutTechnology/ZigBeeTechnology.aspx. [Accessed 16
April 2014].
[22] IEEE Computer Society, IEEE ICCV Workshop on Recognition, Analysis, and Tracking of
Faces and Gestures in Real-Time Systems, IEEE Computer Society, 2001.
[23] S. Lee, H. Cho, K.-J. Yoon and J. Lee, Intelligent Autonomous Systems 12, in
Proceedings of the 12th International Conference IAS-12, Jeju Island, Korea, June 26-29,
2012.
48
[24] S. A. Wilson, A real-time obstacle detection vision system for autonomous high speed
robots, University of Louisiana at Lafayette , Lafayette , 2006.
[25] Kinect
for
Windows,
Microsoft,
21
March
2013.
[Online].
Available:
49
[36] T.C. Lueth, Th. Laengle, G. Herzog, E. Stopp, and U. Rembold. KANTRA - HumanMachine Interaction for Intelligent Robots using Natural Language. In IEEE International
Workshop on Robot and Human Communication, volume 4, pages 106-110, 1994.
[37] Jagdish Raheja, Radhey Shyam and Umesh Kumar, Hand Gesture Capture and Recognition
Technique for Real-time Video Stream, In The 13th IASTED International Conference on
Artificial Intelligence and Soft Computing (ASC 2009), September 7-9, 2009 Palma de
Mallorca, Spain.
[38] Peter X. Liu, A. D. C. Chan, R. Chen, K. Wang, Y. Zhu, Voice Based Robot Control,
International Conference on Information Acquisition,543-547, 2005.
[39] W.S.H. Munro, S. Pomeroy, M. Rafiq, H.R. Williams, M.D. Wybrow and C. Wykes,
Ultrasonic Vehicle GuidanceTransducer, Ultrasonics, Vol. 28, pp. 349-354, 1990.
[40] Reza Hassanpour, Stephan Wong, Asadollah Shahbahrami, VisionBased Hand Gesture
Recognition for Human Computer Interaction: A Review, IADIS International Conference
on Interfaces and Human computer Interaction. 25-27 July 2008 Amsterdam, Netherlands .
[41] Jagdish Raheja, Radhey Shyam and Umesh Kumar, Hand Gesture Capture and Recognition
Technique for Real-time Video Stream, In The 13th IASTED International Conference on
Artificial Intelligence and Soft Computing (ASC 2009), September 7-9, 2009 Palma de
Mallorca, Spain.
[42] Raheja, J.L.; Shyam, R.; Kumar, U.; Prasad, P.B., "Real-Time Robotic Hand Control Using
Hand Gestures," Machine Learning and Computing (ICMLC), 2010 Second International
Conference on , vol., no., pp.12,16, 9-11 Feb. 2010
[43] Takai, H.; Miyake, M.; Okuda, K.; Tachibana, K., "A simple obstacle arrangement detection
algorithm for indoor mobile robots," Informatics in Control, Automation and Robotics
(CAR), 2010 2nd International Asia Conference on , vol.2, no., pp.110,113, 6-7 March 2010
[44] N. Harper, and P. McKerrow, Detecting plants for landmarks with ultrasonic sensing,
Proceedings of the International Conference on Field and Service Robotics, pp. 144-149,
1999.
[45] Discant, A.; Rogozan, A.; Rusu, C.; Bensrhair, A., "Sensors for Obstacle Detection - A
Survey," Electronics Technology, 30th International Spring Seminar on , vol., no.,
pp.100,105, 9-13 May 2007
50
51
52
53
54
Kinect Sensor
Arduino Uno
Arduino Mega
LCD 4x20
EasyVR module
Circuit Board
55
DATE
PROJECT ID
TITLE
TOTAL NUMBER
OF WEEKS IN
PLAN
05
33536
No.
STARTING
WEEK
Week 1
Literature review
4 weeks
Week 5
Working strategy
4 weeks
Week 9
3 weeks
Week 12
Wireless Communication
2 weeks
Week 14
3 weeks
Week 17
4 weeks
Week 21
2 weeks
Week 23
3 weeks
Week 26
Wireless Monitoring
2 weeks
10
Week 28
Speech Recognition
3 weeks
11
Week 31
Testing / Troubleshooting
2 weeks
12
Week 33
Thesis
3 weeks
DESCRIPTION OF MILESTONE
56
DURATION