Вы находитесь на странице: 1из 111

Practical Workbook

BM-423 Introduction to Robotics

Name: ___________________________
Year: ___________________________
Batch: ___________________________
Roll No: ___________________________
Department: ___________________________

Department of Biomedical Engineering


NED University of Engineering & Technology
LEJ Campus, Karachi, 75270, Pakistan
CERTIFICATE

Certified that Mr. /Ms. ______________________________ of class


Final Year Biomedical Engineering bearing seat no. ______________
has completed the course in Introduction to Robotics Practical as
prescribed by the NED University of Engineering & Technology, Karachi
for the Academic Session 2019.

Date: _______________________ __________________________


Lab. Teacher

2
LIST OF EXPERIMENTS

Lab No. Object

01 Explore LEGO Mindstorms NXT

02 Build and program a Bumper Car using Touch sensor with LEGO Mindstorms NXT 1.0 Kit

03 Explore LEGO Mindstorms EV3

04 Build and program an Obstacle Avoiding Car using Ultrasonic sensor with LEGO
Mindstorms EV3 Kit

05 Build and program a Line Following Bot Car using Color sensor with LEGO Mindstorms
EV3 Kit

06 OPEN ENDED LAB: Using the background information on LEGO Mindstorms EV3,
design and conduct a LEGO Robot experiment related to self-driving cars

07 OPEN ENDED LAB: Using the background information on LEGO Mindstorms EV3,
design and conduct a LEGO Robot experiment related to mimicking human motion and
function

08 Explore RHINO Robot System and RoboAnalyzer

09 Explore, operate, and program Robotis GP Humanoid Robot

10 Explore and operate Robotis OP2 Humanoid

11 Explore Robotis OP2 Framework and Tutorial Programs

12 Explore and run Action Editor Program of Robotis OP2

13 Explore, operate, and program NAO V5 Evolution Humanoid Robot

14 Explore PeopleBot Telepresence Mobile Robot

15 Review and explore programming in ROS

3
COMMITMENT PER WEEK

Commitment Grading
Date Object
/Week
Explore LEGO Mindstorms NXT 1.0
1
Build and program a Bumper Car using Touch sensor with LEGO
2 Mindstorms NXT 1.0 Kit

Explore LEGO Mindstorms EV3


3
Build and program an Obstacle Avoiding Car using Ultrasonic sensor
4 with LEGO Mindstorms EV3 Kit

Build and program a Line Following Bot Car using Color sensor with
5 LEGO Mindstorms EV3 Kit

OPEN ENDED LAB: Using the background information on LEGO


Mindstorms EV3, design and conduct a LEGO Robot experiment
6
related to self-driving cars

OPEN ENDED LAB: Using the background information on LEGO


Mindstorms EV3, design and conduct a LEGO Robot experiment
7, 8
related to mimicking human motion and function

Explore RHINO Robot System and RoboAnalyzer.


9
Explore, operate, and program Robotis GP Humanoid Robot
10
Explore and operate Robotis OP2 Humanoid Robot
11
Explore Robotis OP2 Framework and Tutorial Programs
12
Explore and run Action Editor Program of ROBOTIS OP2
13
Explore, operate, and program NAO V5 Evolution Humanoid Robot
14
Explore PeopleBot Telepresence Mobile Robot
15
Review and explore programming in ROS
16

4
LAB PERFORMANCE RUBRICS

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3 Understands Understands Finds it
Working and motor principles principles difficult to
Operation of behind behind understand
Robots functioning functioning of principles
of robots robots under behind
guidance functioning of
robots despite
guidance

Identify, 3 5 Psycho 3 Identifies and Identifies and Finds it


Operate, and motor conducts conducts difficult to
Use Robots in experiments experiments on identify and
Multiple on supplied supplied robots conduct
Applications robots under guidance experiments on
supplied robots
despite
guidance

Evaluate 3 5 Psycho 3 Evaluates Evaluates Finds it


Limitations motor limitations limitations and difficult to
of Robots and highlight highlight evaluate
and Apply assumptions assumptions limitations and
Appropriate for optimized for optimized highlight
Assumptions robot robot assumptions
performance performance for optimized
under guidance robot
performance
despite
guidance

5
LAB # 01

OBJECT: Introduction to LEGO Mindstorms NXT

THEORY:

Introduction to LEGO Mindstorms NXT

The NXT Brick

The NXT is the brain of a MINDSTORMS robot. It is an intelligent, computer-controlled LEGO


brick that lets a MINDSTORMS robot come alive and perform different operations.

Motor Port: The NXT has three output ports for attaching motors - Ports A, B and C.
Sensor Port: The NXT has four input ports for attaching sensors - Ports 1, 2, 3 and 4.
USB Port: Connect a USB cable to the USB port and download programs from your computer to the
NXT (or upload data from the robot to your computer). You can also use the wireless Bluetooth
connection for uploading and downloading.
Loudspeaker: Make a program with real sounds and listen to them when you run the program.
NXT Buttons:
Orange button: On/Enter/Run
Light grey arrows: Used for moving left and right in the NXT menu
Dark grey button: Clear/Go back

Touch Sensor

The Touch Sensor gives your robot a sense of touch. The Touch Sensor detects when it is bumped, is
being pressed by something and when it is released again. It is connected to Input port 1 of NXT.

6
Pressed Released Bumped

Sound Sensor

Sound Sensor makes your robot hear. It is connected to Input port 2 of NXT. The
Sound Sensor can detect both decibels [dB] and adjusted decibel [dBA]. A
decibel is a measurement of sound pressure.

dBA: In detecting adjusted decibels, the sensitivity of the sensor is adapted to the
sensitivity of the human ear. In other words, these are the sounds that your ears can hear.
dB: In detecting standard [unadjusted] decibels, all sounds are measured with equal sensitivity. Thus,
these sounds may include some that are too high or too low for the human ear to hear.

The Sound Sensor can measure sound pressure levels up to 90 dB – about the level of a lawnmower.
Sound pressure levels are extremely complicated, so the Sound Sensor readings on the
MINDSTORMS NXT are displayed in percent [%]. The lower the percentage the quieter the sound.
For example:
• 4-5% is like a silent living room
• 5-10% would be someone talking some distance away
• 10-30% is normal conversation close to the sensor or music played at a normal level
• 30-100% would be people shouting or music being played at a high volume.

Light Sensor

The Light Sensor is one of the two sensors that give your robot vision (Ultrasonic Sensor is the other
one). Light Sensor enables your robot to distinguish between light and dark. It can read the light
intensity in a room and measure the light intensity of colored surfaces. It is connected to Input port 3
of NXT.

Detecting reflected light: Light Sensor can read reflected light when placed close to various colored
surfaces. We get different readings for different colors, based on how robot views them.

Red Blue Yellow

(This is how your robot will view the colors using the light sensor)

7
Detecting ambient [surrounding] light: Light Sensor can also read ambient light, i.e. the light
present in its surrounding. The differences in readings can be observed by taking readings in different
locations of the room. For example, first hold the sensor against the window, and then hold it under
the table.

Ultrasonic Sensor

The Ultrasonic Sensor is one of the two sensors that give your robot vision. The
Ultrasonic Sensor enables your robot to see and detect objects. You can also use
it to make your robot avoid obstacles, sense and measure distance, and detect
movement. It is connected to Input port 4 of NXT.

The Ultrasonic Sensor measures distance in centimeters and in inches. It is able to measure distances
from 0 to 255 centimeters with a precision of +/- 3 cm.

The Ultrasonic Sensor uses the same scientific principle as bats: it measures distance by calculating
the time it takes for a sound wave to hit an object and return – just like an echo.
Large sized objects with hard surfaces return the best readings. Objects made of soft fabric or those
that are curved (like a ball) or are very thin or are small can be difficult for the sensor to detect.

*Note that two or more Ultrasonic Sensors operating in the same room may interrupt each other’s readings.

Interactive Servo Motors

The three Servo Motors give your robot the ability to move. If you use the Move block in the LEGO
MINDSTORMS NXT software to program your motors, the two motors will automatically
synchronize, so that your robot will move in a straight line. Using the Motor block, however, only
moves a single motor at a time.

Built-in Rotation Sensor: Each motor has a built-in Rotation Sensor. This lets your control your
robot’s movements precisely. The Rotation Sensor measures motor rotations in degrees or full
rotations [accuracy of +/- one degree]. One rotation is equal to 360 degrees, so if you set a motor to
turn 180 degrees, its output shaft will make half a turn. The built-in Rotation Sensor in each motor
also lets you set different speeds for your motors [by setting different power parameters in the
software].

Motors are connected to the output ports, A-C.


• Port A: Motor used for an extra function.
• Port B: Motor for movement.
• Port C: Motor for movement.

8
Brick SubMenu

View

The View submenu lets you carry out a quick test of your sensors
and motors, observing real time data from each unit. Simply
connecting a sensor or motor in the allocated port and selecting
options such as motor rotation, reflected light, etc. give you
readings from and for each.

Try Me

The Try Me submenu lets you test your sensor and motors by giving
fun output results. Explore various built-in programs to get some
fun sounds, display images, and to experience different motor
reactions.

NXT Block Programs

You don’t need a computer to program your NXT. Using the NXT
Program submenu, you can make several different programs using
blocks.

Software User Interface

Using Mindstorms NXT software, you can build programs using G-


programming on your computer and download it on your NXT.

9
1. Robo Centre Window: Here you can find building and programming instructions for robot
models.
2. My Portal: Here you can access www.MINDSTORMS.com while programming your robots.
3. The Tool Bar: The tool bar includes the most recently used commands from the menu bar in an
easy-to-reach location.
4. Work Area: This is the space on screen where programming takes place. Drag programming
blocks from the programming palette to the work area and attach the blocks to the sequence
beam.
5. Little Help Window: Here you can always get help if needed.
6. Work Area Map: Use the pan tool on the toolbar to move around the work area – and use the
work area map (tab in the lower right corner) to get an overview.
7. Programming Palette: It contains all the blocks you will need to create your programs. The tabs
at the bottom of the palette let you switch between the common palette (containing the most
frequently used blocks), the complete palette (containing all the blocks) and the custom palette
(containing blocks that you download or create yourself).
8. Configuration Panel: Each programming block has a configuration panel that lets you
customize the block for the specific input and output that you want.
9. Controller: The five buttons on the controller let you download programs (or parts of programs)
from your computer to the NXT. With the controller you can also change the settings of your
NXT.
10. NXT Window: This pop-up window will give you information about your NXT’s memory and
communication settings.

By using the Lego Mindstorms NXT’s software, you can build programs and download it on NXT
brick, so that you can program the robot model you have built according to your requirement.

10
EXERCISES:

1) Test the different sensors and motors connected to appropriate ports on NXT by using View and
Try Me submenu. Explain the output of each program of the submenu.

Sensor/Motor Sub Program Output


Menu
Touch Sensor View Touch

Try Me Try-Touch

Sound Sensor View Sound dB


Sound dBA
Try Me Try-Sound

Light Sensor View Reflected Light


Ambient Light
Try Me Try-Light

Ultrasound View Ultrasonic cm


Sensor Ultrasonic inch
Try Me Try-Ultrasonic

Motors View Motor Rotations


Motor Degrees
Try Me Try-Motor

2) Build the following programs in NXT using NXT program submenu. Explain their output in a
single line.

11
3) Using NXT block programming, program your robot to perform any function of your liking and
draw the block structure below.

Block Structure Output Concept

4) Using NXT software, program your robot to perform the same function as in (3) with added
customizations.

12
LAB # 02

OBJECT: Build and program a Bumper Car using touch sensor with LEGO Mindstorms NXT 1.0
Kit.

REQUIREMENTS:
• Mindstorms NXT 1.0
• LEGO Mindstorms NXT-G

THEORY:

Concept

This robot has a bumper in front that triggers a touch sensor to tell the robot when it has run into
something. The program will make the robot drive around the room, turning each time it bumps into
something.

Building Instructions

Start by building the Castor Bot that is shown below from fig.1 to fig. 11.
14
15
16
17
18
19
Use two medium length wires to connect the two drive motors to ports B and C on the NXT.

20
Important: Keep the left wire on the left and the right wire on the right (do not cross the wires).

21
Use the Shortest wire for this step.
Connect the wire from the touch sensor to port 1 on the NXT. You can route the wire around the
round cross brace under the robot to keep it out of the way.

23
24
EXERCISES:

1) Design a program that tells the robot to go straight until the bumper hits something, then back up
a little, turn right, then go back to going straight again (repeating forever).
2) Modify this program with your own ideas of what to do when the robot hits something.

25
LAB # 03

OBJECT: Introduction to LEGO Mindstorms EV3

THEORY:

Introduction to LEGO Mindstorms EV3

The EV3 Brick

The EV3 is the brain of a MINDSTORMS robot. It is an intelligent, computer-controlled LEGO brick
that lets an EV3 robot come alive and perform different operations.

Motor Port: Output Ports A, B, C, and D are used to connect motors to the EV3 Brick.
Sensor Port: Input Ports 1, 2, 3, and 4 are used to connect sensors to the EV3 Brick.
PC Port: The Mini-USB PC Port, located next to the D port, is used to connect the EV3 Brick to a
computer.
USB Host Port: The USB Host Port can be used to add a USB Wi-Fi dongle for connecting to a
wireless network, or to connect up to four EV3 Bricks together (daisy chain).
SD Card Port: The SD Card Port increases the available memory for your EV3 Brick with an SD
card.
Loudspeaker: All sounds from the EV3 Brick come through this speaker - including any sound
effects used in programming your robots. When the quality of the sound is important to you, try to
leave the speaker uncovered while designing your robot. Check out the cool sound files that can be
programmed with in the EV3 Software.
Brick Buttons:
Back button: This button is used to reverse actions, to abort a running program, and to shut down.
Left, Right, Up, Down Arrows: These four buttons are used to navigate through the contents of Brick.
Center button: Pressing the Center button says “OK” to various questions—to shut down, to select
desired settings, or to select blocks in the Brick Program App.
Brick Status Light: The Brick Status Light that surrounds the Brick Buttons tells you the current
status of the EV3 Brick. It can be green, orange, or red, and can pulse. Brick Status Light codes are
the following:
• Red = Startup, Updating, Shutdown
• Red pulsing = Busy
• Orange = Alert, Ready
• Orange pulsing = Alert, Running
• Green = Ready
• Green pulsing = Running program

You can also program the Brick Status Light to show different colors and to pulse when different
conditions are met.

26
Interactive Servo Motors

The Large Motor is a powerful “smart” motor. It has a built-in Rotation Sensor with 1-degree
resolution for precise control. The Large Motor is optimized to be the driving base on your robots. By
using the Move Steering or Move Tank programming block, the Large Motors will coordinate the
action simultaneously.

The Medium Motor also includes a built-in Rotation Sensor (with 1-degree resolution), but it is
smaller and lighter than the Large Motor. That means it is able to respond more quickly than the
Large Motor. The Medium Motor can be programmed to turn on or off, control its
power level, or to run for a specified amount of time or rotations.

The motors are connected on the following ports:


• Port A: Medium Motor

27
• Port B and C: Two Large Motors
• Port D: Large Motor

Comparing the two motors: Large Motor runs at 160–170 rpm, with a running torque of 20 Ncm
and a stall torque of 40 Ncm (slower, but stronger). The Medium Motor runs at 240–250 rpm, with a
running torque of 8 Ncm and a stall torque of 12 Ncm (faster, but less powerful). Both motors are
Auto ID supported.

Touch Sensor

The Touch Sensor is an analog sensor that can detect when the sensor’s red button has been pressed
and when it is released. That means the Touch Sensor can be programmed to action using three
conditions—pressed, released, or bumped (both pressed and released). It is connected on Port 1 of the
EV3 Brick.

Pressed Released Bumped

Infrared Sensor

The Infrared Sensor is a digital sensor that can detect infrared light reflected from solid objects. It can
also detect infrared light signals sent from the Remote Infrared Beacon. The Infrared Sensor can be
used in three different modes: Proximity Mode, Beacon Mode, and Remote Mode. It is connected on
Port 4 of the EV3 Brick.

The current LEGO kit includes Ultrasound Sensor instead of


Infrared Sensor for object detection.

Ultrasonic Sensor

The digital EV3 Ultrasonic Sensor generates sound waves


and reads their echoes to detect and measure distance from
objects. It can also send single sound waves to work as sonar or listen for a sound wave that triggers
the start of a program. It measures distances between one and 250 cm (one to 100 in.), accurately to
+/- 1 cm (+/- 0.394 in.). To use this sensor in Software programming, an ultrasonic brick needs to be
imported. It can be connected on Port 4 of EV3 Brick.

Color Sensor

The Color Sensor is a digital sensor that can detect the color or intensity of light that enters the small
window on the face of the sensor. This sensor can be used in three different modes: Color Mode,

28
Reflected Light Intensity Mode, and Ambient Light Intensity Mode. For the
best accuracy, the sensor must be held at a right angle, close to - but not
touching - the surface it is examining. It is connected on Port 3 of the EV3
Brick.

Color Mode: Color Sensor recognizes seven colors—black, blue, green,


yellow, red, white, and brown—plus No Color. This ability to differentiate
between colors means your robot might be programmed to sort colored balls
or blocks, speak the names of colors as they are detected, or stop action when it
sees red.

Ambient Light Intensity Mode: Color Sensor measures the strength of light
that enters the window from its environment, such as sunlight or the beam of a
flashlight. The sensor uses a scale of 0 (very dark) to 100 (very light). This
means your robot might be programmed to set off an alarm when the sun rises in
the morning or stop action if the lights go out.

Reflected Light Intensity Mode: Color Sensor measures the intensity of light
reflected back from a red light–emitting lamp. The sensor uses a scale of 0
(very dark) to 100 (very light). This means your robot might be programmed to
move around on a white surface until a black line is detected, or to interpret a
color-coded identification card.

Gyro Sensor

The digital EV3 Gyro Sensor measures the robot’s rotational motion and
changes in its orientation. It can be used to measure angles, create balancing
robots, and explore navigation systems and game controllers. The angle mode
measures angles with an accuracy of +/- 3 degrees, whereas the Gyro mode has
a maximum output of 440 degrees/second and a sample rate of 1 kHz. To use
this sensor in Software programming, a gyro brick needs to be imported. It can
be connected on Port 2 of EV3 Brick.

Brick Apps

The EV3 Brick comes with four brick applications preinstalled and ready to
use. In addition, you can also make your own apps in the EV3 Software.
Once downloaded to the EV3 Brick, the homemade apps will be displayed
here.

29
Port View

On the first screen in the Port View, you will see, at a glance, which
ports have sensors or motors attached. Use the EV3 Brick Buttons to
navigate to one of the occupied ports and you will see the current
readings returned from the sensor or motor. Attach some sensors and
motors and experiment with the different settings. Press the Center
button to see or change the current settings for the attached motors and
sensors. Press the Back button to get back to the Brick Apps main
screen.

Motor Control

Control the forward and reverse movement of any motor connected to


one of the four output ports. There are two different modes. In one
mode, you will be able to control motors connected to Port A (using the
Up and Down buttons) and to Port D (using the Left and Right buttons).
In the other mode, it is motors connected to Port B (using the Up and
Down buttons) and Port C (using the Left and Right buttons) that you control. Use the Center button
to toggle between the two modes. Press the Back button to get back to the Brick Apps main screen.

IR Control

Control the forward and reverse motion of any motor connected to one
of the four output ports using the Remote Infrared Beacon as remote
control and the Infrared Sensor as receiver (the Infrared
Sensor must be connected to Port 4 in the EV3 Brick). There are two
different modes. In one mode, you will be using Channels 1 and 2 on
the Remote Infrared Beacon. On Channel 1, you will be able
to control motors connected to Port B (using Buttons 1 and 2 on the
Remote Infrared Beacon) and to Port C (using Buttons 3 and 4 on the
Remote Infrared Beacon). On Channel 2, you will be able to control motors connected to Port A
(using Buttons 1 and 2) and to Port D (using Buttons 3 and 4). In the other mode, you can control
your motors in the exact same way by using Channels 3 and 4 on the Remote Infrared Beacon instead.
Use the Center button to toggle between the two modes. Press the Back button to get back to the
Brick Apps main screen.

30
Brick Program

The EV3 Brick comes with an on-brick


programming application, that lets you
program your robot without needing
any external software.

The Start screen provides you with a


Start and a Loop block that are
connected via a Sequence Wire. The
vertical broken Add Block line in the
middle indicates that you can add more
blocks to your program. Press the Up
button to add a new block from the
Block Palette.

In the Block Palette, you can choose


which new block to add by navigating
using the Left, Right, Up, and Down
buttons. Generally, there are two types
of blocks—Action and Wait. The
Action Block Indicator is a small arrow
at the top right on the block. The Wait Block Indicator is a small hourglass. In total, there are six
different Action blocks and eleven different Wait blocks to choose from. Pressing the Center button
selects the block. Customization options with each block are also available. For example, you can
select direction of motor movement in the motor blocks.

Software User Interface

Using Mindstorms EV3 software, you can build programs on your computer and download it on your
EV3 Brick.

Every time you open the EV3 Software, you will automatically start out in the Lobby area. The
Lobby makes it easy to locate and work with the software and gives you access to everything you
need. In the Lobby you will find the following options and resources:

1. Lobby Tab: This button always returns you to the Lobby.


2. Add Project: Here you add a new project so that you can start programming your own robot.
3. Robot Missions: Here you can get started with building and programming the five main models.
4. Open Recent: Get easy access to the latest projects you have worked with.
5. Quick Start: Support resources such as short introductory videos, User Guide, and Software
Help.
6. News: Small stories and news splashes from LEGO.com/mindstorms (Internet connection is
required).

31
7. More Robots: Access to building and programming more models (Internet connection is
required).

Program your robot in the intuitive icon-based Programming Environment. Drag and drop the actions
that you want into the programming window and adjust them to suit your robot’s behavior. The EV3
Programming Environment consists of the following main areas:

1. Programming Canvas: Lay out your program here.


2. Programming Palettes: Find the building blocks for your program here.
3. Hardware Page: Establish and manage your communication with the EV3 Brick here and
see what motors and sensors are connected where. This is also where you download programs
to the EV3 Brick.
4. Content Editor: A digital workbook integrated into the software. Get instructions or
document your project using text, images, and videos.

32
5. Programming Toolbar: Find basic tools for working with your program here.

EXERCISES:

1) Test the different sensors and motors connected to appropriate ports on EV3 by using Port View
and Motor Control app. Explain output of each mode/program of the apps, or lack thereof in
certain cases.

Sensor/Motor Brick App Modes/Program Output


Touch Sensor Port View

Infrared Port View


Sensor
IR Control

Ultrasonic Port View


Sensor
Color Sensor Port View

Gyro Sensor Port View

Motors Port View

Motor Control

33
2) Using EV3 brick programming app, program your robot to perform any function of your liking
and draw the block structure below.

Block Structure Output Concept

34
3) Using EV3 software, program your robot to perform the same function as in (2) with added
customizations.

35
LAB # 04

OBJECT: Build and program an Obstacle Avoiding Car using Ultrasonic sensor with LEGO
Mindstorms EV3 Kit.

REQUIREMENTS:
• LEGO Mindstorms EV3 kit
• Mindstorms EV3 Software

THEORY:

Concept

This robot has an ultrasonic sensor that tells the robot when it is about to hit an obstacle. The program
will make the robot drive around the room, turning each time it is close to hitting any object.

Building Instructions

Follow instructions given below, from fig.1 to fig.45, to build a driving base for your bot car.
37
38
39
40
41
42
43
44
45
You can now add specific components to this driving base to build an obstacle avoiding bot car with
an ultrasonic sensor up front. Follow instruction as shown in fig.1 to fig.8.

46
47
48
EXERCISES:

1) Design a program that tells the robot to go straight ahead until a certain distance to an obstacle is
encountered, then back up a little, turn right, then go back to going straight again (repeating
forever).
2) Modify this program with your own ideas of what to do when the robot is about to hit something.

49
LAB ASSESMENT

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3
Working and motor
Operation of
Robots

Identify, 3 5 Psycho 3
Operate, and motor
Use Robots in
Multiple
Applications

Evaluate 3 5 Psycho 3
Limitations motor
of Robots
and Apply
Appropriate
Assumptions

50
LAB # 05

OBJECT: Build and program a Line Following Bot Car using Color sensor with LEGO
Mindstorms EV3 Kit.

REQUIREMENTS:
• LEGO Mindstorms EV3 kit
• Mindstorms EV3 software

THEORY:

Concept

This robot has a Color sensor that searches for a black line to follow. The “Two-State” or "Zig-Zag"
method of line following is very simple where the robot is constantly turning left and right as it sees
either side of the color boundary. Thus, it is never actually moving straight ahead, even when the line
is straight.

Building Instructions

Follow instructions given in lab#04 to build a driving base for the bot car. Then, implement the given
steps, from fig.1 to fig.6., to successfully design a line following robot.

51
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

52
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

EXERCISES:

1) Design a program that tells the robot to follow a black colored path, turning left/right when
encountering a non-black region (repeating forever).
2) Modify this program with your own ideas of what to do when the robot is deviating from its path.

53
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB ASSESMENT

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3
Working and motor
Operation of
Robots

Identify, 3 5 Psycho 3
Operate, and motor
Use Robots in
Multiple
Applications

Evaluate 3 5 Psycho 3
Limitations motor
of Robots
and Apply
Appropriate
Assumptions

54
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 06 (OPEN ENDED LAB)

OBJECT: Using the background information on LEGO Mindstorms EV3, design and
conduct a LEGO Robot experiment related to self-driving cars.

EQUIPMENT:
• LEGO Mindstorms EV3 kit
• Mindstorms EV3 Software

TASK:
Use the given equipment to design a robot that can work as a self-driving car by virtue of the
following characteristics:
• Follows a black line using input from Color Sensor
• Detects obstacles in paths using input from Ultrasonic Sensor
• Plots course to avoid obstacle and find its way back to the original path using input from Gyro
Sensor
• Manually overrides all processes using input from Touch Sensor

LAB REPORT:
Your lab report should include the following sections:

Purpose
This is a statement of the problem that your robot is going to solve. It should include all the
challenges your robot is capable to meet.

Equipment
A detailed list of all the laboratory equipment used in your experiment.

Procedure
• Step by step instructions of your design process carefully explained in a numbered sequence
• Screenshot of the program that was implemented be to execute on NXT brick

Conclusion
Discuss any questionable or surprising results. Explain the possible source of any error or objectionable
results. Suggest changes in experimental design that might test your explanations.

55
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB ASSESMENT

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3
Working and motor
Operation of
Robots

Identify, 3 5 Psycho 3
Operate, and motor
Use Robots in
Multiple
Applications

Evaluate 3 5 Psycho 3
Limitations motor
of Robots
and Apply
Appropriate
Assumptions

56
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 07 (OPEN ENDED LAB)

OBJECT: Using the background information on LEGO Mindstorms EV3, design and
conduct a LEGO Robot experiment related to mimicking human motion and function.

EQUIPMENT:
• LEGO Mindstorms EV3 kit
• Mindstorms EV3 Software

TASK:
Use the given equipment to design a robot that can mimics a specific body part by virtue of the
following characteristics:
• Mimics human motion providing a minimum of 3 Degrees of Freedom (DOF)
• Mimics human function (such as gripping, eating, kicking, etc.) using a combination of inputs
from at least two sensors

LAB REPORT:
Your lab report should include the following sections:

Purpose
This is a statement of the problem that your robot is going to solve. It should include all the
challenges your robot is capable to meet.

Equipment
A detailed list of all the laboratory equipment used in your experiment.

Procedure
• Step by step instructions of your design process carefully explained in a numbered sequence
• Screenshot of the program that was implemented be to execute on NXT brick

Conclusion
Discuss any questionable or surprising results. Explain the possible source of any error or objectionable
results. Suggest changes in experimental design that might test your explanations.

57
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB ASSESMENT

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3
Working and motor
Operation of
Robots

Identify, 3 5 Psycho 3
Operate, and motor
Use Robots in
Multiple
Applications

Evaluate 3 5 Psycho 3
Limitations motor
of Robots
and Apply
Appropriate
Assumptions

58
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 08
OBJECT: Explore RHINO Robot System and RoboAnalyzer

REQUIREMENTS:
• RoboAnalyzer software
• RHINO XR4 Robot System
• LabView

THEORY:

RHINO Robot System

RHINO robot system has all the elements of an industrial robot, which include:
• XR-4 Robotic Arm
• MARK IV Controller
• Electrically Controlled Gripper
• RHINO Teach Pendant
• Storage on Host Computer

RHINO Robot system consists of a microprocessor-based robot controller plus a Host Computer for
Arm Motion, programming options and permanent program storage.

Robotic Arm

• Five motion axes are provided which include rotation at waist, shoulder and elbow for
positioning of hand and flex and rotation motion in the wrist for orientation of the gripper.
• Axis Drive motors are DC servo motors. Motion is transferred from these motors to joints by
chains and lever linkages.
• Axis drive motors are provided with optical encoders to determine the position of joint angles and
arm position. Encoders tell the distance traveled and direction of rotation. Encoders can be
accessed by removal of black covering on motor's ends.
• Ribbon cables are used to supply power to drive motors and to carry data from encoder of a drive
motor. Sixth drive motor is used to operate the gripper (Gripper's servo drive motor does not
cause motion in any arm axes).

MARK IV Controller

• The main power switch applies 120 volts AC power to controller.


• Connector labeled "Pendant" is for the connection of Teach pendant cable.
• RS-232C serial port labeled as "HOST" is used for serial communication between computer and
robot controller.

59
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

• Connectors A through F are used to connect six arm servo motors. Two additional ports, G and H
can be used to connect auxiliary motors. These ports provide 20 volts dc pulses to drive motors.
• 16 input ports are provided (state of 8 inputs is indicated by LEDs and other 8 are in the form of
switches) and 8 output ports. I/O ports are discrete (either "on" or "off") signal ports. These ports
are current devices where input or output is considered 'ON' when passing current and 'OFF'
otherwise. These ports allow many industrial devices to interface directly with the controller.
• Opto-isolators are used on inputs and outputs to provide protection to external devices and the
controller.
• I/O power connector can provide 12 volts at up to 3 amps. It can be used to power external
electronics or sensors used.
• Auxiliary port terminal block provides switched voltage output (-20 to +20 volts) under robot
program control.
• 80188 microprocessors, power supply and solid-state memory are major components for
microcomputer inside the controller.
• Operating system software in controller's memory allows it to send and receive information
through interfaces, change arm position and operate work cell hardware.

MARK IV Teach Pendant

• Teach pendant is a hand-held device that aids in programming the robot arm and is connected to
the controller by a round cable.
• It is a 29-key microprocessor-controlled programming unit with a 2 line by 16-character LCD
display.
• There is an Emergency STOP key provided on it which will immediately stop all motors.
Teach Pendant allows program development in two ways: 1) Complete programs can be
generated using Teach pendant keys and 2) arm positions written on host computer are taught
using the Teach pendant.
• The Teach Pendant operates in 4 major modes:
1. PLAY: allows manipulation of arm and various interfaces without storing information
2. EDIT: allows you to create a program that is stored and can be replayed. It also allows
modifying existing Teach pendant programs.
3. RUN: replays or executes a previously taught program
4. HOST: transfers control from pendant to a host or external computer.

Special Function Keys

Teach Pendant has several special function or multiple purpose keys.


1. ENTER: It is used to execute a command that was created by other key functions.
2. ESCAPE: It is primarily used to abort any command before using ENTER key. In this case it is
like backspace except rather than erasing a character, the entire command is erased. It is also used
to prematurely abort any ongoing function. It also terminates certain modes of pendant operation.
3. UP, DOWN: These are used to answer a YES or NO, to provide OFF or ON state or to scroll
through a program. They are also used to move individual motors to some position.

60
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

4. CONFIG: It is used to configure the MARK IV controller, to invoke some seldom used functions
and to change the sign of some given command value.

Home Position

• HOME is a defined or known position in the work envelope from which all new programs and
stored programs start.
• RHINO robot system has two types of Home positions.
o HARD HOME is a mechanical home position which is determined by limit switches
located on the waist, shoulder, elbow, wrist flex and waist rotate axis. A Hard home must
be performed at least once before a new move sequence is taught or a stored sequence is
loaded and executed. To begin Hard Home procedure, press the CONFIG key. At this
point you can either press ESCAPE to abort the function or press ENTER to execute.
Once pressed the robot will start to move and search for limit switches on all axes.
o SOFT HOME is a software home and can be set at any position in the work envelope.
The controller saves the position of the robot and after the Soft home key is pressed, the
robot is commanded to return to the saved position. Soft home is reached much more
quickly than Hard home.

Program Storage Device

• The hard drive on the computer is the permanent storage device for work cell programs produced
on the RHINO robot system.
• The second type of permanent storage is present in MARK IV controller. The robot operating
system is permanently stored on solid state memory device called EPROM.
• The third storage device is the double EEPROM in the controller used to hold a teach pendant
program.

RoboAnalyzer

RoboAnalyzer is a 3D Model based Robotics Learning Software. It has been developed to help the
students to learn the concepts of robotics.
1. 3D Model: This tab helps you visualize the 3D model of the robot that you build using the D-
H parameters of the robot. It is present on the top right side of the main window.

61
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

2. Graphs: Also present on the top right side of the main window, this tab helps you visualize the
graph of various joint parameters such as joint value (degrees), joint velocities etc. It provides
information of individual joints and links. This option remains disable before the performance of
any analysis. You can change the appearance (color and line type) of any parameter (Joint Value,
Joint Velocity etc.) on the graph by using the tool palette in the lower left corner of below fig.

3. Zoom: This option helps you to zoom in and zoom out the simulator that you build. Can be
found in the 3D model tab.

62
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

4. Select View: Helps you toggle between multiple views the simulation, such as top, front,
side, etc. Can be found in the 3D model tab.
5. Analytical Functions: This allows you to
perform the Forward Kinematics, Inverse
Kinematics, Forward Dynamics and
Inverse Dynamics analysis.
• Forward Kinematics Analysis: For
this you have to input any of the joint
variable(s) in the Link Config section
in bottom right. Then, hit the FKin
button, and after it press play button
and you will see that your simulator
will be moving as you’ve input the
joint variable. After simulation you
can observe the End Effector
configuration (EE Config in the
bottom right), which is shown as
Homogenous Transformation Matrix.
• Inverse Kinematics Analysis: Can
be accessed using IKin button in the
3D model section. This option is available only for limited robots and robotic

63
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

configurations. You can select your robot from the select robot option. By selecting a
particular robot, you will see different options appearing in the inverse kinematics
window.

5. Virtual Robots: This allows you to simulate some of the commonly found robots.

6. Custom Robot: Accessible through the bottom left corner under the DH Parameter tab, this
allows you to build Robots for simulation. It allows you to select the DOF and various robotic
configurations. You can build your model by just selecting the DOF and robotic configuration
(joint types) and pressing the OK button.

7. D-H Parameter Table: From the Custom robot section, this allows you to input the D-H
parameters of the robotic simulator that you want to build. It also allows you to input the joint
variables for visualizing the movements of various joints.

8. Visualizing tools: Visual D-H tab in bottom right, helps you to visualize D-H parameters,
End Effector configuration after running Forward Kinematics analysis. This will help you in
understanding D-H parameters which are difficult to understand without 3D visualization.

EXERCISES:

1) Explore RHINO hardware and suggest one improvement for better functioning.

______________________________________________________________________________
______________________________________________________________________________

64
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

2) What is the difference between Hard Home and Soft Home?


______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

3) Simulate any available robot and attach its screenshot.

4) Simulate RHINO XR4 using given D-H parameters and verify the end effector configuration (EE
Config) using RoboAnalyzer and LabView (to operate the bot).

Axis (mm)
(mm)
1 0
=95
2 0 0
=230
3 0 0
=230
4 0 =
15
5 =135 0 0

65
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 09

OBJECT: Explore, operate, and program ROBOTIS GP Humanoid Robot

THEORY:

Introduction to ROBOTIS GP Humanoid

• Humanoid robot with high-quality


Dynamixel AX-18A (legs) and AX-12A
• Easy assembling of multiple parts through
nut and horn
• Excellent mobility such as turning during
walking or high-speed walking
• Basic humanoid motions such as combat and
soccer modes provided
• Self-position-correcting using Gyro sensor
• Gripper set and sensors included for various missions (in advanced versions)
• Wireless remote included (Zigbee installed)
• Up-to-date version of RoboPlus programming software included
• Digital Packet communication and simple Daisy Chain cable arrangement

Key Components & Functions

ROBOTIS BIOLOID GP comes with a variety of open components with convenient nut assembly.
Following is a brief review of major components:
1. Dynamixel AX-12A: Robot exclusive actuators operating on a torque of 15 kgf-cm at 12V, and
with a movement range of 300 degrees.
2. Dynamixel AX-18A: Robot exclusive actuators operating on a torque of 18 kgf-cm at 12V, and
with a movement range of 300 degrees. Present in legs.
3. Gyro Sensor Module: Used to adjust posture while walking, by observing angular acceleration
on 2 axes.
4. CM-530: Robot exclusive controller, with 6 ports for user created sensors and 5 ports for AX-12
actuators. Also has sound sensor, buzzer, and fuse installed.
5. RC-100B: Robot exclusive remote controller with Zigbee installed.
6. Bluetooth (BT-410 Set): Provides non-interference signal control and allows simultaneous
control of multiple robots.
7. Rechargeable Battery & Charger: Lithium rechargeable battery can be charged with an input
voltage of 100-240V.

66
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

8. Aluminum Frames: Optimized, lightweight frames.

67
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Operating ROBOTIS GP

Switching the Robot On

• Turn the power switch on and the LED will start blinking.
• Use the MODE button to move the LED to “PLAY” (The LED will move each time you press the
MODE button).
• Press the START button (Check whether the LED on “PLAY” is blinking).
• If the LED does not turn on, check the power cable/connection. If there are no problems with
cable, recharge your battery.

Checking Robot’s Posture

• When the robot is turned on, it


defaults to the pose shown below.
• Check the angles of the arms and
legs. If they are different from the
picture, go back to “Check
Assembly Mode”.
• Check whether the cables have
been assembled on the outer part
of the leg.
• If the robot has been incorrectly
assembled to an extent where it
may be severely damaged, a
warning sound will be activated.
Also, the LED of the motor with
the problem will turn on and
release its torque to prevent damage.

68
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Robot in Action

• The robot can be operated by the RC-100 remote controller.


• First step includes pressing either U, D, L, or R buttons at the back of the robot to select action
modes, given below:
U = Soccer Mode
D = Battle Mode
L or R = Performance Mode
• Operating the robot with the RC-100 without selecting the mode will automatically start the
soccer mode.
• Second step includes turning on the remote controller by pressing the power button for 2 seconds.

• Pressing buttons or combination of buttons on


the remote controller thus activate actions belonging to selected modes.

Buttons Motion Buttons Motion

Walking (during Soccer/Battle)


U Forward D Backward
L Turn Left R Turn Right
U+L Walk Forward + Left U+R Walk Forward + Right
L+5 Left Sidestep L+5+6 Fast Left Sidestep
L+U+5 Left Forward Diagonal Step L+D+5 Left Backward Diagonal Step
R+5 Right Sidestep R+5+6 Fast Right Sidestep
R+U+5 Right Forward Diagonal Step R+D+5 Right Backward Diagonal Step

Standard
1+U Gets up facing up 1+D Gets up facing down
5+6+U+1 Soccer mode (Change mode) 5+6+D+3 Battle mode (Change mode)
5+6+L+2 Performance mode (Change mode) - -

Soccer Mode
2+U Left Leg + Forward Kick 4+U Right Leg + Forward Kick
2+D Left Leg + Back Kick 4+D Right Leg + Back Kick
2+L Left Leg + Left Kick 4+L Right Leg + Left Kick
2+R Left Leg + Right Kick 4+R Right Leg + Right Kick
3 Defense Standby 3+L Block Ball + Left
3+U Defense 3+R Block Ball + Right

69
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Battle Mode
2+U Hit + Forward Attack 3 Defense
2+L Hit + Left Attack 2+R Hit + Right Attack
4+U Upper Body Tackle 4+D Lower Body Tackle
4+L Left Strong Tackle 4+R Right Strong Tackle

Performance Mode
2+U Gritting 2+R Handstand
2+D Clap (Twice) 3+U Roll on Side
2+L Clap (337) 3+D Push ups

Programming ROBOTIS GP

Using the RoboPlus programming software, you can easily make your own motions and control the
robot.
Introduction to RoboPlus

RoboPlus is a software suite, which is meant to enable smooth interface between any hardware
obtained from the ROBOTIS company and one’s personal computer. The suite consists of the
following four basic programming tools:

70
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

1. RoboPlus Task: Used to develop custom programs easily by color coding


2. RoboPlus Motion: Used to create animations by adjusting position and velocity of motors
3. RoboPlus Manager: Used to manage controllers and components
4. RoboPlus Terminal: Used to manage controller firmware

Making a Basic Program

Let’s make a program or task code that prints numbers on the output
screen as shown in figure. We’ll be using RoboPlus Task for this
purpose.

1. Execute RoboPlus Task Program: As seen in the picture below, go


to Start > All Programs > ROBOTIS > RoboPlus > Software >
RoboPlus Task to execute RoboPlus Task.

2. Select a Controller: Double click an


empty line or press Enter. In the Select
Control window, select the controller to use,
then press the OK button.

3. Generating Start Program: Select Start


Program from the Select Instruction Type
window. Start Program will be automatically
generated in RoboPlus Task.

71
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

4. Input Endless Loop command: To print the numbers on the screen endlessly, use the Endless
Loop command (Create a command line). Double click or press Enter on an empty line between
{} of Start Program to invoke the Select Instruction Type window. Select Loop > Endless Loop
(while (1)) from the list.

5. Input Load command: Use Load command to input a Print command, which is needed to print
numbers on the screen. Insert Execute > Load (Assignment value) into an empty line between {}
of Endless Loop.

72
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

6. Load 1 into Print: Choose the left parameter (?) among the Load parameters (Explanation on the
parameter). The left parameter receives input from the right parameter. Double click the left
parameter (?), or press Enter key after clicking it once to invoke the Select Parameter Window.
Select Controller > Print then press OK.

Select Constant Numbers > Number > 1 for the right parameter (?) in the same way.

When both parameters of the Load command have been set, it should look like below.

73
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

7. Load 2 into Print with Line: Select } under Print command (at the end of the endless loop
section), and add new lines by pressing the Space key. Repeat Steps 5 and 6 to input the Load
command and to input Controller > Print with Line and 2. The final task code is shown below.

8. Save Task Code: Press Ctrl + S or the Save icon

74
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Executing Task Code

Once a program has been successfully developed, it must be uploaded and executed by the robot
controller.

1. Open the Program Output


Monitor: To see the output of the
program, you must open the Program
Output Monitor BEFORE executing
the program. There are three ways to
open the Program Output Monitor.
• Click on the View Print of Program in the Download Program window
• Click on the View Print of Program button in TOOLS.
• Press F5 or click on View Print of Program (V) menu under Program (P).
2. Executing the Program: When you turn on the controller, the LED will blink, showing it is in
standby mode. Press the MODE button to move it to PLAY, then press START to execute the
downloaded task code. You should see “1” and “2” being printed on the Program Output
Monitor.

EXERCISES:

1) Observe and state 3 limitations in mobility of robot as seen when executing motion through
remote controller.
a. _______________________________________________________________________
b. _______________________________________________________________________
c. _______________________________________________________________________

2) Write a task code using RoboPlus Task Program that performs the following three actions:
a. U button pressed = AUX LED turns on
b. D button pressed = AUX LED turns off
c. Start button pressed = Program terminated
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
3) Write a task code using RoboPlus Task Program, performing an action of your choosing. Also
create a concept map explaining the output. Attach screenshots where necessary.

75
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Concept Map:

______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
___________________________________________________________________________

76
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 10

OBJECT: Explore and operate ROBOTIS OP2 Humanoid Robot

THEORY:

Introduction to ROBOTIS OP2 Humanoid

• Affordable, miniature-humanoid-robot platform to enable


many exciting research, education and outreach activities
• Built-in PC: Intel Atom N2600 @1.6 GHz dual core,
RAM 4GB DDR3, 32 GM mSATA with high payload
capacity and advanced computational power
• Management controller (CM-730) with sophisticated
sensors
• 20 actuators modules providing dynamic motion ability
(6 DOF leg x2 + 3 DOF arm x2 + 2 DOF neck)
• 1 Mbps high-speed Dynamixel bus for joint controller

Key Components & Functions

Following is a brief review of major components:


1. Main Controller: The main controller of the robot is built-in and powered by Intel Atom N2600
dual-core, dual threaded, with 4GB RAM. It also enables Wi-Fi connectivity, along with ethernet
and mini HDMI
2. Management Controller: The management controller is CM-730 powered by
STMicroelectronics Cortex-M3 STM32F10RE (clocked at MHz). Though the manual clearly
states the controller to be CM-740, the model we have received uses a different IC (CM-730) but
with the same functionalities. This IC provides interface to the various motors and sensors,
including gyroscope, FSR, and accelerometer; while also corresponding with LEDs, RGB LEDs,
push buttons, buzzer, and external ADC port.
3. Dynamixel MX-28T: Robot exclusive actuators with durable metallic gears and high-speed
operations. Each actuator has its own ID or unique name that can be used to identify it in coding
scenarios for motion manipulation.
4. Camera: The 2 MP HD Webcam with1600x1200 resolution at 10-30 fps is used to collect
sensory data.
5. Rechargeable Battery & Charger: 1800mAh LIPO Battery (30 minutes of operations), charger
and external power adapter
6. Other Components: Shown below

77
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Operating ROBOTIS OP2 Humanoid

ROBOTIS OP2 comes pre-configured with the following 4 modes of operation:


1. Demonstration Ready Mode
2. Autonomous Soccer Mode
3. Interactive Motion Mode
4. Vision Processing Mode
OP2 defaults to
Demonstration Ready
mode when turned on.
To Switch between
modes, press the
“mode” button. OP2
announces each mode
with each pressing.
Each mode has its own
indicating LED. To run
each mode, press the
“START” button. After
pressing START, OP2
will stand up and begin
operations.

78
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

1. Demonstration Ready Mode

The Demonstration Ready


Mode is the default mode
activated when you turn
OP2 on. LED 1(red), LED 2
(blue), and LED 3 (green)
are on; the head LED
changes from green to
amber, and OP2 announces,
“Demonstration-ready
mode”. OP2 is ready for
action! OP2 remains in kneeling position and does not move under this mode.

2. Autonomous Mode

OP2 follows and kicks a red ball


(you can change the ball color) and
plays soccer by itself. When OP2
falls down (either on its back or
belly), it gets up and resumes the
ball search. To start the
Autonomous Soccer mode:
• Press the “MODE” button
until the LED 1 (red) is on.
OP2 announces,
“Autonomous soccer
mode.”
• Press the “START” button to begin. OP2 will stand up and announce either “Sensor
Calibration Failed” or “Sensor Calibration Complete.”
• OP2 announces “sensor calibration failed” if the robot cannot calibrate its IMU’s prior to
walking. OP2 will keep announcing “sensor calibration failed” until it achieves sensor
calibration.
• To minimize or prevent OP2 from announcing “sensor calibration failed”, place OP2 on a
stable surface. Do not shake OP2 during standing up as it may result in a failure to calibrate
its sensors.
• OP2 announces “sensor calibration complete” when calibration is complete and begins soccer
mode. OP2 make the announcement only once.
• When OP2 sees the ball, it walks towards the ball.
• Once the ball is close enough, OP2 kicks the ball with either left or right foot. If OP2 falls
during pursuit or kick, it gets back up.
To stop Autonomous Soccer mode, press the “MODE” button, and OP2 returns to Demonstration-
Ready Mode.

79
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

3. Interactive Motion Mode

OP2 performs pre-programmed motions


sequentially while talking. To start
Interactive Motion mode:
• Press the “MODE” button until
the LED 2 (blue) is on. OP2
announces “Interactive motion
mode”.
• Press the “START” button to
begin. OP2 will stand up and announces, “Start motion demonstration”.
• OP2 performs the following actions sequentially.

To stop Interactive Motion mode, press the “MODE” button, and OP2 returns to demonstration-ready
mode.

4. Vision Processing Mode

OP2 will perform the same motion as when in interactive motion mode, but individually, depending
on the color(s) cards shown. Color cards are supplied with the bot.

80
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

To start Vision Processing mode:


• Press the “MODE”
button until LED 3
(Green) in on. OP2
announces, “Vision
processing mode”.
• Press the “START”
button to begin. OP2
announces “Start
Vision Processing
demonstration” and
gets up.
• Select a color(s) card and place it in front of OP2. The color card should be
approximately 15cm (about 6in) in front of OP2.
To stop Vision Processing mode, press the “MODE” button, and OP2 returns to demonstration-ready
mode. If OP2 experiences difficulties with reading the card, you may need to adjust color and white
balance.

81
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

EXERCISES:

1) Start the Autonomous Soccer Mode and observe the limitation of robot when following the ball.
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

2) Start the Motion Interactive Mode, and observe the limitations of following movements
performed by the robot:
a. Clap please ______________
b. Oops! ______________
c. Bye bye! ______________

3) Start Vision Processing Mode and observe limitations of card detection by the robot.
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

4) The robot is performing the same actions in Autonomous and Vision Processing Modes. State the
difference in their approaches.
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

82
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 11

OBJECT: Introduction to ROBOTIS OP2 framework and tutorial files.

THEORY:

The following flowchart diagram represents class breakdown and data pipelines. You may modify the
framework at /robotis/Framework

83
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Source Code

You may find the source code directory at '/robotis' from ROBOTIS OP2's PC.

The pre-installed source code may be updated without prior notice. Please check for updates

84
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

periodically. You may obtain updated source code at the following website
https://sourceforge.net/projects/darwinop/files/

Tutorials

The included tutorial programs can be installed at ‘/robotis/Linux/project/tutorial’ in OP2’s PC. The
following tutorials programs are available for better understanding the OP2 coding and
programming:
• Action_Script (‘/robotis/Linux/project/tutorial/action_script’)
• Ball_Following (‘/robotis/Linux/project/tutorial/ball_following’)
• Camera (‘/robotis/Linux/project/tutorial/camera’)
• Color_Filtering (‘/robotis/Linux/project/tutorial/color_filtering’)
• FSR (‘/robotis/Linux/project/tutorial/fsr’)
• Head_Tracking (‘/robotis/Linux/project/tutorial/head_tracking’)
• Read_Write (‘/robotis/Linux/project/tutorial/read_write’)

EXERCISES:

1) Run the Head Tracking program and state the color it is currently detecting. How can we change
that color?
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

2) Run, Read and Write program from the tutorials and write the values in the table

Left Motors Values Right Motors Values


ID 1 ID 2
ID 3 ID 4
ID 5 ID 6

85
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 12
OBJECT: Explore and run the Action Editor Program.

THEORY:

Introduction to Action Editor

Action Editor allows the user to control and edit OP2's motions and poses via command line. This
process is done by
manipulating values of the
MX-28 actuator(s). Each
actuator has its own ID
map. Before getting into
Action Editor, be aware of
the motion data file
residing in the source code.

Motion File

The motion file is a file that


contains OP2’s poses and
motion data. The data is
read and written as position
of the MX-28; thus
manipulating/editing the
file is a robot-low-level
task. Since the motion file
data is a binary file you
cannot view its contents
directly. You can view its
contents with Action Editor.

ROBOTIS currently supplies 2 motion files with the source code. They are in /robotis/Data directory.
These are:
• ‘motion_1024.bin’ for MX-28 position sensor at 10-bit r
• esolution (300 degrees available)
• ‘motion_4096.bin’ for MX-28 position sensor at 12-bit resolution (360 degrees available)

86
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

The motion file contains 256 pages. Each page can store up to 7 stages (or steps) of motion data. In
the basic motion file provided not all pages are used. You may add you own motion if you like by
making use of the empty pages.

1. Page number: This is the listed page number. If you want to create new motion poses you may
use any empty page:
2. Page title: We recommend you use a page title if you will make use of an empty page.
3. Current position: This is the current position of the MX-28 for each ID. This data is represented
by STP7 in Action Editor. Sometimes the position may read as ???? in Action Editor. This means
position of the MX-28 is not being read (and torque is off). If you turn an MX-28 off, you will not
get current position reading until you turn it back on. You can turn off any or all MX-28 at will.
This is very convenient to make robot poses rather than entering position values. For example, if
you want to make a new robot pose simply turn any MX-28 off, make the pose physically on the
robot by yourself, and turn the MX-28(s) back on at that robot pose. Once turned on, you’ll get
the pose values.
4. Steps or stages: Each page can store up to 7 steps, from STP0 to STP6. However, some motions
may require more than 7 stages to perform completely. Simply use multiple pages and link them
with Next.
5. Next: This is a link to indicate whether or not motion continues at a different page. To continue
motions just list the page number where motion is to be continued. Number 0 indicates motion

87
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

does not continue onto another page (default value). Linking page does not have to the in
numerical order allowing you to link from one page to any other page.
6. Play Count: This is the number of times the motion of the page is to be played. For example,
page 239 has a play count of 4 times then has a link to page 240. This means that motions on page
239 will be executed 4 times then move on to page 240 and continue with the motions on page
240.
7. Exit: There may be times when a motion is stopped. If that’s the case the robot may be in an
unstable position. Exit is much like Next, so Exit should be linked to a page where DARWIN-OP
can return to a stable pose. Number 0 indicates no link to exit page linked (default value).

Although there are many pages occupied with data, not all pages are actually set in motion by OP.

Getting Started with Action Editor

Action editor can be found at /robotis/Linux/project/action_editor. You can modify OP2 motion
data in a terminal window.

1. To read and write, data go to the directory: /robotis/Linux/project/action_editor


2. Make sure that there is an executable file named "action_editor". Please note that when running
action editor, the program will open the file motion_4096.bin by default. The illustrations are
from the motion file 'motion_1024.bin' and 'motion_4096.bin'. Remember that motion files are
located at /robotis/Data
3. If there is no said file, then create it by typing make. The compiler will automatically generate the
file.
4. Run the program by typing /action_editor. You will notice OP's head LED changes from green
to amber. Remember the current angle resolution for the actuators.

5. Once in the program, type help for further information

88
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

6. From there you may follow the options given to you.


7. To exit the program, type exit.

Please note the following:


• STP7 is the current value of the actuators. ???? means that torque has been released.
• PauseTime is the pause for motion playback for step STP[x].
• Time (x 8msec) is the time period for OP2 to complete step STP[x]. Each time unit account for
8ms of time.

89
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

After typing help, the following list will appear:

The menu options are very extensive so you may not be able to memorize every command. At any
time, you type help to invoke the options list.
• exit: This exits the program. After exiting the program, press the "RESET" button on OP.
• re: Refreshes the screen.
• b: Moves to the previous page.
• n: Moves to the next page.
• page [index]: Moves to the [index] page. For example, typing page 5 outputs data from page 5
on screen.
• list: Outputs a list of pages.
• new: Initializes current page by clearing all actuator position data.
• copy [index]: Copies data from page [index] to current page. For example, if you are on page 5
and want to copy page 9 then type copy 9.
• set [value]: Sets position value on chosen actuator. For example, if you want ID19 (head pan) to
have a value of 512 then using the keyboard's directional keys place the cursor on ID19 and
type set 512.
• save: Saves any changes you've made. the saved motion file (motion.bin can be found
at /robotis/Data/)
• play: Plays motion(s) of current page.
• name: Changes the name of the current page. You can view the name of the page at the top
right portion of the screen.
• w [index]: Overwrites STP [index] with data from STP7 (the very first column on the page).
• i: Inserts data from STP7 to STP0. Moves data from STP [x] to STP [x + 1] if any.
• i [index]: Inserts data from STP7 to STP [index]. Moves data from STP [index] to STP [index
+ 1] if any.
• m [index] [index2]: Moves data from [index2] to [index].

90
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

• d [index]: Deletes data from STP [index]. Moves data from STP [index] to STP [index - 1].
• on/off: Turns on/off torque from all Dynamixels.
• on/off [index1] [index2] [index3] …: Turns torque on/off from ID [index1] ID[index2]
ID[index3].

Motion Editing with Action Editor

Let's modify OP2’s pose when kneeling. Let's change the position of the left arm during kneeling.
Dynamixels for the left arm are at ID 2, 4, and 6. Before you begin, you may want to make a copy of
"motion_4096.bin" file and save it elsewhere. If you don't like the changes you've made you can
always revert to original data by overwriting the file.
1. Run Action Editor

91
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

2. Find the page where the kneeling (sit down) motion is by typing list. Notice that the motion data
is on page 15.

3. Exit the list and go to page 15


by typing page 15. With the
current data values from page
15, OP2's pose will look like
this. Check by typing play.
4. Once on page 15, edit the
values on ID 2, 4, 6. One of the
easiest ways to edit values is to
release the torque on
Dynamixels from the left arm.
5. Release the torque on ID 2, 4,
and 6 by typing off 2 4 6

92
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

6. After getting the desired pose, turn torque on again by simple typing on. Afterwards match the
values for ID2, ID4, ID6 on STP0 match those from STP7 (save your work).
7. Type play and you will notice the newly updated values for ID 2, 4, and 6

8. Type save if you want this pose to be new


sitting pose whenever OP2 is kneeling (sit
down).

93
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

EXERCISES:

1) Get the ROBOTIS OP2 to salute when it stands, by following the given steps, and make
observations.
a. Go to page number ____ for stand-up motion of the robot
b. Release torque in the motors of right hand and set it to salute position. Note down values
obtained.

Left Motors Values Right Motors Values


ID 1 ID 2
ID 3 ID 4
ID 5 ID 6
c. Mention if you did any accessory steps for a complete motion.

___________________________________________________________________________
___________________________________________________________________________
___________________________________________________________________________
___________________________________________________________________________
___________________________________________________________________________

94
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB ASSESMENT

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3
Working and motor
Operation of
Robots

Identify, 3 5 Psycho 3
Operate, and motor
Use Robots in
Multiple
Applications

Evaluate 3 5 Psycho 3
Limitations motor
of Robots
and Apply
Appropriate
Assumptions

95
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 13

OBJECT: Explore, operate, and program NAO Evolution V5 Humanoid Robot

THEORY:

Introduction to NAO Evolution V5 Humanoid

• NAO is a programming tool that comes with fully


humanoid hardware structure, that is increasingly
becoming a standard in education and research.
• Having multiple applications, NAO is also being used as
an assistant by companies and healthcare centers to
welcome, inform and entertain visitors.
• Provides 25 degrees of freedom which enable the bot to
move and adapt to its environment.
• Houses 7 touch sensors located on the head, hands and
feet, sonars and an inertial unit to perceive environment
and locate itself in space.
• Has 4 directional microphones and speakers to interact
with humans.
• Speech recognition and dialogue feature available in 20
languages.
• Its two 2D cameras help recognize shapes, objects and
even people.
• NAO is an open and fully programmable platform.

Key Components & Functions

Following is a brief review of major components:


1. CPU: The main controller of the robot is built-in and powered by Intel Atom N2600 dual-core,
dual threaded, with 4GB RAM. It also enables Wi-Fi connectivity, along with ethernet and mini
HDMI.
2. Batter: Rechargeable battery providing 60-90 minutes of usage. Charger and external power
adapter available.
3. Interaction Unit: Loudspeakers, microphones, 1.22 megapixels video camera, infrared and
LEDs aid interaction with the bot.
4. FSR: Force Sensitive Resistors are located on the feet and have a working range from 0 N to 25
N. These sensors measure a resistance change according to the pressure applied.
5. Inertial Unit: Made up of a 3-axis accelerometer and gyroscope, its output data enables an
estimation of the torso speed and attitude (Yaw, Pitch, Roll). The Inertial unit is located in the
torso with its own processor.

96
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

6. Sonar: NAO is equipped with two ultrasonic sensors (or sonars) in the chest which allow it to
estimate the distance to obstacles in its environment. Each sonar has 2 emitters and two receivers.
7. Joint Position Sensor: From 36 to 2 MRE (Magnetic Rotary Encoders) using Hall-effect sensor
technology, each sensor provides 12-bit precision, i.e. 4096 values per turn corresponding to
about 0.1° precision. Located at joints.
8. Contact & Tactile Sensor: Capacitive tactile sensors located at head and hands. Bumpers acting
as on and off switch for contact present on each foot.
9. Motors: A total of 25 motors are present at important joint locations, each of varying type based
on functional and structural requirements.

Operating NAO Evolution V5

1. Default Position: Crouching position is the default and most stable position for NAO.
2. Turning On: Pressing the Chest Button once turns on the bot. The LEDs, blinking and fading,
inform you about the progress of the startup. The boot process is completed when NAO says
“OGNAK GNOUK”.
3. Accessing Web Page: Press the Chest button once again. NAO says the four numbers of its IP
address: note them. Open a web browser and enter this IP address in the address bar. An

97
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

authentication windows appears. Enter username and password set by your university, and
official NAO web page will appear customized to your bot.

4. Turning Off: Before doing so, make sure NAO is in a safe position, or it may fall. You could
also place a hand on its back to keep it in position. Press and hold the Chest Button for 3 seconds,
until NAO says “GNUK GNUK”. The shutdown process is completed when all the LEDs are off.

Programming NAO Evolution V5 Humanoid

Introduction to Choregraphe

Choregraphe is a multi-platform desktop application, allowing you to:

• Create animations, behaviors and dialogs,


• Test them on a simulated robot, or directly on a real one,
• Monitor and control you robot,
• Enrich Choregraphe behaviors with your own Python code.

Choregraphe allows you to create applications containing Dialogs, services and powerful behaviors,
such as interaction with people, dance, e-mails sending, without writing a single line of code. This
works on all robots by Aldebaran Robotics.

98
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Choregraphe Interface

1. Main window: At startup, the following interface is displayed.


2. Panels: By default, few main panels are displayed:
A. Project files panel
B. Box libraries panel
C. Flow diagram panel
D. Robot View and Video monitor panel
E. Inspector panel and Robot applications panel

3. Menu Bar: Following items are displayed.


➢ File menu: This menu enables you to create, open and save a Project; add new content to the
current Project, modify its Project Properties; export the current Project as a CRG file or
build it to an Application. Finally, you can exit the application through this menu.
➢ Edit menu: This menu enables you to undo and redo last actions made in the diagram. You
can also access to the preferences of the application through this menu.
➢ Connection menu: This menu enables you to connect to and disconnect from your Aldebaran
robot. Once you are connected to your Aldebaran robot, you can also:
• play and stop the opened behavior.
• display or hide the Log viewer when an error occurs in the behavior.
• transfer files between your computer and your Aldebaran robot.
• update robot system (in Connection > Advanced) with a new version of its software.

99
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

➢ View menu: This menu enables you to manage the displayed panels. You can select the ones
you want to hide or display, you can then save, and load named workspace Layouts or reset
your workspace layout to default.
➢ Help menu: This menu enables you to see the statistics of the current behavior (number of
boxes, number of lines, etc) and to access to the general documentation, the documentation
about Choregraphe and the API reference. You can also get some information about the
version of Choregraphe you are currently using through this menu.
4. Toolbar: These buttons are shortcuts to actions you will often need while creating behaviors.
Note that keyboard shortcuts are also available.

Button(s) Function

Create a New project, Open or Save a project

Undo and Redo last actions made in the diagram.

Connect, Disconnect or Try to reconnect your Aldebaran robot.

Play or Stop the current Behavior.

See the warnings and the errors that can occurs during the execution of
your behavior.

When you click the Play button, the Progress bar, shows the status of the
Behavior loading.
It can be:
▪ Grey: the behavior is not loaded.
▪ Moving green and grey: the behavior is loading.
▪ Green: the Behavior is loaded.

Enables you to set the volume of NAO’s speakers.

Activate / deactivate the Animation Mode which enables you to easily


manipulate your Aldebaran robot and store its position. This button can
be:
▪ Green: the Animation Mode is deactivated.
▪ Orange: intermediate state where the animation mode is either
loading or unloading.
▪ Red: the animation mode is activated.

Turns on and off the Autonomous Life on the robot.

100
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Rest button. Sets the Stiffness off.


If your Aldebaran robot is standing, before setting the Stiffness off, he
goes to the Crouch posture.
Wake Up button. Sets the Stiffness on.
Additionally, if your Aldebaran robot is crouched, he also goes to
the StandInit posture.
Indicate the level of battery of the connected Aldebaran robot.
This indicator can be:
▪ Green: the level of the battery is almost at its maximum.
▪ Orange: the level of the battery is medium.
▪ Red: the level of the battery is very low and
your Aldebaran robot is going to shutdown in a few minutes if
you do not plug it in.

Connecting to Choregraphe

1. Start Choregraphe on
your computer.
2. Click the indicated
“Connect” button from
toolbar.
3. The window below will
appear, with your robot
listed on the table to the
left.
4. Select your robot in the
list and click the
“Connect To” button.
5. Now you should be
connected to NAO. Try
moving some of the
NAO’s body parts by hand. The window to the right in Choregraphe will update to reflect the
changes in NAO’s joint positions.

Programming NAO Evolution V5


Humanoid

1. In Choregraphe, look at the “Box List”


to the left. Navigate to Audio > Voice,
then drag and drop a “Say” box to the
central area. This central area is called
the workspace and contains the
commands that the robot executes.

101
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

2. To execute the “Say” box, we must connect it into the program “flow”. Click on the small arrow
to the left of the workspace and drag a line to the arrow on the left of the “Say” box. The boxes
will be executed sequentially
in the order that they are
connected. The last box
should connect to global
stop [x] on the right hand
side of the work area.
3. Finally, click the green “Play”
button next to the “Connect”
button. NAO should say
“Hello.”

EXERCISES:

1) Explore NAO and Choregraphe and make a simple program that involves a sensor.
2) Explore NAO and Choregraphe and make a simple program that involves a motor.

102
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB ASSESMENT

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3
Working and motor
Operation of
Robots

Identify, 3 5 Psycho 3
Operate, and motor
Use Robots in
Multiple
Applications

Evaluate 3 5 Psycho 3
Limitations motor
of Robots
and Apply
Appropriate
Assumptions

103
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 14
OBJECT: Explore PeopleBot Telepresence Mobile Robot

THEORY:

Introduction to PeopleBot Telepresence Robot

PeopleBOT is a differential-drive human interacting robot built on the robust P3-DX base, with a
chest-level extension to facilitate interaction with people, capable to traverse low sills and household
power cords and climb most wheelchair ramps.

Components

• Struts and Decks


• Body and Nose
• Batteries and Power
• Position Encoders
• Sonar Arrays
• Protective IRs, Bumpers
• Emergency STOP
• Sick Laser
• User Control Panel
• Integrated PC
• Radio Ethernet
• Color PTZ Camera
• Sound System

Features

• Equipped with infrared table sensors to detect when it approaches a table and an optional 3DOF
gripper with sensors to detect and pick up a cup or other tabletop object.
• Furnished with onboard computer and SICK LSM-200 for highly precise navigation and built in
Sonar Sensor.
• Can navigate autonomously and avoid obstacles with precision.
• Can be used for object and people recognition and tracking by utilizing PTZ camera controlled
via ACTS
• With the audio and speech package, it can record and play back sound, perform speech
recognition and speech synthesis.

104
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

PeopleBot Software Package

Mobile robots provide many software tools along with robot base platforms and accessories, useful
for several levels of development – from basic communication with the Robot, all the way up to
complete navigation and localization packages.

1. ARIA:

ARIA provides an interface and framework for controlling and receiving data from all mobile
robots (ActivMedia) robot platforms, as well as many accessory devices i.e., gripper, camera etc.

2. MobileSIM Simulator:

MobileSIM is software for simulating mobile robots, for debugging and experimentation with
ARIA.

3. MobileEyes:

It is a graphical user interface client for remote operation and Monitoring of the robot.

4. Mapper3:

It is a tool for creating and editing map files for use with ARIA, MobileSIM and navigation
software.

5. ACTS Color Tracking System

The Advanced Color-Tracking System (ACTS) provides easy to use and easy to configure
colorbased visual object. Tracking for PeopieBOT with a supported camera and image
acquisition Interface on an onboard computer.

6. Speech Software

Mobile robots offer three libraries that add speech Recognition and synthesis capabilities:
• Arspeechrec_Sphinx is an easy-to-use wrapper for the open-source.
• Sphinx2 speech recognition system.
• Arspeechsynth_Cepstral performs high quality speech synthesis (text-to- Speech) using the
Cepstral synthesizer.

EXERCISES:

1) What is the purpose of fixed IRs in the PeopleBot and up to how much distance they can sense?

______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

105
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

______________________________________________________________________________
______________________________________________________________________________

2) What is the range of sonar sensor and area coverage of the sensor?

______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

3) Observe working of PeopleBot and list down possible applications.

______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

4) PeopleBot is a discontinued product. Research potential reasons and explain one.

______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________

106
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB # 15
OBJECT: Review and explore programming in ROS (Robotic Operating System)

REQUIREMENT:

A PC or Laptop having Ubuntu operating system with ROS installed.

THEORY:

Introduction to ROS

Robot Operating System (ROS) can be called a robotics middleware. That is, it is a
collection of multiple software frameworks or applications required for robot software
development. Although ROS is not an operating system, it provides services designed for
heterogeneous computer cluster such as hardware abstraction, low-level device control,
implementation of commonly used functionality, message-passing between processes, and
package management. Despite the need for speedy responses and low latency in robot
control, ROS itself is not a real-time OS (RTOS), though it is possible to integrate ROS
with real-time code.

ROS releases may be incompatible with other releases and are often referred to by code
names rather than their version number. The major releases so far are:
1. Melodic Morenia
2. Lunar Loggerhead
3. Kinetic Kame
4. Jade Turtle
5. Indigo Igloo
6. Hydro Medusa
7. Groovy Galapagos
8. Fuerte Turtle
9. Electric Emys
10. DiamondBack
11. C Turtle
12. Box Turtle

We will be exploring ROS Kinetic Kane version. All ROS versions require Linux and not
Windows OS to function, for which we have Linux Ubuntu 14.06T installed. Note that
ROS can run programs written in multiple programming languages, including C++ and
Python.

Major Concepts

Following is presented a brief overview of few basic concepts and commands of the
middleware.

107
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

Nodes

A node is an executable file within the ROS package. It uses ROS client libraries to
communicate with sister nodes, no matter which of the two languages they written in.
Following are the client libraries:
• rospy = Python client library
• roscpp = C++ client library

Nodes can either publish or subscribe a message to a topic and can also provide a service.
To view a list of running nodes in ROS, rosnode command is used. To run a single node,
rosrun command is used in combination with the roscore command which starts the Master
service. To run multiple nodes simultaneously, however, roslaunch command is used which
starts the Master service by itself. The below text further elaborates these concepts.

Messages

Messages are the data communicated between nodes. They basically make up a data type
that is used when subscribing or publishing to a topic. For a subscriber and publisher in a
program to communicate, they must send and receive the same type of message. The type
of message therefore determines the type of topic and can be found out using rostopic type
command.

Topics

Nodes can publish messages to a topic as well as subscribe to a topic to receive messages.
Topic is basically a pathway for the messages to travel between publishing and subscribing
nodes. Information about topics can be obtained using a bunch of rostopic commands, as
listed:
• rostopic bw – displays bandwidth used by a topic
• rostopic echo – prints messages to a screen
• rostopic hz – displays publishing rate of topic
• rostopic list – prints information about active topics
• rostopic pub – publishes data to a topic
• rostopic type – prints topic/message type

Master

Mater is a name service for ROS, that helps nodes find each other. It is started when
roscore command is run and remains active throughout a single work session in ROS.
Therefore, roscore command is the first step to running ROS. However, when running
multiple nodes at the same time, roscore does not need to be run to start the Master service.
Instead, roslaunch command is used.

Basic Programs

Running a Tutorial Turtle Sim Program

1. Start ROS by roscore command in the terminal window.

108
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

2. Open a new terminal and use rosnode to see what running roscore did. Bear in mind to keep the
previous terminal open either by opening a new tab or simply minimizing it. You’ll now see
information printed on the screen regarding rosnode commands.
3. Type rosnode list to view a list of running nodes. You’ll get /rosout written on the output screen.
This node is always running as it collects and logs nodes' debugging output.
4. Open another new terminal, this time for the rosrun command, which allows you to use the
package name to directly run a node within a package (without having to know the package path).
Syntax is $ rosrun [package_name] [node_name], meaning now we can run the turtlesim_node
in the turtlesim package.
5. Type in $ rosrun turtlesim
turtlesim_node and a blue
colored window for the
turtlesim program will
pop up. This node will
function as a subscribing
node i.e. where we see the
output.
6. Running rosnode list in
another terminal will
show two running nodes
now – rosout and
turtlesim. We can change
name of the node by
altering previous
command as $ rosrun
turtlesim turtlesim_node __name:=my_turtle
7. We'll also need something to drive the turtle around with. Start up a publishing node that allows
us to give directions to alter output obtained from subscriber node, using command $ rosrun
turtlesim turtle_teleop_key

109
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

8. Now we can use the


arrow keys of the
keyboard to drive the
turtle around. The
turtlesim_node and the
turtle_teleop_key
nodes are
communicating with
each other over a ROS
Topic. That is, the
turtle_teleop_key is
publishing the
keystrokes on a topic,
while turtlesim
subscribes to the same
topic to receive the
keystrokes.
9. You can end the node
by typing Ctrl+C or by
closing the terminal
windows.

Running a Custom Program

For running a custom program in whichever language in ROS, a Catkin workspace needs to
be created. This is possible through the Catkin package, which comes preinstalled with
ROS. It automatically develops a MakeFile, which is a computer-readable file derived
using the cmake command from the code written in either C++ or Python. With the
workspace up and running, various programs can be successfully compiled and executed.

EXERCISES:
1) To run any node in ROS, list the initial steps performed in sequence.
a. _____________
b. _____________

2) You are to run multiple nodes in ROS at the same time, would the above-mentioned
steps be followed? If yes, explain why. If no, state the alternative steps.
______________________________________________________________________
______________________________________________________________________
______________________________________________________________________

3) Observe working of the turtle program. Mention a single limitation.

______________________________________________________________________
______________________________________________________________________

110
Introduction to Robotics
NED University of Engineering & Technology – Department of Biomedical Engineering

LAB ASSESMENT

Mapping
C P Competent Developing Needs Work
L L Level (10-7) (6-4) (3-1)
Domain
O O
Understand 3 5 Psycho 3
Working and motor
Operation of
Robots

Identify, 3 5 Psycho 3
Operate, and motor
Use Robots in
Multiple
Applications

Evaluate 3 5 Psycho 3
Limitations motor
of Robots
and Apply
Appropriate
Assumptions

111

Вам также может понравиться