Вы находитесь на странице: 1из 64

Advanced Robotıcs

General concept on mobile robotics


David FOLIO
INSA Centre Val de Loire, 5A MRI & M2 2EA

2019 – 2020
I. Introductıon

© 2019 – 2020, David FOLIO Advanced Robotics 2 / 66


 Defınıtıons

What is Robotics?
Robotics is an interdisciplinary branch of engineering and science that includes mechanical
engineering, electronic engineering, information engineering, computer science, and so on… It
deals with the design, construction, operation, and use of robots, as well as computer systems for
their control, sensory feedback, and information processing.

Many types and definitions of robots:

different environments and uses;


various applications fields: industry, agriculture, medical, domestic, military, space, etc.

Basic robotic primitives:

Software
Sense: use of sensors to perceive its environment; Cognition
SENSING ACTION
Plan: interpret, resolve, control, etc.;
Act: interaction with its environment.

Hardware
Sensors Actuators

world
Real
ENVIRONMENT

© 2019 – 2020, David FOLIO Advanced Robotics 4 / 66


Figure 1: Autonomous Robots
 Automatıon vs. Robotıcs

Automation:
“performed with minimal human assistance”
manufacturing process,
automatic machines, CNC
Robot
CNC, computer prog., etc.
Smart Computer
Machines
Autonomous Robots need: device

sensory system to check its status and perceive


the surrounding environment; Automatic
car
tools, actuators, effectors to interact with its assistance

environment;
Sensors
a degree of autonomy (i.e. IA): programmable,
adaptable, etc.
Figure 2: What is a robot?

“I can’t define a robot, but I know one when I see one.”


– a father of robotics: Joseph Engelberger (1925-2015)

© 2019 – 2020, David FOLIO Advanced Robotics 5 / 66


 Classıfıcatıon by domaıns/applıcatıons

Industrial robots
e.g. manipulators
Service robots
Hard/painful/dangerous tasks:
∘ security, military, delivery, transport,
cleaning…
Assistance robots
∘ e.g. elderly, handicap…
Medical robots
∘ surgery (e.g. MIS),
biomicromanipulations…
Field robotics, etc.
Exploring Robots
hazardous, confined environments,
sea, space exploration…
Figure 3: Examples of applications

© 2019 – 2020, David FOLIO Advanced Robotics 6 / 66


 Classıfıcatıon by types

Stationary Robots
(e.g. manipulation);
Mobile Robotics
(i.e. by locomotion type):
wheeled robots;
walking robots;
aerial robots (e.g. drone);
under-water robots, etc.
Multiscale robotics:
Macro-Meso-Micro-Nano-
robotics…

Figure 4: Type of robots

© 2019 – 2020, David FOLIO Advanced Robotics 7 / 66


 Brıef hıstory of robots

Half-century of research on autonomous robots:


Unimate (1961): 1st digitally operated and programmable industrial robot
Hilare (LAAS, 1977): 1st French mobile robot
(Hilare, 1977) (Asimo, 1993) (Pepper, 2014)

(Waymo, 2016)

(Unimate, 1961)
Emotion

Co-worker
telemanipulators compagnion
compliant ...
Cobotic

Small scale robotics

2010

Traditional industrial robots:


repetitive, high frequency, high precision…
∘ e.g. pre-computed motions, pre-planned path…
few Human-Robots interaction
∘ new trend: cobotics

© 2019 – 2020, David FOLIO Advanced Robotics 8 / 66


 Towards AI Robots…

GOAL: Autonomous Intelligent Robotic System

Autonomy:

“Autonomous robots operate independently without support of an human operator”


– Murphy (2001)

Autonomous Robots act in real-world environments for some longer time without external
control – Bekey (2005)

Intelligent:

Intelligent Robots are machines that perceive, think and act – Bekey (2005)

A rational agent acts to maximizes its performance measures given the evidence provide by
a perception sequence and built-in knowledge – Russel, Norvig (2003)

© 2019 – 2020, David FOLIO Advanced Robotics 9 / 66


 Some key ıssues

Mobile robotics:
Numerous systems for a wide range of applications;
∘ issues: complex task remains challenging;
Many problems in various scientific fields are still unsolved
Actuators Sensors ing Learning

Computer Vision
Acquision
ak
n M Manipulation
Emotion Dependability
De
cis
io

Navigation
Representation Grasping
Interaction

Mapping Path Planning Decision


SLAM
e
Augmented Reallity le dg
Knowledg ow

Localization
e Repres
entation Path Kn

Reasoning Object Recognition Human Interpretation many more…

Important economic issues and opportunities:


∘ Assistance/Service robots, new health/biomedical applications, etc.;
∘ Outdoor robots (e.g. military, urban, aerial, etc.),
↪ Esp. autonomous (self-driving) car;
∘ Environmental monitoring, response after disaster
© 2019 – 2020, David FOLIO
(e.g. Fukushima), etc.
Advanced Robotics 10 / 66
 Robots example: Atlas

Atlas, by Boston Dynamics <https://www.bostondynamics.com/atlas>


robot for the DARPA Robotics Challenge (DRC)
advanced humanoid robot
∘ joints: 28, weight: 75kg, height:1.5m
remarkable capabilities
∘ walking, running, climbing stairs, obstacle avoidance…
no high-level control

<https://www.bostondynamics.com/atlas>

© 2019 – 2020, David FOLIO Advanced Robotics 11 / 66


 Robots example: Europa

Europa: European Robotic Pedestrian Assistant


EU project, FP7/2007-2013
Service robot
remarkable capabilities
∘ Navigation in densely populated urban environments
∘ World modeling (SLAM)
∘ Advanced object recognition (sidewalks, cars, pedestrian…)

© 2019 – 2020, David FOLIO Advanced Robotics 12 / 66


 Robots example: Self-drıvıng car

Google self-driving car (now as Waymo, since 2016)


Autonomous electric car;
Numerous sensors: camera, high precision 3D laser, IMU, GPS…
remarkable capabilities
∘ cars are able to drive autonomously in urban environments,
∘ very detailed maps,
∘ locations of traffic lights, sign, pedestrians…
∘ remission map of road markings, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 13 / 66


 Road-map to Autonomous Intellıgent Robotıcs

The design of advanced robots involves numerous functionalities


Actuators Sensors ing Learning

Computer Vision
Acquision
k
Ma
Understanding and modeling the system Emotion Dependability cis
i on Manipulation
De
∘ Kinematics, dynamics, motion…
Navigation
Representation Grasping
Interaction

∘ Reliable feedback,
∘ Task planning, etc. Mapping Path Planning Decision
SLAM
e
Augmented Reallity ledg
Integration of sensors, actuators, power… Knowled ow

Localization
ge Repre
sentation Path Kn

Perception and interaction with environment


∘ Knowledge representation Reasoning Object Recognition Human Interpretation many more…

∘ Understanding the “robot environment” Think


Decision making Behavior
∘ Coping with noise and uncertainty Visualization
Mission
∘ Intuitive Human-Robot Interfaces (e.g. “natural Sense Navigation Planner
World Model Act
command” for inhabitants)
Robot status Communication
Creation of flexible/robust control policies Filter Gesture/posture
∘ Control has to deal with new situations Tracking
Interaction Human Interaction Low-level controller

∘ Learning capabilities drive


Sensor Sensor Sensor steer Actuator
∘ Planning and problem solving Real Environment

© 2019 – 2020, David FOLIO Advanced Robotics 14 / 66


 Robotıc paradıgm

Paradigm
a philosophical and theoretical framework of a scientific school or discipline within which theories,
laws, and generalizations and the experiments performed in support of them are formulated.

How to organize the function/intelligence?


↪ robot paradigms are used for this organization

3 major paradigms for intelligent robots


Think

1. Reactive Paradigm Decision making Behavior

∘ direct coupling of sensors and actuators Visualization


Navigation
Mission
Planner
Sense
∘ compare to a reflex (i.e. sensorimotor) World Model Act
Communication
2. Hierarchical Paradigm Robot status

Filter Gesture/posture
∘ uses a planning and reasoning component Tracking
Interaction Human Interaction Low-level controller

∘ abstract description of goals, tasks, capabilities… drive steer Actuator


Sensor Sensor Sensor Real Environment
3. Hybrid (deliberative/reactive) Paradigm
∘ synthesis of reactive and deliberative concept
∘ able to maintains different abstraction levels
∘ able to maintains different temporal granularities
© 2019 – 2020, David FOLIO Advanced Robotics 15 / 66
II. Modelıng

© 2019 – 2020, David FOLIO Advanced Robotics 16 / 66


 Informal defınıtıons

Motion study of the process of causing a robot to move.

Kinematics study of the motion of rigid-bodies that are connected with joints.

Deals with the geometric relationships that govern the robotic system;
Deals with the relationship between control parameters and the behavior of a
system in state-space.
Does not consider the forces that affect the motion.

Dynamics study of motion in which the forces are modeled;

Includes the energies and speeds associated with these motions.

© 2019 – 2020, David FOLIO Advanced Robotics 18 / 66


 Informal defınıtıons (from robotıc manıpulatıon)

Workspace space (Ω) in which the robot can move or reach;

To Ω is related a reference frame ℱ0 :


2D: Ω ⊂ ℝ2 , and ℱ0 ∶ {𝑂, 𝑥, 𝑦};
3D: Ω ⊂ ℝ3 , and ℱ0 ∶ {𝑂, 𝑥, 𝑦, 𝑧}; 4D, etc.

Task space (or Cartesian space) space (ℳ) where the robot posture are expressed

For mobile robots, often:


in 2D: ℳ = SE(2) ≡ ℝ2 × 𝕊1
(𝕊 is SO(2), i.e a circle with 𝕊1
1
= {(𝑥, 𝑦) ∈ ℝ2 |𝑥2 + 𝑦2 = 1})
in 3D: ℳ = SE(3) ≡ ℝ × SO(3)
3

(SO(3) the 3D rotation group)

Robot posture vector 𝜉 ∈ ℳ corresponding to its position and orientation.


For wheeled mobile robots, often:
2D: 𝜉 = {𝑥, 𝑦, 𝜃}, 3D: 𝜉 = {𝑥, 𝑦, 𝑧, 𝜃, 𝜙, 𝜓}, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 19 / 66


 Informal defınıtıons (from robotıc manıpulatıon)

Configuration space (or C-space) space (𝒞) describing uniquely the state of a robot;

Cartesian product of the state-space of each joint (i.e. moving part);


Initially used for robotic arms, but useful for mobile robotic planning

Robot configuration vector 𝑞 ∈ 𝒞 (i.e. generalized coordinates) of independent


parameters uniquely specifying the robot’s state.

Examples Ω C-Space (𝒞)

Differential drive (i.e. unicycle) ℝ 2


SE(2)
Car (1 steering angle) ℝ2 SE(2) × SO(1)
Space Rover (i.e. rough terrain; 6 steering angles) ℝ2 SE(2) × SO(6)
Tractor-Trailer (steering + trailer angles) ℝ2 SE(2) × SO(2)
Airplane, submarine, satellite, etc. ℝ3 SE(3)
Legged robot (exple. HRP2: 30 joints) ℝ2 SE(2) × SO(30)

© 2019 – 2020, David FOLIO Advanced Robotics 20 / 66


 Informal defınıtıons

Degrees of freedom (DoF)


For a single joint:
∘ Number of independent directions of motion;
∘ Knee, elbow: 1; Ankle: 2; etc.
For a manipulator (sequence of joint):
∘ Sum of all the DoF for each joint.
For a moving vehicle without joints
∘ Possible directions of motion
∘ Locomotive:1. Unicycle: 3. Helicopter: 6…
For a moving vehicle with joints: sum them all
∘ Car: 4, Car+trailer:4, Space rover:9+, etc.
The maneuverability of a mobile robot is the combination:
of the mobility available based on the sliding constraints,
plus additional freedom contributed by the steering.

© 2019 – 2020, David FOLIO Advanced Robotics 21 / 66


 Informal defınıtıons

Holonomic system: constraints can be integrated;


The pose of a robot is not constrained by its velocity
The robot can instantaneously move in any direction in the workspace (2D/3D)
They are omnidirectional robots:
∘ The constraint is the robot velocity;
∘ The velocity 𝑣(𝑡) can be integrated if there exists a trajectory 𝑠(𝑡) that is only dependent on the
posture, i.e.: 𝑠(𝜉)
𝜕𝑠 𝜕𝑠 𝜕𝑠
↪Equivalently: 𝑑𝑠 =
𝑑𝑥 + 𝑑𝑦 + 𝑑𝜃
𝜕𝑥 𝜕𝑦 𝜕𝜃
Non-holonomic system: constraints cannot be integrated:
The constraint is the robot velocity: 𝑣(𝑡) is NOT integrable;
∘ Its posture is constraint by the velocity (e.g. not in all directions);
There is no trajectory 𝑠(𝑡) that only depends on the posture;
There is a constraint on the velocity (directions) that the robot can reach;
∘ The robot cannot instantaneously move in every direction in the workspace;

© 2019 – 2020, David FOLIO Advanced Robotics 22 / 66


 Robot Kınematıcs

Robot manipulator:
Fixed to the environment (i.e. fixed base)
Mobile robot:
Not fixed to its environment;
Posses locomotion:
∘ Ability to move from one place to another place;
∘ It depends on the environment (e.g. ground, air, water, etc.)
∘ It is hard to imitate nature…
Kinematics Objective
Description of mechanical behavior of the robot for design and control;
Similar to robot manipulator kinematics;
However, mobile robots move unbound wrt. their environment:
No direct (i.e. instantaneous) way to measure (esp. its position),
Position must be integrated over time,
Leads to inaccuracies in position (motion) estimate
↪ one of main challenge in mobile robotics.

© 2019 – 2020, David FOLIO Advanced Robotics 23 / 66


 Kınematıc Model

Robot speed (𝜉)̇ as a function of inputs 𝑢 (e.g. wheel speed and steering)
Forward kinematics: 𝜉 ̇ = 𝑓(𝑞, u)
Inverse kinematics: u = 𝑔(𝑞, 𝜉)̇
↪ Required for motion control, motion planning
Non-holonomic robots:
differential equations are not integrable to the final position;
the measure of the traveled distance is not sufficient to calculate its final position;
the temporal evolution of the motion must also be known.
this is in stark contrast to actuator arms
mobile robots non-holonomic constraints:
∘ in mobile robotics differential (inverse) kinematics is used;
∘ transformation between velocities instead of positions;
To understand the mobile robot motion (kinematics) the constraintst imposed by the
locomotion system (e.g. wheels) need to be analyzed.

© 2019 – 2020, David FOLIO Advanced Robotics 24 / 66


 Kınematıcs: the dıfferentıal drıve case

Representing the robot within an arbitrary initial frame


y0
In 2D, i.e. C-Space: 𝒞 = SE(2) yR
xR
Reference frame: ℱ0 ∶ {𝑂, 𝑥0 , 𝑦0 }
y P θ
Robot frame: ℱ𝑅 ∶ {𝑂, 𝑥𝑅 , 𝑦𝑅 }
Robot posture: 𝜉0 = (𝑥, 𝑦, 𝜃)𝑡
x0
O x
Mapping between the two frames
̇ = R(𝜃)𝜉0̇ ,
∘ 𝜉𝑅
∘ with R an orthogonal rotation matrix:
cos 𝜃 sin 𝜃 0
R(𝜃) = ⎛
⎜− sin 𝜃 cos 𝜃 0⎞⎟
⎝ 0 0 1⎠

© 2019 – 2020, David FOLIO Advanced Robotics 25 / 66


 Kınematıcs: the dıfferentıal drıve case

Forward Kinematic:

what are the inputs (i.e. control variables)?


∘ It relates the motors (wheels) with the Cartesian space (position and orientation)
∘ 𝜑̇ 𝑟 (𝑡) and 𝜑̇ 𝑙 (𝑡) left and right wheels angular velocities; y0 ICR R
yR
∘ 𝑣𝑟 (𝑡) and 𝑣𝑙 (𝑡) left and right wheels linear velocities; xR

∘ Robot velocities (in ℱ𝑅 ): y P θ

2l r
1 𝑟
𝜔(𝑡) = (𝑣 − 𝑣𝑙 ) = (𝜑̇ 𝑟 − 𝜑̇ 𝑙 ) x0
2𝑙 𝑟 2 O x
1 𝑟
𝑣(𝑡) = (𝑣𝑟 + 𝑣𝑙 ) = (𝜑̇ 𝑟 + 𝜑̇ 𝑙 ) ∘ 𝑟: radius of each wheel;
2 2
∘ 𝑙: distance between the wheel and 𝑃
Motion control (using geometric approach):
∘ (posture) kinematic model: 𝜉0̇ = C𝜉 (𝑞)u

𝑥(𝑡)
̇ cos 𝜃 0
⎛ 𝑣(𝑡)
⎜𝑦(𝑡)̇ ⎞⎟=⎛
⎜ sin 𝜃 0⎞⎟( )
̇ ⎠ ⎝ 0 𝜔(𝑡)
⎝ 𝜃(𝑡) 1⎠

∘ (configuration) kinematic model: 𝑞 ̇ = C𝑞 (𝑞)u


© 2019 – 2020, David FOLIO Advanced Robotics 26 / 66
 Wheeled Mobıle Robots

Wheels are the most appropriate basic solution for most applications:
energetically efficient; good balance;
e.g. simple mechanical implementation and easy to control;
Basic wheels types
z a. b. c. d.
a) Standard wheel (2DoF): rotation around the
(actuated) wheel axis and the contact point (if y

steered);
y
b) Castor wheel (3DoF): rotation around the x
wheel, the castor axis, and the contact point;
c) Swedish wheel (3DoF): rotation around the
(actuated) wheel, the rollers axis and the
contact point
d) Ball or spherical wheel: suspension
technically not solved.
Bigger wheels allow overcoming higher obstacles, but require more torque;
Combining actuation and steering on one wheel makes the design complex and adds
additional errors for odometry.

© 2019 – 2020, David FOLIO Advanced Robotics 27 / 66


 Wheeled Mobıle Robots

Different arrangements of wheels

Nb. Wheels

3
...
4

3 wheels are sufficient to guarantee stability:


with more than 3 wheels stability improved,
but an appropriate suspension is required (e.g. hyperstatic system);
Most arrangements are non-holonomic;
implies control issues.
Selection of wheels depends on the application.
© 2019 – 2020, David FOLIO Advanced Robotics 28 / 66
 Wheeled Mobıle Robots

Idealized rolling wheel


z
φ r
If the wheel is free to rotate about its axis (x-axis), the robot
y
exhibits preferencial rolling motion in one direction (y-axis)
and a certain amount of lateral slip. vC
C
For low velocities, pure rolling is a reasonable wheel model. ground

β x
y

Basic simplification:

Vertical plane for the wheel;


Wheel connected by rigid frame (chassis);
Single point of contact (between wheel and ground);
No friction for rotation around contact point;
Pure rolling: 𝑣𝐶 = 0;
No slipping, skidding nor sliding;
Not deformable: constant shape, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 29 / 66


 Wheel kınematıc constraınts

A typical wheel cannot achieve any motion in the C-space:

Rolling without slipping:


𝑣𝐶 = 𝑉𝐴 + 𝜑̇ ∧ ⃗⃗⃗⃗⃗⃗⃗⃗⃗⃗⃗⃗
𝐴𝐶 = 0⃗

Fixed standard wheel case:


No vertical axis of rotation → No steering;
yR β φ
Angle Wheel/chassis, 𝛽, fixed;
robot chassis
r
𝑣𝑥 = 0 A vC
Velocity of the wheel: 𝑣 = ⎛
⎜𝑣𝑦 = 𝑟𝜑̇ ⎞
⎟ l vy C
θ α
𝑣
⎝ 𝑧 = 0 ⎠
P xR
Rolling constraint:

(sin(𝛼 + 𝛽) − cos(𝛼 + 𝛽) −𝑙 cos 𝛽) R(𝜃)𝜉0̇ − 𝑟𝜑̇ = 0

No sliding constraint:

(cos(𝛼 + 𝛽) sin(𝛼 + 𝛽) 𝑙 sin 𝛽) R(𝜃)𝜉0̇ = 0


© 2019 – 2020, David FOLIO Advanced Robotics 30 / 66
 Wheel kınematıc constraınts

Steered wheel case:


yR β(t) β
Steering: angle wheel/chassis, 𝛽(𝑡), actuated;
𝑣𝑥 = 0 φ
Velocity of the wheel: 𝑣=⎛
⎜𝑣𝑦 = 𝑟𝜑̇ ⎞

robot chassis
A r
𝑣 = 0 l vy C
vC
⎝ 𝑧 ⎠ θ α
P xR
Rolling constraint:

(sin(𝛼 + 𝛽(𝑡)) − cos(𝛼 + 𝛽(𝑡)) −𝑙 cos 𝛽(𝑡)) R(𝜃)𝜉0̇ − 𝑟𝜑̇ = 0

No sliding constraint:

(cos(𝛼 + 𝛽(𝑡)) sin(𝛼 + 𝛽(𝑡)) 𝑙 sin 𝛽(𝑡)) R(𝜃)𝜉0̇ = 0

© 2019 – 2020, David FOLIO Advanced Robotics 31 / 66


 Wheel kınematıc constraınts

Castor wheel case:


yR β(t)
off-centered orientable wheel B
e.g. free (passive) rotation 𝜑̇ and 𝛽;̇ d φ
robot chassis B
omnidirectional wheel A
l d
θ α
P xR
Rolling constraint:

(sin(𝛼 + 𝛽) − cos(𝛼 + 𝛽) −𝑙 cos 𝛽) R(𝜃)𝜉0̇ − 𝑟𝜑̇ = 0

No sliding constraint:

(cos(𝛼 + 𝛽) sin(𝛼 + 𝛽) 𝑑 + 𝑙 sin 𝛽(𝑡)) R(𝜃)𝜉0̇ + 𝑑𝛽 ̇ = 0

© 2019 – 2020, David FOLIO Advanced Robotics 32 / 66


 Wheel kınematıc constraınts

Swedish wheel case:


yR β γ
omnidirectional wheel; φsw
rotation 𝜑̇ (active);
rsw
robot chassis
rotation of the small wheel 𝜑̇ 𝑠𝑤 (passive); A
∘ 𝑟𝑠𝑤 radius of the small roller; l vy φ
θ α
P xR
Rolling constraint:

(sin(𝛼 + 𝛽 + 𝛾) − cos(𝛼 + 𝛽 + 𝛾) −𝑙 cos(𝛽 + 𝛾)) R(𝜃)𝜉0̇ − 𝑟𝜑cos


̇ 𝛾=0

No sliding constraint:

(cos(𝛼 + 𝛽 + 𝛾) sin(𝛼 + 𝛽 + 𝛾) 𝑙 sin(𝛽 + 𝛾)) R(𝜃)𝜉0̇ −𝑟𝜑̇ sin 𝛾 − 𝑟𝑠𝑤 𝜑̇ 𝑠𝑤 = 0

© 2019 – 2020, David FOLIO Advanced Robotics 33 / 66


 Robot kınematıc constraınts

Given a robot with 𝑚 wheels:


each wheel imposes zero or more constraints on the robot motion;
∘ only fixed and steerable standard wheels impose constraints;
∘ castor, swedish and spherical wheels impose no kinematic constraints on the robot chassis:
𝜉0̇ can change freely.
What is the maneuverability of a robot considering a combination of different wheels?
Combine the constraints that arise from all the wheels based on the placement of them on
the robot chassis;

Example
Let consider a robot with a total of 𝑚
= 𝑚𝑓 + 𝑚𝑠 (fixed+steerable) standars wheels, with
𝜑(𝑡)) = (𝜑𝑓 , 𝜑𝑠 )𝑡 . The equations for the kinematics constraints in matrix forms:
J
J1 (𝛽𝑠 )R(𝜃)𝜉0̇ + J2 𝜑̇ = 0, with J1 (𝛽𝑠 ) = ( 1𝑓 ), and
Rolling:
J1𝑓 (𝛽𝑠 )
J2 = 𝑑𝑖𝑎𝑔(𝑟1 , ..., 𝑟𝑛 )
C1𝑓
Slidding: C1 (𝛽𝑠 )R(𝜃)𝜉0̇ = 0, with C1 (𝛽𝑠 ) = ( )
J1𝑓 (𝛽𝑠 )
© 2019 – 2020, David FOLIO Advanced Robotics 34 / 66
 Kınematıcs: the dıfferentıal drıve case

Wheel constraints: y0 ICR


yR R
1 0 𝑙 𝑟𝑙 0 xR
⎛1 0 −𝑙 ⎞ ⎛ 0 𝑟𝑟 ⎞ P θ







𝜉𝑅̇ = ⎜
⎜ ⎟ ( 𝜑̇ 𝑙 )
⎟ y
⎜0 1 0 ⎟ ⎜0 0 ⎟ ⎟ 𝜑̇ 𝑟
2l r
⎝0 1 0 ⎠ ⎝0 0 ⎠ x0
A𝜉𝑅̇ = B𝜑̇ O x
left wheel:𝛼𝑙 = − 𝜋2 ; 𝛽𝑙 = 𝜋;
right wheel: 𝛼𝑟 = − 𝜋2 ; 𝛽𝑟 = 0 ;

Forward kinematics: 𝜉 ̇ = 𝑓(𝜑)̇


𝜉𝑅̇ = A+ B𝜑̇
𝜉0̇ = R+ (𝜃)𝜉𝑅̇ ,
Inverse kinematics: 𝜑̇ = 𝑔(𝜉)̇
𝜑̇ = B+ A𝜉𝑅̇

© 2019 – 2020, David FOLIO Advanced Robotics 35 / 66


 Towards real mobıle robots

In real world:
no planar workspace: rough terrain, obstacles, etc.
wheel are deformable (e.g. deflated, worn…):
∘ not constant radius 𝑟;
wheels can slip, skid, slide:
∘ need to model the wheel/ground contact;
at high speed dynamics becomes important!

© 2019 – 2020, David FOLIO Advanced Robotics 36 / 66


 Introductıon to dynamıc modelıng

Euler-Newton formulation:

𝑀 (𝑞)𝑞 ̈ + 𝑉 (𝑞, 𝑞)̇ 𝑞 ̇ + 𝐹 (𝑞)̇ + 𝐺(𝑞) + 𝜏𝑑 = 𝐵(𝑞)𝜏 − Λ𝑡 (𝑞)𝜆

𝑀 (𝑞): inertia matrix, 𝜏𝑑 : vector of bounded unknown disturbances,


𝑉 (𝑞, 𝑞)̇ : centripetal and coriolis matrix, 𝐵(𝑞): input matrix, 𝜏 is the input vector,
𝐹 (𝑞)̇ : surface friction matrix, Λ: matrix associated with the kinematic
𝐺(𝑞): gravitational vector, constraints,

Lagrange formalism:

𝑑 𝜕ℒ 𝑡 𝜕ℒ 𝑡
( ) −( ) = 𝐹 − Λ𝑡 (𝑞)𝜆
𝑑𝑡 𝜕𝑞 ̇ 𝜕𝑞

ℒ = 𝑇 − 𝑈 is the Lagrangian function, with


∘ 𝑇: the kinetic energy of the system,
∘ 𝑈: the potential energy of the system
𝐹 generalized force vector
© 2019 – 2020, David FOLIO Advanced Robotics 37 / 66
 Introductıon to legged locomotıon

Why legged robots?

Try to imitated nature;


They can cross obstacles;
Energetically inefficient: consumes energy, even in the absence of movement.
They require to overcome a lot of control issues…

StarlETH

With fewer legs With more legs


∘ less stable; ∘ more stable;
∘ require fast control; ∘ require more coordination;
e.g 4-6 legs used; Typical structure (at least 3DoF)
but 2 legs for humanoid robots… hips θ
abduction φ Knee
bending

ψ
Hips
© 2019 – 2020, David FOLIO Advanced Robotics bending 38 / 66
III. Sensors and Actuators

© 2019 – 2020, David FOLIO Advanced Robotics 39 / 66


 Perceıve the envıronment

Perception is the organization, identification, and interpretation of sensory information in


order to represent and understand the presented information, or the environment.

Sensory perception is the “immediate” perception that the senses provide, like
direct information.
Different level of perception:
global/local perception;
self-perception…

Robotic perception refers to the ability to collect, process and format information useful
to the robot to act and react in the world around it.

The autonomy of robots strongly rely on its capability to perceive efficiently its
environment: to perceive robot use sensors
robot environment can be:
unstructured: indoor, outdoor, road, etc.
static/dynamic, etc.
e.g. several sensors → redundancy of information;
sensors choice, data processing,Advanced
© 2019 – 2020, David FOLIO knowledge
Roboticsrepresentation, etc. 41 / 66
 What ıs sensıng!

Physical Output
change Transducer signal
Sensor (measurand) (measure)

A sensor is a device whose purpose is to detect events or changes


in its environment and send the information to other system. It
transforms the state of an observed physical quantity (mesurand) in
a signal that are used by the system: the measure.

∃ a relationship (i.e. a functional) between the measure and the measurand.


Understanding the physical principle behind sensors enable:
To properly select the sensors for a given application;
To properly model the sensor system, e.g. resolution, bandwidth, uncertainties

© 2019 – 2020, David FOLIO Advanced Robotics 42 / 66


 Dıfferent sensors

Classifying sensors by:


Type of information (analog, digital…)
Physical principle (resistive, capacitive…)
Amount of information (bandwidth)
Low and high reading (dynamic range, resolution…)
Absolute vs. derivative, digital vs. analog…
Accuracy and precision, etc.
Functional classification:
How?
∘ Passive sensors: Measure energy coming from the environment;
∘ Active sensors: emit their proper energy and measure the reaction;
What?
∘ Proprioceptive sensors (internal): measure the state of the system (robot)
e.g. motor speed, wheel load, heading of the robot, battery status, etc.
∘ Exteroceptive sensors (external): information from the robots environment,
e.g. distances to objects, intensity of the ambient light, unique features, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 43 / 66


 Robotıc sensıng

Robotic peculiarities (e.g. embedded system):


data volume and processing;
power consumption;
dimension/mass;
life span, robustness, cost, etc.
Important design issues:
What kind of sensors to use? Where to place them?
What are the sources of error? noises? disturbances?
«What are we measuring?» vs. «What do we really want to know?»

© 2019 – 2020, David FOLIO Advanced Robotics 44 / 66


 Sensors for Robotıcs

Example: Wifibot, by Nexter Robotics https://www.wifibot.com


2 odometers; 1 GPS;
4 IR sensors; 1 IMU;
1 Laser; 1 camera

Example: Pepper, by Softbank Robotics https://www.softbankrobotics.com


2 US sensors; 1 HD camera;
6 Laser; 3 touch sensors;
3 bumpers; 4 microphones;
2 gyro;

© 2019 – 2020, David FOLIO Advanced Robotics 45 / 66


 Rotary encoder

Convert angular position to output signals;


Use case: wheel/motor encoder
measure position or speed of the wheels or steering (proprioceptive
sensors);
integrate wheel movements to get an estimate of the position
→ odometry.
Figure 5: Incremental
typical resolutions: 64 - 2048 increments per revolution. encoder
Working principle:
Incremental encoders (low cost):
∘ regular: counts the number of transitions,
but cannot tell the direction of motion…
∘ quadrature: uses two sensors in quadrature-phase shift;
Absolute encoders (more expensive):
∘ A (Gray) code is used to maintain position information;
∘ more complex and expensive… Figure 6: 3bits Gray
code encoder

Pros.: simple, high-frequency, cost effective (e.g. incremental type)…


Cons.: limited resolutions, drift, wheel sliding/skidding…

© 2019 – 2020, David FOLIO Advanced Robotics 46 / 66


 Headıng Sensors

Heading sensors determine the robot’s orientation and inclination wrt. a given
reference.

proprioceptive types: e.g. gyroscope, accelerometer;


exteroceptive types: e.g. compass, inclinometer;

Allow (with velocity information) integrating the movement to a position estimate.


This procedure is called deduced reckoning (ship navigation)

© 2019 – 2020, David FOLIO Advanced Robotics 47 / 66


 Headıng Sensors: exteroceptıve types

Compass provide the direction relative to the geographic cardinal directions

absolute measure of the heading (e.g. wrt. the north);


large variety of solutions: mechanical, magnetic field measure (e.g. Hall-effect),
gyrocompass…
Drawback:
weakness of the earth field (30μT)
disturbed by magnetic objects or other sources
bandwidth limitations,
not suitable for indoor environments for absolute orientation

Inclinometer measures the angles of slope, elevation or depression of an object wrt.


gravity’s direction.

common solutions: accelerometer, liquid capacitive, electrolytic, gas bubble in


liquid, and pendulum.
disturbed by inertia, temperature, vibration…

© 2019 – 2020, David FOLIO Advanced Robotics 48 / 66


 Headıng Sensors: Gyroscope

Gyroscope provide an absolute measure for the heading of a mobile system wrt. a fixed
frame.

Mains categories:
Mechanical Gyroscopes: Optical Gyroscopes:
∘ Standard gyro (angle); ∘ Rate gyro (speed)
∘ Rate gyro (speed); ∘ very expensive, difficult to miniaturize…
∘ very expensive, difficult to miniaturize…
sense
MEMS vibrating structure: direction
measure Coriolis force z ky
∘ low cost; ω cx kx drive
∘ coarser precision m
y direction
(but sufficient in robotics)
cy
x

© 2019 – 2020, David FOLIO Advanced Robotics 49 / 66


 Accelerometer

Accelerometers measure all external forces acting upon them,


…including gravity:
∘ To obtain the inertial acceleration (due to motion alone), the gravity must be subtracted;
∘ Conversely, the device’s output will be zero during free fall!
Acts like a spring–mass–damper system: ax

k
𝑓𝑎𝑝𝑝𝑙𝑖𝑒𝑑 = 𝑓𝑖𝑛𝑒𝑟𝑡𝑖𝑎 + 𝑓𝑑𝑎𝑚𝑝𝑖𝑛𝑔 + 𝑓𝑠𝑝𝑟𝑖𝑛𝑔 = 𝑚𝑥̈ + 𝑐𝑥̇ + 𝑘𝑥 m
c
𝑘𝑥
∘ at steady state: 𝑎𝑎𝑝𝑝𝑙𝑖𝑒𝑑 =
𝑚
Measure only linear acceleration along a single axis
∘ Omnidirectional accelerometer: 3 accelerometers in 3 orthogonal directions
Main characteristics:
∘ bandwidth: up to 50kHz; ∘ disturbed by temperature, vibration…
∘ accelerations up to 50g
Common applications:
∘ Dynamic acceleration, ∘ Airbag sensors (±35g),
∘ Static acceleration (inclinometer), ∘ Control of video games (Wii), smartphone, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 50 / 66


 Inertıal Measurement Unıt (IMU)

IMU (Inertial Measurement Unit) device that uses measurement systems


(e.g. gyroscopes and accelerometers) to estimate the relative position (x, y, z),
orientation (roll, pitch, yaw), velocity, and acceleration of a moving object wrt. an
inertial frame.
ω
Rate gyroscope ∫ Initial v ξ0
values 0
θ Orientation
Transform local to Substract
Accelerometer navigation frame gravity ∫ ∫

Acceleration Velocity Posture


Figure 7: Basic principle

Initial value of velocity, position and orientation should be known;


∫ ⇔ strongly subject to drift!
need an external reference (e.g. GPS, vision, etc.) to correct drift errors.

© 2019 – 2020, David FOLIO Advanced Robotics 51 / 66


 Range sensors

Rangefinder measures distance from the observer to a target, in a process called


ranging.

Principle: measure the traveled distance of a sound or electromagnetic wave,


basically given by:
𝑑 =𝑐⋅𝑡 Propagation speed of sound: c≅0.3m/ms
Propagation speed of EM. signals: c≅0.3m/ns
The quality of time of flight (ToF) range sensors manly depends on:
Uncertainties about the exact time of arrival of the reflected signal;
Inaccuracies in the ToF measure (laser range sensors);
Opening angle of transmitted beam (US range sensors);
Interaction with the target (surface, specular reflections, etc.);
Variation of propagation speed 𝑐;
Motion of mobile robot/target (if not at stand still)…
Different groups: Different methods:
Proximity sensors: sonar (US), IR, etc.; direct pulse measure,
Laser/LIDAR rangefinder; wave modulation,
ToF Camera, etc. hybrid methods…
© 2019 – 2020, David FOLIO Advanced Robotics 52 / 66
 Range sensors: Proxımıty Sensors

Ultrasonic Sensor
Basic principle: emit an US pulse wave (20kHz to >2MHz)

Main characteristics:
∘ Sensitivity to air density: 𝑐 = √𝛾𝑅𝑇 /𝑀
∘ 𝛾: heat capacity ratio (exple.: air 𝛾 = 1.4);
∘ 𝑅: the gas constant (8.314 J/(mol.K)); measurement cone
∘ 𝑀: molar mass of the gas (exple.: air 𝑀 = 0.028kg/mol);
∘ 𝑇: the temperature.
∘ Sound beam propagates in a cone ~±20°;
∘ Precision influenced by angle to object;
∘ 𝑑𝑈𝑆 < 𝑑𝑟𝑒𝑎𝑙 : proximity/obstacle detection
Amplitude (dB)

Light sensors:

Basic principle: a collimated beam (e.g. focused IR, laser, etc.) is transmitted toward the target.
Main characteristics:
∘ Sensitivity to ambient condition (e.g. temperature, light), specular surface, reflection…
∘ Short distance <2m;
∘ Simple, compact, low cost…
© 2019 – 2020, David FOLIO Advanced Robotics 53 / 66
 Range sensors: Laser/LIDAR

LASER acronym for «Light Amplification by Stimulated Emission of Radiation», is a


device that emits light through a process of optical amplification.

LIDAR acronym for «LIght Detection And Ranging», is a method that measures distance
to a target by illuminating the target with laser light, and measuring the reflected
light with a sensor.
Relected light

or
irr
M
g
Transmitted light

in
at
t
Ro
Laser

Detector

Operating Principles:
Pulsed laser (today the standard);
Phase shift measurement;

© 2019 – 2020, David FOLIO Advanced Robotics 54 / 66


 Range sensors: Laser/LIDAR

Main characteristics:
A mechanical mechanism with a mirror sweeps: 2D/3D measurements;
∘ limited angular range: e.g. 100°, 180°, 270° → blind spot!
∘ cumbersome, fragile, expensive…
Good stability/precision, long range (up to 10m, 100m…)

© 2019 – 2020, David FOLIO Advanced Robotics 55 / 66


 Geolocatıon

Geolocation characterization of the real-world geographic location of an object.


Satellite navigation system that uses satellites to provide autonomous geo-spatial
positioning. A satellite navigation system with global coverage is termed a global
navigation satellite system (GNSS).
Geostationary
Earth orbit
Graveyard orbit
(GEO+300 km) Operational GNSS:
Orbital 20h
period
as of 2018:
Galileo
15h
Beidou ∘ United States’ Global Positioning System (GPS): 31
GPS (COMPASS)

10h
GLONASS
sat., accuracy 5m.
∘ Russia’s GLONASS: 24 sat., accuracy 7.4-4.5m.
5h
scheduled in 2020:
00k
m
0k
m
0k
m
Iridium

0k
m
Hubble
0k
m
0 0k
m ∘ China’s BeiDou Navigation Satellite System (BDS):
00 00 00
0 00 00 00
4 30 2 10 10 2

Height above
23 (35) sat., accuracy 10m (0.1m)
Radius of orbit

ISS
sea level
∘ European Union’s Galileo:
Orbital
25000 km/h
22 (28) sat., accuracy 1m (0.01m).
speed 20000 km/h

© 2019 – 2020, David FOLIO Advanced Robotics 56 / 66


 Geolocatıon: Satellıte navıgatıon prıncıple

Working principle: simple positioning beacon system (i.e. triangulation)


S2 r2(ts2)
satellite send signals: orbital location (ephemeris) +
time; r1(ts1)
receivers computes its location through trilateration and e S1
her
osp m)
time correction received them with delay: r3(ts3) Ion 800k ρ2 ρ1
0 -
(6
𝜌 = (𝑡𝑟 − 𝑡𝑠 ) ⋅ 𝑐 S3 ρ3

⇝ 𝜌 ≈ ||r𝑠 (𝑡𝑠 ) − r𝑟 (𝑡𝑟 )|| +


𝜔𝑒
𝑐 (𝑥𝑠 𝑦𝑟 − 𝑦 𝑠 𝑥𝑟 ) rr1(tr1) S4
(𝜔𝑒 Earth rotation angle velocity) rr2(tr2)
rr3(tr3)
∘ 𝜌: is the pseudorange (i.e. the «pseudo-distance»)
between satellite/receivers
∘ At least 3 satellites required for position calculation;
∘ In practice: at least 4 used for clock synchronization
resolution;

© 2019 – 2020, David FOLIO Advanced Robotics 57 / 66


 Geolocatıon: Satellıte navıgatıon

Main characteristics:
Frequency: 5Hz;
Nominal accuracy: 1-5m
∘ Error sources: Ephemeris data errors, tropospheric delays, unmodeled ionosphere delays, multipath…
Higher accuracy: GNSS enhancement
∘ Satellite-based augmentation system (SBAS): use of additional satellite-broadcast messages +
reference stations.
e.g. WAAS (Norh America), EGNOS (EU), GAGAN (India), SNAS (China), etc.
∘ DGPS: use a static receiver at known exact position;
∘ A-GPS: use stationary GPS receiver + (A-GPS) data server;
Only for outdoor applications!
∘ Satellites/signals must be accessible…

© 2019 – 2020, David FOLIO Advanced Robotics 58 / 66


 Others sensors…

Miscellaneous sensors can be used:


touch/tactile sensors (e.g. H/R interactions);
hearing/microphone sensors (e.g. H/R interactions);
force/torque sensors (e.g. manipulation);
proximity (i.e. capacitive) sensors (e.g. micro/nano-robotic);
barometric/pressure sensors (e.g. aerial/underwater robots);
Gas/odor/chemical sensors, temperature sensors…
and also vision system:
camera, webcam, microscope, etc.
(very) rich information;
specific image processing methods…

© 2019 – 2020, David FOLIO Advanced Robotics 59 / 66


 Act/ınteract wıth the envıronment

A robot must be able to interact physically with the

Software
Cognition
SENSING ACTION
environment in which it is operating

Hardware
Sensors Actuators

world
Real
ENVIRONMENT

Effector a device that makes impact/influence on environment,


i.e. legs, wheels, arms, fingers, etc.

Actuator a component that transform an energy into a physical phenomenon which


change the behavior or state of a system.

component by which a system acts upon an environment.


enables effector to perform actions.
(common) examples:
mechanics: motors, hydraulic cylinder, etc.
thermal: thermometer, heating resistor, etc.
light: lighting, LED, screen…

© 2019 – 2020, David FOLIO Advanced Robotics 60 / 66


 Dıfferent actuators

Classifying actuators by types: Functional classification:


Hydraulics/Pneumatics: All actuators need a power source.
∘ based on fluid/air pressure: pressure changes, Active: power consumption.
and actuator moves. Passive: no power consumption.
∘ powerful and precise, but large and dangerous. ∘ Uses potential energy to interact with
Chemically reactive materials environment.
∘ Respond to chemicals reactions.
(Electric) motors (most common):
∘ Affordable and simple;
∘ Uses electric current (simple to control);
∘ Well suited for wheels.

© 2019 – 2020, David FOLIO Advanced Robotics 61 / 66


 Motors

Motor a system designed to convert one form of energy (e.g. electrical) into mechanical
energy.

Electric motors are the most common source of torque for mobility and/or
manipulation in robotics.
spinning at some speed: Ω or 𝑛,
with some amount of torque: 𝑇.
Transducer: 𝑖 ⋅ 𝑣 = 𝑇 ⋅ Ω
Main characteristics:
easy to control: accurate servo control,
excellent efficiency,
from mW to MW,
mainly rotating, but also linear ones are available,
∘ common velocities: 1000-10000rpm.
several types (DC, brushless, AC synchronous/asynchronous, etc.),
main issue: autonomous power source (reloading)…

© 2019 – 2020, David FOLIO Advanced Robotics 62 / 66


 DC Motors

Direct Current (DC) Motors


Simple, inexpensive, easy to find and use; Coil Commutator
Needs DC electrical power to run, Rotor Magnet
∘ need a constant voltage in proper range
Variety of sizes and packaging: N S
∘ low voltage means low power (smaller motors),
I
∘ high voltage means high power Brush
Stator
(wear and tear occur faster).
+ Vm
Brushed motors: provide electric current to
the rotor (with brushes+commutators) Speed: 𝐸𝑏 = 𝐾𝑏 (Φ)Ω
∘ the brushes wear down and require replacement ∘ 𝐸𝑏 : induced or counter-EMF,
Brushless DC motors: synchronous motors ∘ 𝐾𝑏 : counter-EMF constant,
powered by DC current Torque: 𝑇𝑚 = 𝐾𝑖 (Φ)𝐼𝑎 (e.g. 𝐾𝑖 = 𝐾𝑏 )
∘ more expensive, but more reliable… Combined equations of motion:

𝑑𝑖
𝐿𝑤𝑖𝑛𝑑. + 𝑅𝑤𝑖𝑛𝑑. 𝑖 + 𝐾𝑏 Ω = 𝑉𝑚
𝑑𝑡
𝑑Ω
𝐽𝑟𝑜𝑡. + 𝑘𝑓𝑟𝑖𝑐𝑡. Ω = 𝐾𝑖 𝑖
© 2019 – 2020, David FOLIO Advanced Robotics
𝑑𝑡 63 / 66
 Servo-Motors

Servo-Motors Motors that can turn their shaft to


Encoder

a specific position. Brush


Ironless winding
e.g. made from DC motors by adding: Housing
(magnetic return) .

gear reduction, Commutator

position (encoder) sensor, Magnet


Shaft
electronic circuit to tell directions much to Ball bearing
Rotor pinion
Planet carrier plate

turn. Gear
Ball bearing

Output shaft

Motor Loading Motors apply torque in response to loading.

The higher the load on the output:


the more the motor will “fight back” with an opposing torque;
the more current the motor draws;
increasing the load, the motor may stops spinning or stalls.

© 2019 – 2020, David FOLIO Advanced Robotics 64 / 66


 References

Adams, Martin David (1999). Sensor modelling, design and data processing for
autonomous navigation. Vol. 13. World Scientific.
Borenstein, J., H. R. Everett, and L. Feng (1996). ”Where am I?” Sensors and methods
for mobile robot positioning. Tech. rep. University of Michigan.
Corke, Peter (2017). Robotics, vision and control: fundamental algorithms in
MATLAB®. 2nd ed. Vol. 118. Springer. ısbn: 9783319544137.
Craig, J.J. (2018). Introduction to Robotics: Mechanics and Control. Pearson. ısbn:
978-0-13-348979-8.
Everett, HR (1995). Sensors for mobile robots. AK Peters/CRC Press. ısbn:
978-1-4398-6348-0.
Lynch, Kevin M and Frank C Park (2017). Modern Robotics. Cambridge University
Press. ısbn: 9781107156302.
Murphy, R. (2000). Introduction to AI Robotics. Ed. by R.C. Arkin. A Bradford book. MIT
Press. ısbn: 9780262133838.

© 2019 – 2020, David FOLIO Advanced Robotics 65 / 66


 References

Russell, S. and P. Norvig (2016). Artificial Intelligence: A Modern Approach. Always


learning. Pearson. ısbn: 9781292153964.
Siegwart, Roland, Illah Reza Nourbakhsh, and Davide Scaramuzza (2011). Introduction
to autonomous mobile robots. 2nd ed. MIT press.

© 2019 – 2020, David FOLIO Advanced Robotics 66 / 66

Вам также может понравиться