Вы находитесь на странице: 1из 13

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/328987800

Risk Assessment for Human-Robot Collaboration in an automated


warehouse scenario

Conference Paper · September 2018


DOI: 10.1109/ETFA.2018.8502466

CITATION
READS
1
510

7 authors, including:

Rafia Inam
Klaus Raizer
Ericsson
Ericsson
45 PUBLICATIONS 177
CITATIONS 25 PUBLICATIONS 69 CITATIONS

Ricardo Souza
University of Campinas
19 PUBLICATIONS 76 CITATIONS

Some of the authors of this publication are also working on these related projects:

PPMSched View project

TROCA - Control of a Transportation Robot using the MECA Cognitive Architecture View project

All content following this page was uploaded by Klaus Raizer on 17 June 2019.
The user has requested enhancement of the downloaded file.
PREPRINT

Risk Assessment for Human-Robot Collaboration


in an automated warehouse scenario
Rafia Inam∗, Klaus Raizer∗, Alberto Hata∗, Ricardo Souza∗, Elena Fersman∗,
Enyu Cao∗‡, Shaolei Wang∗‡
∗Ericsson Research, {firstname.lastname}@ericsson.com
‡ Royal Institute of Technology (KTH), {caoe, shaolei}@kth.se

Abstract—Collaborative robotics is recently taking an ever- workers, other objects or to itself [2]. This issue is aggravated
increasing role in modern industrial environments like man- in an automated warehouse scenario, where mobile robots can
ufacturing, warehouses, mining, agriculture and others. This
navigate autonomously together with human workers and
trend introduces a number of advantages, such as increased
productivity and efficiency, but also new issues, such as new other moving robots.
risks and hazards due to the elimination of barriers between Safety requirements for robots interaction were introduced
humans and robots. In this paper we present risk assessment for in the international standard ISO 10218 [3], [4] and for
an automated warehouse use case in which mobile robots and human-robot collaboration (HRC) operations in a relatively
humans collaborate in a shared workspace to deliver products
from the shelves to the conveyor belts. We provide definitions of
recent technical specification ISO/TS 15066:2016 [1]. Cur-
specific human roles and perform risk assessment of human- rently, the expected requirements of safety and unhindered
robot collaboration in these scenarios and identify a list of human-robot collaboration are under development. The scope
hazards using Hazard Operability (HAZOP). Further, we present of the standard is not limited to the development of new
safety recommendations that will be used in risk reduction sensors, robots or intelligent control systems, but includes risk
phase. We develop a simulated warehouse environment using V-
REP simulator. The robots use cameras for perception and analysis techniques, which is a fundamental requirement for
dynamically generate scene graphs for semantic representations collaborative robot applications.
of their surroundings. We present our initial results on the ISO standards alone are not enough for collaborative sys-
generated scene graphs. This representation will be employed tems to ensure safety. A dedicated risk management approach
in the risk assessment process to enable the use of contextual (including risk assessment and risk reduction) is vital, even for
information of the robot’s perceived environment, which will be
further used during risk evaluation and mitigation phases and those robots that are specifically designed for human-robot
then on robots’ actuation when needed. col- laboration (HRC) [5], [6]. An experiment was performed
in [7] to check the safe collaborative operations by applying
Keywords-Collaborative robotics, safe human-robot collabora-
tion, warehouse, safety, risk assessment, safety standards, “power and force limitation” specified in ISO/TS 15066
ISO/TS 15066, hazard identification, HAZOP. standard for pick-and-place task while taking the maximum
permissible values for pressure and force from the standard.
I. I NTRODUCTION The results indicate that the current specification is not
sufficient, even though ISO/TS15066 was applied reasonably,
Robots are playing a viable role in manufacturing, ware- and there is a vital need for risk assessment. As a result of this,
houses and industries by performing operations in a shorter additional actions to reduce risk must be taken for
time and in a more precise way as compared to humans. collaborative scenarios in a safe manner.
However, there are some tasks where humans can’t be In this paper we present a systematic description of collab-
replaced due to their complexity. Collaborative robotics, in orative scenarios for our use case, in the safety perspective
which robots and humans work together to accomplish by first identifying different human roles, their collaborative
common tasks, demands additional safety requirements as interactions, and unsafe scenarios along with safety issues.
compared to traditional safeguarding measures [1]. The Secondly, we perform a risk assessment for HRC that in-
autonomous operations of mobile robots in human-shared cludes hazards identification by using Hazard Operability
environments introduce new advantages in terms of (HAZOP) technique coupled with Unified Modeling Language
productivity and effi- ciency, as the abilities of human and (UML) [8] and risk estimation. Lastly, we present our simu-
machine complement one another, but it is subject to hard lation setup along with scene graph representation that will
safety constraints. It also introduces new risks and hazards, be used to evaluate the proposed risk assessment, and initial
like increased possibility of collision with workers. Due to results of the dynamically generated scene graph for each
the elimination of barriers around the robot in this new robot in our scenario1. Through the scene graph, the robot
collaborative situation, a robot should interact with other perception
robots and workers at different levels. It is crucial to ensure
the correct and safe operation of the robot, so that it cannot 1
Code available in https://github.com/EricssonResearch/scott-eu/tree/
cause injuries or damages to the simulation-ros/simulation-ros
is converted into a semantic representation that contributes to Further, ISO 13849-1 [11] and IEC 62061 [16] provide guide-
add relevant environment information for the risk assessment. lines to determine the safety level based on the severity, the
Paper Outline: Section II presents related works in risk frequency of exposure and the possibility to manage the
assessment for HRC. Section III describes an automated ware- hazard to an acceptable level by reducing risk.
house use case using collaborative robots and the HRC scenar- Robotic companies such as ABB [6] and SICK [5] are
ios along with safety issues that could arise. It further presents devel- oping robots and sensors that follow these standards to
hazards and risk assessment for the use case. Implementation be used in industrial automation. They present safety
set up along with semantic representation of the environment requirements and emphasize the need of risk assessment and
and its initial results is presented in Section IV. Section V risk reduction approach for collaborative robots using these
presents discussion and finally, Section VI concludes the standards. Sim- ilar need of applying standards of risk
paper with a description of ongoing and future works. assessment and risk reduction, specially focused on
collaborative robots, have also been observed in research
II. RELATED W ORK institutes [17]. Recent works done from 2016 made
Traditionally, when robots and humans share the same improvements in the HRC risk assessments and reductions by
working space, they are separated by barriers to avoid direct identifying the operation modes accord- ing to the standard,
contact with each other and consequently, prevent possible presenting collaborative scenarios, and identifying different
injuries, such as seen in Kiva project [9]. There are some human body areas that could potentially get injured. The work
initiatives to enable the interaction between humans and of [6] presents structured description of HRC scenarios and
robots to increase efficiency in the production. Despite performs risk assessment and hazards analysis using Failure
requiring risk assessment, some works do not follow any Modes and Effects Analysis (FMEA) method [18] for
safety standard. For instance, in the work of [10] the authors industrial robots used in manufacturing. This work is quite
employ robot equipped with cameras and sensors to detect similar to ours in its approach, but different in terms of the
objects and people. In this case, the robot simply modifies its use case, robot mobility and the operations performed by the
path or velocity to prevent collisions, but does not perform robots. We also define different human roles in our use cases
any risk assessment. and our HRC scenarios. Further, we present the details of
Regulations that incorporate robot related risks for human implementation and how risk assessment will be evaluated
workers include the international standard ISO 10218 [3], which is completely missing in [6].
[4] and ISO 13849-1 [11]. Based on these standards, several Askarpour et al. [19] presents a methodology to perform
works were done on industrial robotics to maximize the semi-automated safety analysis of HRC applications. The
productivity while sharing the same space. In [12], the authors methodology aims at applying formal verification methods
present a kinematic control strategy to enforce safety for such in HRC tasks to identify possible hazardous situations and
robot through an optimization-based real-time path planning mitigate them. The proposal performs offline verification of
algorithm. During planning, a tractable set of constraints on such tasks and assumes a ’human-in-the-loop’ for providing
the robot’s velocity is used to keep the minimum separation the mitigation strategies for unsafe situations. In [20], the
distance to the human. As compared to our work, [12] does authors further extend the work by adding a model for the
not contemplate on risk assessment to ensure safety. The operators behaviour as an attempt to deal with unpredictable
work of [13] performs hazard analysis and risk assessment in human behavior. The authors employs a cognitive model of
cable harness assembly and evaluates the safety design before the operator, capturing erroneous human behavior driven from
and after the implementation that resulted in the reduction of the operator’s perception of the environment and mental
the potential collaboration risks. However, different from our decisions. Although the approach of having the operator’s
proposed solution, the previous two works only consider fixed cognitive model is extremely interesting, we argue that the
robot manipulators and not mobile robots. In [14] the complexity of modeling human behaviour and more
operators use safety vests (ANSI/ISEA 107-2004 standard) to specifically possible erroneous behaviors is almost an
facilitate their detection by the robot’s cameras. Moreover, the insurmountable task to be performed.
robot complies with the EN 1525 standard which enables the We use HAZard OPerability (HAZOP) [21] in this work,
contact with humans by limiting the force and power [15]. which is a guideword-based technique to identify hazards/risks
Despite all these works following some safety standard, they in risk identification phase. The technique mainly focuses on
are not in the accordance with the most recent technical operational hazards. It is relatively new as compared to other
specification ISO/TS 15066:2016, since they were in fact done techniques like FMEA, Preliminary Hazard Analysis (PHA),
before the technical specification definition. with the advantages of controlling the model complexity and
The ISO/TS 15066:2016 [1] standard introduces collabora- delivers a safety document for certification. Further, HAZOP-
tive robotics concepts and four collaborative operating modes UML analysis can identify all hazards that PHA covered, and
in details and thus, supplements the requirements of ISO additional hazards [21]. The work of [22] tested this technique
10218 in order to develop safe collaborative robot with two systems: a robotic mobile manipulator and a robot
applications. It enforces power, speed and movement that assists disabled people. A HAZOP-UML tool is
limitation on robots, according to the level of the risk that developed and used for a walking assistance robot [21]. The
they bring to humans. tool was also used for an airport light measurement robot [23].
the warehouse. Thus, most of the collaborations will happen
around the shelves and on the warehouse floor. Proper safety
measures are needed to be adopted and safety must be ensured
during all these scenarios.
We have provided an overall architecture containing major
components of an automated warehouse including a
warehouse controller, planning service, and a two-layered
safety strategy in [24]. The presented safety strategy consists
of an offline safety analysis (performed before sending the
tasks to the robot) and an online safety analysis (performed at
runtime and inside the robot’s control loop). Warehouse
controller is a digital actor in the system that receives high-
level goals from the warehouse manager through a graphic
user interface (GUI), uses the planning service to generate a
Fig. 1: Illustration of human-robots interactions in the ware- high-level plan for the warehouse to accomplish the goal, and
house. The small boxes in red, green and yellow are products checks the generated plan for safety constraints using the
on the shelves and on the conveyor belts, white squares on the offline safety analysis before sending it to the robots. For
floor (next to the conveyor belts and to the shelves) are way- instance, the plan ensures that two or more robots will not be
points and blue cylinders are the robots’ recharging stations. at the same position at a particular time. If the plan is correct
then the controller assigns resources (e.g. number of robots
III. D ESCRIPTION OF USE CASE AND H UMAN -R OBOT and number of conveyor-belts) to be used to fulfill the current
C OLLABORATION plan and then finally sends the verified plan (tasks) to the
The first part of this section presents details of an auto- robots. The task assignment to the robots is shown in Figure 2.
mated warehouse use case and describes collaborative and The warehouse controller, planning and offline safety analysis
non-collaborative scenarios to be performed safely inside the are briefly presented here only to provide an overview of the
warehouse. The later parts of this section identify different complete system and are not focus of this paper.
human workers that interacts with the robots and describes As mentioned before, each robot receives a high-level plan
the hazard exposure, their skill level, their frequency of (task) that was already been checked in the warehouse con-
collaboration, and unsafe scenarios that could arise, along troller (offline) for safety constraints. However, still different
with safety recommendations. Unsafe scenarios lead to situations can arise when the products are placed closely
hazards identification, and safety recommendations that will and there is a chance of collision with nearby robots. Close
later be used in risk mitigation step. encounters can also happen during navigation from shelves
We consider a use case of an automated warehouse where to conveyor belts and when the robots are coming back
autonomous mobile robots and humans share a common place towards the shelves after delivering the products. Online
and work together to move products to the delivery truck. safety analysis is performed at runtime for this purpose using
Multiple mobile robots perform pick-and-place operations by risk management process and is the main focus of this
picking up products from the shelves and delivering them to paper.
conveyor belts, that in turn take the products to the trucks.
Each robot is equipped with a robotic arm for pick-and-place
operation. Human workers interact with shelves by placing or Robot
UC01
moving products on them. In special, placing the products is Collaborative Operation UC06
a complex task that involves choosing the ordered products, Monitoring

therefore it is performed by the human workers. UC02


<<uses>>

Figure 1 shows the simulated warehouse with the collabo- Placing/ replacing products on shelf
<<uses>>
Manager

rative robots and a human worker. Collaborative


WorkerRisk Management
UC03<<uses>>
A. Human-robot collaboration scenarios and their safety re- Cleaning
<<uses>>
quirements Assigning
UC04
Human worker and robot can come into close interaction replacing <<uses>>
tasks
Co-existing
with each other around the shelf when the former is placing Worker
Warehouse
the product and latter is picking it up, thus leading to severe UC05 Controller
Updating software, Monitoring behavior of robot
safety risks. Other situations can include human intervention UC07
Visiting the area
when a product is dropped by the robot and a worker comes
to remove it, or when a worker enters in the warehouse for External System
the maintenance of a broken robot while other robots are Worker/ Visitor Engineer

moving, or when a visitor (e.g. external worker) enters in Fig. 2: Use cases for collaborative scenarios in automated
warehouse.
We consider the following robot states in our use case: Robot Node
Scene Graph
• Manipulation: Robot arm picks up or places a product. Sensors Object Detection

• Navigation: The robot’s platform is moving towards to a Camera Image

Object Classification
waypoint (i.e. shelf or conveyor belt). For safety reasons and to Warehouse Scene Graph

reduce hazards complexity, we assume that the robot platform Static Objects
Graph
Scene Graph Construction
stands still during manipulation, and vice versa. Shelves Risk Analysis
• Idling: Robot is standing still because it is either waiting for the Conveyor Belts Object Positions

next task or had some technical problem (e.g. battery out of Dynamic Objects
Risk Evaluation
charge). Humans
Risk Magnitude Hazard Identification
• Charging: Robot recharges itself at the charging station. Other Robots Risk Reduction (AI Based Alg.)
Before quantifying risks associated with our collaborative

Safety Zones
robots, we present a systematic description of collaborative
scenarios from the safety perspective. We first identify differ-
ent roles and their collaborative interactions, and then explore Navigation/ Actuation
unsafe scenarios and safety issues.
1) Description of the roles in our collaborative scenario:
In order to find hazards that could arise, we first need to Fig. 3: Risk management process, its components and in-
identify all humans who might be exposed to a hazard, their puts/outputs. The ROS nodes that are executed in robot are
skill levels and the frequency of exposures. For our use case, depicted in Robot Node. The scene graph produces a semantic
we describe different human roles in Table I with the information of the environment which is used in risk analysis.
respective explanations about their skill levels and frequency
of exposure in each column. In the table, by “skilled” person Hazard Analysis (PHA), HAZard OPerability analysis (HA-
we mean that the necessary safety certification courses have ZOP), Fault Tree Analysis (FTA), and Failure Modes and Ef-
been un- dertaken, and by “trained” we mean that a training to fects Analysis (FMEA). PHA is a simple but inductive method
work collaboratively with robots have been attained. These in which hazards for a specific scenario are identified from
include instructions of coming close-by the robot, hazard checklists of a standard (e.g. ISO12100 [26] Annex
understanding robot behavior (e.g. robot gradual speed B: Examples of hazards, hazardous situations and hazardous
reduction when the human gets closer) or even making events). However, robotic standards [3], [4], [1] do not include
physical interactions with it. hazards for HRC scenario. We apply HAZOP method, which
2) Description of use cases and unsafe scenarios: After is a structured and systematic examination approach to
describing different roles, and their exposure levels, we identify hazards, and is suitable for our HRC use case.
present their interactions with the robots in Figure 2. We HAZOP first models the scenarios by use case diagrams,
briefly describe the use cases in Table II. The table further sequence diagrams and state-machine diagrams. Then,
explains unsafe interaction scenarios, safety issues in the use attributes and guidewords are used to generate deviations. The
cases, and presents safety requirements and recommendations hazards list can be obtained after merging redundant
that will be used in risk reduction phase. deviations and removing meaningless deviations. A detailed
list of hazards for our HRC use case along with the
B. Risk Assessment identification of its type, consequences and effected human
This section presents the concept of risk management pro- body area is presented in Table III.
cess that has capability of managing (identifying, assessing The last three phases of risk management process are
and mitigating/reducing) safety risks for our collaborative performed inside the robot’s control loop as shown in Fig-
scenar- ios. Four main phases of a classical Risk Management ure 3. Risk analysis phase comprehends the nature of risks,
Process are 1) Hazard and Risk Identification; 2) Risk determines the level of risk, including risk estimation [26]. In
Analysis; 3) Risk Evaluation; and 4) Risk Mitigation (also this phase we identify key entities, attributes of the entities,
called Risk Reduction or Treatment) [25] as shown in the red and the relationships among the attributes and then perform
boxes of Figure 3. The main focus of this paper is on risk risk estimation. The key entities in our case are the shared
assessment, which is an overall process of hazard workspace which consists of some static objects (e.g. shelves,
identification and risk analysis (i.e. first two phases of risk products, conveyor belts and dock stations) and some dynamic
management). Risk assessment enhances understanding of objects (other robots and human workers) (see Figure 3).
risks, their causes, frequencies, consequences, and probability. We use scene graph in this phase to identify the entities and
The first phase of the risk assessment is the hazard and risk their attributes based on the identified hazards. This consists
identification. We conduct it manually by identifying and then of gathering sensor data from the warehouse and then
describing all possible existing threats in our HRC scenario. processing the camera image through the scene graph module
Additionally, the possible consequences and damages to the (details in Section IV-B) which outputs the corresponding
human and to other objects are also catalogued. There are scene graph. To formalize this problem, initially the obstacles
several methods to perform this phase such as, Preliminary are classified as static objects, mobile objects, humans and the
special objects: dock station. The first three objects require
increased
TABLE I: Description of different roles in our use case and their collaboration with robot
Role Expertise Description Degree of collabo- Frequency of col-
ration with robot laboration
Collaborative Skilled, trained Has close collaboration with the robots. Has proper Close collaboration Regular, daily work
Worker and experienced training on working collaboratively with the robots
and understanding robot behavior.
System Skilled, trained Has technical knowledge on how the robot works Close collaboration Occasional, only
Engineer and experienced and performs, understands behavior, and responsi- for updates or
ble for development/maintenance of the robots when some error
occurs
Manager Non-skilled, He is responsible for administrative operations and No interaction No collaboration
untrained management of the warehouse and the assets. He
has low knowledge of the robot behavior. He
rarely interacts physically with the robots
Co-existing Skilled, trained Shares the same place as the robots but has occa- Close collaboration Occasional
Worker sional interaction with them. Has proper training
and a shallow understanding of the robot behavior
External Untrained Workers that do not pertain to the warehouse but No close collabora- Very rare
Worker / Visitor has access to this place tion

TABLE II: Description of use cases, unsafe scenarios and safety recommendations
Use Case Use cases and its unsafe scenarios Safety Requirements/Recommendations
UC01 The worker interacts with the robots in a collaborative way (working The robot must adjust its behavior and always keep necessary safety
very closely to robot) while placing products on the shelf. distance from the worker according to the standards. Some of the
adjustments can include robot speed reduction and stop the robot if
the worker is in a safe range.
The worker must be trained and has knowledge about the robot’s
behaviour such as how much distance to be kept from the robot while
working safely and efficiently.
Feedback (visual and/or auditory) must be provided to inform the
current robot behaviour to the worker that can help to anticipate the
robot’s movement. E.g. the robot’s stopped position can be presented
as a red light at top of the robot and its slow movement can be
presented as yellow light.
UC02 The worker takes different products from the storage and places them All recommendations from UC01.
on the shelf, from where the robot will pick up the products and Products to be placed at specific positions.
delivers them to the conveyor belt. The products should be carefully The worker is trained to identify the reason why robot is not able to
positioned so that the robot can easily pick it up. If it is placed at pick up the product.
an unusual position or shifted, then the robot may have difficulties or A continuous information about robot functionality is provided that
may be unable to pick it up. could help to identify the reason.
UC03 A worker needs to remove items on the floor/to that the robot has All recommendations from UC01.
dropped. When it happens, the worker enters the collaborative area
and goes towards the dropped item. He/she is comfortable to come
close to the robots in a safe way to perform its task, without
interfering the robot’s activities.
UC04 If a robot breaks down during its operation or a deadlock occurs in All recommendations from UC01.
the system, then the manager sends a technician. The worker enters in Identifies the problem (e.g. drained-out battery) and logs it.
the collaborative area to replace or move out the robot in the presence
of other working robots.
UC05 Collaborative worker informs the engineer about the problem in the All recommendations from UC01 for collaborative worker
robot’s normal behavior. Sometimes the robot initiates actions which The engineer must perform systematic tests of the algorithms before
were not anticipated by the worker. It could be a problem due to an deploying in the robots.
error in the algorithm or due to incorrect parameters of the algorithms. The engineer can work together with the worker in order to be
informed about the most common problems that the worker notices
while working with robots. This helps to adjust the robot’s software
and helps identifying the need of additional training to the workers.
UC06 The manager’s interaction with the robots happens mostly through the Continuous information about the robot’s functionality must be dis-
warehouse management interface. The manager verifies, and approves played through some user interface.
high-level plans being sent to the robots and expects them to be
accomplished in safely and timely manner.
UC07 A visitor gets inside the warehouse and moving in the warehouse along All recommendations from UC01.
with other mobile robots. He/she also observes the pick up operation The visitor should be provided with some basic information or
around the shelf and may like to place the products for the robot or training about the collaborative robots, so that he/she could anticipate
wishes to touch or come close to the robot. about robot’s behaviour (e.g. distances at which the robot will slow
down or stops completely to keep him/her safe).

function complexity and performance. The last one, the dock a proper direction.
station, breaks the common object strategy because it is used For risk evaluation and risk mitigation phases, online safety
to charge the robot where the robot must park closely and in analysis is to be followed and implemented. We propose
dynamically changing three-layered safety fields/zones around
TABLE III: Description of Hazards for collaborative operations
Description Hazard Type Consequence
Task Problem Body area
Pickup operation Product is not properly HN1: Robot cannot pick up the product Temporal Time loss None
placed and robot fails because either the product is not present on
the shelf or is not placed at a proper place
Pickup operation Human is very close to the HN2: Physical human injury as transient Mechanical Human Back of
and the worker is robot. contact between gripper and hand. Followed injury: workers
placing/replacing the by clamping and dragging along the hand Gripping hand
products while continuing the planned pick-and-
place task
The robot navigates its Human is very close to the HN3: Physical human injury. The robot’s Mechanical Human the upper
arm to pickup a product robot. moving arm can hit the worker’s body injury: impact part of
the body
Manipulator drops the Product is not held prop- HN4: Physical human injury on worker’s Mechanical Human Foot / leg
product and a worker is erly and the robot drops foot or leg due to the fallen product injury:
nearby the product close to the crushing
human who can get hurt
Robot navigation and a Human is very close to the HN5: Physical human injury on worker’s Mechanical Human the lower
worker/visitor is moving moving robot. body due to the moving robot injury: impact part of
close by the body
Robot navigates and a Human is very close to the HN6: Physical human injury on worker’s Mechanical Human the lower
worker is cleaning the moving robot. body due to the moving robot injury: impact and/or
floor close by or come to upper
replace the robot parts of
the body
Place operation Product cannot be placed HN7: No place for the product because Temporal Time loss None
properly conveyor belt is not moving
Place operation Product cannot be placed HN8: Robot is not able to place the grip Mechanical Financial loss None
properly from product properly
Change in the Robot’s be- Robot does not behave as HN9: Physical injury, stress to collaborative Communi- Human Any
havior due to new/updated anticipated by worker worker with unexpected behavior cation physical/ body
software mental injury area
Multiple robots are mov- Proximity sensor failed or HN10: Damage due to robots collision Mechanical Financial loss None
ing close to each other software error
Pickup and/or place oper- Improper force limitation HN11: Property damage on fragile products Mechanical Financial loss None
ations or force control failure due to robot
The robot is performing a Software error HN12:Failure to switch modes when a re- Software Financial loss None
task action is needed.
The robot is performing a Software or hardware er- HN13False emergency stop Software/ Financial loss None
task ror hardware
The robot is performing a Software or hardware er- HN14Robot shutdown during a task Software/ Time loss None
task ror hardware
The robot is performing a Software error HN15 False alarm or indicator light Software Time loss. Any
task Physical/ body
mental injury area

the robot for its safe navigation and manipulation. The IV. IMPLEMENTATION SETUP
fields/zones are categorized as red, yellow and green. The
sizes of the zones will be taken from the standards [1]. If an This section presents our implementation setup along with
object (obstacle) is identified far from the robot (in green semantic description of the environment in the form of scene
zone) then we evaluate it as safe and there is no risk; if the graph. The scene graph is used to represent the knowledge
object is identified a bit closer (in yellow zone) then there is a of the robot’s visual perception and to enable environment
moderate level of risk and the robot may need to reduce its analysis at a semantic level2 (Figure 3).
speed depending on the object type and its distance; and when A. Simulated Warehouse
the object is identified very close (in red zone) then the risk is
For simulation purposes, we use a Virtual Robot
high and the robot must stop immediately. This information
Experimen- tation Platform (V-REP) [27] to model all the
will be used in risk evaluation phase to calculate the
above mentioned collaborative scenarios of our use case. The
magnitude of the risk (i.e. no risk, low, moderate, high, very
simulator comes with an integrated development environment
high). And based on this risk magnitude, the safety rules will
in which the physical models can be created and controlled.
be generated and will be used in the next risk mitigation
Figure 1 depicts the simulated warehouse that was modeled
phase. To implement online safety analysis, we intend to use
with all its physical components, i.e. shelves, products,
an Artificial Intelligence (AI) based algorithm (e.g. fuzzy
conveyor belts, robots, charging stations, and the human
logic, neuro-fuzzy algorithm) in future.
workers. The simulated robot
2
The simulated warehouse prototype and the codes are available
in this link: https://github.com/EricssonResearch/scott-eu/tree/simulation-ros/
simulation-ros/
robot navigation by
3
Turtlebot2i robot specifications: http://www.trossenrobotics.com/ interbotix-
turtlebot-2i-mobile-ros-platform.aspx

Scene Graph
Module

Risk Reduction

Mapping/
Localization ROS MASTER

Path Planning
ROS ROS ROS
Messages Messages Messages
ROS Controller
ROS Nodes - Robot1
V-REP ROS Nodes - Robot2..N

Scene
ROSGraph
INTERFACE
Module

Risk Reduction
Simulated Scenario
Mapping/
Localization
Simulated Robot 1...N
Path Planning

Simulated
ROS Arm 1...N
Controller

V-REP Remote API

Fig. 4: ROS architecture employed to simulate the warehouse


and the robots. The main components of the architecture are
the V-REP simulator (red box) and the ROS nodes (gray box).

is a Turtlebot2i3 which is equipped with a robotic arm and two


3D cameras.
We use Robot Operating System (ROS) which is a flexible
and widely used framework for developing robot software.
The main advantages of ROS are the code reusability, the
abstraction of the low level codes and support for several
robot models. The data generated by V-REP is converted to
ROS messages using the ROS interface.
In the simulation environment, we use a single ROS master
that centralizes the communication between the simulated
robots and the V-REP. The V-REP remote API is specifi-
cally used to control the robotic arms through socket com-
munication. Main components of the warehouse simulation
architecture are V-REP, Robot ROS nodes and ROS master.
Figure 4 presents an overview of the architecture with its
components and the communication between them. Details of
these components and description of their functionalities are
as follows:
V-REP: models all the physical components of the ware-
house. It also simulates the behavior of the warehouse and
pro- duces visualization through its GUI interface. It is
important to highlight that the robots control logic is
implemented using ROS and therefore is not coupled to V-
REP, it only controls the behaviors of conveyor belts, shelves,
trucks, and workers. The main motivation of keeping the robot
code separately in ROS nodes is to increase code reusability
between simlation and real world scenario.
Robot ROS Nodes: contain all the algorithms responsible
for processing the data coming from sensors in the robot’s
control loop. All methods were modeled using ROS libraries
and all data is formatted in ROS message structure. Some
of ROS nodes are: scene graph module, risk reduction or
mitigation algorithm, mapping/localization, obstacle
detection, and path planning. Path planning, based on
Navigation Func- tion 1 (NF1) algorithm [28], performs the
combining localization, mapping and obstacle detection
nodes. During the navigation, obstacles’ boundaries are
inflated pro- portionally to the robot size for a safer robot
movement. Navigation module also relies on local path
planner to deal with dynamic obstacles. Thus, the robot
can make local changes without modifying the global
path.
ROS Master: centralizes the communication between
the components of the architecture and is responsible for
estab- lishing the communication between node pairs.
B. Semantic Representation of the Environment and
Initial Results
In this work, instead of analyzing the object detection
output, we include a contextual information within the
de- tected objects in order to get a richer representation of
the environment. We use scene graph [29] for this
purpose, which incorporates the semantic relationship
between the objects. Additionally, scene graphs include
positional information of the object which further
enhances the contextual information. We use scene graph
for two main purposes: first, for environ- ment perception
(as mentioned before) and second, for risk mitigation
instead of using raw sensor data.
The scene graph is represented as a direct graph where
nodes denote objects (e.g. conveyor belt, shelf and
robot) or human workers, and the edges denote the
spatial (e.g. on, below, beside) or other semantic
relationships between two nodes. The nodes are obtained
after performing object detection and classification from
the robot’s camera images. These nodes store static (e.g.
shape and size) or dynamic (e.g. pose, velocity and
acceleration) properties of the objects. The root node is
associated to a location (e.g. office, floor, factory) and
subsequent child nodes represent the objects present in
this location. To construct the graph from the robot’s list
of detected objects, the objects that are in contact with the
leaf nodes are added as child nodes and the process is
repeated until all elements in the list are analyzed. Using
this method, a separate scene graph is generated for each
robot and the graph is dynamically updated whenever any
change is observed in robot’s detection.
Figure 5 presents dynamically generated scene graphs
at two different times (t0 and t1) by two robots
navigating in the simulated warehouse. The root node is
always the
“warehouse” and has a child node “floor”, which is the
element that connects all the objects in the scene. The
objects detected by the robot is added below the “floor”
node and the edge is labeled with “on”, which
represents the placement of the object. With this
representation the robot can have a special attention when
the “worker” node is added and the risk assessment can
use the contextual information provided by this graph to
generate safe behavior.
In the presented graphs, each child node has a distance
attribute, which corresponds to distance of the object to
the corresponding robot. The robots used monocular RGB
camera
velocities
Con- for are 0.00)
object and the At
detection. worker
timeis tpassing in frontare
, the robots of
stopped (their veyorBelt#1 (Figure 5a).0 In this setting,
Robot#0 detects the
Robot#0 Robot#1
camera_rgb warehous camera_rgb warehous
velocity=0.00 e velocity=0.00 e

floor floor
size: 25*25 size: 25*25

on on
on on on
Worker ConveyorBel#1 Shelf#0 Shelf#1 DockStation#1
distance: 0.82 distance: 2.13 distance: 2.33 distance: 1.45 distance: 3.21

(a) Scenario at time t0. (b) Scene graph from Robot#0 at time t0. (c) Scene graph from Robot#1 at time t0.

Robot#0 Robot#1
camera_rgb warehous camera_rgb warehous
velocity=0.10 e velocity=0.15 e

floor
floor
size: 25*25
size: 25*25
on
on on
ConveyorBel#1
Shelf#0 Shelf#1
distance: 0.12
distance: 1.12 distance: 0.21
(d) Scenario at time t1 (e) Scene graph from Robot#0 at time t1.
(f) Scene graph from Robot#1 at time t1.
Fig. 5: Dynamically generated scene graphs by two robots, labeled as Robot#0 and Robot#1, based on the objects detected at
two different time stamps (t0 and t1) during the simulation. Distance, size and velocity units are in m, m2 and m/s, respectively.

Worker and ConveyorBelt#1, while Robot#1 detects Shelf#0, be checked by verifying the robot behavior in presence of
Shelf#1 and DockStation#1. These detections are reflected in humans; both in simulated and real world scenarios.
the generated scene graphs presented in Figure 5b – 5c re- We have identified limitation to obtain the complete list
spectively. At t1, Robot#0 is moving towards ConveyorBelt#1 of hazards. HAZOP method is employed as it can identify
and Robot#1 towards Shelf#1 (Figure 5d). At this moment, more hazards than other methods, such as Preliminary Hazard
Robot#0 can’t observe the Worker anymore and Robot#1 Analysis [21], but can not identify all possible hazards.
stops detecting the DockStation#1, thus these nodes are Our goal is to evaluate the robot system using a set of Key
removed from the graphs, Figure 5e – 5f. Performance Indicators (KPIs) based on safety requirements
graph construction
Currently, the Regards totakes 150 ms.a
approximatelyperformance,
the computational and the overall performance of the warehouse (e.g. the number
single scene risk reduction is based on the robot direction and of delivered products). An interesting topic is to study the
velocity
adjustment by taking into account its distances to the per- trade-off for the questions like: What is the effect on the
ceived objects. Robots’ distances to objects and velocities are robot’s performance when using the safety analysis
illustrated in Figure 5. approach?; How much safety of the system is improved by
using the safety analysis approach?; Has the number of
V. D ISCUSSION possible colli- sions/risky situations been reduced?
Conducting risk related testing experiments directly in real Studying human trust on machines is also an important
environments can be dangerous, spend lot of time and re- aspect. Human trust on the automated system should not be
sources. The simulated warehouse setup will make it possible blind. Excessive trust could be as harmful (or even more)
to conduct these experiments in a safe and efficient manner as a lack of it. Therefore, the concept of calibrated trust
before deploying the algorithms in real robots. [30] should be explored and how this calibrated-trust can be
The presented setup enables risk management process for developed and achieved.
safe HRC. Currently, we are using the scene graph, which VI. C ONCLUSIONS AND FUTURE WORK
provides a contextual and semantic based representation of
the environment. It still requires some investigation on how to Human-robot collaboration (HRC) is expected to increase
add safety and risk related information in this representation both productivity and performance. However, this causes new
to leverage the risk analysis. After implementing an AI- hazardous situations that must be avoided through proper
based algorithm, we intend to use this setup to evaluate our risk assessment and risk reduction, without compromising
safety approach. In this way, the proposed safety aspects will human or robot productivity. In this perspective, we have
presented a systematic risk assessment approach applied to an
automated warehouse use case. We have identified different [10]L. Sabattini, M. Aikio, P. Beinschob, M. Boehning, E. Cardarelli,
humans working at different interaction levels with robots V. Digani, A. Krengel, M. Magnani, S. Mandici, F. Oleari, C. Reinke,
D. Ronzoni, C. Stimming, R. Varga, A. Vatavu, S. Castells Lopez,
and we have presented their respective safety requirements. C. Fantuzzi, A. Mayra, S. Nedevschi, C. Secchi, and K. Fuerstenberg.
We identified a list of hazards for possible HRC scenarios The pan-robots project: Advanced automated guided vehicle systems for
using HAZOP method and presented risk analysis based on industrial logistics. IEEE Robotics Automation Magazine, PP(99):1–1,
2017.
the hazards. Additionally, we presented our simulation setup [11]ISO. ISO 13849-1:2016 Safety of machinery – Safety-related pats of
based on V-REP along with the proposed ROS architecture control systems – Part 1: General princeples for design. International
and described the usage and advantage of scene graph for the Organization for Standardization, Geneva, Switzerland, January 2016.
[12]A. M. Zanchettin, N. M. Ceriani, P. Rocco, H. Ding, and B. Matthias.
risk management process. Safety in human-robot collaborative manufacturing environments: Met-
Although this work focuses on an automated warehouse rics and control. IEEE Transactions on Automation Science and
Engineering, 13(2):882–893, April 2016.
sce- nario, most of the techniques and algorithms can be [13]J. T. Chuan Tan, F. Duan, Y. Zhang, R. Kato, and T. Arai. Safety design
applied to different contexts. The overall safety solution and development of human-robot collaboration in cellular manufactur-
coupled with the scene graph generation could be used in any ing. In 2009 IEEE International Conference on Automation Science and
Engineering, pages 537–542, Aug 2009.
scenario where humans and robots need to coexist (i.e. office [14]R. Krug, T. Stoyanov, V. Tincani, H. Andreasson, R. Mosberger,
environment or health care etc.). Furthermore, all basic G. Fantoni, and A. J. Lilienthal. The next step in robot commissioning:
navigation, planning and obstacle avoidance strategies are Autonomous picking and palletizing. IEEE Robotics and Automation
Letters, 1(1):546–553, Jan 2016.
agnostic to the context and could be used generally. [15]MTCOS. Safety of industrial trucks. driverless trucks and their systems.
Currently, we are looking into a suitable AI based algorithm technical report. Technical report, EN 1525, 1998.
for the risk reduction phase. This will be implemented as a [16]IEC. IEC 62061:Safety of machinery - Functional safety of safety-related
electrical, electronic and programmable electronic control systems.
ROS node and the output of this algorithm will be the basis to International Electrotechnical Commission, Geneva, Switzerland, 2016.
implement and evaluate the three-layered zone safety strategy. [17]Safety requirements and standardisation for robots: Software
We also intend to set up the real robots and test our risk dos and donts. https://rosindustrial.squarespace.com/s/
ROS-I-Conf2016-day2-06-jacobs.pdf. Accessed: 2018-04-19.
management in the real environment. Another future direction [18]Z. De Lemos. FMEA Software Program for Managing Preventive
could be to work on trust aspects to bring the safety evaluation Maintenance of Medical Equipment. 2004.
closer to the real world. [19]Mehrnoosh Askarpour, Dino Mandrioli, Matteo Rossi, and Federico
Vicentini. SAFER-HRC: Safety analysis through formal verification in
human-robot collaboration. In Computer Safety, Reliability, and
ACKNOWLEDGEMENT Security, pages 283–295, Cham, 2016. Springer International
SCOTT (www.scott-project.eu) has received funding from the Publishing.
[20]Mehrnoosh Askarpour, Dino Mandrioli, Matteo Rossi, and Federico
Electronic Component Systems for European Leadership Joint Un- Vicentini. Modeling operator behavior in the safety analysis of collabo-
dertaking under grant agreement No 737422. This Joint Undertaking rative robotic applications. In Computer Safety, Reliability, and Security,
receives support from the European Unions Horizon 2020 research pages 89–104, Cham, 2017. Springer International Publishing.
and innovation programme and Austria, Spain, Finland, Ireland, [21]Jeremie Guiochet, Quynh Anh Do Hoang, Mohamed Kaaniche, and
Sweden, Germany, Poland, Portugal, Netherlands, Belgium, Norway. David Powell. Model-based safety analysis of human-robot interactions:
The miras walking assistance robot. In Rehabilitation Robotics
(ICORR), 2013 IEEE International Conference on, pages 1–7. IEEE,
REFERENCES 2013.
[22] Damien Martin-Guillerez, J e´re´mie Guiochet, David Powell,
[1]ISO. ISO/TS 15066:2016 Robots and robotic devices – Collaborative and Christophe Zanon. A uml-based method for risk analysis of human-
robots. International Organization for Standardization, Geneva, Switzer- robot interactions. In Proceedings of the 2nd International Workshop on
land, February 2016. Software Engineering for Resilient Systems, pages 32–41. ACM, 2010.
[2]S. Robla-G o´mez, V. M. Becerra, J. R. LLata, E. Gonzlez- [23]L. Masson, J. Guiochet, H. Waeselynck, A. Desfosses, and M. Laval.
Sarabia, Synthesis of safety rules for active monitoring: Application to an airport
C. Torre-Ferrero, and J. Pe´rez-Oria. Working together: A review light measurement robot. In 2017 First IEEE International Conference
on safe human-robot collaboration in industrial environments. IEEE on Robotic Computing (IRC), pages 263–270, April 2017.
Access, PP(99):1–1, 2017. [24]Rafia Inam, Elena Fersman, Klaus Raizer, Ricardo Souza,
[3]ISO. ISO 10218-1 (2011): Robots and robotic devices - Safety require- Amadeu Nascimento Junior, and Alberto Hata. Safety for
ments for industrial robots - Part 1: Robots. International Organization Automated Warehouse exhibiting collaborative robots. In 28th
for Standardization, Switzerland, July 2011. European Safety and Reliability Conference (ESREL’18), pages 2021–
[4]ISO. ISO 10218-2 (2011): Robots and robotic devices - Safety require- 2028, Trondheim, Norway, June 2018. IEEE. available at
ments for industrial robots - Part 2: Robot systems and integration. https://www.taylorfrancis.com/books/9781351174657.
International Organization for Standardization, Switzerland, July 2011. [25]ISO. ISO 13000:2018 Risk management– Guideline. International
[5]Fanny Platbrood and Otto G o¨rnemann. Safe robotics - safety in Organization for Standardization, Geneva, Switzerland, April 2018.
[26]ISO. ISO 12100:2010 Safety of machinery – General principles for
collaborative robot systems. In SICK AG WHITE PAPER, 2017.
design – Risk assessment and risk reduction. International Organization
[6]B. Matthias, S. Kock, H. Jerregard, M. Kllman, and I. Lundberg.
for Standardization, Geneva, Switzerland, November 2010.
Safety of collaborative industrial robots: Certification possibilities for
[27]E. Rohmer, S. P. N. Singh, and M. Freese. V-REP: A versatile and
a collaborative assembly robot concept. In 2011 IEEE International
scalable robot simulation framework. In 2013 IEEE/RSJ International
Symposium on Assembly and Manufacturing (ISAM), May 2011.
Conference on Intelligent Robots and Systems, Nov 2013.
[7]M. J. Rosenstrauch and J. Kr u¨ger. Safe human-robot-collaboration-
[28]O. Brock and O. Khatib. High-speed navigation using the global
introduction and experiment using ISO/TS 15066. In 2017 3rd Interna-
dynamic window approach. In Proceedings 1999 IEEE International
tional Conference on Control, Automation and Robotics (ICCAR), pages
Conference on Robotics and Automation, volume 1, 1999.
740–744, April 2017.
[29]Michael Ying Yang, Wentong Liao, Hanno Ackermann, and Bodo
[8]J e´re´mie Guiochet. Hazard analysis of human-robot interactions Rosenhahn. On support relations and semantic scene graphs. ISPRS
with HAZOP-UML. Safety Science, 84:225 – 237, 2016. Journal of Photogrammetry and Remote Sensing, 131:15 – 25, 2017.
[9]E. Guizzo. Three engineers, hundreds of robots, one warehouse. IEEE [30]John D Lee and Katrina A See. Trust in automation: Designing for
Spectrum, 45(7):26–34, July 2008. appropriate reliance. Human factors, 46(1):50–80, 2004.
View publication stats

Вам также может понравиться