Вы находитесь на странице: 1из 111

Accessible and Assistive ICT

VERITAS
Virtual and Augmented Environments and Realistic User Interactions To achieve
Embedded Accessibility DesignS

247765

Simulation models for the automotive scenario

Deliverable No. D2.2.1

SubProject No. SP2 SubProject Title Innovative VR models, tools and


simulation environments

Workpackage W2.2 Workpackage Automotive Solutions


No. Title

Activity No. A2.2.2 Activity Title Generation of simulation models

Authors Hans-Joachim Wirsching (HS), Nicola Cofelice ,


Hunor Erdlyi (LMS), Francesco Palma, Giuseppe
Varalda (CRF), Irene Ducci, Marco Pieve (PIAGGIO)

Status F (Final)

Dissemination Level PU (Public)

File Name: VERITAS_D2.2.1_final

Project start date and 01 January 2010, 48 Months


duration
VERITAS D2.2.1 PU Grant Agreement # 247765

Version History table


Version Dates and comments
no.

1 25. October 2011; Initial version based on ID2.2.1(HS)

2 9. November 2011; LMS contribution

3 15. November 2011; CRF contribution

4 23. November 2011; HS contribution to simulation methods

5 7. December 2011; HS contribution to RAMSIS task files; USTUTT


contributions

6 12. December; USTUTT contribution; Piaggio contribution

8 2 December 2011; (CRF) Executive summary and general review and


editorial revision

iii
VERITAS D2.2.1 PU Grant Agreement # 247765

Table of Contents
Version History table ............................................................................................... 3
Table of Contents ..................................................................................................... 4
List of Figures .......................................................................................................... 6
List of Tables ............................................................................................................ 7
Abbreviations list ..................................................................................................... 8
Executive Summary ................................................................................................. 9
1 UCD in the automotive industry..................................................................... 10
1.1 Overview ............................................................................................... 10
1.2 UCD design in the automotive industry: the simulation approach ......... 11
1.3 Integration of VERITAS tools ................................................................ 12
2 Industrial needs............................................................................................... 14
3 Application requirements ............................................................................... 15
3.1 Use case analysis ................................................................................. 15
3.2 Automotive domain requirements ......................................................... 19
4 Simulation models .......................................................................................... 21
4.1 General approach ................................................................................. 21
4.2 Car interior applications ........................................................................ 23
4.2.1 Overall simulation task table.......................................................................................... 23
4.2.2 Identification of object parameters ................................................................................ 29
4.2.3 Definition of success criteria for each task .................................................................... 32
4.2.4 Generation of UsiXML files ............................................................................................ 41
4.2.5 Transfer of simulation models into RAMSIS tasks ........................................................ 44
4.3 Motor cycle applications........................................................................ 49
4.3.1 Overall simulation task table.......................................................................................... 49
4.3.2 Identification of object parameters ................................................................................ 53
4.3.3 Definition of success criteria for each task .................................................................... 56
4.3.4 Generation of UsiXML files ............................................................................................ 58
4.3.5 Transfer of simulation models into RAMSIS tasks ........................................................ 65
5 Tools requirements ......................................................................................... 66
5.1 Overview of tool requirements .............................................................. 66
5.1.1 Multibody methodology (MBS) ...................................................................................... 67
5.1.2 Finite element Method (FEM) ........................................................................................ 69
5.1.3 Video Motion Capture (MC) ........................................................................................... 69
5.2 RAMSIS tool ......................................................................................... 70
5.3 LMS Virtual.Lab tool.............................................................................. 73
5.3.1 Vibrational analysis methodology .................................................................................. 76
5.4 Lightning tool ........................................................................................ 76
5.4.1 Event propagation ......................................................................................................... 78
5.4.2 Multilanguage/multiparadigm programming .................................................................. 79
5.4.3 Superposition of virtual space and user space.............................................................. 79
5.4.4 Extensibility .................................................................................................................... 80
6 Integrated environment simulation specifications ....................................... 81

iv
VERITAS D2.2.1 PU Grant Agreement # 247765

6.1 Use case and feasibility analysis .......................................................... 81


6.2 Integration platform RAMSIS ................................................................ 83
6.2.1 Human simulation parameters ...................................................................................... 83
6.2.2 Accessibility simulation methods ................................................................................... 86
6.3 Integration platform LMS virtual lab ...................................................... 90
6.4 Integration platform Lightning ............................................................... 90
6.4.1 RAMSIS simulation core ................................................................................................ 90
6.4.2 VERITAS simulation core .............................................................................................. 92
7 Specification of simulation methods / techniques ....................................... 94
7.1 Integration platform RAMSIS ................................................................ 94
7.1.1 Posture prediction subjected to new applications and human parameters ................... 94
7.1.2 Posture assessment ...................................................................................................... 97
7.1.3 Vision assessment ......................................................................................................... 99
7.1.4 Simulation model execution......................................................................................... 101
7.2 Integration platform LMS Virtual.Lab................................................... 102
7.2.1 Vibrational analysis methodology ................................................................................ 103
7.2.2 Car environment representation and transmissibility analysis .................................... 103
7.2.3 Motorcycle environment representation ...................................................................... 105
7.3 Integration platform Lightning ............................................................. 108
7.3.1 User interaction ........................................................................................................... 108
7.3.2 Vision assessment ....................................................................................................... 109
8 Conclusions................................................................................................... 110
9 References ..................................................................................................... 111

v
VERITAS D2.2.1 PU Grant Agreement # 247765

List of Figures
Figure 1-1 - Simulation approach in automotive industry .................................................. 11
Figure 1-2 - Integration of VERITAS tools ......................................................................... 13
Figure 3-1 - Use case analysis process ............................................................................ 15
Figure 3-2 - Ergonomic Design Process example in the Automotive field ......................... 19
Figure 4-1 - Generation of overall simulation task table .................................................... 21
Figure 4-2 - Success criteria in task table ......................................................................... 22
Figure 4-3 - Generation of UsiXML files ............................................................................ 22
Figure 4-4 - Transfer of simulation models into RAMSIS tasks ......................................... 23
Figure 4-5 - Navigation Information Accessibility Simulation Model .................................. 43
Figure 4-6 - Listen Navigation System Audio Cues Task Model ....................................... 43
Figure 4-7 - Listen Navigation System Audio Cues Interaction Model............................... 44
Figure 4-8 - Motorcycle Simulation Model ......................................................................... 60
Figure 4-9 - Get on Scooter Task Model ........................................................................... 62
Figure 4-10 - Motorcycle OBIS Simulation Model ............................................................. 63
Figure 4-11 - Warning Recognized Task Model ................................................................ 63
Figure 4-12 - Become aware of the situation Interaction Model ........................................ 64
Figure 5-1 - 3D CAD manikin RAMSIS.............................................................................. 70
Figure 5-2 - Posture prediction method ............................................................................. 71
Figure 5-3 - Vision Analysis ............................................................................................... 72
Figure 5-4 - Force Analysis ............................................................................................... 72
Figure 5-5 - RAMSIS body dimensions ............................................................................. 73
Figure 5-6 - LMS Virtual Dummy ....................................................................................... 75
Figure 5-7 - Structure of Lightning tool .............................................................................. 76
Figure 5-8 - Object pool and dataflow in Lightning ............................................................ 78
Figure 5-9 - Execution of Lightning applications ................................................................ 79
Figure 6-1 - Use case and feasibility analysis process ...................................................... 81
Figure 6-2 - Joint range limits ............................................................................................ 83
Figure 6-3 - Human maximum torque data........................................................................ 84
Figure 6-4 - Eye motion limits and visual field (gaze, perception) ..................................... 85
Figure 6-5 - Visual acuity................................................................................................... 85
Figure 6-6 - Posture prediction on joint range restrictions ................................................. 86
Figure 6-7 - Generation of posture distribution functions................................................... 87
Figure 6-8 - Posture modeling from experiments .............................................................. 87
Figure 6-9 - Joint force assessment .................................................................................. 88
Figure 6-10 - Posture assessment .................................................................................... 88
Figure 6-11 - Vision assessment ....................................................................................... 89
Figure 6-12 - RAMSIS nice skin integrated in Lightning .................................................... 91
Figure 6-13 - RAMSIS standard mode integrated in Lightning .......................................... 91
Figure 6-14 - VERITAS simulation core integrated in Lightning ........................................ 92
Figure 6-15 - Third person perspective ............................................................................. 93

vi
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 6-16 - First person perspective............................................................................... 93


Figure 7-1 - Camera parameters ....................................................................................... 95
Figure 7-2 - Image import in 3D environment .................................................................... 95
Figure 7-3 - Superimposed manikin in separate image work spaces ................................ 96
Figure 7-4 - Joint angle histogram ..................................................................................... 96
Figure 7-5 - Joint angle distribution function...................................................................... 97
Figure 7-6 - Design accessibility simulation ...................................................................... 98
Figure 7-7 - Comparison of task imposed and preferred posture ...................................... 98
Figure 7-8 Comparison of task imposed and preferred joint angle .................................... 99
Figure 7-9 - Display of body part ratings ........................................................................... 99
Figure 7-10 - Vision control mask .................................................................................... 100
Figure 7-11 - Acute vision field ........................................................................................ 101
Figure 7-12 - Visual field, cone (left) and circle (right) representation ............................. 101
Figure 7-13 - Simulation model execution ....................................................................... 102
Figure 7-14 - Dummy in the driver position ..................................................................... 103
Figure 7-15 - Process flow for Frequency Domain Simulation ........................................ 105
Figure 7-16 - Multibody motorcycle model used for the analysis ..................................... 106
Figure 7-17 - Motorcycle modes of plane vibration (left) and dummy positioning
(right) ........................................................................................................................ 107
Figure 7-18 - Tracking cube ............................................................................................ 108
Figure 7-19 - Example eye cataract shader .................................................................... 109

List of Tables
Table 1 - Use case analysis results (desktop application) ................................................. 17
Table 2 - Use case analysis results (immersive application) ............................................. 18
Table 3 - Overall simulation task table for car interior use cases ...................................... 23
Table 4 - Object parameters for car interior use cases...................................................... 29
Table 5 - Success criteria for car interior use cases .......................................................... 32
Table 6 - RAMSIS task and use case / simulation model files for car interior use
cases .......................................................................................................................... 45
Table 7 - Overall simulation task table for motor cycle use cases ..................................... 49
Table 8 - Object parameters for motor cycle use cases .................................................... 53
Table 9 - Success criteria for motor cycle use cases ........................................................ 56
Table 10 - RAMSIS task and use case / simulation model files for motor cycle use
cases .......................................................................................................................... 65
Table 11 - Overview of the existing physical models for motor and vision impairment ...... 66
Table 12 - Use case and feasibility analysis results (desktop application) ........................ 82
Table 13 - Use case and feasibility analysis results (immersive application) .................... 82

vii
VERITAS D2.2.1 PU Grant Agreement # 247765

Abbreviations list

Abbreviation Explanation
A Activity
API Application Programming Interface
C++ (pronounced "cee plus plus") is a statically typed, free-form,
multi-paradigm, compiled, general-purpose programming
language.
D Deliverable
DOF Degree of freedom
ICT Information and Communication Technology
ID Internal Deliverable
DoW Description of Work
OpenGL Open Graphics Library)[3] is a standard specification defining a
cross-language, cross-platform API for writing applications that
produce 2D and 3D computer graphics.
OSG OpenSceneGraph is an open source 3D graphics application
programming interface
PTW Powered Two Wheeler
SP Subproject
UCD User Centred design
WP Work package

viii
VERITAS D2.2.1 PU Grant Agreement # 247765

Executive Summary
D2.2.1 describes activities and results of activities in A2.2.2 to A2.2.4, starting from ID2.2.1,
which is an internal deliverable produced as a result of the initial activities of A2.2.1.

Starting from results of task analysis and use cases description performed in WP1.7 and the
virtual user model file generated by the VERITAS model platform (WP1.6), the requirements
of end-users and of technology providers for ergonomic design with respect to humans with
limited functionalities are analyzed.

In the first part of the document an overview of the automotive (i.e. car and motorcycle)
needs and requirements is presented together with a description of the methodologies and
related software for human physical modeling which are expected to be used for the
simulation of the functional impairments.

In the second part, the simulation models are described in detail for the two application
domains of passenger car and motorcycle, in terms of:

A overall simulation task table, which for each use case, describe the primitive task
used to perform a use-case level task (subtask in the Use Case language)

A table reporting for each use case / subtask the parameters identified as descriptive
of the performance of the subtask

A table reporting for each use case the subtask success criteria, which are one basis
for evaluating the effectiveness of the accessible solutions developed with VERITAS
tools

A table describing the transfer of simulation models in the target simulation


environments.

In the last section, details on the integration of each identified VERITAS tool are provided, in
terms of review of the requirements for these tools, followed by integration specifications,
which will be implemented and documented in D2.2.2 VR and integrated solutions for the
automotive scenario (M36).

20/12/2011 9 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

1 UCD in the automotive industry


1.1 Overview
Performance and technical development of road vehicles is strongly based on translating
user expectations and interaction behavior into technical requirements, based either on
criteria and requirements and on evaluation of uservehicle interaction by means of testing
with subjects. Necessarily these criteria and evaluations are based on a reference
population which is described in measurable terms wherever possible (e.g. static and
dynamical anthropometrics, visual acuity) or by requirements and guidelines based on best
practices [10]. Currently these principles are based on able-bodied population which, while
covering almost 90% of the general population, does not necessarily cover the needs of
users with special needs. Such users are increasingly accessing road transport, and
especially personal vehicles, due either to a consolidated habit and to increasing
opportunities and needs for personal mobility.

Disabled users are considered based mainly on vehicle adaptation, while the elderly is taken
into account at the design stage either by increasing the personalization features of the
vehicles, or addressing by design issues such as ingress-egress or instrument cluster
legibility.

Since it is demonstrated in many cases that targeting the population with special needs can
give benefit also to the general population (for example designing the instrument cluster with
characters and layout taking into account visual and cognitive characteristics of the elderly),
it is expected that the VERITAS design approach supports accessibility to the vehicle by:

including the special population in the mainstream design process by proper


methodologies and reference data referred to the enlarged population of potential
users;

providing design and evaluation tools, which can be used by designers from the
early engineering stages and during the design refinement, either by simulation
or by testing with potential users in virtual environments.

In such a context simulation plays a key role since the fundamental set-up of vehicles is
consolidated early in the process, by vehicle modeling and virtual evaluation of
performances such as styling, ergonomics, aerodynamics and safety.

Therefore, widening the scope of early stage simulation tools to include by nature the
characteristics of special population is the key asset to structure and give efficiency to the
universal design as addressed in VERITAS.

20/12/2011 10 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

1.2 UCD design in the automotive industry: the simulation


approach
Digital human modeling is widely used in the automotive industry (OEMs and suppliers) for
design of cars, for ergonomics, comfort and passive safety. Since many years design
engineers evaluate the human machine interaction in automotive vehicles through digital
human models at early design stages (Figure 1-1). For physical ergonomics design, the
engineer first inserts a digital human representing one critical target user into the CAD
environment of the vehicle. Then the human model simulates critical postures and motions,
which are used to evaluate finally the interaction of the human with the environment
regarding the following ergonomic aspects (accessibility simulation):

Visibility: Can the target user see all necessary visual information? This includes
the recognition of ICTs as well as classical aspects as sight limits inside and
outside the vehicle.

Clearance and accessibility: Does the vehicle have the sufficient room and
solutions so that the target user can carry out all necessary movements freely
without collision and in comfort conditions?

Ease of use: Can the target user reach and manipulate the controls easily within
his/her physical capacity (strength, joint maximum range of motion, equilibrium
keeping)?

Since a vehicle can be used by thousands even millions of people, the design engineer has
to consider the variability of user population in the digital human model system. People vary
not only in body dimension and shape (anthropometry), but also in physical capacity (body
strength, joint mobility and vision).

Input Parameter Probability based Analyses


Posture Prediction

Tasks:
- Sit Accessibility
- Grasp st. wheel Simulation
- Press pedall
- etc.

Figure 1-1 - Simulation approach in automotive industry

20/12/2011 11 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

For the purpose of assessing the ergonomics performances even before the availability of
the first physical prototype or test-rig, digital human models have been developed by
manufacturers, then providing key input to the vehicle development in the early stage. Digital
human models are virtual models that can represent the behavior of the human body under
different perspectives (kinematics, dynamics, ergonomics, crash, etc.).

Another key functional performance aspect in automotive design and ergonomics analysis is
the assessment of the vibrational comfort. Since the exposition to vibrations may have
adverse effects on the human health, the comfort performance of the final product must be
carefully evaluated and optimized in the virtual engineering process, in order to guarantee
the comfort performance of the final product that hits the road. Short-term human discomfort
may lead to annoyance, temporary hearing threshold shift (temporary hearing loss), reduced
motion control, impaired vision, discomfort and fatigue ([1], [2], [3]). Long-term and extended
exposure to whole body vibration has been linked to chronic back pain. These cumulative
effects can be even more dangerous, due to the risk of negatively influencing the driver
physical and cognitive performances.

1.3 Integration of VERITAS tools


A sound understanding of the capabilities of existing digital human models and the functional
decline and limitations of older and disabled population are essential to a successful
integration of VERITAS tools. Currently in automotive industry available digital human
models and environments allow, among others, for the simulation of basic human perception
(e.g. visual) and interaction with the environment (e.g. reach). These models are
representative of normal human functionalities, in which limitations coming in particular
from age or functional impairment are not a factor being considered.

Older and disabled people suffer deterioration in muscle strength, joint flexibility, sense of
balance, motion control, and visual and audio acuity. The work package WP2.2 will take
such deterioration into account and develop new models with an aim to extend available
human models capabilities to the simulation of impaired human function. The new models
will be integrated into Virtual Reality environment and CAD/CAE tools to support the
ergonomic design of cars which are accessible to people with impaired mobility and make it
possible to validate the (physical) accessibility in the mainstream development process.

Figure 1-2 demonstrates how the VERITAS tools will be integrated directly into the current
human simulation process of the automotive industry. The process is fed by input
parameters as task description (from WP1.7) and the virtual user model file generated by the
VERITAS model platform (from WP1.6). The accessibility simulation is extended by new
simulation models created in A2.2.2, which are based on simulation aspects identified in the
present internal deliverable. Finally the extended accessibility simulations are integrated into
VR (A2.2.3) and desktop applications (A2.2.4).

20/12/2011 12 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

WP1.6

VR / desktop
Environ- Input Parameter Probability based Analyses application
Posture Prediction
mental A2.2.3 / 2.2.4
geometry
file

VERITAS
user
model file

Tasks:
Task file - Sit Accessibility
- Grasp st. wheel Simulation
- Press pedall
- etc.
A2.2.2
WP1.7
1 Simulation models

A2.2.1

Simulation aspects

Figure 1-2 - Integration of VERITAS tools

20/12/2011 13 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

2 Industrial needs
With reference to [11], the assessment of industrial needs towards the VERITAS solutions
has allowed to identify a series of points and issues, which are summarized here with a
particular focus on digital analysis based on human modeling:

VERITAS tools and associated methods should integrate smoothly in the


mainstream development process, which consists in a specific reading of the
general User Centered Design approach, based on target setting in terms of user
requirements and expectations, and deployment into performance requirements
and eventually technical specifications

As a consequence, integration or at least compatibility with the operational and


format standards and mainstream tools should be ensured. VERITAS functions
should eventually become part of industry grade virtual ergonomics
environments.

Time or cost saving are not expected to constitute a major outcome of the use of
VERITAS tools, while enlarging the scope of virtual analysis towards accessibility
will have to be. The tools should anyway not imply longer development times or
costs, nor causing significant changes in the engineering process and engineers
operational practices. As a consequence, ergonomics evaluation with virtual
environment incorporating the VERITAS tools should be accomplished by
selecting configurations and parameters in a similar way with respect to the
current implementations.

Effectiveness has to be demonstrated as a requisite to implement the VERITAS


tools in the design process; therefore, there is a relevant expectation towards the
VERITAS pilots results, which should give indications about these aspect also.

VERITAS tools application should at least allow for designing special applications
(e.g. for disabled)

These requirements and expectations are intended as a relevant factor of the VERITAS
approach industrial feasibility; therefore, their implementation should result in the expected
benefits above described.

20/12/2011 14 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

3 Application requirements
3.1 Use case analysis
Within Work Package 1.7, the use cases and application scenarios for the automotive field
were defined. The process of identification of each use case (Figure 3-1) was applied and
use cases for four different subjects (car, motorcycle ADAS-IVIS systems, ORAS-OBIS
systems) and related specific scenarios were identified.

- Healthy - Joint strength


- Limited force capability - Joint mobility
- Limited mobility

Hand brake Human Simula


Actor
operation Parameter

Use
D1.7.1 Action
Case
Accessibility
Simulation
Pull or push Design
hand brake lever Question

- Posture simulatio
1 - Force simulation

- Is lever reachable?
- Is lever movable?

Figure 3-1 - Use case analysis process

In the following tables, for each automotive scenario the description of the design
and impairments for the desktop application (

20/12/2011 15 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Table 2 - Use case analysis results (immersive application)

) and immersive application (Error! Reference source not found.) are shown.

20/12/2011 16 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Table 1 - Use case analysis results (desktop application)

Subject Scenario Use Case / D1.7.1 Design Question Impairments


Central rear view mirror
Car interior 2.1.a.1 tuning Reachability upper limb movement limitations

2.1.a.2 Lateral mirror tuning Reachability upper limb movement limitations


Hand brake Reachability upper limb movement limitations
2.1.a.3 activation/deactivation Force capability upper limb traction/pushing force limitations
Gear changing
2.1.a.4 (manual/automatic) Reachability upper limb movement limitations
Accessing interior storage
2.1.a.5 compartments Reachability upper limb movement limitations

Motor-cycle 2.2.a.1 Riding position Adequate position upper limb impairment

2.2.a.2 Usage on bumpy road Adequate articular response upper limb impairment

Limb impairments
2.2.a.3 Central stand Ability to place the vehicle on stand Reduced force capabilities

Reachability
Force Capability upper limb impairment (flexibility, force)
2.2.a.4 Motorcycle handling Visibility visual impairments
upper limb limitations
On-board navigation system Reachability Limited head rotation
ADAS / IVIS 2.3.a.1 programming Visibility Visual limitations (tunnel vision)
Audio system programming
2.3.a.2 and tuning Reachability upper limb limitations
Navigation information Low reaction time
2.3.a.3 accessibility Accessible informatory content Visual acuity
Support colour-blindness Visual limitations
Correct speech characteristics Hearing limitations
Collision Avoidance system Warning reacation time Reduced reaction time
ARAS / OBIS 2.4.a.1 (CAS) Character size Vision acuity
Position in vision field Visual acuity
Character size Reduced peripheral Field of vision
Reachability Limited mobility
2.4.a.2 Navigation Correct speech characteristics Hearing limitations

20/12/2011 17 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Table 2 - Use case analysis results (immersive application)

Application Subject Scenario Use Case / D1.7.1 Design Question Impairments


Central rear view mirror
Immersive Car interior 2.1.b.1 tuning Reachability upper limb movement limitations
2.1.b.2 Lateral mirror tuning Reachability upper limb movement limitations
Hand brake Reachability upper limb movement limitations
2.1.b.3 activation/deactivation Force capability upper limb traction/pushing force limitations
Gear changing
2.1.b.4 (manual/automatic) Reachability upper limb movement limitations
Accessing interior storage
2.1.b.5 compartments Reachability upper limb movement limitations
upper limb limitations
On-board navigation system Reachability Limited head rotation
ADAS / IVIS 2.3.b.1 programming Visibility Visual limitations (tunnel vision)
Audio system programming
2.3.b.2 and tuning Reachability upper limb limitations
Navigation information Low reaction time
2.3.b.3 accessibility Accessible informatory content Vision acuity
Support colour-blindness
Correct speech characteristics Visual limitations
Collision Avoidance system Warning reacation time Hearing limitations
ARAS / OBIS 2.4.b.1 (CAS) Character size Reduced reaction time
Position in vision field Visual acuity
Character size Reduced peripheral Field of vision
Reachability Limited mobility
2.4.b.2 Navigation Correct speech characteristics Hearing limitations

20/12/2011 18 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

3.2 Automotive domain requirements


Ergonomics is the scientific discipline dedicated to the study of interaction between man and
technological system (e.g. a vehicle). Within the automotive domain, the ergonomic design
can be considered as a crucial part of comfort assessment and it is intended to optimize the
system performance and the well-being of the user.

Ergonomics within design domain usually refers to the study of physical aspects in terms of
interaction between man and machine, geometric and physiological study of the habitability
(postural comfort) but also to cognitive aspects such as behavioral approach to the system
(commands and controls, instruments and accessories, driving support systems) and to
auxiliary aspects like protectiveness of the vehicle against bad weather and for the comfort
and safety of the driver.

An ergonomic design for a vehicle (Figure 3-2) starts with the definition of the characteristics
of the target user: data such as markets to be addressed, customer profile, type of vehicle
and competitors offer in the reference market segment are important to identify the
ergonomics requirements to be satisfied by the vehicle. Simulation tools based on virtual
models (dummies) are used for posture analysis on sketches and vehicle layout designed
with CAD. The virtual vehicle model is then approved by mean of body appearance
evaluation, aerodynamic comfort analysis and of ergonomic assessment.

Figure 3-2 - Ergonomic Design Process example in the Automotive field

Current available simulation models allow mainly to investigate the geometrical habitability
on a vehicle so that the designer can intervene consequently on the layout. Also additional

20/12/2011 19 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

functionalities such as visual perception of the pilot and interaction with the environment are
useful to check ergonomics during the design phase.

Apart from computer simulation, also physical simulations with test riders through vehicle
ergonomic simulators and ergonomic test benches for instance for the evaluation of
dashboard and controls, and also experimental tests, made on vehicle prototypes, are
usually carried out in the subsequent phases of product development.

Even if the ergonomic design approach is quite similar, there are different aspects to be
taken into account while designing a car rather than a motorcycle. First of all, designing a car
means design a stable and closed vehicle, while for motorcycle case the riding conditions
(i.e rider must keep the equilibrium in a dynamic way), and the weather on the rider
influences the process. Additionally the ergonomics concerns not only the physical design of
a vehicle but also cognitive aspects related to the human-machine interface field, which
basically consists of the study of the interaction between the rider and the on-board
application tailored on specific requirements of the vehicle which are completely different for
a car and a motorcycle (e.g. accessing to the information from the application while riding a
motorcycle and therefore while maintaining the hands on the handlebar, listening to the
audio messages in an open and noisy environment). Regarding automotive application, in
Work Package 3.1, two specific tasks, A3.1.1 Car interior and A3.1.2 Motorcycle
handling, will be carried out.

Additionally, from simulation point of view, a specific posture model for the motorcycle field,
which should assign to the dummy a physiological position on the vehicle, has to be defined.
Therefore, a specific posture model will be assessed with the aim to reproduce a realistic
posture, through the implementation of an experimental campaign (see also paragraph
6.2.2), and will constitute the base for the following accessibility simulation.

Simulation models in the automotive design are representative of normal human


functionalities and usually dont refer to limitations related to age or diseases. The intention
of extending the investigation of ergonomics in case of restricted mobility, visibility and
anthropometrics of older people in Veritas Project, will then result in new simulation models
integrated in current design instruments.

20/12/2011 20 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

4 Simulation models
4.1 General approach
All simulation models and related information are created by the following general process:

1. Generation of overall task table (Figure 4-1)


All necessary data are collected from the deliverables D1.7.1 (use cases), D1.7.2
(task models) and ID2.8.3 (interaction modalities). Finally they are structured in an
EXCEL table as illustrated in Figure 4-1.

D1.7.1 Use case definition D1.7.2 Task analysis ID2.8.3 Multimodal interaction

Primitive
Alternative
tasks
Alternative Alternative task
Use Case Task Subtask (numbers Modality Task object Disability
task(s) modality object/assistiv
indicate
e device
alternatives)

Parking brake Lower limb


Push (thumb) Motor Pull (hand) Motor Brake lever
2.1.a.3 Conducting a release button impaired
Parking brake
Handbrake car
Pull (hand)

Figure 4-1 - Generation of overall simulation task table

2. Identification of object parameters


For all relevant task objects the variable parameters are identified. E.g. a hand brake
contains the parameters rotation angle and resistance force.
3. Definition of success criteria for each task
For each primitive task a measurable success criterion is defined. In addition to the
criterion a corresponding (quantitative) threshold is specified, which indicates when a
criteria is fulfilled or not. The criteria and thresholds are integrated in new columns in
the table of Figure 4-1 addressing the tasks as well as the alternative tasks as
illustrated in Figure 4-2.

20/12/2011 21 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Primitive
tasks Alternative task
Success Alternative Alternative Success
Use Case Task Subtask (numbers Success critera Modality Task object Disability object / assistive Success critera
threshold tasks(s) modality threshold
indicate device
alternatives)

Figure 4-2 - Success criteria in task table

4. Generation of corresponding UsiXML files (Figure 4-3)


The UsiXML files for the simulation model, task model and multimodal interaction
model file are created from the overall simulation task table.

Primitive
Alternative
tasks
Alternative Alternative task
Use Case Task Subtask (numbers Modality Task object Disability
task(s) modality object/assistiv
indicate
e device
alternatives)

Parking brake Lower limb


Push (thumb) Motor Pull (hand) Motor Brake lever
2.1.a.3 Conducting a release button impaired
Parking brake
Handbrake car
Pull (hand)

Simulation model file Task model file Multimodal interaction


(*.usi) (*.usi) model file (*.usi)

Figure 4-3 - Generation of UsiXML files

5. Transfer of simulation models into RAMSIS tasks (Figure 4-4)


Finally all simulation models are transferred into the RAMSIS specific task
description environment. In a first level the primitive task descriptions are modeled in
specific RAMSIS task files (extension *.tsk). From each task file a posture of the
manikin will be calculated. In a second level these task files are ordered to represent
a sequence of primitive tasks in the sub task. This ordering is modeled in specific
RAMSIS simulation model / use case files (extension *.aan). These files contain
references to the task files and can be loaded by RAMSIS to generate a sequence of
manikin postures through the automatic analysis function.

20/12/2011 22 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Primitive
Alternative
tasks
Alternative Alternative task
Use Case Task Subtask (numbers Modality Task object Disability
task(s) modality object/assistiv
indicate
e device
alternatives)

Parking brake Lower limb


Push (thumb) Motor Pull (hand) Motor Brake lever
2.1.a.3 Conducting a release button impaired
Parking brake
Handbrake car
Pull (hand)

Primitive
Task list task

RAMSIS Task

Figure 4-4 - Transfer of simulation models into RAMSIS tasks

This process is applied to the car interior and motor cycle application domain and the results
are reported in the following sections.

4.2 Car interior applications


4.2.1 Overall simulation task table
Table 3 - Overall simulation task table for car interior use cases
Use Task Subtask Primitiv Moda Task Disabi Altern Altern Altern
Case e lity object lity ative ative ative
x={a| (numbe tasks tasks(s) modali task
b} rs ty object /
indicate (numbe assistiv
alternat rs e
ives) indicate device
alternat
ives)
2.1.x.1 Conduc Viewing Look Vision Central = = = =
Central ting a backward (eyes) rearview
rearview car s (inside mirror
mirror car)
tuning Grasp Motor Central Upper Speak Voice Voice
(hand) rearview limb control controlle
mirror impair d mirror
ment

20/12/2011 23 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Task Subtask Primitiv Moda Task Disabi Altern Altern Altern
Case e lity object lity ative ative ative
x={a| (numbe tasks tasks(s) modali task
b} rs ty object /
indicate (numbe assistiv
alternat rs e
ives) indicate device
alternat
ives)
Turn Motor Central Upper Speak Voice Voice
(hand) rearview limb control controlle
mirror impair d mirror
ment

Push Motor Central Upper Speak Voice Voice


(hand) rearview limb control controlle
mirror impair d mirror
ment

Look Vision Central = = = =


(eyes) rearview
mirror

2.1.x.2 Conduc Viewing Look Vision Left = = = =


Lateral ting a backward (eyes) lateral
mirror car s (outside mirror
tuning car, left)
1 Grasp Motor Left Upper Speak Voice Voice
(hand) lateral limb control controlle
mirror impair d mirror
knob ment

Turn Motor Left Upper Speak Voice Voice


(hand) lateral limb control controlle
mirror impair d mirror
knob ment

Push Motor Left Upper Speak Voice Voice


(hand) lateral limb control controlle
mirror impair d mirror
knob ment

Look Vision Left = = = =


(eyes) lateral
mirror

2.1.x.2 Conduc Viewing Look Vision Left = = = =


lateral ting a backward (eyes) door
mirror car s (outside mirror
tuning car, left) control
switch

20/12/2011 24 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Task Subtask Primitiv Moda Task Disabi Altern Altern Altern
Case e lity object lity ative ative ative
x={a| (numbe tasks tasks(s) modali task
b} rs ty object /
indicate (numbe assistiv
alternat rs e
ives) indicate device
alternat
ives)
2 (4 way)

Reach Motor Left Upper Speak Voice Voice


(hand) door limb control controlle
mirror impair d mirror
control ment
switch
(4 way)

Push Motor Left Upper Speak Voice Voice


(hand) door limb control controlle
mirror impair d mirror
control ment
switch
(4 way)

Push Motor Left Upper Speak Voice Voice


(hand) door limb control controlle
mirror impair d mirror
control ment
switch
(4 way)

Look Vision Left = = = =


(eyes) lateral
mirror

2.1.x.3 Conduc Parking Push Motor Handbra Upper Pull Motor Handbra
Handbra ting a brake (thumb) ke lever limb (hand) ke lever
ke car deactivati unlock impair with pull
activatio on button ment unlock
n/ handle
deactivat
ion Controlle Motor Handbra Upper Controll Motor Handbra
d release ke lever limb ed ke lever
(hand) impair release (*)
ment (hand)

20/12/2011 25 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Task Subtask Primitiv Moda Task Disabi Altern Altern Altern
Case e lity object lity ative ative ative
x={a| (numbe tasks tasks(s) modali task
b} rs ty object /
indicate (numbe assistiv
alternat rs e
ives) indicate device
alternat
ives)
Release Motor Handbra Upper Release Motor Handbra
(thumb) ke lever limb (hand) ke lever
unlock impair with pull
button ment unlock
handle

2.1.x.4 Conduc Driving Push (left Motor Clutch Lower Push Motor Hand
Gear ting a backward foot) limb (hand / button /
changing car s impair foot) feet
(manual / 1 ment button
automati (Manual) clutch
c) activ.

Grasp Motor Manual Upper Reach Motor Button


(right gear limb (hand) gear
hand) lever impair selector
ment

Push Motor Manual Upper Push Motor Button


(hand) gear limb (hand) gear
lever impair selector
ment

Pull Motor Manual Upper n.a. n.a. n.a.


(hand) gear limb
lever impair
ment

2.1.x.4 Conduc Driving Grasp Motor Automat Upper Reach Motor Button
Gear ting a backward (right ic gear limb (hand) gear
changing car s hand) lever impair selector
(manual / 2 ment
automati (Automat
c) ic) Push Motor Automat Upper n.a. n.a. n.a.
(hand) ic gear limb
lever impair
unlock ment
button

Push Motor Automat Upper Push Motor Button


(hand) ic gear limb (hand) gear
impair

20/12/2011 26 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Task Subtask Primitiv Moda Task Disabi Altern Altern Altern
Case e lity object lity ative ative ative
x={a| (numbe tasks tasks(s) modali task
b} rs ty object /
indicate (numbe assistiv
alternat rs e
ives) indicate device
alternat
ives)
lever ment selector

2.1.x.5 Using Accessin Look Visual Storage Recogn Look Visual Enhance
Accessin interior g interior (eyes) door ition (eyes) d
g interior equipm storage (elderly recogniti
storage ent / on labels
compart (NEW) cognitiv
ments e)

Reach Motor Storage Upper Reach Motor Easy


(hand) door limb (hand) reach
impair storage
ment
(elderly
)

Pull Motor Storage Upper Push Motor Push-to-


(hand) door limb (hand) open
handle impair storage
ment door
(elderly
)

Look Visual Storage Percepti Look Visual Illumina


(eyes) compart on (eyes) ted
ment (elderly storage
/ compart
cognitiv ment
e)

Reach Motor Object Upper Reach Motor Easy


(hand) in limb (hand) reach
storage impair storage
compart ment
ment (elderly
)

2.3.x.1 Conduc Destinati Reach Motor Destinat Upper Speak Voice Vocal
On- ting a on (hand) (multi ion limb comman
board car selection 1 ple) select impair ds (to

20/12/2011 27 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Task Subtask Primitiv Moda Task Disabi Altern Altern Altern
Case e lity object lity ative ative ative
x={a| (numbe tasks tasks(s) modali task
b} rs ty object /
indicate (numbe assistiv
alternat rs e
ives) indicate device
alternat
ives)
nav. buttons ment select
system dest.)
program
ming
Reach Motor Destinat Upper Move Motor Scannin
(hand) (multi ion limb left - g switch
2 ple) select and right interface
buttons speech (head)
impair
ment

2.3.x.2 Conduc Reaching Look Vision Knob Upper n.a. n.a. n.a.
Audio ting a car radio (eyes) limb
system car 1 impair
program ment
ming and
tuning Reach Motor Knob Upper Speak Voice Voice
(hand) limb controlle
impair d radio
ment

Grasp Motor Knob Upper Speak Voice Voice


(hand) limb controlle
impair d radio
ment

Rotate Motor Knob Upper Speak Voice Voice


(hand) limb controlle
impair d radio
ment

2.3.x.2 Conduc Reaching Look Vision Radio Upper n.a. n.a. n.a.
Audio ting a car radio (eyes) push limb
system car 2 buttond impair
program ment
ming and
tuning Reach Motor Radio Upper Speak Voice Voice
(hand) push limb controlle
buttond impair d radio
ment

20/12/2011 28 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Task Subtask Primitiv Moda Task Disabi Altern Altern Altern
Case e lity object lity ative ative ative
x={a| (numbe tasks tasks(s) modali task
b} rs ty object /
indicate (numbe assistiv
alternat rs e
ives) indicate device
alternat
ives)
Position Motor Radio Upper Speak Voice Voice
(hand) push limb controlle
buttond impair d radio
ment

Tactile Motor Radio Upper Speak Voice Voice


verify push limb controlle
(Cognitiv buttond impair d radio
e) ment

Grasp Motor Radio Upper Speak Voice Voice


(hand) push limb controlle
buttond impair d radio
ment

2.3.x.3 Conduc Listen - Listen Auditi Navigati Hearing Look Vision Navigati
Navigati ting a Navigatio 1 on on audio impaire (eyes) on
on car n system cues d system
informati audio visual
on cues cues
accessibi
lity Listen Auditi Navigati Hearing Look Vision + Navigati
2 on on audio impaire (eyes) haptics on
cues d + touch system
visual
cues +
haptic
propmpt

4.2.2 Identification of object parameters


The table that follows reports the objects to be interacted with by the user and evaluated,
associated to a set of relevant parameters affecting their interaction performance. These
parameters, which can be parameterized totally or partially (i.e. only some such as e.g.
actuation force law) constitute the test independent variables, to be added to the solutions
(primitive vs alternative object of table 4-4) being tested with different user profiles.

Table 4 - Object parameters for car interior use cases

20/12/2011 29 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task objects Task objects Alternative Alternative /


parameters task objects / assistive
assistive parameters
devices
2.1.x.1 Central Central rearview Position in x,y,z Voice controlled Vocal command
rearview mirror mirror Mirror dimensions mirror typology
tuning Surface grip Recognition rate
Movement friction Control law

2.1.x.2 Lateral Left lateral mirror Position in x,y,z


mirror tuning knob Knob dimensions
Up-down and left-right
forces and knob
movement range

Left door mirror Position in x,y,z


control switch (4 Knob dimensions
way) Control law - left-right
and up-down

2.1.x.3 Handbrake Handbrake lever Position in x,y,z Handbrake lever Position in x,y,z
activation / Lever length with pull unlock Handle length
deactivation Lever range handle Handle range
Handle diameter and Handle grip
grip Pull force law
Pull force law (applicable to either
(activation) mechanical or
Resistence force law electrical control)
(deactivation)

Handbrake lever Button dimensions


unlock button Movement range
Unlock force law

2.1.x.4 Gear Clutch Position in x,y,z Hand button / Position in x,y,z


changing (manual Movement range feet button Button dimensions and
/automatic) Actuation / deactuation clutch activ. grip
force law Control force law
Labelling

Manual gear lever Position of knob in Button gear Position in x,y,z


x,y,z selector Button dimensions and
left - right movement grip
range Control force law
Fore - aft movement Labelling
range
Left - right and fore -
aft force law

20/12/2011 30 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task objects Task objects Alternative Alternative /


parameters task objects / assistive
assistive parameters
devices
Automated gear Position in x,y,z
lever unlock Size of the button
button Force law

Automated gear Position of knob in


lever x,y,z
Fore - aft movement
range
Fore - aft force law

2.1.x.5 Accessing Storage door Position in x,y,z Enhanced Design


interior storage (label) Label specifications recognition Intelligibility
compartments labels

Storage door Position in x,y,z Push-to-open Position in x,y,z


handle Handle dimensions storage door Force law
Movement range
Movement force law

Storage Position in x,y,z Storage Storage interior


compartment Dimensions and compartment visibility
physical accessibility illumination Storage (physical)
Easy reach accessibility
storage
compartment

Object in storage Dimensions =


compartment Weight

2.3.x.1 On-board Destination select Position in x,y,z Vocal Recognition rate


nav. system buttons Button dimensions and commands Semantics
programming grip Interaction tree
Control force law
Labelling
Scanning switch Tbd
interface

2.3.x.2 Audio Radio knob Posizion in x,y,z Voice controlled Recognition rate
system Knob dimensions and radio Semantics
programming and grip Interaction tree
tuning Control force law
Labelling

Radio push Position in x,y,z Voice controlled Recognition rate


button Button dimensions and radio Semantics
grip Interaction tree
Control force law

20/12/2011 31 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task objects Task objects Alternative Alternative /


parameters task objects / assistive
assistive parameters
devices
Labelling

2.3.x.3 Navigation Navigation audio Vocal message quality Navigation Position in x,y,z
information cues (audio and speech) visual cues Legibility /
accessibility Semantics intelligibility
Timing Timing

Navigation Intelligibility
haptic cues Timing

4.2.3 Definition of success criteria for each task


In the table that follows are reported the success criteria for each use case, expressed either
at sub-task level (i.e. a complete operation) and at primitive / alternative task level.

In this way, the interaction performance can either be described at complete operation level
and / or at single action level, in order to sort out the critical factors that affect the interacitno
performance.

Table 5 - Success criteria for car interior use cases


Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
2.1.x.1 Conducti Viewing Look (eyes) = The user = =
Central ng a car backwards can
rearview (inside car) complete
mirror the task to
Grasp n.a. The user n.a.
tuning adjust the
(hand) can reach
mirror to
and
obtain a
effectively
suitable
grasp the
rear vision
mirror

Turn (hand) Speak The user The user


can turn can bring
left-right the mirror
the mirror to the

20/12/2011 32 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
to reach desired
the desired left-right
position position
through
voice
control

Push (hand) Speak The user The user


can turn can bring
up-down the mirror
the mirror to the
to reach desired up-
the desired down
position position
through
voice
control

Look (eyes) = = =

2.1.x.2 Conducti Viewing Look (eyes) = The user = =


Lateral ng a car backwards can
mirror (outside car, complete
tuning left) the task to
Grasp n.a. The user n.a.
1 adjust the
(hand) can reach
mirror to
and
obtain a
effectively
suitable
grasp the
rear vision
lateral
mirror
knob

Turn (hand) Speak The user The user


can actuate can bring
the knob in the mirror
order to to the
turn left- desired
right the left-right
mirror, to position
reach the through
desired voice

20/12/2011 33 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
position control

Push (hand) Speak The user The user


can actuate can bring
the knob in the mirror
order to to the
turn up- desired up-
down the down
mirror, to position
reach the through
desired voice
position control

Look (eyes) = = =

2.1.x.2 Conducti Viewing Look (eyes) = The user = =


lateral ng a car backwards can
mirror (outside car, complete
tuning left) the task to
Reach n.a. The user n.a.
2 adjust the
(hand) can reach
mirror to
and grasp
obtain a
the lateral
suitable
mirror
rear vision
control

Push (hand) Speak The user The user


can actuate can bring
the control the mirror
switch in to the
order to desired
turn left- left-right
right the position
mirror, to through
reach the voice
desired control
position

20/12/2011 34 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
Push (hand) Speak The user The user
can actuate can bring
the control the mirror
switch in to the
order to desired up-
turn up- down
down the position
mirror, to through
reach the voice
desired control
position

Look (eyes) = = =

2.1.x.3 Conducti Parking Push Pull (hand) The user The user The user
Handbrake ng a car brake (thumb) can can push can pull
activation / activation complete the button the unlock
deactivation the task to in order to lever in
secure the release the order to
handbrake lever release the
control in break lever
stop
Pull (hand) Pull (hand) position The user The user
can pull can pull
the lever to the brake
the desired lever to the
position desired
posiiton

Release Release The user The user


(thumb) (hand) can release can release
the button the unlock
to lever in
successfull order to
y lock the successfull
lever in the y lock the
stop brake lever
position in the
desired
position

20/12/2011 35 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
2.1.x.3 Conducti Parking Push Pull (hand) The user The user The user
Handbrake ng a car brake (thumb) can can push can pull
activation / deactivation complete the button the unlock
deactivation the task to in order to lever in
release the release the order to
brake lever lever release the
from stop break lever
to idle
Controlled Controlled position The user The user
release release can control can control
(hand) (hand) the lever the brake
movement lever of
of brake brake
release release

Release Release The user The user


(thumb) (hand) can release can release
the button the unlock
to lever in
successfull order to
y lock the successfull
lever in the y lock the
idle brake lever
position in idle
position

2.1.x.4 Conducti Driving Push (left Push (hand The user The user The user
Gear ng a car backwards foot) / foot) can can can
changing 1 complete successfull successfull
(manual / (Manual) the task of y actuate y actuate
automatic) engaging the clutch the clutch
the reverse pedal button
gear
Grasp (right Reach The user The user
hand) (hand) can can reach
successfull the button
y reach gear
and grasp selector
the gear
lever

20/12/2011 36 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
Push (hand) Push The user The user
(hand) can push can push
the gear the button
lever right

Pull (hand) n.a. The user n.a.


can pull
the gear
lever aft

2.1.x.4 Conducti Driving Grasp (right Reach The user The user The user
Gear ng a car backwards hand) (hand) can can reach can
changing 2 complete and grasp successfull
(manual / (Automatic) the task of the y actuate
automatic) engaging automatic the clutch
the reverse gear lever button
gear
Push (hand) n.a. The user The user
can push can push
the gear the button
lever right

Push (hand) Push The user n.a.


(hand) can pull
the gear
lever aft

2.1.x.5 Using Accessing Look (eyes) Look The user The user The user
Accessing interior interior (eyes) can can can
interior equipmen storage successfull correctly correctly
storage t y extract identify the identify the
compartme (NEW) an object actuator actuator
nts from the and the and the
storage operation operation
compartme to perform to perform
nt
Reach Reach The user The user
(hand) (hand) can reach can reach
the atuator the atuator

20/12/2011 37 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
Pull (hand) Push The user The user
(hand) can open can open
the the
compartme compartme
nt nt

Look (eyes) Look The user The user


(eyes) can can
identify the identify the
object object

Reach Reach The user The user


(hand) (hand) can reach can reach
the object the object

2.3.x.1 Conducti Destination Reach Speak The user The user The user
On-board ng a car selection (hand) can can reach can select
nav. system 1 successfull and actuate the
programmi y select a the destination
ng destination selection through
from a list button voice
of stored control
destination
Reach Move left - s The user The user
(hand) right can reach can select
2 (head) and actuate the
the destination
selection through
button gazing at
the
destination
label (at
vehicle
standstill)

2.3.x.2 Conducti Reaching Look (eyes) n.a. The user The user n.a.
Audio ng a car car radio can can
system 1 successfull recognize
programmi y adjust the vloume
ng and the radio knob

20/12/2011 38 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
tuning Reach Speak volume at The user The user
(hand) the desired can reach can adjust
level the volume the radio
knob volume
through
voice
control

Grasp Speak The user The user


(hand) can grasp can adjust
the volume the radio
knob volume
through
voice
control

Rotate Speak The user The user


(hand) can rotate can adjust
the volume the radio
knob to volume
reach the through
desired voice
sound level control

2.3.x.2 Conducti Reaching Look (eyes) n.a. The user The user n.a.
Audio ng a car car radio can can
system 2 successfull recognize
programmi y select a the desired
ng and radio station
tuning station
Reach Speak from a set The user The user
(hand) of stored can reach can select
stations the station the desired
button radio
bank station by
pronouncin
g its name
or a list
ordinal

20/12/2011 39 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
Position Speak The user The user
(hand) can reach can select
the desired the desired
station radio
button station by
pronouncin
g its name
or a list
ordinal

Tactile Speak The user The user


verify can can select
(Cognitive) recognize the desired
the button radio
station by
pronouncin
g its name
or a list
ordinal

Grasp Speak The user The user


(hand) can push can select
the button the desired
and select radio
the station station by
pronouncin
g its name
or a list
ordinal

2.3.x.3 Conducti Listen - Listen Look The user The user The user
Navigation ng a car Navigation 1 (eyes) can can can
information system successfull correctly correctly
accessibilit audio cues y and listen and read and
y correctly interpret a interpret a
recognize navigation navigation
an manoeuvre manoeuvre
indication prompt prompt

20/12/2011 40 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria

Use Case Task Subtask Primitive Alternati Sub-task Primitive Alternati


x={a|b (numbers tasks ve level task level ve task
} indicate (numbers tasks(s) level
alternative indicate
s) alternative
s)
Listen Look for a The user The user
2 (eyes) navigation can can
+ touch manoeuvre correctly correctly
listen and perceive
interpret a and
navigation interpret a
manoeuvre navigation
prompt manoeuvre
prompt

4.2.4 Generation of UsiXML files


Based on the simulation task table defined for Automotive Applications based on the use
Cases, simulation models, and corresponding task model and multimodal interaction model
files have been developed on a per-use case basis, and stored on the ftp site of Veritas
Project, in the folder /Deliverables/D2.2.1. The following files were uploaded:

Simulation models

SimulationModel_Central_rearview_mirror_tuning.usi
SimulationModel_Lateral_mirror_tuning_manual.usi
SimulationModel_Lateral_mirror_tuning_electric.usi
SimulationModel_Handbrake_activation_deactivation.usi
SimulationModel_Gear_changing_manual.usi
SimulationModel_Gear_changing_automatic.usi
SimulationModel_Accessing_interior_storage_compartments.usi
SimulationModel_On-boardNavSystemProgramming.usi
SimulationModel_Audio_system_programming_and_tuning_1.usi
SimulationModel_Audio_system_programming_and_tuning_2.usi
SimulationModel_Navigation_information_accessibility.usi

Task models

TaskModel_Viewing_backwards_inside_car.usi
TaskModel_Viewing_backwards_outside_car.usi
TaskModel_Viewing_backwards_outside_car_left_electric.usi
TaskModel_Parking_brake_activation.usi
TaskModel_Parking_brake_deactivation.usi
TaskModel_Driving_backwards_manual.usi
TaskModel_Driving_backwards_automatic.usi
TaskModel_Accessing_Interior_Storage.usi

20/12/2011 41 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

TaskModel_Destination_selection.usi
TaskModel_Reaching_car_radio_1.usi
TaskModel_Reaching_car_radio_2.usi
TaskModel_Listen_navigation_system_audio.usi

Multimodal interaction models

MultimodalInteractionModel_Grasp_hand_central_mirror.usi
MultimodalInteractionModel_Look_central_mirror.usi
MultimodalInteractionModel_Push_hand_central_mirror.usi
MultimodalInteractionModel_Turn_hand_central_mirror.usi
MultimodalInteractionModel_Grasp_hand_left_mirror.usi
MultimodalInteractionModel_Look_left_mirror.usi
MultimodalInteractionModel_Push_hand_left_mirror_knob.usi
MultimodalInteractionModel_Turn_hand_left_mirror_knob.usi
MultimodalInteractionModel_.usi
MultimodalInteractionModel_Look_left_mirror_control_switch.usi
MultimodalInteractionModel_Push_hand_left_mirror_control_switch.usi
MultimodalInteractionModel_Reach_hand_left_mirror_control_switch.usi
MultimodalInteractionModel_Controlled_release_hand_brake.usi
MultimodalInteractionModel_Pull_hand_brake.usi
MultimodalInteractionModel_Push_parking_brake_release_button.usi
MultimodalInteractionModel_Release_parking_brake_release_button.usi
MultimodalInteractionModel_Grasp_right_hand_gear_handle.usi
MultimodalInteractionModel_Push_left_foot_gear_pedal.usi
MultimodalInteractionModel_Push_right_hand_gear_handle.usi
MultimodalInteractionModel_Grasp_right_hand_automatic_gear.usi
MultimodalInteractionModel_Push_right_hand_automatic_gear_handle.usi
MultimodalInteractionModel_Push_right_hand_automatic_gear_handle_unlock_button.usi
MultimodalInteractionModel_Look_storage_compartment.usi
MultimodalInteractionModel_Look_storage_door.usi
MultimodalInteractionModel_Pull_hand_storage_door_handle.usi
MultimodalInteractionModel_Reach_hand_object_in_storage.usi
MultimodalInteractionModel_Reach_hand_storage_door.usi
MultimodalInteractionModel_Push_hand_navigation_system_buttons.usi
MultimodalInteractionModel_Grasp_hand_radio_knob.usi
MultimodalInteractionModel_Look_radio_knob.usi
MultimodalInteractionModel_Reach_hand_radio_knob.usi
MultimodalInteractionModel_Turn_hand_radio_knob.usi
MultimodalInteractionModel_Grasp_hand_radio_push_button.usi
MultimodalInteractionModel_Look_radio_push_button.usi
MultimodalInteractionModel_Position_hand_radio_push_button.usi
MultimodalInteractionModel_Reach_hand_radio_push_button.usi
MultimodalInteractionModel_Tactile_verify_hand_radio_push_button.usi
MultimodalInteractionModel_Listen_navigation_system_audio_cues_1.usi
MultimodalInteractionModel_Listen_navigation_system_audio_cues_2.usi

20/12/2011 42 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Below, example diagrams and corresponding UsiXML code for Simulation, Task and
Multimodal Interaction models for car interior use cases are displayed:

Figure 4-5 - Navigation Information Accessibility Simulation Model

<?xml version="1.0" encoding="UTF-8"?>

<taskmodel>
<!--Tareas-->
<task id="st0task0" name="Navigation information accessibility"
type="abstraction">
<task id="st0task1" name="Conducting_a_car" type="abstraction">
<task id="st0task2" name="Listen_navigation_system_audio_cues"
type="abstraction"/>
</task>
</task>
<!--Relaciones entre tareas-->
</taskmodel>

Figure 4-6 - Listen Navigation System Audio Cues Task Model

20/12/2011 43 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

<?xml version="1.0" encoding="UTF-8"?>


<taskmodel>
<!--Tareas-->
<task id="st0task0" name="Listen_navigation_system_audio_cues"
type="abstraction">
<task id="st0task1" name="Listen_navigation_system_audio_cues_1"
type="interaction"/>
<task id="st0task2" name="Listen_navigation_system_audio_cues_2"
type="interaction"/>
</task>
<!--Relaciones entre tareas-->
<deterministicChoice>
<source sourceId="st0task1"/>
<target targetId="st0task2"/>
</deterministicChoice>
</taskmodel>

Figure 4-7 - Listen Navigation System Audio Cues Interaction Model

<?xml version="1.0" encoding="UTF-8"?>


<taskmodel>
<!--Tareas-->
<task id="st0task0" name="Listen_navigation_system_audio_cues_1" type="abstraction">
<task id="st0task1"
name="Listen(modality:audition)(object:navigation_system_audio_cues)"
type="interaction"/>
<task id="st0task2" name="Look(modality:vision)(object:navigation_system_visual_cues)"
type="interaction"/>
</task>
<!--Relaciones entre tareas-->
<deterministicChoice>
<source sourceId="st0task1"/>
<target targetId="st0task2"/>
</deterministicChoice>
</taskmodel>

4.2.5 Transfer of simulation models into RAMSIS tasks


The use cases and tasks in Error! Reference source not found. were transferred into
RAMSIS task (*.tsk) and use case / simulation model (*.aan) definition files. The assignment
of these files to the use cases and tasks is given in the following table. The use case 2.3.x.3
(Navigation information accessibility) is addressed by the VERITAS simulation core and
omitted here.

20/12/2011 44 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Table 6 - RAMSIS task and use case / simulation model files for car interior use cases
Use Case Task Subtask Primitive RAMSIS task file RAMSIS use
x={a|b} (numbers tasks case /
indicate (numbers simulation
alternatives indicate model file
) alternatives
)
2.1.x.1 Conductin Viewing Look (eyes) 21x1_viewing- 21x1_central-
Central g a car backwards backwards_1_look.tsk mirror.aan
rearview (inside car)
mirror tuning Grasp 21x1_viewing-
(hand) backwards_2_grasp.ts
k

Turn (hand) 21x1_viewing-


backwards_3_turn.tsk

Push (hand) 21x1_viewing-


backwards_4_grasp.ts
k, 21x1_viewing-
backwards_5_push.tsk

Look (eyes)

2.1.x.2 Conductin Viewing Look (eyes) 21x2_viewing- 21x2_lateral-


Lateral g a car backwards backwards- mirror.aan
mirror tuning (outside car, 1_1_look.tsk
left)
1 Grasp
(hand)

Turn (hand) 21x2_viewing-


backwards-
1_3_turn.tsk

Push (hand)

Look (eyes)

2.1.x.2 Conductin Viewing Look (eyes) 21x2_viewing- 21x1_central-


lateral mirror g a car backwards backwards- mirror.aan
tuning (outside car, 2_1_look.tsk
left)
2 Reach
(hand)

20/12/2011 45 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task Subtask Primitive RAMSIS task file RAMSIS use
x={a|b} (numbers tasks case /
indicate (numbers simulation
alternatives indicate model file
) alternatives
)
Push (hand) 21x2_viewing-
backwards-
2_3_push.tsk

Push (hand)

Look (eyes)

2.1.x.3 Conductin Parking Push 21x3_hand-


Handbrake g a car brake (thumb) brake.aan
activation / activation
deactivation Pull (hand) 21x3_parking-brake-
activation_2_pull.tsk

Release
(thumb)

2.1.x.3 Conductin Parking Push 21x3_hand-


Handbrake g a car brake (thumb) brake.aan
activation / deactivation
deactivation Controlled 21x3_parking-brake-
release deactivation_2_pull.ts
(hand) k

Release
(thumb)

2.1.x.4 Conductin Driving Push (left 21x4_driving- 21x4_gear-


Gear g a car backwards foot) backwards- manual.aan
changing 1 1_0_reach.tsk,
(manual / (Manual) 21x4_driving-
automatic) backwards-
1_1_push.tsk

Grasp (right 21x4_driving-


hand) backwards-
1_2_grasp.tsk

Push (hand) 21x4_driving-


backwards-
1_3_push.tsk

20/12/2011 46 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task Subtask Primitive RAMSIS task file RAMSIS use
x={a|b} (numbers tasks case /
indicate (numbers simulation
alternatives indicate model file
) alternatives
)
Pull (hand)

2.1.x.4 Conductin Driving Grasp (right 21x4_driving- 21x4_gear-


Gear g a car backwards hand) backwards- automatic.aan
changing 2 2_1_grasp.tsk
(manual / (Automatic)
automatic) Push (hand) 21x4_driving-
backwards-
2_2_push.tsk

Push (hand)

2.1.x.5 Using Accessing Look (eyes) 21x5_interior- 21x5_storage-


Accessing interior interior storage_1_look.tsk compartment.aa
interior equipment storage n
storage (NEW) Reach 21x5_interior-
compartment (hand) storage_2_reach.tsk
s
Pull (hand) 21x5_interior-
storage_3_pull.tsk

Look (eyes)

Reach 21x5_interior-
(hand) storage_5_reach.tsk

2.3.x.1 Conductin Destination Push (hand) 23x1_dest- 23x1_onboard-


On-board g a car selection 1 selection_1_push.tsk navigation.aan
nav. system
programming Push (hand)
2

2.3.x.2 Conductin Reaching Look (eyes) 23x2_radio- 23x2_audio-


Audio g a car car radio 1_1_look.tsk system-1.aan
system 1
programming Reach
and tuning (hand)

Grasp 23x2_radio-
(hand) 1_3_grasp.tsk

20/12/2011 47 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task Subtask Primitive RAMSIS task file RAMSIS use
x={a|b} (numbers tasks case /
indicate (numbers simulation
alternatives indicate model file
) alternatives
)
Rotate 23x2_radio-
(hand) 1_4_rotate.tsk

2.3.x.2 Conductin Reaching Look (eyes) 23x2_radio- 23x2_audio-


Audio g a car car radio 2_1_look.tsk system-2.aan
system 2
programming Reach
and tuning (hand)

Position 23x2_radio-
(hand) 2_3_position.tsk

Tactile
verify
(Cognitive)

Grasp
(hand)

All files are provided in the archive VERITAS_D2.2.1_RAMSIS-simulation-model-files.zip in


the folder of this deliverable on the ftp server (/Deliverables/D2.2.1).

20/12/2011 48 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

4.3 Motor cycle applications


4.3.1 Overall simulation task table
Table 7 - Overall simulation task table for motor cycle use cases
Primiti
Subtask ve Altern
tasks ative
Altern Altern
(numbe task
Moda Task Disabil ative ative
Use Case Task rs (numbe object /
lity object ity tasks(s modali
indicate rs assistiv
) ty
alternat indicate e
ives) alternat device
ives)
Limb
Handle
Riding a Riding a Grasp Motor Impari n.a n.a n.a
bars
Powered Powered ment
Steering
Two Two Limb
Handle
Wheeler Wheeler Turn Motor Impari n.a n.a n.a
bars
ment
Ignitio Limb
Starting Turn Motor n Impari n.a n.a n.a
the switch ment
engine - Electri Limb
1 Push Motor c start Impari n.a n.a n.a
button ment
Ignitio Limb
Starting Turn Motor n Impari n.a n.a n.a
the switch ment
engine - Kick- Limb
2 Push Motor start Impari n.a n.a n.a
lever ment
Limb
Throttl
Grasp Motor Impari n.a n.a n.a
e
Deceler ment
ating Limb
Motorcycle Motorcycle Move Throttl
Motor Impari n.a n.a n.a
Handling Handling upwards e
ment
Limb
Throttl
Grasp Motor Impari n.a n.a n.a
e
Acceler ment
ating Move Limb
Throttl
downwa Motor Impari n.a n.a n.a
e
rds ment
Limb
Grasp Motor Clutch Impari n.a n.a n.a
ment
Gearing Limb
(motorb Pull Motor Clutch Impari n.a n.a n.a
ike) ment
Limb
Shift
Push Motor Impari n.a n.a n.a
level
ment

20/12/2011 49 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Primiti
Subtask ve Altern
tasks ative
Altern Altern
(numbe task
Moda Task Disabil ative ative
Use Case Task rs (numbe object /
lity object ity tasks(s modali
indicate rs assistiv
) ty
alternat indicate e
ives) alternat device
ives)
Dual
lever
system
(front
and
Limb
rear
Grasp Motor Impari n.a n.a n.a
brake)
ment
or
front
brake
lever
lever
Braking
Dual
lever
system
(front
and
Limb
rear
Pull Motor Impari n.a n.a n.a
brake)
ment
or
front
brake
lever
lever
Indicate Limb
Turn
directio Push Motor Impari n.a n.a n.a
signal
n ment
Left Vision
Visio
Locate lateral Impari n.a n.a n.a
n
mirror ment
Left Limb
Locatin Reach Motor lateral Impari n.a n.a n.a
g and mirror ment
adjustin Left Limb
g the Grasp Motor lateral Impari n.a n.a n.a
lateral mirror ment
mirror Left Limb
(left) Turn Motor lateral Impari n.a n.a n.a
mirror ment
Left Vision
Visio
Locate lateral Impari n.a n.a n.a
n
mirror ment
Usage of Usage of Riding Grasp Motor Handle Limb n.a n.a n.a

20/12/2011 50 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Primiti
Subtask ve Altern
tasks ative
Altern Altern
(numbe task
Moda Task Disabil ative ative
Use Case Task rs (numbe object /
lity object ity tasks(s modali
indicate rs assistiv
) ty
alternat indicate e
ives) alternat device
ives)
scooter on scooter on Posture bars Impari
bumpy bumpy ment
roads. roads. Put feet Limb
Footre
on Motor Impari n.a n.a n.a
st
footrest ment
Limb
Sit Motor on seat Impari n.a n.a n.a
ment
Limb
Handle
Grasp Motor Impari n.a n.a n.a
bars
ment
Limb
Lean Body
Motor Impari n.a n.a n.a
over posture
ment
Limb
over
Get on a Lift Motor Impari n.a n.a n.a
seat
motorbi ment
ke Limb
over
(driver) Swing Motor Impari n.a n.a n.a
seat
ment
Limb
on
Stand Motor Impari n.a n.a n.a
ground
ment
Limb
Parking the Parking the Sit Motor on seat Impari n.a n.a n.a
motorcycle/ motorcycle/ ment
scooter. scooter. Limb
Handle
Grasp Motor Impari n.a n.a n.a
bars
ment
over Limb
Lift Motor footres Impari n.a n.a n.a
t ment
Get on a over Limb
scooter Swing Motor footres Impari n.a n.a n.a
(driver) t ment
Limb
on
Stand Motor Impari n.a n.a n.a
ground
ment
Limb
Sit Motor on seat Impari n.a n.a n.a
ment
Park a the Limb
Push Motor n.a n.a n.a
powered side Impari

20/12/2011 51 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Primiti
Subtask ve Altern
tasks ative
Altern Altern
(numbe task
Moda Task Disabil ative ative
Use Case Task rs (numbe object /
lity object ity tasks(s modali
indicate rs assistiv
) ty
alternat indicate e
ives) alternat device
ives)
two stand ment
wheeler
Limb
vehicle pull/pus Handle
Motor Impari n.a n.a n.a
onto a h bars
ment
side
Limb
stand
Stand Motor Feet Impari n.a n.a n.a
ment
Limb
Handle
Grasp Motor Impari n.a n.a n.a
bars
ment
Limb
Seat
Lift Motor Impari n.a n.a n.a
hight
ment
Limb
Seat
Swing Motor Impari n.a n.a n.a
hight
ment
Limb
Lean Body
Motor Impari n.a n.a n.a
over posture
ment
Left Limb
Park a Grasp Motor handle Impari n.a n.a n.a
powered bar ment
two
Frame Limb
wheeler
Grasp Motor membe Impari n.a n.a n.a
vehicle
r ment
onto a
Onto
center Limb
center
stand (if Lift Motor Impari n.a n.a n.a
stand
possible ment
lever
start
Limb
with the Push Center
Motor Impari n.a n.a n.a
vehicle down stand
ment
on its
side Pull Limb
Handle
stand) backwar Motor Impari n.a n.a n.a
bars
ds ment
Limb
Get off on
Stand Motor Impari n.a n.a n.a
a ground
ment
powered
Limb
two Handle
Grasp Motor Impari n.a n.a n.a
wheeler bars
ment
vehicle
over Limb
(driver) Swing Motor n.a n.a n.a
seat Impari

20/12/2011 52 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Primiti
Subtask ve Altern
tasks ative
Altern Altern
(numbe task
Moda Task Disabil ative ative
Use Case Task rs (numbe object /
lity object ity tasks(s modali
indicate rs assistiv
) ty
alternat indicate e
ives) alternat device
ives)
ment
over Limb
Lean
Motor the Impari n.a n.a n.a
over
vehicle ment
Limb
Handle
Release Motor Impari n.a n.a n.a
bars
ment
Lookin
Speed Listeni
g at the
limit ng to Audito
visual
exceede Listenin Audit the ry Lookin
Vision cues on
d during g ion sounds Impari g
the
PTW in the ment
Receiving dashbo
riding helmet
infomation ard
Navigation from the To get
Lookin
device while aware Listeni
g at the
driving about ng to Audito
visual
low fuel Listenin Audit the ry Lookin
Vision cues on
level g ion sounds Impari g
the
during in the ment
dashbo
regular helmet
ard
riding

4.3.2 Identification of object parameters


Table 8 - Object parameters for motor cycle use cases
Use Case Task Task object Alternative Alternative / assistive
object parameters tasks(s) parameters
Handle Position in x,y,z of the
2.2.a.1 Riding a n.a n.a
bars handle bars
Powered Two
Handle Position in x,y,z of the
Wheeler n.a n.a
bars handle bars
Ignition Position in x,y,z of the
n.a n.a
switch Ignition switch
Electric Position in x,y,z of the
n.a n.a
start button electric start button
2.2.a.2 Ignition Position in x,y,z of the
n.a n.a
Motorcycle switch Ignition switch
Handling Kick-start Position in x,y,z of the
n.a n.a
lever kick-start lever
Position in x,y,z of the
Throttle n.a n.a
throttle
Throttle Rotation of the throttle n.a n.a

20/12/2011 53 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task Task object Alternative Alternative / assistive


object parameters tasks(s) parameters
Position in x,y,z of the
Throttle n.a n.a
throttle
Throttle n.a n.a
Accelerating Torque
Position in x,y,z of the
Clutch n.a n.a
clutch
Clutch Activation force n.a n.a
Shift level Activation force n.a n.a
Dual lever
system
(front and
Position in x,y,z of the
rear brake) n.a n.a
lever
or front
brake lever
lever
Dual lever
system
(front and
rear brake) Activation force n.a n.a
or front
brake lever
lever
Position in x,y,z of the
Turn signal n.a n.a
turn signal button
Left lateral
Locate n.a n.a
mirror
Left lateral Position in x,y,z of the
n.a n.a
mirror left lateral mirror
Left lateral Position in x,y,z of the
n.a n.a
mirror left lateral mirror
Left lateral Position in x,y,z of the
n.a n.a
mirror left lateral mirror
Left lateral
Locate n.a n.a
mirror
Vibrational comfort
Handlebars n.a n.a
index for the hands
2.2.a.3 Usage of Vibrational comfort
Footrest n.a n.a
scooter on bumpy index for the feet
roads. Vibrational comfort
on seat index for the buttock n.a n.a
and the head
Handle Position in x,y,z of the
n.a n.a
bars handle bars
Body
Body posture n.a n.a
posture
Position in x,y,z of the
2.2.a.4 Parking over seat right leg
n.a n.a
the Position in x,y,z of the
motorcycle/scoote over seat right leg
n.a n.a
r. Position in x,y,z of the
on ground n.a n.a
feet
Position in x,y,z of the
on seat n.a n.a
buttock
Handle Position in x,y,z of the n.a n.a

20/12/2011 54 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Use Case Task Task object Alternative Alternative / assistive


object parameters tasks(s) parameters
bars handle bars
over Position in x,y,z of the
n.a n.a
footrest right leg
over Position in x,y,z of the
n.a n.a
footrest right leg
Position in x,y,z of the
on ground n.a n.a
feet
Position in x,y,z of the
on seat n.a n.a
buttock
the side Position in x,y,z of the
n.a n.a
stand side stand
Handle Position in x,y,z of the
n.a n.a
bars handle bars
Position in x,y,z of the
Feet n.a n.a
feet
Handle Position in x,y,z of the
n.a n.a
bars handle bars
Position in x,y,z of the
Seat hight n.a n.a
right leg
Position in x,y,z of the
Seat hight n.a n.a
right leg
Body
Body posture n.a n.a
posture
Left Position in x,y,z of the
n.a n.a
handlebar handle bar
Frame Position in x,y,z of the
n.a n.a
member frame member
Onto
Position in x,y,z of the
center n.a n.a
foot
stand lever
Center Position in x,y,z of the
n.a n.a
stand center stand
Position in x,y,z of the
Handlebars n.a n.a
handle bar
Position in x,y,z of the
on ground n.a n.a
feet
Handle Position in x,y,z of the
n.a n.a
bars handle bars
Position in x,y,z of the
over seat n.a n.a
right leg
over the
Body posture n.a n.a
vehicle
Handle Position in x,y,z of the
n.a n.a
bars hands
Listening
Reaction to the
to the Reaction to the warning (press a
warning (press a Looking
sounds in button)
button)
2.4.a.1 the helmet
Navigation Listening
Reaction to the
to the Reaction to the warning (press a
warning (press a Looking
sounds in button)
button)
the helmet

20/12/2011 55 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

4.3.3 Definition of success criteria for each task


Table 9 - Success criteria for motor cycle use cases
Task success criteria
Use Case Task Subtask Primitive Alternative Sub-task Primitive Alternative
x={a|b} (numbers tasks tasks(s) level task level task level
indicate (numbers
alternatives) indicate
alternatives)
All fingers from
both hands touch
Grasp n.a and get wrapped n.a.
2.2.a.1 Riding a The rider can
Riding a Powered around the
Powered Two Steering steer the
Two Wheeler Handlebars
Wheeler handlebar
New position in
Turn n.a x,y,z of the n.a.
handle bars
New position in
Turn n.a x,y,z of the n.a.
The rider can
Ignition switch
Starting the engine - reach the
New position in
1 commands and
x,y,z of the
Push n.a activate them n.a.
electric start
button
New position in
Turn n.a The rider can x,y,z of the n.a.
Starting the engine - reach the Ignition switch
2 commands and New position in
Push n.a activate them x,y,z of the kick- n.a.
start lever
All fingers touch
The rider can and get wrapped
Grasp n.a reach the n.a.
around the
Decelerating throtthle and throttle
move it
New position of
Move upwards n.a upwards n.a.
the throttle
All fingers touch
The rider can and get wrapped
Grasp n.a n.a.
reach the around the
Accelerating throtthle and throttle
move it Minimum torque
Move downwards n.a downwards to be provided to n.a.
accelerate

All fingers touch


Grasp n.a and get wrapped n.a.
2.2.a.2 Motorcycle Motorcycle The rider can around the clutch
Handling Handling reach the levers,
Gearing (motorbike) pull the clutch Minimum force
Pull n.a and push the to be provided to n.a.
shift lever them pull
Minimum force
Push n.a to be provided to n.a.
push

All fingers touch


Grasp n.a The rider can and get wrapped n.a.
reach the lever around the clutch
Braking
and pull it to
brake Minimum force
Pull n.a to be provided to n.a.
pull
The rider can
New position in
reach the
Indicate direction Push n.a x,y,z of the n.a.
command and
button
activate it
Locate the Left
Locate n.a n.a.
lateral mirror
The hand is
streched until it
The rider can reaches the
Locating and Reach n.a n.a.
locate the position in x,y,z
adjusting the lateral
mirror and of the left lateral
mirror (left)
adjust it mirror
At least 3 of the
five fingers grasp
Grasp n.a n.a.
the left lateral
mirror

20/12/2011 56 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria


Use Case Task Subtask Primitive Alternative Sub-task Primitive Alternative
x={a|b} (numbers tasks tasks(s) level task level task level
indicate (numbers
alternatives) indicate
alternatives)
New position in
Turn n.a x,y,z of the n.a.
mirror
Locate the
Locate n.a Central rearview n.a.
mirror
Maximum value
of vibriational
Grasp n.a n.a.
comfort for the
hands
Maximum value
2.2.a.3 Usage of The rider rides of vibriational
Usage of scooter Put feet on footrest n.a n.a.
scooter on bumpy Riding Posture in a comfortable comfort for the
on bumpy roads.
roads. situation feet
Maximum value
of vibriational
Sit n.a comfort for the n.a.
buttock and the
head
All fingers from
both hands touch
Grasp n.a and get wrapped n.a.
around the
Handlebars
New body
Lean over n.a n.a.
posture.
New position in
Lift n.a The rider is able x,y,z of the right n.a.
Get on a motorbike
to get on a leg, over the seat
(driver)
motorbike
New position in
Swing n.a x,y,z of the right n.a.
leg, over the seat
New position in
Stand n.a n.a.
x,y,z of the feet
Position in x,y,z
Sit n.a of the buttock on n.a.
the seat.
All fingers from
both hands touch
Grasp n.a and get wrapped n.a.
around the
Handlebars
New position in
2.2.a.4 Parking the Parking the x,y,z of the right
Lift n.a n.a.
motorcycle/scooter. motorcycle/scooter. leg, over the
The rider is able footrest
Get on a scooter
to get on a New position in
(driver)
scooter x,y,z of the right
Swing n.a n.a.
leg, over the
footrest
New position in
Stand n.a n.a.
x,y,z of the feet
Position in x,y,z
Sit n.a of the buttock on n.a.
the seat.
New position in
Push n.a x,y,z of the side n.a.
stand
New position in
pull/push n.a x,y,z of the n.a.
handle bars
The rider is able
Park a powered two New position in
Stand n.a to park the n.a.
wheeler vehicle onto x,y,z of the feet
vehicle on the
a side stand New position in
side stand
Grasp n.a x,y,z of the n.a.
handle bars
New position in
x,y,z of the right
Lift n.a n.a.
leg the same with
the seat hight.

20/12/2011 57 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Task success criteria


Use Case Task Subtask Primitive Alternative Sub-task Primitive Alternative
x={a|b} (numbers tasks tasks(s) level task level task level
indicate (numbers
alternatives) indicate
alternatives)
New position in
Swing n.a x,y,z of the right n.a.
leg
New body
Lean over n.a n.a.
posture.
All fingers touch
and get wrapped
Grasp n.a n.a.
around the
Handlebars
The hand touch
Grasp n.a and grasp the n.a.
Park a powered two
frame member
wheeler vehicle onto The rider is able
New position in
a center stand (if to park the
x,y,z of the foot
possible start with Lift n.a vehicle on the n.a.
on the center
the vehicle on its center stand
stand
side stand)
New position in
Push down n.a x,y,z of the n.a.
center stand
New position in
Pull backwards n.a x,y,z of the n.a.
handle bars
New position in
Stand n.a n.a.
x,y,z of the feet
New position in
Grasp n.a n.a.
x,y,z of the hands
Get off a powered The rider is able New position in
two wheeler vehicle Swing n.a get off from the x,y,z of the right n.a.
(driver) vehicle leg
New body
Lean over n.a n.a.
posture.
New position in
Release n.a n.a.
x,y,z of the hands
The rider can Press a button in Press a button in 2
Speed limit
understand and 2 sec after te first sec after te first
exceeded during Listening Looking
react in a short time the warning time the warning
Receiving PTW riding
time is notified is notified
infomation from
2.4.a.1 Navigation Press a button in
the device while The rider can Press a button in 2
To get aware about 2 sec before the
driving understand and sec before the 2nd
low fuel level during Listening Looking 2nd time the
react in a short time the warning
regular riding warning is
time is notified
notified

4.3.4 Generation of UsiXML files

4.3.4.1 Use case 2.2 Simulation model


Based on the simulation task table defined for Motorcycle Applications, the simulation model,
task model and multimodal interaction model files were created and stored on the ftp site of
Veritas Project. In particular on the ftp site (/Deliverables/D2.2.1) the following files were
uploaded:

A simulation model related to related to UC 2.2.a.1, 2.2.a.2, 2.2.a.3, 2.2.a.4 (Veritas_


Motorcyle_simulation _model.usi)

Relative to the simulation model for Use Case 2.2., 15 task model files were created:

Veritas_automotive_motorcycle_task_model_accelerating.usi
Veritas_automotive_motorcycle_task_model_adjust_mirror.usi

20/12/2011 58 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Veritas_automotive_motorcycle_task_model_braking.usi
Veritas_automotive_motorcycle_task_model_decelerating.usi
Veritas_automotive_motorcycle_task_model_gearing.usi
Veritas_automotive_motorcycle_task_model_get_off_PTW.usi
Veritas_automotive_motorcycle_task_model_get_on_motorbike.usi
Veritas_automotive_motorcycle_task_model_get_on_scooter.usi
Veritas_automotive_motorcycle_task_model_indicate_direction.usi
Veritas_automotive_motorcycle_task_model_park_center_stand.usi
Veritas_automotive_motorcycle_task_model_park_side_stand.usi
Veritas_automotive_motorcycle_task_model_riding_posture.usi
Veritas_automotive_motorcycle_task_model_start_engine1.usi
Veritas_automotive_motorcycle_task_model_start_engine2.usi
Veritas_automotive_motorcycle_task_model_steering.usi

Relative to the simulation model for Use Case 2.2., 32 multimodal interaction were created:

Veritas_automotive_motorcycle_interaction_model_feet_on_footrest.usi
Veritas_automotive_motorcycle_interaction_model_grasp_clutch.usi
Veritas_automotive_motorcycle_interaction_model_grasp_frame.usi
Veritas_automotive_motorcycle_interaction_model_grasp_handlebars.usi
Veritas_automotive_motorcycle_interaction_model_grasp_left_handlebar.usi
Veritas_automotive_motorcycle_interaction_model_grasp_lever.usi
Veritas_automotive_motorcycle_interaction_model_grasp_mirror.usi
Veritas_automotive_motorcycle_interaction_model_grasp_throttle.usi
Veritas_automotive_motorcycle_interaction_model_lean_over.usi
Veritas_automotive_motorcycle_interaction_model_lift_onto_center_stand.usi
Veritas_automotive_motorcycle_interaction_model_lift_over_footrest.usi
Veritas_automotive_motorcycle_interaction_model_lift_over_seat.usi
Veritas_automotive_motorcycle_interaction_model_locate_mirror.usi
Veritas_automotive_motorcycle_interaction_model_move_downwards.usi
Veritas_automotive_motorcycle_interaction_model_move_upwards.usi
Veritas_automotive_motorcycle_interaction_model_pull_backwards.usi
Veritas_automotive_motorcycle_interaction_model_pull_clutch.usi
Veritas_automotive_motorcycle_interaction_model_pull_lever.usi
Veritas_automotive_motorcycle_interaction_model_push_button.usi
Veritas_automotive_motorcycle_interaction_model_push_center_stand.usi
Veritas_automotive_motorcycle_interaction_model_push_shift_lever.usi
Veritas_automotive_motorcycle_interaction_model_push_side_stand.usi
Veritas_automotive_motorcycle_interaction_model_push_signal.usi
Veritas_automotive_motorcycle_interaction_model_reach_mirror.usi
Veritas_automotive_motorcycle_interaction_model_release_handlebars.usi
Veritas_automotive_motorcycle_interaction_model_sit.usi
Veritas_automotive_motorcycle_interaction_model_stand.usi
Veritas_automotive_motorcycle_interaction_model_swing_over_footrest.usi
Veritas_automotive_motorcycle_interaction_model_swing_over_seat.usi
Veritas_automotive_motorcycle_interaction_model_turn_handlebars.usi
Veritas_automotive_motorcycle_interaction_model_turn_key.usi
Veritas_automotive_motorcycle_interaction_model_turn_mirror.usi

20/12/2011 59 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Below are the diagram representations of the Simulation model and the UsiXML code for
Use Case 2.2 as well as example diagrams and code for a Task and and an Interaction
Model of this Use Case:

Figure 4-8 - Motorcycle Simulation Model

20/12/2011 60 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

<?xml version="1.0" encoding="UTF-8"?>


<taskmodel>
<!--Tareas-->
<task id="st0task0" name="Automotive_simulation_Motorcycle" type="abstraction">
<task id="st0task1" name="riding_position" type="abstraction">
<task id="st0task2" name="steering" type="abstraction"/>
</task>
<task id="st0task3" name="usage_on_bumpy_road" type="abstraction">
<task id="st0task4" name="riding_posture" type="abstraction"/>
</task>
<task id="st0task5" name="parking" type="abstraction">
<task id="st0task6" name="get_on_motorbike" type="abstraction"/>
<task id="st0task7" name="get_on_scooter" type="abstraction"/>
<task id="st0task8" name="park_side_stand" type="abstraction"/>
<task id="st0task9" name="park_center_stand" type="abstraction"/>
<task id="st0task10" name="get_off_PTW" type="abstraction"/>
</task>
<task id="st0task11" name="motorcycle_handling" type="abstraction">
<task id="st0task12" name="start_engine1" type="abstraction"/>
<task id="st0task13" name="starting_engine2" type="abstraction"/>
<task id="st0task14" name="decelerating" type="abstraction"/>
<task id="st0task15" name="accelerating" type="abstraction"/>
<task id="st0task16" name="gearing" type="abstraction"/>
<task id="st0task17" name="braking" type="abstraction"/>
<task id="st0task18" name="indicate_direction" type="abstraction"/>
<task id="st0task19" name="adjust_mirror" type="abstraction"/>
</task>
</task>
<!--Relaciones entre tareas-->
<deterministicChoice>
<source sourceId="st0task1"/>
<target targetId="st0task3"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task3"/>
<target targetId="st0task5"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task5"/>
<target targetId="st0task11"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task6"/>
<target targetId="st0task7"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task7"/>
<target targetId="st0task8"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task8"/>
<target targetId="st0task9"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task9"/>
<target targetId="st0task10"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task12"/>
<target targetId="st0task13"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task13"/>
<target targetId="st0task14"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task14"/>
<target targetId="st0task15"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task15"/>
<target targetId="st0task16"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task16"/>
<target targetId="st0task17"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task17"/>
<target targetId="st0task18"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task18"/>
<target targetId="st0task19"/>
</deterministicChoice>
</taskmodel>
20/12/2011 61 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 4-9 - Get on Scooter Task Model

<?xml version="1.0" encoding="UTF-8"?>


<taskmodel>
<!--Tareas-->
<task id="st3task0" name="get_on_scooter" type="abstraction">
<task id="st3task14" name="grasp_handlebars" type="interaction"/>
<task id="st3task16" name="lift_over_footrest" type="interaction"/>
<task id="st3task17" name="swing_over_footrest" type="interaction"/>
<task id="st3task19" name="stand" type="interaction"/>
<task id="st3task20" name="sit" type="interaction"/>
</task>
<!--Relaciones entre tareas-->
<enabling>
<source sourceId="st3task14"/>
<target targetId="st3task16"/>
</enabling>
<enabling>
<source sourceId="st3task16"/>
<target targetId="st3task17"/>
</enabling>
<enabling>
<source sourceId="st3task17"/>
<target targetId="st3task19"/>
</enabling>
<enabling>
<source sourceId="st3task19"/>
<target targetId="st3task20"/>
</enabling>
</taskmodel>

4.3.4.2 Consolidated Use case 2.4 Simulation model


A simulation model dedicated to OBIS case and in particular to UC 2.4.a.1 (Veritas_
Motorcyle_OBIS_simulation _model.usi). For the UC 2.4.a.1, one task model file
(Veritas_Motorcyle_OBIS_task_model.usi) and one multimodal interaction file (Veritas_
Motorcyle_OBIS_interaction_model.usi) were developed. Below are diagram representations
of the simulation, task model and interaction model for Use Case 2.4 as well as the UsiXML
code that defines them:

20/12/2011 62 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 4-10 - Motorcycle OBIS Simulation Model


<?xml version="1.0" encoding="UTF-8"?>
<taskmodel>
<!--Tareas-->
<task id="st0task0" name="Motorcycle Desktop simulation" type="abstraction">
<task id="st0task3" name="Low fuel level during PTW riding" type="abstraction">
<task id="st0task11" name="Warning recognized" type="abstraction"/>
<task id="st0task12" name="Get suitable speed" type="abstraction"/>
</task>
<task id="st0task8" name="Speed Limit exceeded during PTW riding" type="abstraction">
<task id="st0task9" name="Warning recognized" type="abstraction"/>
<task id="st0task10" name="Get suitable speed" type="abstraction"/>
</task>
</task>
<!--Relaciones entre tareas-->
<deterministicChoice>
<source sourceId="st0task3"/>
<target targetId="st0task8"/>
</deterministicChoice>
<enabling>
<source sourceId="st0task9"/>
<target targetId="st0task10"/>
</enabling>
<enabling>
<source sourceId="st0task11"/>
<target targetId="st0task12"/>
</enabling>
</taskmodel>

Figure 4-11 - Warning Recognized Task Model

20/12/2011 63 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

<?xml version="1.0" encoding="UTF-8"?>


<taskmodel>
<!--Tareas-->
<task id="st0task0" name="Warning recognized" type="abstraction">
<task id="st0task13" name="Become aware of the situation" type="interaction"/>
</task>
<!--Relaciones entre tareas-->
</taskmodel>

Figure 4-12 - Become aware of the situation Interaction Model

<?xml version="1.0" encoding="UTF-8"?>


<taskmodel>
<!--Tareas-->
<task id="st0task0" name="Become aware of the situation" type="interaction">
<task id="st0task1" name="Listening modality audition means ears object
additional audio warning" type="interaction"/>
<task id="st0task2" name="modality visual means eyes object cues"
type="interaction"/>
<task id="st0task3" name="modality tactile means vibration object helmet"
type="interaction"/>
</task>
<!--Relaciones entre tareas-->
<deterministicChoice>
<source sourceId="st0task1"/>
<target targetId="st0task2"/>
</deterministicChoice>
<deterministicChoice>
<source sourceId="st0task2"/>
<target targetId="st0task3"/>
</deterministicChoice>
</taskmodel>

20/12/2011 64 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

4.3.5 Transfer of simulation models into RAMSIS tasks


The use cases and tasks in Error! Reference source not found. were transferred into
RAMSIS task (*.tsk) and use case / simulation model (*.aan) definition files. The assignment
of these files to the use cases and tasks is given in the following table. The use case 2.4.a.2
(Navigation-1) is addressed by the VERITAS simulation core and is omitted here.

Table 10 - RAMSIS task and use case / simulation model files for motor cycle use
cases
Use Case Task Subtask Primitive RAMSIS task file RAMSIS use case /
x={a|b (numbers tasks simulation model file
} indicate (numbers
alternatives) indicate
alternatives)

2.2.a.1 Riding Steering Grasp 22a1_motor- 22a1_motor-cycle_riding-


Riding a (hands) cycle_steering_1_grasp.tsk position.aan
position Powered
Two Turning
Wheeler (hands)

2.2.a.1 Riding Steering Grasp 22a1_scooter_steering_1_grasp.tsk 22a1_scooter_riding-


Riding a (hands) position.aan
position Powered
Two Turning 22a1_scooter_steering_2_turn.tsk
Wheeler (hands)

2.2.a.4 Riding Looking at Look (eyes) 22a4_scooter_lateral- 22a4_scooter_motorcycle-


Motorcycle a and adjusting mirror_1_look.tsk handling.aan
Handling Powered the lateral
Two mirror (left) Grasp (hand) 22a4_scooter_lateral-
Wheeler 1 mirror_2_grasp.tsk

Turn (hand) 22a4_scooter_lateral-


mirror_3_turn.tsk

Look (eyes)

2.4.a.2 Riding Navigating 2 Look (eyes) 24a2_scooter_navigating- 24a2_scooter_navigation-


Navigation a 2_1_look.tsk 2.aan
Powered
Two Reach (hand)
Wheeler
Push (hand) 24a2_scooter_navigating-
2_3_push.tsk

All files are provided in the archive VERITAS_D2.2.1_RAMSIS-simulation-model-files.zip in


the folder of this deliverable on the ftp server (/Deliverables/D2.2.1).

20/12/2011 65 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

5 Tools requirements
5.1 Overview of tool requirements
Existing physical models applied to simulate motor and vision impairment across various
application domains have been reviewed and categorised. An overview is presented in
Error! Reference source not found.. The following four methodologies which have been
commonly used for human physical modelling in automotive application domain are to be
employed. Details of these methodologies are described in these sections:

Virtual reality (Section 0)

Multibody methodology (MBS Section 5.1.1)

Finite element Method (FEM - Section 5.1.2)

Video Motion Capture (MC - Section 5.1.3)

Once the methodologies have been defined, the most important modelling frameworks are
analyzed. Close attention should be paid to the state of the art towards human physical
model for disabled and elderly people whose ability to control and handling a vehicle is
affected by functional decline. The tools developed in VERITAS should be able to simulate
the reduced range of limb movement, push and traction force, and visual acuity so to give
designers and developers the opportunity to gain better understanding of impaired mobility
and vision and assess whether the design is accessible to the targeted users.

Table 11 - Overview of the existing physical models for motor and vision impairment

20/12/2011 66 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Virtual reality models (VR)


Virtual reality is a term that applies to computer-simulated environments that can simulate
places in the real world as well as in imaginary worlds. Within the context of the simulation of
motor impairment, Virtual Reality is often used to describe a wide variety of applications
commonly associated with immersive, highly visual, 3D environments. In the context of this
activity, existing X3D and HANIM physical models and their application potential within
VERITAS has been studied as well.

5.1.1 Multibody methodology (MBS)


A multibody system is used to model the dynamic behavior of interconnected rigid or flexible
bodies, each of which may undergo large translational and rotational displacements.

The systematical treatment of the dynamic behavior of interconnected bodies has led to a
large number of important multibody formalisms in the field of mechanics. The simplest
bodies or elements of a multibody system were already treated by Newton (free particle) and
Euler (rigid body). Euler already introduced reaction forces between bodies. Later on, a
series of formalisms have been derived, only to mention Lagranges formalisms based on
minimal coordinates and a second formulation that introduces constraints.

Basically, the motion of bodies is described by its kinematics behavior. The dynamic
behavior results due to the equilibrium of applied forces and the rate of change in the
momentum. Nowadays, the term multibody system is related to a large number of
engineering fields of research, especially in biomechanics field. As an important feature,
multibody system formalisms usually offer an algorithmic, computer-aided way to model,
analyze, simulate and optimize the arbitrary motion of possibly thousands of interconnected
bodies.

While single bodies or parts of a mechanical system are studied in detailed with finite
element methods, the behavior of the whole multibody system is usually studied with
multibody system methods. In fact a body is usually considered to be a rigid or flexible part
of a mechanical system (not to be confused with the human body). An example of a body is
the arm of a robot, a wheel or axle in a car or the human forearm. A link is the connection of
two or more bodies, or a body with the ground. The link is defined by certain (kinematical)
constraints that restrict the relative motion of the bodies. Typical constraints are:

spherical joint; constrains relative displacements in one point, relative rotation is


allowed; implies 3 kinematical constraints

revolute joint; only one relative rotation is allowed; implies 5 kinematical


constraints; see the example above

prismatic joint; relative displacement along one axis is allowed, constrains relative
rotation; implies 5 kinematical constraints

There are two important terms in multibody systems: degree of freedom and constraint
condition.

20/12/2011 67 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

The degrees of freedom denote the number of independent kinematical possibilities to move.
A rigid body has six degrees of freedom in the case of general spatial motion, three of them
translational degrees of freedom and three rotational degrees of freedom. In the case of
planar motion, a body has only three degrees of freedom with only one rotational and two
translational degrees of freedom.

The degrees of freedom in planar motion can be easily demonstrated using e.g. a computer
mouse. The degrees of freedom are: left-right, up-down and the rotation about the vertical
axis.

A constraint condition implies a restriction in the kinematical degrees of freedom of one or


more bodies. The classical constraint is usually an algebraic equation that defines the
relative translation or rotation between two bodies. There are furthermore possibilities to
constrain the relative velocity between two bodies or a body and the ground. This is for
example the case of a rolling disc, where the point of the disc that contacts the ground has
always zero relative velocity with respect to the ground. In the case that the velocity
constraint condition cannot be integrated in time in order to form a position constraint, it is
called non-holonomic. This is the case for the general rolling constraint. In addition to that
there are non-classical constraints that might even introduce a new unknown coordinate,
such as a sliding joint, where a point of a body is allowed to move along the surface of
another body. In the case of contact, the constraint condition is based on inequalities and
therefore such a constraint does not permanently restrict the degrees of freedom of bodies.

The equations of motion are used to describe the dynamic behavior of a multibody system.
Each multibody system formulation may lead to a different mathematical appearance of the
equations of motion while the physics behind is the same. The motion of the constrained
bodies is described by means of equations that result basically from Newtons second law.
The equations are written for general motion of the single bodies with the addition of
constraint conditions. Usually the equations of motions are derived from the Newton-Euler
equations or Lagranges equations.

The motion of rigid bodies is described by means of

Equation 1

Equation 2

This type of the equations of motion (Equation 1 & Equation 2) is based on so-called
redundant coordinates, because the equations use more coordinates than degrees of
freedom of the underlying system. The generalized coordinates are denoted by q, the mass
matrix is represented by M(q), which may depend on the generalized coordinates. C
represents the constraint conditions and the matrix Cq (sometimes termed the Jacobian) is
the derivation of the constraint conditions with respect to the coordinates. This matrix is used
to apply constraint forces to the according equations of the bodies. The components of the

20/12/2011 68 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

vector are also denoted as Lagrange multipliers. In a rigid body, possible coordinates
could be split into two parts,

Equation 3

where u represents translations and describes the rotations.

In the case of rigid bodies, the so-called quadratic velocity vector Qv is used to describe
Coriolis and centrifugal terms in the equations of motion. The name is because Qv includes
quadratic terms of velocities and it results due to partial derivatives of the kinetic energy of
the body.

The Lagrange multiplier i is related to a constraint condition Ci = 0 and usually represents a


force or a moment, which acts in direction of the constraint degree of freedom. The
Lagrange multipliers do not "work" as compared to external forces that change the potential
energy of a body.

5.1.2 Finite element Method (FEM)


The finite element method (FEM) (its practical application often known as finite element
analysis (FEA)) is a numerical technique for finding approximate solutions of partial
differential equations (PDE) as well as of integral equations. The solution approach is based
either on eliminating the differential equation completely (steady state problems), or
rendering the PDE into an approximating system of ordinary differential equations, which are
then numerically integrated using standard techniques such as Euler's method, Runge-
Kutta, etc.

In solving partial differential equations, the primary challenge is to create an equation that
approximates the equation to be studied, but is numerically stable, meaning that errors in the
input and intermediate calculations do not accumulate and cause the resulting output to be
meaningless. There are many ways of doing this, all with advantages and disadvantages.
The Finite Element Method is a good choice for solving partial differential equations over
complicated domains (like cars and oil pipelines), when the domain changes (as during a
solid state reaction with a moving boundary), when the desired precision varies over the
entire domain, or when the solution lacks smoothness. For instance, in a frontal crash
simulation it is possible to increase prediction accuracy in "important" areas like the front of
the car and reduce it in its rear (thus reducing cost of the simulation).

5.1.3 Video Motion Capture (MC)


Motion capture, motion tracking, or motion capture are terms used to describe the process of
recording movement and translating that movement onto a digital model. It is used in
military, entertainment, sports, and medical applications. In filmmaking it refers to recording
actions of human actors, and using that information to animate digital character models in

20/12/2011 69 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

2D or 3D computer animation. When it includes face, fingers and captures subtle


expressions, it is often referred to as performance capture

In motion capture sessions, movements of one or more actors are sampled many times per
second, although with most techniques (recent developments from Weta use images for 2D
motion capture and project into 3D) motion capture records only the movements of the actor,
not his/her visual appearance. This animation data is mapped to a 3D model so that the
model performs the same actions as the actor. This is comparable to the older technique of
rotoscope, such as the 1978 "The Lord of the Rings" animated film where the visual
appearance of the motion of an actor was filmed, then the film used as a guide for the frame
by frame motion of a hand-drawn animated character. Camera movements can also be
motion captured so that a virtual camera in the scene will pan, tilt, or dolly around the stage
driven by a camera operator, while the actor is performing and the motion capture system
can capture the camera and props as well as the actor's performance. This allows the
computer generated characters, images and sets, to have the same perspective as the video
images from the camera. A computer processes the data and displays the movements of the
actor, providing the desired camera positions in terms of objects in the set. Retroactively
obtaining camera movement data from the captured footage is known as match moving or
camera tracking.

5.2 RAMSIS tool


The 3D CAD manikin RAMSIS (Figure 5-1) is a simulation software program for a wide
range of design and construction analyses. RAMSIS addresses demands on ergonomics,
comfort and safety as early as the planning stage. The system acquires and analyzes
physical measurement sizes just as dependably as it does with the space, force and vision
requirements of a target group.

Figure 5-1 - 3D CAD manikin RAMSIS

RAMSIS creates a manikin for the detailed simulation of vehicle occupants. To do this, the
virtual human being is positioned in a CAD environment which conforms exactly to the future
vehicle in space and design. This forms the basis for precise analyses. In order to design the
vehicle for the complete physical spectrum of future users, RAMSIS works with
anthropometric scaling and changes smoothly between male and female models. Task-

20/12/2011 70 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

related postures and motions and individual motion sequences are calculated automatically.
The user sees the vehicle interior as RAMSIS sees it. Thus outward visibility and the view of
each individual instrument can be optimally evaluated.

The human model RAMSIS is based on anthropometrical and kinematical accessibility


simulation technologies, which require a set of specific input parameters and provide task
specific output parameters. Since the principal technologies should not be modified in
VERITAS, the identified simulation aspects have to be addressed by these basics.
Additionally the RAMSIS simulation input and output parameters should be compatible with
the needs of the simulation aspects. In particular the following application specific RAMSIS
tool requirements have to be fulfilled:

Posture Prediction

The prediction of task specific postures in an environment should be based on the


following method (Figure 5-2). The static task is transferred into geometrical
constraints for body elements of the manikin. A kinematical model of the manikin is
driven by an optimization method (inverse kinematics) to calculate joint angles of the
posture which fulfils the geometrical constraints. The optimization method guarantees
that the calculated joint angles are between given anatomical limits and are close to
a task specific preferred reference posture.

Input Parameter Probability based Analyses


Posture Prediction

Tasks:
- Sit
- Grasp st. wheel
- Press pedall
- etc.

Figure 5-2 - Posture prediction method

Task Analysis

The principal task analysis process should be valid (Figure 1-1). For a given vehicle
cabin design, a static task (for example driving, closing the door, pulling hand brake)
and a set of body dimensions (for example stature, sitting height) the posture of the
manikin is calculated and visualized. Afterwards the posture is analyzed with respect

20/12/2011 71 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

to the task constraints and design geometry (for example distances, collision,
clearance).

Vision Analysis

The vision analysis should be performed on a given posture and manikin size.
Depending on the head and eye orientation different vision areas in the environment
are calculated and analyzed, in particular areas of preferred view or of obscured view
(Figure 5-3).

Figure 5-3 - Vision Analysis

Force Analysis

The force analysis should be performed on a given static posture, manikin size,
gender and age (group). For a given direction the maximal possible force is
calculated based on measured human maximum joint torque data (Figure 5-4).

Figure 5-4 - Force Analysis

Anthropometry Analysis

20/12/2011 72 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Since a RAMSIS manikin can be scaled according to specific 22 body dimensions,


this set of body dimensions or a non empty subset of it has to be provided in the
application. Missing body dimension values are calculated from given body
dimension values by statistical models.

Figure 5-5 - RAMSIS body dimensions

The specific 22 RAMSIS dimensions (Figure 5-5) are:

Pelvis & Hip Width

Chest Width & Depth

Foot Width, Height & Length

Knee Height Sitting & Buttock-Knee Length

Head Width, Height & Length

Body Height & Sitting Height

Upper Arm Length & Forearm Length with Hand

Bideltoid Shoulder Width

Waist Circumference

Forearm & Upper Arm Circumference

Calf & Thigh Circumference

5.3 LMS Virtual.Lab tool


For vibrational comfort evaluations for vehicles or Powered Two-Wheelers (PTW, i.e.
motorcycles), three main techniques are distinguished in the literature to develop virtual
human models:

20/12/2011 73 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Linear models ([4]): these models are easy to implement and can be solved in
real-time, but, due to their simplicity, they cannot handle non-linearities of the
contact between the dummy and the environment (either PTW or car) and the
different posture of the subject.

Finite Element (FE) models ([5], [6]): these models can predict the local
behaviour of the contact tissue-environment. A disadvantage is the typical size of
FE models that results in large computational efforts. Moreover, the FE models
have many parameters and material properties to be specified (for which
accurate data is not always available).

MBS models ([6], [7], [8]): these represent the state-of-the-art in this sector. The
MBS models offer an efficient and effective modeling & simulation strategy: quite
efficient and not too complex to set up (as compared to the FE virtual dummy
models), and sufficiently accurate to predict the global behaviour of the
rider/driver due to whole body vibrations. The multibody simulation (MBS)
methodology can be considered as an important tool for virtual prototyping, which
is be very useful to substitute expensive test-rigs and drastically reduce the time-
to-market phase of new products. In the development engineering process of
road transportation systems (vehicles, motorcycles, .), one can apply MBS
simulations to predict and optimize the dynamics, handling, safety and ride
comfort, all of which have become key strategic factors for successful product
positioning in the market.

Within LMS, a three dimensional model of a virtual human dummy is proposed to represent
the biomechanical response due to whole body vibration. The model has been developed
using the MBS modeling & simulation environment LMS Virtual.Lab Motion, taking into
account a detailed spine sub-assembly in order to accurately evaluate the human frequency
response in the entire range of interest of the whole body vibration. The model has been
completely parameterized, and it can be set up automatically, for instance to define the
percentile of the dummy (height and weight) and the initial position. Special attention has
been paid to the posture of the dummy, which can assume three different configurations:
standing position, driver position (representing a car occupant) and rider position. The
model can be used for both frequency-domain and time domain-analysis; in the frequency
domain, the dummy can be used to compute human vibrational modes and transmissibility
functions; in the time domain, it can be used for the estimation of the vertical transmissibility
and the comfort assessment against the ISO 2631-1 for the whole body vibration analysis.

To allow considering different rider and driver models, the multibody model has been
parameterized to different anthropometric characteristics. Two relations have been used to
allow automating this process: one for the percentile of the size and one for the percentile of
the mass ([9]). Here, it is noted that dummy models are related to the human population with
a percentile measure. For height, the percentile dummy denotes a dummy that is taller
than percent of the human population. For weight, the percentile dummy is heavier than
percent of the human population.

20/12/2011 74 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 5-6 - LMS Virtual Dummy

The relation for the size valid for human subjects between the 20 and 80 percentile is given
by:

11
h 1.7515 ( perc 20)[m]
6000 Equation 4

where h denotes the height of the dummy and perc represents the percentile of the dummy.

The correlation for the variation of the weight (valid between the 5 and 95 percentile) is given
by:

22
m 64.21 ( perc 5)[kg]
90 Equation 5

where m denotes the weight of the dummy and perc represents the percentile of the dummy.

To obtain a straightforward and fast control of the model parameters, it has been ensured
that one can automatically change the settings of the entire MBS model by using a VBA
script developed for this purpose, based on the above-mentioned relations (Equations 4 and
5); the user can make use of Graphical User Interface (GUI) windows, in which the value of

20/12/2011 75 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

the percentile can be changed to update the model definition. Once the dummy has been
created and parameterized, special attention has been paid to its posture, which can
assume three different configurations: standing position, driver position (representing a car
occupant) and rider position (representing a motorcycle user, i.e. the a rider model is
obtained that is oriented to the Powered Two Wheelers industry).

5.3.1 Vibrational analysis methodology


In order to evaluate how the vibrations are transmitted from the environment to the occupant,
the complete definition of the MBS manikin is not sufficient for a complete analysis, and
other parameters have to be described.

Dummy percentile and position: The vibration transmission chain and the
absorbed vibration dose are strongly influenced by the posture of the occupant
influences and by the dummy percentile. Therefore, a detailed set of angles and
inertial parameters must be identified for each analysis.

Vehicle parameters: within the transmissibility analysis, the vibrations are mainly
transmitted to the human body from the contact points between manikin and
vehicle. In the car environment, these points are the seat surface, the seat back,
the pedals and the steering wheel, while in the motorcycle environment, these
points they are represented by the handlebar, the footrests and the saddle.

5.4 Lightning tool


Scripts

Data
VR Application
Lightning Kernel
Modules
TCL Interpreter Classes
Factory

Cluster Simulation Pool


Replication Dataflow Objects
Synchronization Behavior Routes

Lightning Library

Figure 5-7 - Structure of Lightning tool

The Lightning virtual reality system was first presented in 1996 [10] as a rapid prototyping
tool for VR applications, particularly in the architectural and presentation domain. Since then

20/12/2011 76 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

it has evolved to a mature tool for interactive engineering environments, where prototyping of
3D user interfaces becomes an important issue.

Its core component is a database called object pool. Application developers define the
environment merely by creating or deleting objects within this database. The interactivity and
the behavior can be introduced with function objects and communication channels, so-called
routes. These communication channels define application specific event propagation and
therefore the interactivity. A basic object type is provided with several properties:

Unified Interface slots so-called fields provide a standardized interface for the event
propagation module.

A uniform update function, which changes the internal state according to the data
at the input fields and the internal state of the object.
Execution of this update function, only if the input data has changed or the object
has been handed to a specialized administration module, which always calls the
update function. This is a generalized implementation of the so called sensor type
in VRML2.0 [11].

Objects have a certain internal linkage, which connects them to the Abstract Device Layer.
System or application developers can extend the system simply by providing a new object
with the described field interface. Generic render devices exist for visual and audio renderer.
The specific implementation will be filled in via inheritance, which in case of the visual
renderer provides the actual implementation is based on OPENSCENEGRAPH graphical
render library. The specific OPENSCENEGRAPH renderer links the visual scene graph to
Lightning objects and reports changes of the geometry of a visual object to the underlying
library and, as a result, to the graphics device hardware.

20/12/2011 77 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

5.4.1 Event propagation

Sensor

Object Object
2 3

Object
4

Object Object
5

Figure 5-8 - Object pool and dataflow in Lightning

Objects within the pool and their links form a directed graph (link graph) which is evaluated
at each simulation step. Starting with output from sensor objects, information is propagated
along links. The behavior description code of a behavior object node is re-executed as soon
as all nodes prior to this node (along the path of links) produced new output or at least had a
chance to do so. Finally, if all graphs are evaluated, the procedure repeats. Output is defined
by objects like for example cameras, which are part of the object pool, but are sampled
asynchronously by the various renderer.

20/12/2011 78 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Simulation step i (frame)

Input Simulation Output


Gather user interaction Compute new system state Present system state

Read and filter


input data Lightning

Update system state


Hardware based upon user input Hardware

VR Application
Display updated
Lightning system state

input for next frame (step i+1)

Figure 5-9 - Execution of Lightning applications

5.4.2 Multilanguage/multiparadigm programming


Lightning does not rely on a specific language but supports a heterogeneous, mixed
language approach. That is, behavior objects written in, for instance, C++ can be freely
replaced by objects written in an interpreted language and vice versa, even at runtime.

So far, the Tcl programming language [8] has been used as it is easy to integrate into other
systems and has as a free software package many resources and extensions. With its self
evaluating property, the "eval" command it also has an expressive power similar to the
"lambda" of Lisp or other AI languages. This enables a compact description of high-level
behavior.

5.4.3 Superposition of virtual space and user space


One basic approach to designing a projection model is to achieve an exact match of virtual
and physical space. That is, the apparent size of an object should be independent of the
projection system. If this is the case, the handle of a teapot would have exactly the same
size if seen on a CRT screen as if seen on a wall projection. Consequently, the user can
interact naturally with the teapot without the need for scaling the geometry model. In
Lightning the projection system is specified simply by providing a geometric model (with
exact physical dimensions) of the projection environment. With this data, the projection is
automatically configured by the system. Location measurement calibrations are actually also
an important factor for exact spatial matching. In Lightning this feature is provided on device
module level.

20/12/2011 79 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

5.4.4 Extensibility
An important key feature of Lightning is its extensibility. All application objects are accessible
via the Tcl interface. Behavior scripts can be developed to define the functionality of
applications. New applications can be built completely on this layer. On the core system
layer we use a combination of object oriented techniques and operating system features for
decoupling the modules. Application objects inherit the interface and communication
properties from a base class. The communication process operates only on base class
features. The configuration and initialization communication is strictly string based to enable
run time coupling. All Lightning system libraries are so-called shared objects, which are
linked at runtime.

This feature is used primarily for ease of maintenance. Application objects are by default
shared objects and can be accessed immediately by the Tcl interface without coding or
recompiling. This is a major advantage for the extensibility of the system. Application objects
can easily be developed; only common system interfaces have to be included. The
extension on C++ level is also independent of static linking with the system.

20/12/2011 80 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

6 Integrated environment simulation specifications


This section includes a high level specification of the feasible tools addressing the identified
use cases. These specifications are the base for the more detailed simulation models
provided in D2.2.1.

6.1 Use case and feasibility analysis


The use case analysis in 3.1 is continued in order to identify and specify the relevant
simulation methods. In particular the use case specific parameters actor, action and design
question (Figure 3-1) are analyzed to derive the corresponding human simulation parameter
(e.g. joint strength and mobility) and accessibility simulation methods (e.g. posture and force
simulation) (Figure 6-1). Finally the tool is identified, which is provided by a partner in this
work package and can handle the required human simulation parameter and the
accessibility simulation method.

- Healthy - Joint strength


- Limited force capability - Joint mobility
- Limited mobility

Hand brake Human Simulation


Actor
operation Parameter

Use
D1.7.1 Action
Case
Accessibility
Simulation
Pull or push Design
hand brake lever Question

Technical
Feasibility
- Posture simulation
1 - Force simulation

- Is lever reachable?
- Is lever movable?

Figure 6-1 - Use case and feasibility analysis process

The results are given in Error! Reference source not found. for the desktop application
use cases as continuation of the results in

20/12/2011 81 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Table 2 - Use case analysis results (immersive application)


.

Table 12 - Use case and feasibility analysis results (desktop application)


Human
Simulation
Subject Scenario Use Case / D1.7.1 Design Question Impairments Tool Parameter
Central rear view mirror
Car interior 2.1.a.1 tuning Reachability upper limb movement limitations RAMSIS Joint range limits

2.1.a.2 Lateral mirror tuning Reachability upper limb movement limitations RAMSIS Joint range limits
Hand brake Reachability upper limb movement limitations Joint range limits
2.1.a.3 activation/deactivation Force capability upper limb traction/pushing force limitations RAMSIS Joint strengths
Gear changing
2.1.a.4 (manual/automatic) Reachability upper limb movement limitations RAMSIS Joint range limits
Accessing interior storage
2.1.a.5 compartments Reachability upper limb movement limitations RAMSIS Joint range limits

Motor-cycle 2.2.a.1 Riding position Adequate position upper limb impairment RAMSIS Joint range limits
Joint range limits
2.2.a.2 Usage on bumpy road Adequate articular response upper limb impairment LMS Virtual.Lab Joint stiffnesses

Limb impairments Joint range limits


2.2.a.3 Central stand Ability to place the vehicle on stand Reduced force capabilities VERITAS Core Joint strengths

Reachability Joint range limits


Force Capability upper limb impairment (flexibility, force) Visual field limits
2.2.a.4 Motorcycle handling Visibility visual impairments RAMSIS Joint strengths
upper limb limitations
On-board navigation system Reachability Limited head rotation Joint range limits
ADAS / IVIS 2.3.a.1 programming Visibility Visual limitations (tunnel vision) RAMSIS Visual field limits
Audio system programming
2.3.a.2 and tuning Reachability upper limb limitations RAMSIS Joint range limits
Navigation information Low reaction time N.A.
2.3.a.3 accessibility Accessible informatory content Visual acuity RAMSIS Visual acuity
Support colour-blindness Visual limitations N.A. N.A.
Correct speech characteristics Hearing limitations N.A. N.A.
Collision Avoidance system Warning reacation time Reduced reaction time N.A. N.A.
ARAS / OBIS 2.4.a.1 (CAS) Character size Vision acuity RAMSIS Visual acuity
Position in vision field Visual acuity RAMSIS Visual acuity
Character size Reduced peripheral Field of vision RAMSIS Visual field limits
Reachability Limited mobility RAMSIS Joint range limits
2.4.a.2 Navigation Correct speech characteristics Hearing limitations N.A. N.A.

In the same way the results for the immersive application use cases are given in Error!
Reference source not found. as continuation of the results in Error! Reference source
not found..

Table 13 - Use case and feasibility analysis results (immersive application)

20/12/2011 82 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Human
Simulation
Subject Scenario Use Case / D1.7.1 Design Question Impairments Tool Parameter
Central rear view mirror
Car interior 2.1.b.1 tuning Reachability upper limb movement limitations RAMSIS Joint range limits
2.1.b.2 Lateral mirror tuning Reachability upper limb movement limitations RAMSIS Joint range limits
Hand brake Reachability upper limb movement limitations RAMSIS Joint range limits
2.1.b.3 activation/deactivation Force capability upper limb traction/pushing force limitations N.A. N.A.
Gear changing
2.1.b.4 (manual/automatic) Reachability upper limb movement limitations RAMSIS Joint range limits
Accessing interior storage
2.1.b.5 compartments Reachability upper limb movement limitations RAMSIS Joint range limits
upper limb limitations
On-board navigation system Reachability Limited head rotation Joint range limits
ADAS / IVIS 2.3.b.1 programming Visibility Visual limitations (tunnel vision) RAMSIS Visual field limits
Audio system programming
2.3.b.2 and tuning Reachability upper limb limitations RAMSIS Joint range limits
Navigation information Low reaction time N.A.
2.3.b.3 accessibility Accessible informatory content Vision acuity RAMSIS Vision acuity
Support colour-blindness N.A. N.A.
Correct speech characteristics Visual limitations N.A. N.A.
Collision Avoidance system Warning reacation time Hearing limitations N.A. N.A.
ARAS / OBIS 2.4.b.1 (CAS) Character size Reduced reaction time RAMSIS Visual acuity
Position in vision field Visual acuity RAMSIS Visual acuity
Character size Reduced peripheral Field of vision RAMSIS Visual field limits
Reachability Limited mobility RAMSIS Joint range limits
2.4.b.2 Navigation Correct speech characteristics Hearing limitations N.A. N.A.

The details on the human simulation parameters and the accessibility simulation methods
are given in the next section for each identified tool integration platform.

6.2 Integration platform RAMSIS


6.2.1 Human simulation parameters
According to the use case feasibility results in Error! Reference source not found. and
Error! Reference source not found., the following human simulation parameters are
required and can be provided by the human model RAMSIS:

Joint range limits

The kinematical model already provides joint specific anatomical limits for the joint
ranges of motion (Figure 6-2). They are used to ensure that manikin motions are
always within the anatomical reasonable limits. In the standard application the limits,
correspond to healthy humans, but can be reduced according to specific joint mobility
restrictions.

Figure 6-2 - Joint range limits

Joint strengths

20/12/2011 83 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Joint strength is currently modeled based on measured maximum joint torques of


human beings (Figure 6-3). These models take into account joint specific
characteristics, such as posture dependent strength and degrees of freedom, as well
as gender and age effects on strength. They are used to evaluate task specific
required forces and torques against the human force capabilities. In standard
applications the limits correspond to healthy humans, but can be scaled according to
specific strength restrictions.

Figure 6-3 - Human maximum torque data

Visual field limits

Visual limits are provided in two ways, first the eye motion range and second the
visual gaze and perception areas according to the head and eye position are
assigned specific limits. (Figure 6-4). They are used to cluster the visual space
according to visibility capabilities. In standard applications the limits correspond to
healthy humans, but can be reduced according to specific visual restrictions.

20/12/2011 84 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 6-4 - Eye motion limits and visual field (gaze, perception)

Visual acuity

The visual acuity is expressed as the reciprocal value of the size of the gap
(measured in arc minutes) of the smallest Landolt C that can be reliably identified.
This can be geometrical modelled through a cone with the corresponding opening
angle at the eye position and aligned to the vision. According to a user defined acuity
analyzing symbols can be visualized at potential information device locations. They
can be used to check character sizes such that the characters can be recognized by
humans. The user defined acuity can also be reduced according to age effects.

Figure 6-5 - Visual acuity

20/12/2011 85 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

6.2.2 Accessibility simulation methods


According to the use case feasibility results in Error! Reference source not found. the
following accessibility simulation methods are required and can be provided by the human
model RAMSIS:

Posture prediction subjected to mobility limits

The current posture prediction takes individual joint limits, joint value distributions and
preferred postures into account. The final predicted posture is within the joint angle
limits and close to the preferred posture. This method can be applied straightforward
for limited eye and joint range motions and restricted preferred postures (Figure 6-6).
Input Parameter Probability based Analyses
Limited eye motion
Posture Prediction
range

Restricted preferred
postures
Tasks:
Limited joint ranges
- Sit
- Grasp st.Figure
wheel 6-6 - Posture prediction on joint range restrictions
- Press pedall
- etc.

Posture prediction subjected to new applications and human parameters

A more sophisticated but extensive way of adapting the posture prediction to new
applications (e.g. scooter) and human parameters (e.g. age) is the consideration of
these conditions in the posture model, which is the technological base for the
accessibility assessment in Figure 1-1. In order to implement a new posture model
posture experiments have to be performed, which address the specific application
and human parameters. The posture results are used to generate the posture
distribution functions and the neutral (reference) posture of the posture model (Figure
6-7).

20/12/2011 86 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

P50

Joint Angle A

P50

Joint Angle B

P50

f Neutral Posture

Joint Angle C

Figure 6-7 - Generation of posture distribution functions

A critical step in this generation process is the transformation of the real postures in
the experiments in postures modeled by an digital human model (e.g. RAMSIS). In
addition to marker-based optical systems (e.g. VICON) the posture modeling can be
done by marker less camera systems. In that process standard 2D images of the
postures are taken and superimposed with the digital human model scaled to the
individual body dimensions of the test subject (Figure 6-8). An operator manually
animates the human model posture until the manikin fits to the image.

2D Image of
subject
3D Model
RAMSIS

1 Superimposing of 3D
model in 2D image

Figure 6-8 - Posture modeling from experiments

Joint force assessment

20/12/2011 87 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

The assessment of forces is performed on joint torque calculations and measured


human maximum joint torques depending on gender and age. Based on a given
posture and external forces the joint loads are calculated from the ratio of joint
torques and maximum joint torques (Figure 6-9). This method can be applied to
virtual users having reduced joint strengths due to age or impairments.

Maximum forces
Force calculation and joint loads

Human joint
torque data

Figure 6-9 - Joint force assessment

Posture assessment

The assessment of a given posture is based on the simple assumption that joint
rotations at or near the joint angle limits are less preferred than in the middle of the
joint range. Hence, a rating of the joint angles with respect to limits is calculated
(Figure 6-10). Finally these ratings can be compared for different designs or for
different aged or disabled people to analyze the influence of the design or user
parameter on the posture.

1
Joint angle
capacity
Figure 6-10 - Posture assessment

Vision assessment

The assessment of vision is provided by symbol size analysis and visual field
clustering (Figure 6-11). In the first case optimal symbol sizes based on human visual
acuity are displayed to provide benchmarks for the character design. In the second
case the space is clustered in areas of direct (objects can be fixed by eyes) and
perceptional (objects can be perceived but not recognized) vision in order to locate

20/12/2011 88 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

information devices in the vehicle. These methods can be applied straightforward to


user models subjected to reduced acuity or visual fields.

Symbol size
assessment

Acuity data

Visual field
Visual field clustering
definition

Figure 6-11 - Vision assessment

20/12/2011 89 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

6.3 Integration platform LMS virtual lab


According to the use case feasibility results in Error! Reference source not found. and
Error! Reference source not found., the following human simulation parameters are
required and can be provided by the human model LMS Virtual Dummy:

In the framework of VERITAS, the LMS Virtual Dummy can be characterized to reproduce
different disabilities by tuning specific parameters, such as the joint rotational stiffness. The
results of the simulation will provide simple and relevant indications about how much a
vehicle is accessible in terms of ergonomics and vibrational comfort. This is really innovative
for such a field since there is a good experience at experimental level but we are quite at the
ground floor at simulation level. The LMS Virtual Dummy will provide a complementarity
aspect to the VERITAS core simulation engine, since it can address an additional aspect
compared to the VERITAS simulation engine.

6.4 Integration platform Lightning

6.4.1 RAMSIS simulation core


The RAMSIS simulation core with several VERITAS related add-ons will be integrated into
lightning as an additional module. In Lightning the RAMSIS module is linked to the VERITAS
simulation core and other modules like the tracking system.

Doing so an immersive application based on Lightning, RAMSIS and the VERITAS


simulation core will be developed. Details of the architecture are described in the section
VERITAS simulation core that follows.

Lightning is used to connect the simulation cores of RAMSIS and VERITAS with input
devices and providing user interfaces. The user is tracked and provides directly the input for
the simulation cores. The posture of the RAMSIS model is calculated and Lightning
visualises the results of the simulation:

Input

o The digital human model of RAMSIS is scaled to the users measurements

o The user is tracked and the tracking system provides tracking data for the
simulation core

Function

o The RAMSIS human model is moved by inverse kinematics.

o Joint angles are blocked at limits given by the VERITAS simulation core.

20/12/2011 90 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Output

o Human model appearance

o Visualisation of simulation

o Forces

o Visual acuity and visual fields can be visualized in virtual reality using
shaders, so the user can see the virtual reality as like as people with
impairments might see the world. For instance, a limited field of view can be
simulated by displaying only the virtual world inside of the according viewing
cone.

Mobility

o The kinematical model of the articulated digital human model is equipped with
corresponding joint angle ranges, so the simulated motions can be restricted
by the VERITAS simulation core.

o The human model will only perform those movements which match the limits
given by the VERITAS simulation core.

The RASMIS human model is displayed in Lightning in the two modes nice skin and
standard skin:

Figure 6-12 - RAMSIS nice skin Figure 6-13 - RAMSIS standard mode
integrated in Lightning integrated in Lightning

20/12/2011 91 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

6.4.2 VERITAS simulation core

The VERITAS simulation core has to be included into Lightning. The simulation core is
wrapped into Lightning. Doing so, an immersive Virtual Reality platform is developed based
on Lightning and VERITAS simulation cores. This platform allows the designer or developer
to experience a virtual simulation of her design and to interact with the environment as being
a user with disabilities.

Virtual Environment

Core Immersive Interaction


Simulation Viewer Manager
(A2.1.1) (A2.1.2) (A2.1.3) Interaction
User/Task Tools
Models Simulation of mental, Output of visual, (WP2.7)
(SP1)
visual, hearing, and hearing simulation Filtering of
motor activities results input/output stream

Immersive Simulation Viewer


Common VR Integration Interface
(A2.4.3)
Application Support and Programming Interface

Simulation
Models
(SP2)

Automotive Smart Living Workplace


(WP2.2) (WP2.3) (WP2.4)
Ramsis Domotics ???

Applications

Figure 6-14 - VERITAS simulation core integrated in Lightning

The immersive application will allow two modes:

Third Person perspective:

o The User is a Observer of the VERITAS-Manikin in VR

o The Manikin is under control of the VERITAS Core Simulation with


application-specific simulation model.

20/12/2011 92 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 6-15 - Third person perspective

1st Person

o User controls the VERITAS-Manikin directly using motion capture techniques


like tracking hands and head.

o Manikin under users direct control and minimal Core Simulation influence

o Users interaction filtered by the VERITAS simulation core to simulate


disabilities

Figure 6-16 - First person perspective

An immersive user interface has to be developed for first and third person modes. The
immersive GUI allows to load VERITAS use cases and to control the VERITAS manikin.

20/12/2011 93 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

7 Specification of simulation methods / techniques


This section reports the simulation environment specification in Section 6 in more detail in
order to prepare the software implementation.

7.1 Integration platform RAMSIS


In Section 6.2.2 all simulation methods are described, which are relevant in VERITAS for the
simulation platform RAMSIS. In this section all simulation methods are specified in more
detail, which are currently not available in RAMSIS and will implemented during the project.

7.1.1 Posture prediction subjected to new applications and human


parameters
In the automotive domain several use cases consider the design of scooters and motor
cycles. Since a posture model is not available for these applications, the corresponding
models are created during the project. In Section 6.2.2 the main steps of a posture model
generation are described:

1. Measurement of real postures on motorcycles / scooters

2. Modeling of postures in RAMSIS

3. Analysis of joint angle distributions

4. Integration in RAMSIS

In the following these steps are given in more detail for the generation of a specific scooter /
motor cycle posture model in RAMSIS.

7.1.1.1 Measurement of real postures on motorcycles / scooters


In the posture measurement experiments the following configurations were defined:

Vehicles: 5 vehicles were selected, of which 3 scooters and 2 motorcycles

Test subjects: 23 females and 25 males aged between 22-60

Tasks: riding position, mirror adjustment, foot on the ground, looking at dashboard

The body dimensions of all subjects were measured, in particular including the RAMSIS
body dimensions given in Section 5.2. All subjects (i.e. 48 people) performed all tasks (i.e. 4
tasks) on all vehicles (i.e. 5 vehicles) and for every situation a camera image was
simultaneously taken from front, top and side view (i.e. 3 photos for each situation). In case
of the riding task three repetitions were considered so to take into consideration the
variability of the posture (i.e 3 photos X 3 views only for the riding task).

20/12/2011 94 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

In addition the parameters for each camera were determined (Figure 7-1). The identified
parameters are used for the setting up of the image superimposing, that is described in the
next paragraph.

Image width (pixel)

camera opening camera opening


angle, vertical [ ] angle, horizontal [ ]

camera up
normal vector Image
height
(pixel)

3D camera center in
3D camera view plane [mm]
position [mm]

Figure 7-1 - Camera parameters

7.1.1.2 Modeling of postures in RAMSIS


Every image is displayed in a separate 3D work space from the camera position point of
view (Figure 7-2).

Figure 7-2 - Image import in 3D environment

In the next step a manikin is created according to the body dimensions of the subject, which
were taken during the experiments in Section 7.1.1.1. This manikin is embedded in the same
3D environment as the imported images and is displayed in every image specific work space
as illustrated in the next figure.

20/12/2011 95 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 7-3 - Superimposed manikin in separate image work spaces

In a last step this manikin is articulated joint by joint until the manikin posture superimposed
all three images sufficiently. The posture angles are stored and provided for the analysis in
Section 7.1.1.3.

7.1.1.3 Analysis of joint angle distributions


In this step the joint angles of each DOF are analyzed with respect to the frequency
distribution over the test subjects. In particular the finite histograms for all DOFs are
calculated (Figure 7-4).

Experimental joint angle distribution


25.00%

20.00%

15.00%

10.00%

5.00%

0.00%
-3.3 -1.55 0.2 1.95 3.7 5.45 7.2 8.95 10.7 12.45 14.2

Figure 7-4 - Joint angle histogram

Based on the joint angle histograms a monotone distribution function is fit into the data. This
function is zero at the end of the joint angle definition interval and has one maximum at the
joint angle value of largest frequency (Figure 7-5).

20/12/2011 96 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Modelled joint angle distribution function


25.00%

20.00%

15.00%

10.00%

5.00%

0.00%
-6 -4 -2 0 2 4 6 8 10 12 14 16

experimental modeled

Figure 7-5 - Joint angle distribution function

The support points for the joint angle distribution functions of all joint DOF are determined
and provided for the integration into the RAMSIS human model in Section 7.1.1.4.

7.1.1.4 Integration in RAMSIS


The human model RAMSIS provides a software interface structure to define posture models
supporting the posture simulation process in Figure 1-1. The support points for all joint angle
distribution functions derived in Section 7.1.1.3 are integrated into the corresponding
parameter files. As a consequence the new posture model scooter can be activated in the
GUI.

7.1.2 Posture assessment


The accessibility of a given design is simulated in two stages (Figure 7-6). First a task
specific posture is predicted using the experimental based approach described in
Section 6.2.2. In a second step the predicted posture is assessed with respects to visibility,
force, reachability and posture criteria.

20/12/2011 97 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Experimental preferred posture


distributions
P50

Joint Angle A

P50

Prediction f

Joint Angle B

P50

f Neutral Posture

Joint Angle C

Visibility
Assessment Force

Task specific posture accessibility


Reachability
Posture

Figure 7-6 - Design accessibility simulation

In VERITAS a new posture assessment approach is implemented. It is based on the


comparison of the task specific (imposed) posture and the preferred posture determined
from experiments (Figure 7-7).

Task specific posture

Comparison

Deviation of task specific


to optimal / preferred
posture

Optimal / preferred
posture

Figure 7-7 - Comparison of task imposed and preferred posture

This comparison is calculated on the normalized joint angle distributions of the related
posture model (Figure 7-8). The distance of the task specific joint angle to the optimal /
preferred joint angle is rated by the joint angle distribution function value at the task specific

20/12/2011 98 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

joint angle. The rating is close to 1 (best) when the distance is small and close to 0 (worst)
when the distance is large. The weighting of the rating is given by the joint angle distribution
shape.

joint angle
distribution

Assessment of
deviation to 0.7
optimum /
preference

joint angle

lower limit optimal / task specific upper limit


preferred joint angle
joint angle

Figure 7-8 Comparison of task imposed and preferred joint angle

The joint DOF ratings are merged to a joint rating, the joint ratings are merged to a body part
rating and body part ratings are merged to a whole body rating by multi-linear blending
techniques. All body part ratings are displayed in the GUI by bar charts (Figure 7-9 -).

Figure 7-9 - Display of body part ratings

7.1.3 Vision assessment


The vision assessments are based on new functions developed in WP2.7 for the simulation
of physical interaction tools addressing visual impairments. The following functions are
already described in ID2.7.2 Preliminary prototype of interaction tools at least one for each,
but will be documented again for the sake of clarity here.

The vision assessment is controlled by the mask displayed in Figure 7-10 -.

20/12/2011 99 CRF
VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 7-10 - Vision control mask

The mask provides the following elements:

Option Acute Vision Field

This option activates the generation of the visual acuity cone with respect to the user
defined flare angle (Figure 7-11). The default value of the opening angle represents
the acuity of average people according to the literature [1, B-5.1.6] and can be
overwritten by specific values from the impairment metrics.

Option Binocular Visual Field

This option activates the generation of the visual field cone with respect to the user
defined horizontal and vertical opening angles (Figure 7-12). The default values of
the opening angles represent the optimal visual field of average people according to
the literature [1, B-5.1.2] and can be overwritten by specific values from the
impairment metrics.

Option Maximum Binocular Field

This option activates the generation of the visual field cone with respect to the user
defined horizontal and vertical opening angles (Figure 7-12). The default values of
the opening angles represent the maximal visual field of average people according to
the literature [1, B-5.1.2] and can be overwritten by specific values from the
impairment metrics.

Options Visual Field Representation

The visual fields can be generated as a cone (Figure 7-12, left) and as a circle in the
fixation plane perpendicular to the vision line (Figure 7-12, right), which can be
activated independently by these options.

Buttons

20/12/2011 100 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

The mask functionalities are controlled by the following buttons.

o Create

This button generates the visual field objects according to the user defined
opening angles and options (Figure 7-11 and Figure 7-12).

o Reset to Default

This button resets all user defined opening angles to the initial default values.

o Close

This button closes the mask without further action.

Depending on the selected options the mask creates the acuity vision field as displayed in
Figure 7-11 and the visual field as displayed in Figure 7-12.

Figure 7-11 - Acute vision field

In Figure 7-12 the different visual field representations (cone, circle) are shown.

Figure 7-12 - Visual field, cone (left) and circle (right) representation

7.1.4 Simulation model execution


A simulation model consists of a set of tasks, which have to be executed for a manikin and a
design in a specific order. Each task requires a posture prediction and a succeeding posture,
vision of force assessment (Figure 7-13).

20/12/2011 101 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

Use case /
simulation model
Task 1 Task 2 Task n

Posture
prediction

Posture Design
assessment

Vision
assessment

Force
assessment
VERITAS
user model

Figure 7-13 - Simulation model execution

The execution of the consecutive task specific posture predictions and assessments is
controlled by a software module. In particular the tasks are loaded from files in the specified
order and are automatically executed on the current VERITAS user model and design. The
calculated postures are used for the various assessments.

7.2 Integration platform LMS Virtual.Lab


For the purpose of assessing the ergonomics even before the availability of the first physical
prototype or test-rig, digital human models have been developed by manufacturers, then
providing key input to the vehicle development in the early stage. Digital human models are
virtual models that can represent the behaviour of the human body under different
perspectives (kinematics, dynamics, ergonomics, crash, etc).

A key functional performance aspect in motorcycle design and ergonomics analysis is the
assessment of the vibrational comfort. Since the exposition to vibrations may have adverse
effects on the human health, the comfort performance of the final product must be carefully
evaluated and optimized in the virtual engineering process, in order to guarantee the comfort
performance of the final product that hits the road. Short-term human discomfort may lead to
annoyance, temporary hearing threshold shift (temporary hearing loss), reduced motion
control, impaired vision, discomfort and fatigue ([1],[2],[3]). Long-term and extended
exposure to whole body vibration has been linked to chronic back pain. These effects can be
even more dangerous.

20/12/2011 102 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

7.2.1 Vibrational analysis methodology


In order to evaluate how vibrations are transmitted from the environment to the occupant, the
complete definition of the MBS manikin is not sufficient for a complete analysis, and other
parameters have to be described.

Dummy percentile and position: The vibration transmission chain and the absorbed
vibration dose are strongly influenced by the posture of the occupant influences and by
the dummy percentile. Therefore, a detailed set of angles and inertial parameters must
be identified for each analysis.
Vehicle parameters: within the transmissibility analysis, the vibrations are mainly
transmitted to the human body from the contact points between manikin and vehicle. In
the car environment, these points are the seat surface, the seat back, the pedals and the
steering wheel, while in the motorcycle environment, these points they are represented
by the handlebar, the footrests and the saddle.
In the following sections, how the model is adapted to the automotive and motorcycle
simulation environment is explained.

7.2.2 Car environment representation and transmissibility analysis

To allow representing the car occupant configuration, a vehicle interior interface has been
defined, which consists of three main elements: the seat (composed by seat surface and
backrest), the pedals and the steering wheel, see Figure 7-14 -. These elements have been
rigidly connected in order to transmit the vibration as a uniform signal. Note that the car
environment can only transmit vertical vibrations, which is not a big limitation for engineering
design purposes, since the human comfort level is mainly affected by signals along the
vertical direction [6].

Figure 7-14 - Dummy in the driver position

20/12/2011 103 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

For the car driver, the reference configuration that the dummy assumes has been taken from
literature [7]. In line with the literature, the angle between thigh and leg has been set close to
117, the angle between arm and forearm has been set to 150 and the angle between foot
and floor has been given a value of about 45. The inclination of the backrest, which can be
adjusted by the occupant, has been offset 15 with respect to the vertical plane.

The connection between the automotive environment and the human model has been made
in a way that replicates the real contact between seat and person. In reality, the comfort is
strongly influenced by the features of the seat, because normally the materials of the seat
have a non-linear behaviour (both depending on the frequency and the applied pre-load). In
this study, a linear approximation has been made, which is considered to be an acceptable
approximation for the purpose of the analysis that will be carried out further on. Accordingly,
the contact between car and dummy has been created by means of traslational spring-
damper element with linear behaviour.

Once the model has been created, the transmissibility functions can be evaluated. By
definition, transmissibility is the ratio of displacement of an isolated system to the input
displacement. It is used to describe the effectiveness of a vibration isolation system. In the
analysis case considered in this paper, the transmissibility varies with the frequency, and it
can be used to analyze how the dummy amplifies or damps the input signal through a
specific vibration transmission. The expression of the transmissibility is given by Equation 4:

a( f ) out
TR( f )
a( f ) in Equation 4

there a( f ) out is the magnitude of the output acceleration in the frequency domain and a( f ) in is
the magnitude of the input acceleration in the frequency domain.

Regarding the analysis, the input signal is the vertical acceleration of the car interface, while
the output signal can be measured in two different points: the centre of the mass of the
sacrum and the centre of the mass of the head. The frequency range of interest in whole-
body vibration is limited to 0 to 30 Hz, but the most important range is between 0-10 Hz.
Measuring the acceleration in the pelvis is important, since all the standards that deal with
objective comfort assessment take into account the analysis of the acceleration at the
interface between human and car environment [12], [13]; on the other hand measuring the
acceleration in the head can provide useful information about how the human spine damps
or amplifies the signal. One can linearize the model around the equilibrium position (i.e.
extract the A-B-C-D matrices) and thus obtain the description of the model in the state-space
form. We can then define two transmissibility functions, considering two different
transmission paths.

In the analysis, the transmissibility function is carried out by considering the linearized model
and by varying the height and the mass of the dummy. This means that the non-linearities of
the model are not taken into account; this assumption is valid only if we are considering
small displacement (as in the case of vibrational comfort analysis. The complete process
flow for the calculation (from MBS model to results) is shown in Figure 7-15.

20/12/2011 104 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 7-15 - Process flow for Frequency Domain Simulation

7.2.3 Motorcycle environment representation


In this section, a methodology for rider comfort assessment on a motorcycle is explained. In
order to represent the rider configuration, a MB model of the motorcycle has been created.
The model, described into detail in [14], represents a sport motorcycle that consists of the
following 6 main rigid bodies: the front wheel, the front suspension (composed by lower fork
and upper fork), the chassis, the swinging arm, the constraint with the rear suspension and
the rear wheel. The above-mentioned bodies have been connected several (kinematic and
dynamic) constraints, so that the assembled motorcycle model can be used to simulate the
actual behaviour of a motorcycle (Figure 7-16).

20/12/2011 105 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 7-16 - Multibody motorcycle model used for the analysis


RSDA: Rotational Spring Damper Actuator
TSDA: Translational Spring Damper Actuator

Once the model has been generated and the inertial properties have been assigned, the
multibody simulation methodology allows studying the in-plane dynamics of the motorcycle
before running any dynamic simulation. From a dynamics point of view, a motorcycle can be
considered as a suspended mass connected by means of two suspensions to the
unsuspended masses. Usually, the suspended mass (or sprung mass) is composed of
engine, chassis, saddle, rider and steering head, while the unsuspended mass (or unsprung
mass) is composed of the wheels and the brakes. For performing an in-plane dynamics
study, the motorcycle must be placed in the up-right position, with the longitudinal and lateral
velocities and also the roll and yaw velocities all set to zero. Using LMS Virtual.Lab Motion in
a procedure similar to the procedure described for vehicle drive comfort modal analysis, one
can then linearize the behaviour of the motorcycle around the up-right position, resulting in
the resonance frequency and the damping ratio for each vibration mode, and an animation of
the mode shape. Figure 7-17 shows a representation of each mode shape related to the
multibody model generated in LMS Virtual.Lab Motion, where the rider is taken into account
by adding a rigid mass and rotational inertia to the motorcycle frame. The modes correspond
very well with the literature [15].

20/12/2011 106 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

Figure 7-17 - Motorcycle modes of plane vibration (left) and dummy positioning (right)

After this preliminary analysis, the dummy can be positioned on the motorcycle. As
previously for the car occupant case, MBS model has again been fully parameterized,
allowing the automatic set up and modification of the parameters that are relevant for the
model definition and the comfort assessment (i.e. different posture of the dummy,
mechanical behaviour of the seat foam, etc.

Once the model has been created, the input and the output should be defined. For comfort
assessment in the motorcycle field, the inputs can be the vertical displacement of the two-
posts test-rig (representing the road profile), while the outputs are the acceleration levels
measured at the interface between rider and PTW. After that, one can linearize the model
around the equilibrium position (i.e. extract the A-B-C-D matrices) and have the description
of the model in the state space form. In the literature, little experimental data has been
reported for the vertical transmissibility for motorcycle simulations in the literature. As a
result, it has not been possible to compare the results of the modal analysis against the
literature references.

20/12/2011 107 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

7.3 Integration platform Lightning


In addition to the simulation methods in section 6.4 some technical details of the
implementation will be described. The documentation of these tasks will be given in
the deliverable D2.2.2: VR and integrated solutions for the automotive scenario in
M36 of the project.

7.3.1 User interaction


The user can interact directly with the RAMSIS manikin. The ARTTRACK2-system from ART
is used to track the users movements and poses. In the premises of the University of
Stuttgart the system is built up with 8 cameras and the tracking area is about 2,40x2,40
meters.

Figure 7-18 - Tracking cube

The user will be equipped with tracking targets for the hands, the feet and the head. Using
inverse kinematics the posture of the manikin can be determined.

20/12/2011 108 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

7.3.2 Vision assessment


Lightning uses for rendering tasks OPENSCENEGRAPH based on OpenGL an C++. The
modules for the simulation of visual disabilities will be implemented using shaders. Shaders
allow to generate non static filters simulate eye cataract:

Figure 7-19 - Example eye cataract shader

Vision cones are generated using the parameters described in section 7.3.2 In additions to
these vision cones shaders will be used to visualize the vision fields. The environment
outside of the vision field will be greyed out.

20/12/2011 109 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

8 Conclusions
The analysis reported in this deliverable has allowed to identify which categories of
simulation (as described in VERITAS use cases for the automotive domain) are feasible with
respect to the industrial application needs and requirements and how they will be
implemented.

To this purpose, USIXML files have been created based on the Use Cases to describe the
user interaction, including relevant impairments, for the identified Use Cases, to be read by
the simulation environments that are to be integrated in the context of WP2.2.

Based on (i) industrial needs, (ii) application domains and (iii) features of the existing and
VERITAS tools under development, an analysis has been performed of high level
specifications for:

The physical and basic perceptual ergonomics analysis for ergonomics related to
habitability, accessibility and reach and use of vehicle control

The vibrational comfort analysis

From here, high level architecture and specifications are given for the integration of the
VERITAS human models specifications (users physical and perceptual limitations), to be
used in desktop applications, and of the VERITAS core simulation platform in immersive
applications for designers, both for first and third person evaluation.

These specifications constitute the basis for providing industrial feasible applications of the
VERITAS concept in the automotive domain, to be implemented to the developments within
the project, to be refined and evaluated in the pilot application phases.

The integration activity will be performed in the continuation of WP2.2 and will be
documented in D2.2.2 VR and integrated solutions for the automotive scenario (M36).

20/12/2011 110 CRF


VERITAS D2.2.1 PU Grant Agreement # 247765

9 References
[1] Harris, C.M. Shock Vibration Handbook, McGraw Hill (1998)
[2] Griffin, M.J. Handbook of Human Vibration, Academic Press (1990)
[3] Mansfield, N.J. Human Response to Vibration, CRC Press (2005)
[4] Xuting, W. et al. Study of human-seat interactions for dynamic seating comfort
analysis, SAE Paper 1999-01-1303.
[5] Kitazaki, S. and Griffin, M.J. Resonance behaviour of the seated human body a
summary of experimental data, Journal of Biomechanics, Vol. 31, No. 2, 12 May
1997, pp.143149.
[6] Pennestri, E., Valentini, P.P., Vita, L. Comfort Analysis of Car Occupant,
Comparison between multibody and finite element models, International Journal for
Vehicle Systems Modeling and testing, Vol. 1 Nos. 1/2/3, pp.68-78 (2005).
[7] Valentini, P.P Virtual dummy with spine model for automotive vibrational comfort
analysis, Int. J. Vehicle Design, Vol. 51, Nos. 3/4, pp.261277, (2009)
[8] Valentini, P.P, Vita, L. DaviD A Multibody Virtual Dummy for Vibrational Comfort
Analysis of Car occupant, Virtual Nonlinear Multibody Systems, NATO Science
Series, Kluwer Academic Publisher.
[9] Valentini, P.P, Modelli virtuali predittivi del comfort vibrazionale degli occupanti di
autovetture, PhD thesis, University of Rome Tor Vergata, (2004).
[10] Official Journal of the European Commission, Updated Version of the European
Statement of Principles on Human Machine Interface (HMI) for InVehicle
Information and Communication Systems (2007)
[11] VERITAS Deliverable D1.1.1 - UCD-based user requirements extraction (2011)
[12] International Organization for Standardization, ISO 2631-1 (1997), Mechanical
vibration and Shock-Evaluation of Human Exposure to Whole-Body Vibration
Part 1: General requirements
[13] British Standards Institution, BS 6841 (1987), Measurement and Evaluation of
Human Exposure to Whole-Body Mechanical Vibration
[14] Cossalter, V., Lot, R. A motorcycle multi-body model for real time simulations
based on the natural coordinates approach, Vehicle System Dynamics, pag.
423-447, 2002.
[15] Cossalter, V., Motorcycle dynamics, published by Race Dynamics
[16] Cofelice N., Zanni R., Locatelli D., Toso A., Moreno Giner D., Kang J.,
Donders S. Vibrational analysis of a multibody virtual dummy for car and
motorcycle users 1st Joint International Conference on Multibody System
Dynamics, May 25-27, 2010, Lappeenranta, Finland

20/12/2011 111 CRF

Вам также может понравиться