Академический Документы
Профессиональный Документы
Культура Документы
I. INTRODUCTION
Advances in networking computing, sensor technology
and robotics allow us to create more convenient
environments for humans. In that context the Intelligent
Space (iSpace) concept was proposed. The iSpace is then a
space that has ubiquitous distributed sensory intelligence
and actuators for manipulating the space and providing
useful services (Fig 1). It can be regarded as a system that
is able to support humans, i.e. users of the space, in various
ways. Actuators provide physical services as well as
information to people in the space, where sensors are used
for observing the space and gathering information. [1]
The iSpace consists of three functions, observing,
understanding, and acting. The Observing function
is the most important one, because it will deliver the
information to know what kind of services are required.
Conventionally, observation focus has been limited to the
human and robot, but there are a large number of objects in
our living environment and thus we need to analyze the
relation the users have with them.[2] In order to offer the
appropriate services using the objects, not only the physical
information of the object but also the kind of use the human
gives to it needs to be known. Such information cannot be
written beforehand and is only provided by observing the
interaction. In other words the relations among humans and
objects are important, thus, 4W1H, a paradigm where
when, who, what, where and how variables of the
environment are sensed, and are used to determine the
elements interaction information within the space. [3]
In the whole scope of human activity sensing, this has
been more focused on sensing broad movements such as
User
Desk
(1)
X id (t + 1 ) = X id (t) + Vid (t + 1 )
(3)
and,
d max = max d (X p , Vi ,k ) ni ,k
k1, 2 ,K, K
X p Ci ,K
(4)
ALGORITHM DESCRIPTION
WHERE
WHO
WHAT
Sensors
B. Hardware Disposition
PSO
Clustering
A. Hardware Description
To perform the experiments we used an MTx sensor
from the company Xsens, which is a small and accurate
3DOF inertial Orientation Tracker. It provides drift-free 3D
orientation as well as cinematic data: 3D acceleration, 3D
rate of turn (rate gyro) and 3D earth-magnetic field [9]. The
system contains nine sensors which can be interlinked with
each other in order to obtain a more complex set of data out
of one specific object, as well as to provide a good
architecture for setting referenced cinematic systems. [9]
The data is retrieved using the Matlab toolbox that comes
with the product, which allows us to acquire in real time all
the needed data form the sensors.
The computer used was a computer with an Intel Core 2
Duo processor running Windows XP Professional Edition
and Matlab 7.0.
We will be using as well RFID tags equipped each with a
rough accelerometer for the implementation of the 4W1H
architecture.
IV.
WHEN
MTx
ZPS
Z
HOW
Y
X
V. EXPERIMENTS
A. Experiment Design
When designing the sensing experiments, we defined a
set of actions that were found to be the most common ones
in the sets we decided. As we justified before we are going
to use a small set of actions since the objective is to test the
training capabilities of the system.
Table 1 shows the sets and number of Users, actions,
places, etc that we chose for the experiments. This data is
used to train the SOM [5] for the How segment of the
algorithm as well as for the final training.
We had each user performing a set of each action with
different objects at different times in different places, in
such a way that not a pair of users may have the same set of
activities. The objective is to recognize data from a user in a
place he did not sample data for, for example if User 1
never performed a reading action in the Bed, the system
would be able to do recognize it, since it can detect the
reading action from that user regardless of the place where
he is performing it.
TABLE 1. VARIABLE SELECTION
User
Action
Place
Time
Object
User 1
User 2
User 3
User 4
User 5
User 6
User 7
User 8
User 9
User 10
Drink
Work
Texting
Read
Desk
Bed
Table
Morning
Noon
Afternoon
Night
Cup
Glass
Mouse
Keyboard
Mobile
Book
Magazine
C. Results
In table 2 we show the results of the system, the first
column being the variable we clustered around, the second
one being the number of users we had the system trained
with, then the error rate and finally the time elapsed for the
data to be clustered.
TABLE 2. RESULT TABLE USING DIFFERENT VARIABLES
Cluster
#Users
Error %
Time [s]
Time
Time
Object
Object
User
User
Action
Action
Places
Places
5
10
5
10
5
10
5
10
5
10
0
0
5
2
10
0
7
2
0
0
3
4
3
4
3
4
3
4
3
4
[4]
[6]
[7]
[8]
[9]