Вы находитесь на странице: 1из 12

Computers in Industry 60 (2009) 114125

Contents lists available at ScienceDirect

Computers in Industry
journal homepage: www.elsevier.com/locate/compind

Tangible augmented prototyping of digital handheld products


Hyungjun Park a,*, Hee-Cheol Moon a, Jae Yeol Lee b
a
b

Department of Industrial Engineering, Chosun University, 375 Seosuk-dong, Dong-gu, Gwangju 501-759, South Korea
Department of Industrial Engineering, Chonnam National University, 300 Yongbong-dong, Buk-gu, Gwangju 500-757, South Korea

A R T I C L E I N F O

A B S T R A C T

Article history:
Received 3 December 2007
Received in revised form 4 July 2008
Accepted 6 September 2008
Available online 17 December 2008

Proposed in this paper is a novel approach to virtual prototyping of digital handheld products using
augmented reality (AR)-based tangible interaction and functional behavior simulation. For tangible user
interaction in an AR environment, we use two types of tangible objects: one is for a product, and the other
is for a pointer. The user can create input events by touching specied regions of the product-type
tangible object with the pointer-type tangible object. Rapid prototyping and paper-based modeling are
adopted to fabricate the AR-based tangible objects which play an important role in improving the
accuracy and tangibility of user interaction. For functional behavior simulation, we adopt a state
transition methodology to capture the functional behavior of the product into an information model, and
build a nite state machine (FSM) to control the transition between states of the product based on the
information model. The FSM is combined with the AR-based tangible objects whose operations in the AR
environment facilitate the tangible interaction, realistic visualization and functional simulation of a
digital handheld product. Based on the proposed approach, a prototyping system has been developed and
applied for the design evaluation of various digital handheld products with encouraging feedback from
users.
2008 Elsevier B.V. All rights reserved.

Keywords:
Virtual prototyping
Tangible objects
Augmented reality
User interaction
Functional behavior simulation

1. Introduction
For most digital handheld products such as a mobile phone and
an MP3 player, their functional behavior is very complicated and
nearly all expressed as humanmachine interaction (HMI) tasks,
each of which may trigger the transition between the states of the
products. For successful entry of a new product into the
competitive world market, it is imperative to reduce time to
market as much as possible while precisely converting its demands
into actual product forms, features, and functions [1,2]. An
essential activity required is the efcient and extensive use of
prototypes during the product development process [1,2].
With recent advances in computer technology, virtual prototyping (VP) has been considered as a new and powerful
prototyping solution to overcome the shortcomings of conventional prototyping methods. The concept of VP has been widely
employed and implemented in many industrial elds including
automotive and airplane industries [7,8], but most works have
been based on using virtual reality (VR) techniques [310], and
they have been focused on visualization [2,9], assembly and

* Corresponding author. Tel.: +82 62 230 7039; fax: +82 62 230 7128.
E-mail address: hzpark@chosun.ac.kr (H. Park).
0166-3615/$ see front matter 2008 Elsevier B.V. All rights reserved.
doi:10.1016/j.compind.2008.09.001

disassembly testing [1012], manufacturing process simulation


[13,14], structural analysis [2,6], and ergonomic analysis [2,9].
Some works have been conducted on capturing and simulating the
functional behaviors of digital handheld products in VP applications [15,16]. In VR-based prototyping solutions, it is not easy to
build a virtual environment of ne quality (e.g. making detailed
and realistic three-dimensional models) and to acquire tangible
user interaction with low cost VR devices. Recently, augmented
reality (AR) approaches have been applied as alternatives for
developing VP solutions to overcome these shortcomings [1723].
In order to realize faithfully the virtual design and prototyping
of digital handheld products such as mobile phones and MP3
players, it is very important to provide the people involved in
product development with tangible user interaction, the realistic
visualization of the products, and the vivid simulation of their
functional behaviors in a virtual environment. In this paper, we
propose a novel approach to virtual prototyping of digital handheld
products, which can satisfy such requirements by combining ARbased tangible interaction with functional behavior simulation.
We call it tangible augmented prototyping.
The proposed approach does not require high-cost devices such
as data gloves and haptic devices for user interaction. Rapid
prototyping (RP) and paper-based modeling are properly adopted
in building AR-based tangible objects whose manipulation in an AR

H. Park et al. / Computers in Industry 60 (2009) 114125

environment can improve the accuracy and tangibility of interaction with the products. Rapid prototyping is a manufacturing
technology to generate physical objects so-called RP models
directly from geometric data without traditional tools easily and
rapidly [2,14]. An RP model usually serves the purpose of
communicating information and demonstrating ideas. It can also
support various kinds of tangibility for experiments and interactions which gives rapid and critical feedback to the product
development and evaluation.
The tangible objects, composed of paper and RP models without
any hardwired connection using electronic components, are easily
available at low cost. This makes the AR environment more
accessible to developers, stakeholders, and even consumers.
Moreover, the proposed approach suggests how to combine the
forms, functions, and interactions of digital handheld products
physically and virtually at the same time.
The rest of the paper is organized as follows: Section 2
summarizes previous work related to virtual prototyping. In
Section 3, the proposed approach is described with its key
components. Section 4 explains the operations of the virtual
product model in a tangible AR environment. Section 5 addresses
the implementation and application of the product design
evaluation system based on the proposed approach. Section 6
describes a preliminary user study to show the usefulness of the
approach. Section 7 closes the paper with some concluding
remarks and future work to be done.
2. Previous work
Early attempts at supporting VP were based on CAD and VR
systems. Powerful tools including stereoscopic display systems,
head mounted displays (HMD), data gloves and haptic devices have
been introduced [9] and combined to construct VP systems that
provide realistic display of products in a simulated environment
and offer various interaction and evaluation means. Bochenek et al.
compared the performance of four different VR displays in a design
review setting and mentioned that the best approach for design
review activities could be a combined technology approach [24].
Park et al. suggested virtual prototyping of consumer electronic
products by embedding HMI functional simulation into VR
techniques for design evaluation [15,16]. As it is not easy to build
a virtual environment of ne quality and to acquire tangible
interaction with VR-based systems, many alternative solutions
have been proposed.
Greenberg and Fitchett presented toolkits called Phigets that
allow designers to explore a tangible user interface (TUI) for
interactive product design [25]. Hartmann et al. presented similar
toolkits called d.tools for visually prototyping physical user
interfaces [26]. In TUI, physical objects and ambient spaces are
used to interact with digital information [27]. Hardwired connection
is often employed using electronic components. Tangible interfaces
are quite useful because the physical objects used in them have
properties and physical constraints that restrict how they can be
manipulated. However, it is difcult to change and evaluate an
objects physical properties dynamically. The human computer
interfaces and interaction metaphors originating from AR research
have proven advantageous for a variety of applications [17,18]. AR
techniques can naturally complement physical objects by providing
an intuitive interface to a three-dimensional information space
embedded within physical reality. However, although an AR
interface provides a natural environment for viewing spatial data,
it is often challenging to interact with and change the virtual content.
To overcome the limitations of the AR and TUI approaches while
retaining their benets, tangible AR has been suggested [19,20].
Verlinden et al. suggested the concept of augmented prototyping

115

that projects the perspective images of the product on the physical


object made by rapid prototyping techniques [21]. The concept of
integrating hardware and software in AR environments has been
presented [22,23]. Basically, it augments a virtual display onto the
soft mockup of a product by incorporating simple switches as basic
input interfaces. Prototypes with hardwired connection can
provide direct and accurate interfaces, but signicant efforts are
usually required to implement and build them. Moreover, it is not
easy to make them available and accessible to many people who
are located at different places.
Although various ways have been proposed to support virtual
prototyping of digital products, more research is still needed in the
following aspects. The interaction should be intuitive and tangible
to help developers and users in product design evaluation to make
a product of interest more complete and malfunction free before
production. The prototyping environment should be available at
low cost without strong restriction of its accessibility to users.
Moreover, for effective evaluation of the product, we need to dene
its behavior through forms, functions and interactions, and to
develop a proper way of integrating them in a virtual environment.
In this paper, we address these aspects by proposing a prototyping
approach called tangible augmented prototyping.
3. Proposed approach
Fig. 1 shows the overall process of tangible augmented
prototyping proposed in this paper. There are ve main tasks
required for relevant prototyping and downstream applications:
creation of a product model, acquisition of multimedia contents
data, generation of HMI functional model, construction of a FSM,
and fabrication of AR-based tangible objects.
Fig. 2 shows a graphical diagram depicting key components
used for the proposed approach. In the diagram, a game phone is
used as an example of a digital handheld product. A product model,
multimedia contents data, an HMI functional model, and an FSM
constitute a virtual product model whose operations combined
with tangible objects in an AR environment facilitate tangible
interaction, realistic visualization, and functional simulation of the
product. The visualization of the product in the AR environment is
obtained by overlaying the rendered image of the product on the
real world environment in real time [17,18]. For tangible user
interaction, we play with the AR-based tangible objects to create
input events by touching specied regions of the product-type
object with the pointer-type object. For functional behavior
simulation, we adopt a state transition methodology to capture
the functional behavior of the product into the HMI functional
model, and build the FSM to control the transition between states
of the product using the model. RP and paper-based modeling are
properly adopted to build the AR-based objects which support
good tangibility for experiments and interactions.
During the process of tangible augmented prototyping, users
may detect any problems in the overall appearance, the assembly
structure, or the functional behavior of the product. In such cases,
product designers correct the problems and update the product
model or the HMI functional model. As shown in Fig. 1, the users
and the product designers can promote the product design and
development by repeating the process with the product model and
the HMI functional model updated. In the following subsections,
we describe how to acquire the key components used for the
proposed tangible augmented prototyping.
3.1. Product model creation
Creating a product model is the most basic step for constructing
the virtual product model. The product model includes the

H. Park et al. / Computers in Industry 60 (2009) 114125

116

Fig. 1. Overall process of tangible augmented prototyping.

geometry, the attributes of material and color, the assembly


structure, and the kinematics of the part components of the
product [2]. In general, geometric models of the part components
can be created with CAD software. In the case that only physical
prototypes or soft mockups are available, the geometric models
can be created with reverse engineering (RE) tools [28]. In this
work, we approximate the geometric models by triangular meshes
that are ne enough to provide realistic visualization, and we store
them in OBJ and STL formats [2]. The STL formats are used to
fabricate the AR-based tangible objects of the product.

basically require three kinds of multimedia contents data:


graphical images, audio sounds, and video animations. Nearly
every digital handheld product has LCD display(s) to show visual
information (images and animations) related to its specic states.
It also has audio output devices or speakers to output auditory
information, that is, audio sounds specic to its states. We use the
JPEG le format for graphical images, MP3 or WAV le formats for
audio sounds, and the AVI le format for video animations. The
multimedia contents data can be acquired by audio/video
recording.

3.2. Multimedia contents data acquisition

3.3. Generation of HMI functional models

We acquire multimedia contents data to create visual and


auditory display information required for realistic visualization of
a product and vivid simulation of its functional behavior. We

The HMI functional model of a product is an information model


that represents the HMI-related functional behavior of the product.
In this work, we adopt a state transition methodology [26,2931]
to capture the functional behavior by breaking it down into the
following entities: (1) all objects related to HMI tasks, (2) all HMI
events occurring in the product, (3) all states in which the product
can be, (4) activity information related with each state, (5) state
transitions occurring in each state, and (6) event-condition-action
information related with each state transition.
Nearly every digital handheld product has some part components (i.e. lamps, switches, buttons, sliders, displays, timers, and
speakers) involved in the interaction between the user and the
product. They are called objects making the basic building blocks of
functional simulation. Every object has a pre-dened set of
properties and functions that describe everything it can do in a
real-time situation. The overall functional behavior of a product
can be broken down into separate units called states. Every state
transition is triggered by one or more events associated with it. As
shown in Fig. 3, the transition occurs only when one of these events
is activated and some specied conditions are met. Tasks called
actions can be performed before transition to a new state. Tasks
called activities are performed in each state, and they only occur
when their state becomes active. Actions and activities are

Fig. 2. Key components of the proposed approach and their relations.

H. Park et al. / Computers in Industry 60 (2009) 114125

117

Fig. 3. State transition.

constructed using the objects properties and functions. Each


action or activity consists of a set of statements, each of which can
be the assignment of some value to a variable, the calling of a
function of an object, or a composite statement with a conditional
statement.
In this work, we have used a simple markup language to
represent the functional behavior of the product [16]. After
analyzing and gathering the above-mentioned entities of the
functional behavior, we write them into a set of text les based on
the markup language to build the HMI functional model. In the
HMI functional model, each action or activity consists of a set of
logical statements expressed with the objects properties and
functions. The concrete execution according to each action or
activity is conducted during functional simulation.
3.4. Construction of a nite state machine
The FSM is used to control the transition between states of the
product based on the HMI functional model. In this work, we
developed a module to compile the text les for the HMI functional
model and to operate the FSM. Fig. 4 shows the control ow of the
FSM module. Note that actions or activities, logically dened in the
HMI functional model, are invoked during functional simulation.
As they may contain statements invoking functions related to
image synthesis and display, playing audio sounds, and handling
video animation data, some function libraries for processing audio/
image/video data are often required.
3.5. Fabrication of AR-based tangible objects
To improve the accuracy and tangibility of the interaction
between the user and the product in an AR environment, we use
two types of AR-based tangible objects: one is for the product, and
the other is for a pointer. The user creates HMI events by touching
specied regions of the product-type tangible object with the
pointer-type tangible object. Each tangible object has at least one
AR marker used to augment the image of the real world with its
rendered image [17,18].

Fig. 4. Control ow of the nite state machine.

For the AR-based tangible object of the product, we build an RP


model using rapid prototyping with the STL format of the product
[2,14], and paste AR markers on the specied regions of the RP
model. As nearly every digital handheld product has at least one
LCD display, the AR markers are pasted on the LCD display(s). Fig. 5
shows the AR-based tangible object for a game phone. Although
the product consists of part components, it is enough to make the
RP model as a single component since the necessary information
for tangible interaction is the position of the pointer-type object
with respect to the RP model as shown in Fig. 5(b).
For the AR-based tangible object of the pointer, we apply paperbased modeling as follows: we generate a polygonal mesh
composed of a cube and a square pyramid, develop its unfolded
sheet, cut out the sheet, and build the paper model with the cut-out
sheet. Fig. 6 shows the AR-based tangible object for the pointer.
Note that four AR markers are included in the unfolded sheet. The
paper model actually can be replaced with any physical model (i.e.
RP model or plastic object) that satises the following conditions:
 Its shape and size are nearly the same as those of the geometric
model of the pointer.
 It is easily fabricated at low cost.
 It is rigid enough to keep its overall shape and the accuracy of
picking operations when users grasp it in their hands and touch
rigid objects with its tip.
Paper-based modeling provides low-cost benets with ease to
fabricate. Moreover, it can guarantee good rigidity if a paper model
is made of thick and sturdy paper. In this work, the geometric
model of the pointer is dened as the outward offset of the
polygonal mesh by a small distance. This offsetting is helpful to

Fig. 5. AR-based tangible object for a game phone: (a) the RP model with an AR marker and (b) the augmented image of the game phone.

118

H. Park et al. / Computers in Industry 60 (2009) 114125

Fig. 6. AR-based tangible object for a pointer: (a) an unfolded sheet for a paper model; (b) the paper model with four AR markers; (c) the augmented image of the pointer.

overlay properly the rendered image of the pointer on the image of


the real paper model. The use of the AR-based tangible objects can
greatly improve the accuracy and tangibility of user interaction.

 The distance from the tip to the button or slider is the shortest
among the distances from the tip to the other buttons and sliders.
 The distance is kept smaller than a tolerance during a specied
time period.

4. AR-based tangible interaction and functional simulation


After obtaining the product model, multimedia contents data,
the HMI functional model, the nite state machine, and AR-based
tangible objects, we can start the design evaluation of the product
by operating them in the AR environment. In this work, ARToolKit
[17] is adopted to construct a computer vision-based AR
environment. Fig. 7 shows a schematic diagram for AR-based
tangible interaction and functional simulation. When the user
manipulates AR-based tangible objects to touch specied regions
of a digital handheld product, the AR engine recognizes HMI events
based on the spatial relations between the AR-based tangible
objects, reacts to the events, and sends the proper results to output
devices. It may change states of the product and activate associated
actions or activities. Through the output devices, the user can
experience the appearance, kinematics animation, and functional
behavior of the product.

According to the length of the specied time period, the HMI


events related to push-type buttons can be recognized differently
short or long pushing. As the distance computation should be

4.1. Tangible user interaction


Most digital handheld products have buttons to push or sliders
to move. In order to create HMI events, the user holds the pointertype tangible object and touches specied regions (for example,
buttons or sliders) of the product-type tangible object with the tip
of the pointer-type tangible object. We consider a button (or slider)
to be pushed or moved (that is, an HMI event occurs) if the
following conditions are satised:

Fig. 7. Control ow in AR-based interaction and simulation.

H. Park et al. / Computers in Industry 60 (2009) 114125

119

Fig. 8. Distance computation in the AR environment: (a) coordinate transformation between a camera and an AR marker; (b) distance between two points using the camera
coordinate frame.

performed in a reference coordinate frame, it is required to


transform the tip, the buttons, and the sliders into the camera
coordinate frame for each video image. Using the camera
calibration information acquired by ARToolKit [17], we can easily
acquire coordinate transformations between the camera and the
AR markers associated with their tangible objects as shown in
Fig. 8(a). A point pk in a local coordinate frame OXYZk dened by a
kth AR marker can be expressed as a point pc in the camera
k
coordinate frame OXYZc as follows: pc Rkc pk dc where the
k
k
rotation matrix Rc and the translation vector dc constitute a
transformation from OXYZk to OXYZc. The distance d(p1, p2)
between p1 in OXYZ1 and p2 in OXYZ2 can be easily computed as
1
2
follows: dp1 ; p2 jjR1c p1 dc  R2c p2  dc jj, see Fig. 8(b). To
reduce the computational load, we can simply compute the
distances from the center of the tip to the centers of buttons or
sliders.
4.2. Functional behavior simulation
The functional simulation of the product is completed as
follows: when the user creates an input event with AR-based
tangible objects, the AR engine checks if the event is related to the
functional behavior of the product or not. If so, the FSM module
refers to the HMI functional model of the product and determines if
the event triggers a state transition. If the state transition is
conrmed, the FSM module does the specied actions and changes
the state to a new one, and performs the activities of the new state
(see Fig. 3). Otherwise, it keeps conducting the activities of the
current state. These actions and activities include tasks such as, for
example, changing the position and orientation of buttons and
switches, playing or pausing MP3 music, turning on or off the lamp,
increasing or decreasing the volume. The execution of the actions
and activities yields state-specic visual and auditory data. The
state-specic visual data are sent to the AR engine to update the
visual image of the product, and the state-specic auditory data are
sent directly to auditory output devices.

windows-based IBM compatible personal computer. As input


devices, we used two types of AR-based tangible objects and a PC
camera with 640  480 resolution. As output devices, we used an
LCD monitor and a pair of speakers. VR-oriented devices such as
HMD can also be integrated into the AR environment. Fig. 9 shows
the system environment.
For product model generation, we used a CAD software called
Rhino3DTM version 3.0 and an RE software called RapidFormTM
version 2004. We stored the HMI functional models of digital
handheld products into markup language-based text les and used
them in the FSM module. Graphical images, video animations, and
audio sounds required as multimedia contents data were acquired
by recording. The AR engine is based on ARToolKit [17] and
includes additional modules for visualization, I/O interface
handing, sound play, LCD image display, and environment
parameter setting. ARToolKit was used for camera calibration,
marker recognition, and 3D object augmentation. The LCD image
display module is used to create state-specic images which
appear on the LCD display of each product [15,16]. For the
visualization module, we used OpenGL and GLUT as graphics
libraries. For the sound play module, we used Direct Show for MP3
decoding. We developed the other modules by writing our own
source code.

5. Implementation and application


Based on the proposed approach, a product design evaluation
system has been implemented in C and C++ languages on a

Fig. 9. System environment for tangible augmented prototyping.

H. Park et al. / Computers in Industry 60 (2009) 114125

120

Figs. 10 and 11 show the virtual product models (i.e. virtual


prototypes) of an MP3 player and a game phone in four different
states and their operations in the tangible augmented prototyping
environment, respectively. The MP3 player is made by iRiverTM and
the game phone is made by LG ElectronicsTM. In some states, video
animations are displayed and sounds (i.e. button click or music
sounds) are played. While performing the design evaluation of any
product using the system, users may detect any problems in the
overall appearance, the assembly structure (if the prototyping
system is elaborated further), or the functional behavior of the
product. In such cases, product designers correct the problems and
update the product model or the HMI functional model (i.e.
feedbacks occur during the iterative process in Fig. 1). The users
and the product designers can promote the product design and
development by repeating the product design evaluation in this
manner.
6. Preliminary user study
Usability testing is used for ensuring that the intended users of a
product can carry out the intended tasks efciently, effectively and
satisfactorily. In usability testing, users are asked to perform
certain tasks in an effort to measure the products ease-of-use, task
time, and the users perception of the experience [1,32,33]. To
investigate the usefulness and quality of the proposed approach,

we carried out a preliminary user study of the MP3 player and the
game phone with a subject group consisting of 10 university
students. Of the 10, 8 learned the basics of 3D geometric modeling
from CAD/CAM courses. Simple task performance measures and
questionnaires were used to evaluate the approaches using four
different virtual prototypes: traditional 2D screen prototypes
(2DSCR), 3D stereoscopic prototypes (3DSTR), 3D augmented
prototypes (3DAR), and 3D tangible augmented prototypes
(3DTAR). As we have built the virtual prototypes from commercial
products based on reverse engineering, we could include the use of
real products (REAL) in the tests as the target reference of the
prototyping approaches. Obviously, using a real product is the best
but most costly approach to its design evaluation.
As shown in Fig. 12, 2D screen prototypes present front and/or
side views of products, but the image of the views usually has a
limited resolution. 2D screen prototypes were built by combining
the images of views with multimedia contents data, functional
models, and FSMs. These 2D screen prototypes allow users to
perform the image-based functional simulation by clicking buttons
on the views. 3DTAR corresponds to the approach proposed in this
paper and 3DSTR corresponds to the VR-based prototyping
approach proposed by Park et al. [15,16]. Similar to the 3DTAR
approach, each virtual model in 3DSTR also consists of a product
model, multimedia contents data, a functional behavior model, and
an FSM.

Fig. 10. MP3 player in four different states: (a) MP3 Play; (b) Mode Select; (c) FM Radio; (d) Hold; and (e) its tangible augmented prototyping.

H. Park et al. / Computers in Industry 60 (2009) 114125

Fig. 11. Game phone in four different states: (a) On; (b) Calling; (c) Multimedia menu; (d) Movie; and (e) its tangible augmented prototyping.

Fig. 12. 2D screen prototypes for (a) the MP3 player and (b) the game phone.

121

122

H. Park et al. / Computers in Industry 60 (2009) 114125

Fig. 13. VR-based prototyping of (a) the MP3 player and (b) the game phone.

As shown in Fig. 13, a user using 3DSTR wears HMD and


interacts with the virtual model by using some keys and a mouse to
experience the realistic appearance and functional behavior of a
product. 3DAR is the same as 3DTAR except that, instead of
tangible objects made by RP and paper-based modeling, traditional
AR markers shown in Fig. 14 are used for the augmentation of 3D
virtual objects and the interaction between users and products.
3DAR was included in order to show the advantages that AR-based
tangible objects have over traditional AR markers in aspects such
as tangibility and ease-of-use.
Note that a 3D screen prototype (3DSCR) working on a single
monitor screen without an HMD can be considered as an alternative
approach. However, 3DSCR has been compared with 3DSTR in the
literature (for example, see [15,16,32]), and it was known that
3DSCR and 3DSTR show similar results with some tradeoffs between
them. As advanced HMDs have been introduced, 3DSTR produces
better results in many aspects than 3DSCR. Based on this rationale,
3DSCR was not included in the user study of the paper.
In order to evaluate the task performance of the prototyping
approaches, we asked the subjects to complete two tasks for the
MP3 player and three tasks for the game phone. Details of the tasks
are described in Table 1. We rst introduced the subjects to four
kinds of prototypes (2DSCR, 3DSTR, 3DAR, 3DTAR) of the two
products for 15 min. Each subject was given some time (about
20 min) to learn how to manipulate them (i.e. how to click buttons,
how to move, scale, and rotate 3D prototypes, and how to
manipulate tangible objects or AR markers). The subject was then
asked to conduct the ve tasks using the four prototypes in the
following order: {2DSCR, 3DSTR, 3DAR, 3DTAR} ! REAL. For each
prototype, the tasks were assigned to the subject in the following
order: T1 ! T2 ! T3 ! {T4, T5}. The prototypes and the tasks in
braces {} were ordered randomly. This random ordering was used
to minimize the learning effect occurred during the repetitive

tasks. Before performing each task, the subject could have access to
a simple graphical manual describing the steps required to
complete the task. The results of the performance measures are
summarized in Table 1 and plotted in Fig. 15.
After completing all the tasks, each subject was asked to ll
questionnaires in order to capture qualitative aspects (i.e. understandability of functions, ease-of-use, tangibility, sense of realism,
legibility) of his or her experience of four prototyping approaches
(2DSCR, 3DSTR, 3DAR, 3DTAR). Verlinden et al. and Park et al.
performed similar qualitative comparison of their virtual prototyping approaches [16,33]. The questions asked are summarized in
Table 2. All responses were scored on a ve-point scale and each
question included a eld to add some comments. Fig. 16 shows the
analysis results of the questionnaires collected from all the
subjects.
From the results of task performance, we found that the
overall ranking of task performance was as follows:
REAL > 3DSTR > 2DSCR > 3DTAR > 3DAR. We found that the ease
of button clicks is the most dominant factor affecting the task
performance. Button clicks are done with some keystrokes and the
mouse in 2DSCR and 3DSTR, with simple AR markers in 3DAR, and
with AR-based tangible objects in 3DTAR. Using the keystrokes and
the mouse was faster and easier than using the AR-based markers
or tangible objects. Especially, all the subjects felt severe
inconvenience in clicking buttons with traditional AR markers.
When they tried to click buttons with simple AR markers, they
often encountered confusing situations in which 3D virtual objects
overlapped each other (i.e. the pointer object penetrated the
product object). This can explain why the 3DAR approach required
much more time to complete the tasks than the others. On the
other hand, the subjects mentioned that they did not experience
any object overlap and penetration and felt the sense of touch
when using tangible objects in the 3DTAR approach.

Fig. 14. AR-based prototyping: (a) simple AR markers; (b) manipulation of the MP3 player; and (c) manipulation of the game phone.

H. Park et al. / Computers in Industry 60 (2009) 114125

123

Table 1
Task descriptions and performance measures.
Product type

Task

Task steps

MP3 Player

(T1) Play the 3rd MP3 music


with volume level 30.

1.
2.
3.
4.
5.
1.
2.
3.
4.
5.
1.
2.
3.
1.
2.
3.
4.
1.
2.
3.
4.

Average time required (in s)


2D SCR

(T2) Play the 2nd FM radio


station with volume level 20.

Game Phone

(T3) Make a call to the


1st tester.
(T4) Play an animation.

(T5) Change the caller


ring into the 3rd one.

Turn on the MP3 player.


Play the music.
Move to the 3rd music.
Increase the volume up to 30 levels.
Hold on the buttons (move hold slider to the right).
Hold off the buttons (move hold slider to the left).
Enter the menu.
Select the FM radio submenu to play FM radio.
Move to the 2nd FM radio station.
Decrease the volume down to 20 levels.
Turn on the game phone.
Enter the phone number by using number buttons.
Press call button.
Enter the main menu.
Move to the multimedia submenu (press 6 button).
Select the 2nd bin for animation les.
Play the animation by pressing OK button.
Enter the main menu.
Move to the caller ring submenu (press 2 button).
Select the 3rd one.
Set it to the current caller ring by pressing OK button.

We also found that the complexity of button layout and the


visibility of prototypes affect the task performance. For the MP3
player, it has a few buttons, but their layout in the 2D screen
prototype is rather confusing as the buttons were distributed into
three views (one front view and two side views). This often caused
the subjects to make mistakes when picking buttons. For the game
phone, it has over 30 buttons whose layout is rather complex, and

3D AR

REAL

6.5

5.9

3D STR

3D TAR
9.2

42.4

5.3

8.8

8.1

13.0

45.8

7.5

10.0

7.7

13.3

47.8

6.4

9.6

7.3

10.3

42.4

6.7

9.0

6.6

8.8

35.2

5.6

some buttons are small and compact. Moreover, the visibility of the
2D screen prototype in either case is not good. Some subjects
commented that they felt inconvenience and made mistakes when
clicking buttons with the mouse pointer in the 2DSCR approach. In
the 3DSTR approach, subjects can go closely to the prototypes to
see them in ner detail. This helps to make the subjects click
buttons more easily and accurately. As Kuutti et al. pointed out
[32], it might be unnatural to go more closely to an object than
where the eye can be focused. Nonetheless, it must be one of
advantages of using 3D virtual prototypes to manipulate (move,
scale, and rotate) freely them in a VR environment. In the AR-based
approaches, subjects can also have a close look at the prototypes as
long as the AR markers associated with the prototypes are captured
and identied. The visibility of prototypes in the AR-based
approaches is not better than the one in the 3DSTR approach
since the resolution of PC camera is lower than that of HMD. On the
other hand, some subjects without experience of using HMD
commented that in the 3DSTR approach they had some inconvenience in visibility of 3D prototypes and felt some unnatural
weight gain in their heads while wearing the HMD.
From the analysis results of the questionnaires, we found the
advantages of 3D-based approaches over the 2D-based approach
in most aspects, the signicant advantages of 3DTAR over 3DAR,
and some trade offs between 3DSTR and 3DTAR. As the same HMI
functional models were integrated into all the four kinds of
prototypes, there were no signicant differences between the
four approaches in understandability of the product functions.
The 3D-based (3DSTR, 3DAR, 3DTAR) approaches gave the
subjects better sense of realism than the 2DSCR approach.
Table 2
Questionnaires contents (translated from Korean).

Fig. 15. Graphical plots of task performance measures of (a) AR-based approaches
and (b) four approaches.

Q1
Q2
Q3
Q4
Q5
Q6
Q7
Q8
Q9

Can you understand the product functions by using the virtual prototype?
Is it easy to click buttons when using the virtual prototype?
Does the virtual prototype make you feel as if you push real buttons?
Does the virtual prototype look like the real product?
Can you gure out the size of the product with the virtual prototype?
Can you feel a three-dimensional effect when using the virtual prototype?
Is the liquid crystal display (LCD) of the product legible?
Do you think the use of virtual prototype interesting?
Does the virtual prototype offer enough information for a decision to buy the
product?

H. Park et al. / Computers in Industry 60 (2009) 114125

124

Fig. 16. Graphical plots of questionnaire results for all subjects.

Regarding the three-dimensional effect, the subjects perceived


depth the most vividly in the 3DSTR approach. As these
approaches allow the subjects to go closely to the prototypes
and to see the display of the products more clearly, they got
higher points than the 2DSCR approach in legibility. The 3DAR
approach received the lowest points in the ease of button clicks as
mentioned above. The 3DAR and 3DTAR approaches showed
similar results except that the 3DTAR approach received much
higher points than the 3DAR approach especially in tangibility
and the ease of button clicks. The 3DSTR and 3DTAR approaches
received higher points than the 2DSCR approach in most aspects,
but we found some tradeoffs between the two: 3DTAR received
lower points than 3DSTR in the ease of button clicks but higher
points in the tangibility and the sense of realism. Some of the
subjects felt some difculty in guring out the size of the product
with the 2DSCR and 3DSTR approaches. However, all the subjects
could easily estimate the product size in the AR-based
approaches since they felt like holding the product with their
hands (especially in 3DTAR). Most of the subjects answered that
3D-based prototypes were more appealing to customers than the
2D screen prototype.
Some subjects pointed out that they sometimes felt confusion and
difculty in their tasks when the virtual objects disappeared in the
AR-based approaches. This problem usually occurs when some AR
markers fail to be recognized due to bad light conditions or some
occlusions between the markers and by the users hands. Some
subjects expressed inconvenience as they had to watch the LCD
screen (not their hands) while manipulating tangible objects or AR
markers. Nonetheless, most of the subjects felt that it was very nice to
touch and grasp the prototypes in the 3DTAR approach. Some
subjects commented that it would be better if they could click
buttons with their ngers not with the pointer-type tangible object.
They also added that the 3D-based prototypes, if they can be accessed
via Internet, will draw great interest from a number of remote
customers. Based on the preliminary user study, we found that the
subjects feedback about the 3DTAR approach was encouraging since
it could provide them with more tangibility and better sense of
realism while allowing them to experience the functional behavior
and the visual appearance of the product easily and vividly.
7. Concluding remarks and future work
Functions represent what a product does to satisfy customers.
These functions usually differ from product to product. Generally,
there is no prototyping tool that can make a prototype represent all
the functions of every kind of products. It is common to use various
prototypes each of which can represent specic functions of a
group of target products.

In this paper, we have proposed a novel approach to virtual


prototyping of digital handheld products, which is called tangible
augmented prototyping. The primary function of these products is
mostly considered as sending users visual or auditory information
in response to user inputs. The proposed approach is aimed at
generating and utilizing prototypes that can represent not only the
primary function but also other functions such as looking nice in
aesthetic shape and keeping good in overall structure. In the
approach, a product model, multimedia contents data, an HMI
functional model, and an FSM are combined with AR-based
tangible objects whose operations in an AR environment facilitate
the tangible interaction, realistic visualization, and functional
behavior simulation of a digital handheld product. We presented
how to adopt rapid prototyping and paper-based modeling
properly in building the AR-based tangible objects, and thereby
to realize accurate and tangible interaction between the user and
the product. We also suggested how to combine the forms,
functions, and interactions of digital handheld products physically
and virtually at the same time.
The AR-based tangible objects are composed of paper and RP
models without any hardwired connection. If drawing les for
paper models and STL les for RP models are sent (via Internet),
anyone can easily obtain the tangible objects by simple paper
crafting and with the help of an RP service bureau. The AR
environment described in this paper is easy to implement,
available at low cost, and accessible to developers, stakeholders,
and even consumers.
The paper model used as a pointing tool can be replaced by a
haptic device in the AR environment. With the haptic device, we
can enhance the sense of touch and improve the accuracy of button
clicks during design evaluation. However, using the haptic device
makes object manipulation rather uncomfortable due to increased
spatial constraints. It also increases hardware costs directly, which
signicantly restricts the availability of the AR environment to
users.
Based on the proposed approach, a prototyping system has been
developed and applied for the design evaluation of various digital
products such as MP3 players and mobile phones, and it has
obtained highly encouraging feedback from users. We found some
potential possibility that the prototyping system can be used as an
important tool for design review and evaluation of digital handheld
products. We also found that the proposed approach can be
applicable to any product (or system) that satises the following
guidelines:
 The primary function of the product is featured as sending users
visual or auditory information in response to user inputs.
 Users can create HMI inputs by touching, moving, or pressing the
specic components of the product.
 The product has a at region in which an AR marker can be
placed.
 The tangible physical model (i.e. RP model) of the product can be
easily available at low cost.
These guidelines are not so tight that the approach would not be
applicable to various kinds of products including digital handheld
products. We are currently expanding the application of the
approach to various products (or even systems) used for
entertainment, education, and training.
From the user study, the prototyping system revealed some
points which guide the directions of our future research for
improving the proposed approach. Firstly we will improve
compatibility between the manipulation and the viewing of
tangible objects. In our present AR environment, the direction
from the users eyes to the LCD screen is different from the

H. Park et al. / Computers in Industry 60 (2009) 114125

direction from the camera to an object of interest. This tends to


make the users cognitive process inconvenient. We expect to
alleviate this problem greatly by adopting an optical or video seethrough HMD system in the AR environment. Secondly we will
make the marker recognition module more robust to reduce the
occasions where the virtual objects disappeared during their
augmentation. Thirdly we will incorporate computer vision
techniques to recover the image of real objects (e.g. ngers)
occluded by the image of virtual objects and thereby to make the
visualization more natural and realistic. Fourthly we will make
user interaction more tangible by devising a picking mechanism
with which the user can push or select buttons with his or her
ngertips. Lastly we will expand the system to be run on webbased environments that allow easy access to AR-based product
design evaluation via Internet.
Acknowledgements
The authors are grateful to all the anonymous referees for their
helpful comments on this paper. This work was supported by the
Korea Research Foundation Grant (KRF-2008-013-D00152).
References
[1] K.T. Ulrich, S.D. Eppinger, Product Design and Development, McGraw Hill, New
York, 2004.
[2] K. Lee, Principles of CAD/CAM/CAE Systems, Addison Wesley, Berkeley, 1999.
[3] M.J. Clayton, J.C. Kunz, M.A. Fischer, Rapid conceptual design evaluation using a
virtual product model, Engineering Applications of Articial Intelligence 9 (4)
(1996) 439451.
[4] H.J. Bullinger, R. Breining, W. Baucer, Virtual prototypingstate of the art in
product design, in: Proceedings of the 26th International Conference on Computers & Industrial Engineering, 1999, pp. 103107.
[5] S. Ottosson, Virtual reality in the product development process, Journal of
Engineering Design 13 (2) (2002) 159172.
[6] F. Zorriassantine, C. Wykes, R. Parkin, N. Gindy, A survey of virtual prototyping
techniques for mechanical product development, Journal of Engineering Manufacture 217 (2003) 513530.
[7] F. Dai, W. Felger, T. Fruhauf, Virtual prototyping examples for automotive industries, in: Proceedings of Virtual Reality World, 1996, pp. 113.
[8] M. Soderman, Virtual reality in product evaluations with potential customers: an
exploratory study comparing virtual reality with conventional product representations, Journal of Engineering Design 16 (3) (2005) 311328.
[9] G.C. Burdea, P. Coiffet, Virtual Reality Technology, John Wiley & Sons, USA, 2003.
[10] S. Jayaram, H.I. Connacher, K.W. Lyons, Virtual assembly using virtual reality
techniques, Computer-Aided Design 29 (8) (1997) 575584.
[11] N. Shyamsundar, R. Gadh, Collaborative virtual prototyping of product assemblies
over the Internet, Computer-Aided Design 34 (2002) 755768.
[12] Z. Siddique, D.W. Rosen, A virtual prototyping approach to product disassembly
reasoning, Computer-Aided Design 29 (12) (1997) 847860.
[13] S.K. Ong, L. Jiang, A.Y.C. Nee, An internet-based virtual CNC milling system,
International Journal of Advanced Manufacturing Technology 20 (1) (2002)
2030.
[14] S.H. Choi, A.M.M. Chan, A virtual prototyping system for rapid product development, Computer-Aided Design 36 (2004) 401412.
[15] H. Park, C.Y. Bae, K.H. Lee, Virtual prototyping of consumer electronic products by
embedding HMI functional simulation into VR techniques, Transactions of the
Society of CAD/CAM Engineers 12 (2007) 8794.
[16] H. Park, J.S. Son, K.H. Lee, Design evaluation of digital consumer products using
VR-based functional behavior simulation, Journal of Engineering Design 19
(2008) 359375.
[17] ARToolKit, http://www.hitl.washington.edu/ARToolKit.
[18] R.T. Azuma, A survey of augmented reality, Presence: Teleoperators and Virtual
Environments 6 (1997) 355385.
[19] H. Kato, M. Billinghurst, I. Poupyrev, K. Imamoto, K. Tachibana, Virtual object
manipulation on a table-top AR environment, in: Proceedings of the International
Symposium on Augmented Reality, 2000, pp. 111119.
[20] M. Billinghurt, H. Kato, I. Poupyrev, Collaboration with tangible augmented reality
interfaces, in: Proceedings of HCI International, 2001, pp. 234241.
[21] J. Verlinden, A. de Smit, A.W.J. Peeters, M.H. van Gelderen, Development of a
exible augmented prototyping system, Journal of WSCG 11 (2003) 496503.

125

[22] T.J. Nam, Sketch-based rapid prototyping platform for hardwaresoftware integrated interactive products, in: Proceedings of conference on human factors in
computing systems (CHI), 2005, pp. 16891692.
[23] W. Lee, J. Park, Augmented foam: touchable and graspable augmented reality for
product design simulation, Bulletin of Japanese Society for the Design Science 52
(2006) 1726.
[24] G.M. Bochenek, J.M. Ragusa, L.C. Malone, Integrating virtual 3-D display systems
into product design reviews: some insights from empirical testing, International
Journal Technology Management 21 (2001) 340352.
[25] S. Greenberg, C. Fitchett, Phidgets: easy development of physical interfaces
through physical widgets, in: Proceedings of the ACM UIST, 2001, pp. 209218.
[26] B. Hartmann, S.R. Klemmer, M. Bernstein, N. Mehta, d.tools: visually prototyping
physical UIs through statecharts, in: Proceedings of ACM Symposium on User
Interface Software and Technology (UIST), 2005.
[27] H. Ishii, B. Ullmer, Tangible bits: towards seamless interfaces between people, bits
and atoms, in: Proceedings of Conference of Human Factors in Computing
Systems, 1997, pp. 234241.
[28] T. Varady, R. Martin, J. Cox, Reverse engineering of geometric modelsan introduction, Computer-Aided Design 29 (1997) 255268.
[29] J. Rumbaugh, State trees as structured nite state machines for user interfaces, in:
Proceedings of ACM SIGGRAPH Symposium on User Interface Software, 1988, pp.
1529.
[30] D. Harel, Statecharts: a visual formalism for complex systems, Science of Computer Programming 8 (1987) 231274.
[31] H. Feng, Dcharts, a formalism for modeling and simulation-based design of
reactive software systems, Master Thesis, McGill University, 2004.
[32] K. Kuutti, K. Battarbee, S. Sade, T. Mattelmaki, T. Keinonen, T. Teirikko, A. Tornberg,
Virtual prototypes in usability testing, in: Proceedings of the 34th Hawaii International Conference on System Sciences, 2001, pp. 17.
[33] J. Verlinden, W. van den Esker, L. Wind, I. Horvath, Qualitative comparison of
virtual and augmented prototyping of handheld products, in: Proceedings of
International Design Conference, 2004, pp. 533538.
Hyungjun Park is an associate professor at the
Department of Industrial Engineering, Chosun University, Korea. He received his BS, MS, and PhD degrees in
Industrial Engineering from Pohang University of
Science and Technology (POSTECH), Korea, in 1991,
1993, and 1996, respectively. From 1996 to 2001, he
worked as a senior researcher at Samsung Electronics,
Korea. He involved in developing commercial CAD/CAM
software and in-house software for modeling and
manufacturing aspheric lenses used in various optical
products. Since 2001, he has been a faculty member of
Chosun University. His current research interests
include geometric modeling, virtual prototyping of
engineered products, 3D shape reconstruction using reverse engineering, biomedical engineering, and CAD/CAM/CG applications.
Hee-Cheol Moon received his BS and MS degrees in
Industrial Engineering from Chosun University, Korea,
in 2005 and 2007, respectively. He is currently a PhD
student at Chosun University. His main research topic is
virtual prototyping of portable electronic products
using augmented reality and CAD/CAM techniques.

Jae Yeol Lee is an associate professor at the Department


of Industrial Engineering, Chonnam National University, Korea. Before joining the faculty members of
Chonnam National University, he was a senior
researcher at Distributed Collaboration Technology
Research Team in Electronics and Telecommunications
Research Institute (ETRI). He received his BS, MS and
PhD degrees in Industrial Engineering from Pohang
University of Science and Technology (POSTECH),
Korea, in 1992, 1994, and 1998, respectively. His
current research interests include collaborative virtual
engineering, collaborative product commerce, and
distributed computing for product development.

Вам также может понравиться