Вы находитесь на странице: 1из 43

VIRTUAL REALITY AND

VIRTUAL ENVIRONMENTS
INTRODUCTION

• Personal computers are now used at


– home for organizing accounts
– Desktop publishing
– Letter writing
– Student homework
– Games
• A synthetic 3D universe that is as believable as
the real physical universe.
• Such Virtual reality(VR) systems create a
‘cyberspace’ where it is possible to interact
with anything and anyone on a virtual level.
• The key technologies behind such imaginative
writing are real time computer graphics,colour
displays and simulation software.
• Computer graphics provides the basis for
creating the synthetic images,while a head-
Mounted Display(HMD) supplies the user
eyes with a stereoscopic view of computer
generated world.
• Complex software creates the virtual
environment(VE) which could be anything from
3D objects to abstract database.
• In the 1980’s embryonic VR system become
commercially available and on a wave of media
hype one was lead to believe that the future had
already arrived.
• In the past few years,VR has matured considerably.
• New hardware platforms and software environment
s have appeard,supported by a
young,dynamic,professional workforce.
• Now that the potential of VR system has been
appreciated ,research organization around the
world have embarked upon projects to perfect
the primary technologies and to investigate the
complex issues of human factors, the human
computer interface and real time hardware.
• Simulantous with these developments, the
term ‘virtual reality’ has been quietly ignored
by some in preference for the term
‘ Virtual Environment’(VE) system.
Generic VR System
INTRODUCTION :
This chapter explains about the elements ,building
blocks ,role and functions of generic VR system.

VR System has four system elements:


The VE
The Computer environment
VR Technology
Modes of Interaction
The VE:
• It covers ideals like ,
Model Building
Introducing dynamic features
Physical Constraints
Illumination
Collision detection
The Computer environment:
It includes Processor Configuration
I/O channels
VE data base
Real Time Operating System
VR Technology:
It encompasses
The hardware used for head tracking
Image display
Sound
Haptics
Hand-tracking
Modes of Interaction:
It involves
Hand Gestures
3D Interface
Multi participant System
Virtual Environment:
• VE can take many forms like realistic representation of some
physical environment.
• For example interior of the building,kitchen,even object such
as car.
• Interactive 3D modeling software's are used to construct such
environment.
• VE could be used to evaluate some physical simulation where
the accuracy of physical behaviors are more important than
visual fidelity.
• Here a geometric database must be build to represent the
environment and stored such that it can be retrieved and
rendered in real time when needed.
Virtual Object:
• Objects in the real world using a rich vocabulary of adjectives.
• To build virtual objects many attributes are used
• VR system provides a domain where the virtual world can be
modelled, simulated, visualized and even experienced using
immersive display.
• VE consists of a collection of objects and sources which are
manipulated by animation and physical simulation procedures
and collection detection algorithm monitor collisions between
specified object .
• State of VE is influenced by input signals coming from the
head trackers, hand trackers and speech channel while the
outputs from the system are in the form of visual, audio and
haptic data channels.
• The elements present are…
• Static and dynamic features
• Physical constraints
• Level of detail
• Surface attributes
• Acoustics
Static and dynamic features:
Objects within a VE can be divided into two groups
Static
Dynamic
In the case of an architectural interior , the static features are
floor ,walls, ceilings, stairs etc. and where the position of a
door needs to be a dynamic feature.
Physical constraints:
• Dynamic objects may be defined without any constraints
being placed upon their spatial behavior whereas others can
be physically constrained to move within prescribed limits.
• An object can be constrained to limits translations and
rotation about the axes of its local frame of reference.
Level of detail:
• The database involves different levels of details for specific
objects.
• The real-time operating system automatically selects the
model description that matches the current view and mode of
operation.
• Here high level details can be accessed on the range basis
• Change in the level of details are controlled using dynamic levels of
transparency.
Surface attributes:
• At some stage the object are defined with its attributes.
• Color parameters might be in the form of components of red green and
blue or hue ,saturation and value.
• These require scaling and orientation parameters.
Acoustics:
• Sound plays an important role in the virtual simulations.
• The collision could arise between two independent objects,in which case
the accompanying sound could enhance the event with some suitable
noise.
• In Head-related transfer functions 3D location of the sound source is
communicated to the sound hardware subsystem which filters the sound
signal to incorporate spatial origin.
Virtual Lights:
• VE can be illuminated in various ways.
• When the radiosity algorithm has computed the discrete
changes in light levels across the various surfaces, the VE
database is prepared with this information.
• When extra computing is available, it is possible to implement
a complete illumination model incorporating several light
source.
• Some VR graphics system are even able to compute specular
highlights that arise when moving about an environment.
Physical Simulation:
• Computer animation has long been used to explore the
simulation of various physical system.
• Linked structures,fabrics,human motion and natural
phenomena have all been animated with great accuracy.
• Dynamic simulation had to be animated on a stop-frame
basis.
• Creating these simulation in real time is a challenge to VR
system
Animation:
• In the case of certain animation sequences ,a complex
animation can be simulated without involving extra procedure.
• The technique requires animation incorporated into the
database as a sequence of discrete key models.
Collision Detection:
• If an intersection condition is detected, there is a chance that
the enclosed objects intersect.
• To find this situation the objects in individual surface are
compared for the actual intersection. If this is found the event
is communicated to the real time operating system and other
action are initiated.
User Inputs:
• The basic user input signals consist of the position and
orientation of the head and hand.
Head Position and Orientation:
• This is the vital input signal to the system.
• It determine the viewpoint from which the real world is
rendered.
Hand Position and Orientation:
• This is used to interact with the VE.
• When an interactive glove is used, hand gestures can be used
to initiate or terminate system process.
System Output:
The output signals from the VR system carry
Visual information
Audio information
Haptic information
Visual Signals:
• The primary output from the system is a visualization of the
VE displayed upon the user HMD.
• This may comprise of a single view displayed upon one display
device.
Audio Signal:
• Audio signal is derived from the acoustic subsystem for
simulating sound events in the VE.
• This signals allows the user to localize sound in space using
headphone
Audio Signals:
• It carries signals describing forces to be transmitted back to
the users body
• This assume that the user is either wearing some sort of
glove, holding a reactive joystick or is attached to some
force-feedback device.
THE COMPUTER ENVIRONMENT
• Input Channels:
Two primary input channels are used here.
1.Head
2.Hand
Head data is used to control the dynamic
behavior of the VO and VE.
The hand data supplies the position and
orientation of a 3D mouse.
• Output Channels
Two primary output channels are used here.

1.Graphics
2.Sound
 The graphics channel provides two independent
images for the user’s left and right eyes so that a
binocular view of the scene is perceived.
 The sound channel provides the user with acoustic
feedback relating to events taking place at a virtual
level.
• The VE database

 The VE database describing the virtual objects is


stored on disk and loaded into main memory
whenever required.

 The loading of this data may take only few


seconds ,but when large no. of texture maps are
required the load can take considerably longer.
• Run-time services

 The real time run time services are central to any


VR system because their task is to coordinate all
of the other system components and make them
behave coherently.
VR TECHNOLOGY
• VR technology embraces all of the hardware
utilized by the user to support a VR task.

• It includes
Head-mounted displays
Head coupled displays
3D interactive devices
Headphones and 3D trackers
• 3D trackers

 Various techniques have been developed to


monitor the position and orientation of objects in
3D spce,but the method that has been used in VR
systems for a number of years works
electromagnetically.

 A stationary transmitter radiates electromagnetic


signals that are intercepted by a mobile detector
attached to the user’s head.
 The 3D tracing system should d operate at least at
the visual update rate, with the lowest possible
latency.

 The active range of electromagnetic trackers is


restricted to 1-2m,which restricts the VR user to a
relatively small physical floor space to navigate the
VE.
• Head-mounted displays

 An HMD isolates the user from the real world and


substitutes binocular views of the VE.

 One approach to HMD design involves two small


flat panel liquid crystal displays, in front of which
are mounted optics to provide a comfortable focal
point.
• 3D mice
 3D mouse is employed by the user to direct an icon
in the user’s 3D graphic interface. Its position and
orientation are also monitored similar to the
method used for the user’s head.

 The mouse also has various buttons whose status


are continuously sampled and used to signal to the
real time os to move forwards or backwards within
the VE.
• Gloves

 This is developed by VPL can monitor the status of


the user’s fingers.

 This is achieved through the use of thin fibre-


optics attached to the back of the glove’s fingers.
• Headphones
 One approach to developing a virtual 3D sound
stage is through the use of Head-related Transfer
Functions(HRTFs) measured in our ear canals.

 This data encodes how sound waves interact with


our ears, and characterizes how we exploit signal
propagation delays between our two ears to
localize sound sources.
• Haptic devices:

 It encompasses the rich sensory information we


glean from holding an object. This includes the
ability to discern surface textures as well as
attributes such as temperature, wetness,
stickiness, tension and force.
Modes of Interaction
• Computers are interactive machines and some of the interactive
devices include paper-tape readers, punched-card readers,
character recognition units, visual display units, touch screens,
light pens, joysticks, thumbwheels, pressure-sensitive pens,
tablets, digitizers, microphones and, recently, all the technology
associated with VR.
• One example of such an interface is found on Quantel’s Paint
box, where a cordless pressure sensitive pen is used to direct a
real-time painting system.
• Using natural hand gestures, the user selects screen- based
menus from which colors are mixed and drawing modes
selected.
• Pen pressure is used intuitively to control color density and line
thickness.
Immersive interaction

• A typical VR session begins by loading a VE from


disk into the host computer’s memory. The initial
view is determined by the resting position of the
HMD relative to the stationary tracking system.
• Flying
• As the user is physically restrained by the HMD’s cable
harness connecting it with the host computer, navigation
of the VE is achieved by instructing the system to ‘fly’
the users viewpoint in a specific direction.
• This is communicated to the OS via the 3D mouse or a
glove to identify both direction of travel and when to
star and stop the flight.
• Teleporting
– Flying can be time consuming and include
motion sickness and can be avoided using
teleporting.
– In flight simulation, the simulator is relocated
to a new position by supplying the OS with
the longitude, latitude, height and heading.
– The user identifies the new location by
pointing to the model and is teleported there
by the system.
• Object Picking
– A 3D mouse or an interactive glove can be used to explore
and interact with any of the objects that have been assigned
dynamic properties.
– If the VE contains a car, the car doors can be given dynamic
properties and constrained to allow them to rotate about
their hinges through a specified angle.
– The user opens a door by moving the icon of the 3D mouse
towards the position of the door’s handle.
– When a collision is detected, the user confirms the selection
of the door by activating a button on the mouse. This is
called picking.
– If the VE contains objects that are totally unconstrained, the
user can ‘pick’ them using the mouse icon.
• Gesture Recognition
– Hand gestures appear to be an excellent VR interface;
simply by tracking the user’s hand position and finger
joint angles it is possible to give commands through an
accepted vocabulary of signs.
– Where hand gestures accuracy is vital, the system can be
taught to recognize the individual characteristics of a user
by allowing the user to repeat the gestures until an average
profile is obtained.
– Whenever the user makes a gesture this table of data is
automatically scanned to identify what action to take.
– Neural nets can also be used effectively to recognize hand
gesture
• The user interface
– For the 3D VR interface, Bryson defines his own
principles of use as follows:
– Each object should have its own virtual controllers
associated with it.
– Meaning of hand gesture depends on which object the
hand is touching.
– Avoid global environment control directly through the
hardware interface.
– Hide virtual controls as much as possible, but make
them easy to get at.
• Through- the-window systems
– If the VR system is not immersive, it is not
VR.
– In the case of desktop screens, the viewer’s
head position and orientation can be used to
control the viewpoint of the computer-
generated image.
– This creates the impression of looking into a
window , beyond which exists some volume
of space. These systems are called ‘through-
the –window’ and require only a graphics
workstation and a simple head-tracker.
• Non-immersive VR systems
– It cannot be denied that 3D immersion
transcends all other modes of human
computer interaction.
– Recreating object constancy, together with all
of the other visual cues found in illumination,
texture, shadows and so on reinforce our
acceptance of the VE as a believable
experience.
– Thus non-immersive approach to 3D
interaction is just as valid as any other form
of interaction
• Multi-participant system
– The ability to support multiple participants
in a logical and scalable way is a key issue
for future VR system.
– It is possible to network a group of people
such that they can interact with one another
in a virtual domain, and be aware of one
another in their interaction.
VR systems
• As VR technology evolves, totally new system
approaches will appear. These might include
multimedia interface, communication
networks and hybrid configurations.

Вам также может понравиться