Вы находитесь на странице: 1из 9

Echology: An Interactive Spatial Sound and Video Artwork

Meghan Deutscher1, Reynald Hoskinson1, Sachiyo Takashashi2 and Sidney Fels1,2


1 2
Dept. of ECE, UBC Media and Graphics Interdisciplinary
Vancouver, BC, Canada Centre(MAGIC), UBC
+1 604 822-5338 Vancouver, BC, Canada
{deutscher, hoskinson, ssfels}@ece.ubc.ca +1 604 822-8990
taksaci@telus.net

ABSTRACT provides an effective means to bring the visual elements of the


We present a novel way of manipulating a spatial soundscape, one Beluga whales into the piece as well as driving the sound space
that encourages collaboration and exploration. Through a table-top using video processing. The amount of motion resulting from
display surrounded by speakers and lights, participants are invited to processing the video is mapped to the amplitudes of several sound
engage in peaceful play with Beluga whales shown through a live sources. The sounds of the Beluga whales, recorded over 3 years
web camera feed from the Vancouver Aquarium in Canada. Eight using underwater microphones, are remixed to provide the base
softly glowing buttons and a simple interface encourage sounds that are spatialized interactively. Enjoyment, understanding
collaboration with others who are also enjoying the swirling Beluga and engagement are key elements of the expressive meaning of the
sounds overhead. interactive installation.

Categories and Subject Descriptors


J.5 ARTS AND HUMANITIES

General Terms
Design, Experimentation, Human Factors

Keywords
Mediascape, sound spatialization, interactive art, Beluga whale

1. INTRODUCTION Beluga Webcam Interaction Space


Concept
Artists Rendition of
Atrium Space
The playful, graceful motions of Beluga whales swimming in water
create a mesmerizing motion space to watch. These highly
communicative creatures use a variety of vocalizations, physical Figure 1: Beluga webcam, sketch of interaction
expressions and physical contact in their navigation, social space, atrium for installation
interaction and survival. Beluga whales have fatty structures on the
top of their heads; it is hypothesized that these structures, called
While our table-top interactive installation was created as an
melons, act as acoustical lenses for focusing and directing sounds
artwork, the exploration of the table-top interaction space
for echolocation. We have created an interactive sound and video
contributes to the understanding of how people work together and
installation for participants to play with directional sounds initiated
communicate in a shared interactive space. We reflect upon our
by the playful movement of Beluga whales in the water captured
observations about how our design decisions facilitated social
with a single live webcam provided by the Vancouver Aquarium
interaction using imagery linked to a spatial sound space. We
(www.vanaqua.org/belugacam/index.html). The piece allows
noticed that people developed a sense of ownership of part of the
participants to experience a sonic aquarium while feeling linked to
shared interface when they were standing around it. They used this
the live, organic and mesmerizing movements of the Beluga whales.
territory to collaborate with other people around the installation to
Our artistic motivations stem from creating an interactive
play with the sounds. The blend of nature-oriented video and
soundscape that provides a representation of the Beluga’s space
spatialized sound used in the installation helped people standing
they inhabit. Figure 1 depicts our initial concept of bringing the
around the exhibit to overcome awkward moments when there was
Belugas into an atrium space to create an interactive soundscape.
silence during discussion around the table. The awkward moments
The single webcam feed provided by the Vancouver Aquarium
were avoided by providing an unobtrusive, pleasing media
experience when participants focus attention on the work, while not
Permission to make digital or hard copies of all or part of this work for being distracting while they are in conversation. These observations
personal or classroom use is granted without fee provided that copies are are related to research in human-computer interaction (HCI) and
not made or distributed for profit or commercial advantage and that copies computer-supported cooperative work (CSCW) using shared
bear this notice and the full citation on the first page. To copy otherwise, or displays.
republish, to post on servers or to redistribute to lists, requires prior
specific permission and/or a fee. We discuss our work in the context of other table-top interactive
MM’05, November 6–11, 2005, Singapore. works before describing details of how the installation works. We
Copyright 2005 ACM 1-59593-044-2/05/0011…$5.00.

937
exhibited the work from May 26-28, 2005 at the New Interfaces for and die in a virtual fish-tank. Participants create the genetic codes
Musical Expression interactive installation program (NIME’05, that are converted to a unique creature.
www.nime.org) and report on our observations of participants’
experiences. Table-top interaction surfaces have formed the basis of interactive
artworks. Toshio Iwai’s Composition on the Table [3] provided
inspiration artistically and technically for the sound and interaction
2. RELATED WORK method that we use. Iwai’s work uses a large horizontal projection
Our interactive installation uses a combination of a table-top surface with a light grid on the display. At each node of the grid a
interaction environment, web camera and spatialized musical button allows the player to change the direction of the arrow
sounds as its main technical elements and whales as our main associated with the node. A colored ball of light moves along the
theme. Each element by itself has formed the basis of other lines of the grid and then follows a path decided upon by the
interactive art installations, however, the integration of the different direction of the arrow at each node it encounters. When a node is
media in our piece provides a unique expressive experience. hit, a specific MIDI note plays. There are four lights moving at
Artworks based on wildlife are very popular in traditional forms different speeds. Thus, with careful selection of the direction of the
such as painting, sculpture, music, movies, computer animation and arrows, complex loops and rhythms can be created. The table is
performance. Human’s attraction to animals is exploited in large enough for multiple people to change the patterns. Working
computer interfaces such as Microsoft’s Bob, the dog icon used in with multiple players allows very complex sounds that cannot be
Softwork’s Fetch program and many games such as Nintendo’s made as effectively by a single person, since there are too many
Nintendog™ (http://www.nintendog.com/) and Sunstorm lights active at the same time. In Echology, we use four circular
Interactive’s Deer Hunter™. Interactive artworks that use wildlife graphics to represent four different sound tracks that are reflecting
forms are still relatively new, but have appeared in works such as spatially when they hit a node rather than triggering a MIDI note, as
Alpha Wolf [1] and A-Volve [2]. In Alpha Wolf, participants interact described below. Furthermore, the movements of Beluga whales
with a synthetic wolf that is simulated using an artificial intelligence control our sounds. Other table-top sound installations include:
engine. The representations and processes involved in the Augmented Groove [4], Jamodrum [5], and Jamoworld [6].
formation of hierarchical social structures provides the basis of the
Blaine and Fels [7] have identified many of the qualities to organize
interactive experience. A-Volve uses the simulation of genetic
multi-person interactive sound installations and guidelines based on
processes to drive the simulation of swimming creatures who live
these qualities to make the sound interaction successful.

Figure 2: System Diagram showing data flow and the relationship between components.

938
Accordingly, we place our work in their classification system along 3. THE ECHOLOGY SYSTEM
with integrating their guidelines into the design of the work. The The installation consists of an interaction table, 8 surround speakers,
relevant qualities and the recommendations we followed in our a PC running Max/MSP/Jitter [11], 8 lights and a light controller, a
design (from [7]) are: live webcam at the Vancouver Aquarium, and the atrium projection
• capacity – multiplayer, single interface; use turn-taking space. The system diagram in Figure 2 shows the paths between
protocols as well as having clear relationships between inputs, the Max/MSP/Jitter software modules, and outputs.
action and sound with a multi-person, single interface. Input to the Echology system consists of the Vancouver Aquarium
• aptitude – novice; prioritize that the sound be engaging at webcam [12] that is maintained by the aquarium, and the 8
the expense of virtuosity. interaction table buttons. These large arcade style buttons represent
• media – sound and video; use video to strengthen the the reflection points on the edge of the soundscape (i.e. the eight
relationship between action and sound without being too loudspeakers). The 8 buttons are arranged symmetrically around the
distracting. perimeter of the table as shown in Figure 7. These buttons are used
to control the movement of the sounds in the sonic aquarium.
• player interaction – participants use 8 buttons arranged in
a circle to reflect sounds; each player has the same type of The interaction table also houses a monitor to display our
interaction so that they can learn from each other. visualization of the Beluga webcam and sound space for the
• musical range – players control spatial parameters of participants. The display gives visual feedback to the participants
music; restrict the set of sound controls to help create an so they can see where the sounds are and where they will go. This
engaging, satisfying experience for novices. same display is also projected on an overhead rear projection screen
mounted in the truss for the spectators. The screen can be seen in at
• physical interface – buttons; provide Norman-style
the top of Figure 3. The sound space is created using 8 speakers
affordances [8] to engage novices and make it easy for
distributed symmetrically just above head level. Eight lights with
them to join the group.
blue filters are mounted below each speaker and light up
• directed interaction – low; attendants explicitly direct momentarily to indicate when a sound passes by a speaker to
novices so they understand how the interaction works. enhance the spatialized sound.
• learning curve – fast; use a direct mapping between
All controls between the inputs and outputs are written in the
gesture and sound to speed up learning.
Max/MSP/Jitter graphical programming language by Cycling ’74
• path to expert – no; compromise virtuosity for ease-of-use [11]. Below is a brief overview of the software components:
as the sound installation is intended for novices.
• The video and motion module processes the live webcam feed.
• levels of physicality – high; encourage high levels of Motions in the webcam view are mapped to the amplitude of 4
physical interaction that “lays the foundation for sound sources managed by the sound controller.
developing intimate personal connections with other
players and their instruments over relatively short periods • The sound controller sends 4 sound signals to the panner object
of time, which can also lead to a sense of community.” for output to the 8 speakers.
Spatial sound has been the topic of much research and commercial • The panner can take up to 8 sound signals as inputs. It
application. If not using headphones, two speakers (stereo) can be spatializes the sound by controlling the amplitudes of 8 output
used to provide a sound source that moves along a path between the channels connected to the speakers. Our spatialization model is
two speakers through adjusting the sound mix levels. This has been of a two-dimensional plane surrounded by 8 “reflection points”.
extended to quadraphonic (4 speakers) and octophonic systems (8 Sound signals (represented by coloured circles in our model)
speakers). Dolby 5.1™ surround sound systems use 5 mid-range can travel a path from one reflection point to any other dictated
speakers and a sub-woofer to provide spatial sound effects. It takes by the buttons. The panning controller manages the position of
advantage of the fact that human perception does not localize bass each sound signal in the two-dimensional space. The
sounds very well so uses only one sub-woofer. The mid-range coordinates are sent to the panner to do the spatialization.
speakers are spatially arranged around the listerner. Works such as • The paths of the sounds in the sound space are determined by
InTheMix [8] have used virtual spatial sounds in their work based the directions of the reflection points. The direction of
on techniques such as the Crystal River Engineering’s reflection from any reflection point can be to any other of the 7
Convolvotron™. These works are well suited for single person points, or to itself. In the latter case the sound doesn’t move.
sound spatialization in virtual environments. de Nijs and van der The interaction table buttons control the directions of reflection
Heide [10] present a unique way to incorporate spatial sound in points. Button presses are monitored by the panning controller
their work. They use a speaker on a counter-balanced arm to through a data acquisition interface called a phidget [13]
physically move the sound source around at different speeds so that discussed below.
participants see and hear the sound moving. Echology uses spatial
• The panning controller also switches the 8 lights on and off via
sound as one of its core elements also. We use an octophonic
the phidgets interface.
spatial sound system that allows all participants to experience the
movement of sounds in a plane parallel to the floor formed above • The graphics controller generates the output display. It blends
their heads. As Echology does not have a strong bass component, the live webcam feed with iconic representations of the
we do not use a subwoofer as would be done with a Dolby 5.1™ reflection points and coloured circles for the sounds.
surround sound system.

939
modify the sound’s amplitude. Motion values are averaged over a
sliding window of 10 frames; the averages are then normalized and
sent to the sound controller.

SPLASH

PLAY

SWIM

DEEP
Figure 3: A photo taken from above of the
Echology installation as presented for NIME
2005. The overhead truss has 8 speakers attached
Figure 4: Beluga webcam view catching a whale swimming
and a rear projection screen. The participants
are standing around the white interaction table. close to the camera divided into 4 horizontal slices or “layers”
(close up image courtesy of Vancouver Aquarium website).
Echology was produced as an interactive multimedia installation
piece for the Open Media Environment (OME) situated in the
Sound Controller
The sound controller manages the Echology audio media. It uses
atrium of a building at the University of British Columbia. The
the MSP portion of the Cycling environment to open, play and loop
OME incorporates a center stage area, a movable, suspended,
prerecorded samples.
circular theatrical truss that has eight attached speakers and can
support various lighting attachments. The circular truss has a rear The amplitudes of the 4 sound signals are controlled by the
projection screen stretched inside it. The complete installation is normalized motion values, and the resulting audio signals are sent to
shown in Figure 3. the panner for output. Often the motion of Belugas in the webcam
view is only for a short duration of time. This makes sounds
The following sections discuss the component details as they were
disappear too quickly to interact with after entering the soundscape.
implemented for the OME installation of Echology.
To remedy this we add a 10 second sustain to keep the sounds
playing after they pass a threshold volume level.
Beluga Video and Motion Control
The Vancouver Aquarium has six Belugas in the tank shown Musical Elements
through the webcam. From the webcam, a variety of different As already discussed, whale activity captured with the webcam
Beluga activities can be observed. They have various swimming coupled with the participants’ interaction directs a spatialized
patterns, sometimes play with bubble rings, do headstands, bob their soundscape. To enrich this soundscape musically, we carefully
tails up in the air and perform trained movements during the selected various sound sources, including actual Beluga sounds and
aquarium show times. composed segments. Although each sound source had to be
The Echology installation uses Jitter to process this live webcam elementally simple enough to achieve successful interaction and
feed. Motions of whales change given the different areas of the spatial perception, we also wanted the soundscape to be rich enough
camera’s view. For example, the top portion of the view is of the to give the audience the feel of playing with the Belugas underwater.
Beluga tank surface. The Belugas spend most of their time near this We use two categories of sounds in Echology. The first consists of
surface and are often bobbing around in one place; their activity is short samples of Beluga whale voices received from the Vancouver
particularly high during the training shows that occur every 1.5 to 2 aquarium research team. The second category is a number of
hours during opening hours. We used this idea of different areas to composed synthesizer tracks meant to create an aquatic atmosphere,
divide the view into four “layers”, the motion detected in each and provide continuity and context to the soundscape.
controls one sound signal. These four layers are evenly spaced
from top to bottom as shown in Figure 4. The layers were given The sound signals of the four layers in Echology's soundscape are as
labels of Splash, Play, Swim, and Deep for layers 1 to 4 follows:
respectively. The 4 sound signals associated with these four layers 1. Splash contains a synthesizer motif creating an aquatic
are composed to be aesthetically and conceptually characteristic of atmosphere when a whale moves across the top of the
Beluga motions within the given layer. For example, we have a aquarium.
playful motif for the sound associated with the Play layer. The
2. Play loops several short samples of Beluga voices that are
nature of the sounds will be discussed further in the section on
randomly triggered by the Beluga movements.
Echology’s musical elements.
3. Swim consists of additional Beluga voice samples
Beluga motion in each layer is mapped to the amplitude of the accompanied with the drone of synthesizer sound.
associated sound signals. Motion is calculated by subtracting
grayscale pixel values in consecutive frames of the Beluga webcam 4. Deep contains low frequency mass to create a drone.
feed. If the difference between 2 frames is greater than a threshold We selected high frequency Beluga voices since they provide better
value, the intensity of motion (or magnitude of difference) is used to cues for spatial perception. Atmospheric background synthesizer

940
sounds are designed to give continuity to the entire soundscape. In It will continue in this triangle pattern until a participant presses one
general, there is enough variation in the soundscape to be engaging of the buttons associated with reflection points A, C or E which will
and interesting because of the variety of motion of whales in the change the direction of the reflection.
aquarium.
Graphics / Interaction
Spatialized Sound Graphical visualization of the reflection directions and sound signal
The four sound signals are spatialized around the installation space paths are created as feedback for participants. The graphics are
using a spatial sound engine based on the Max/MSP plugin for intentionally kept simple so as not to detract from the live Beluga
Vector-Based Amplitude Panning (VBAP), developed by Ville webcam feed in the background. All graphics are generated with
Pukki [15][16]. VBAP provides sound spatialization using an Jitter’s OpenGL capabilities.
arbitrary specification of speaker position in two or three Four circles of different colours move about the interaction table
dimensions. This is an essential feature for installations such as display. Each circle directly represents the location and amplitude
ours that require a high degree of control over sound positions, and of the sound signal associated with a “layer”. The colour mapping
are set up in disparate environments that we have little advance is as follows: Splash is red, Play is yellow, Swim is orange and
control over. Deep is green. The amplitude is mapped onto the alpha value of the
Amplitude panning involves sending the same sound signal to a circles, so that they fade in and out of the background as the level of
number of different loudspeakers, with amplitudes based on the activity of the Belugas increases or decreases in the corresponding
loudspeaker positions and the location of the intended sound signal. layer of the tank. Figure 6 shows a view of the table display with all
Using VBAP allows us to easily fit the loudspeaker setup to the 4 sound channels at peak amplitude
space our installation will inhabit. With the VBAP plugin, the
speaker placements can be changed on the fly in a modular fashion
that does not affect any of our other software components.

B C

A D

H E

G F
Figure 5: Sound signals travelling between reflection
Figure 6: Screen capture of interaction screen, showing
points A, C, and E. Pressing the buttons rotates the
the sounds (circles) orbiting the reflection points (pies),
reflection points to direct the sounds to other locations.
and a Beluga whale in the background.
This method of spatialization is used in the 8-channel panner that
we use. The panner accepts up to eight channels of input sound Eight blue circles resembling a pie with a slice removed represent
signals that can each exist in a unique soundscape position at the the reflection points. The sound reflects in the direction of the pie
same time. We use four of these, one for each sound signal. The slice. Each press of the associated button rotates the pie slice so that
panner accepts x and y coordinates for each sound signal. the sound reflects to the next point. For example, in figure 5,
pressing the button associated with point A changes the sound
The panning controller calculates and maintains the coordinates for direction from going to point C to point D.
each of the sound signals. These coordinates are sent to the panner,
as well as to the graphics and lighting controllers discussed below. Reflection points cycle through 8 positions, the default scenario
The panning controller also accepts button press inputs from the being each pointing to its neighbouring reflection point so that
interaction table via a phidgets interface. When a participant sounds travel in a circle around the soundspace. Each press of the
presses a button on the interaction table, the panning controller button rotates the reflection point direction by one in a counter-
adjusts the direction of reflection for the associated reflection point. clockwise direction including pointing to itself. When this happens,
The new direction is shown on the display by the graphics the pie becomes a torus, indicating that it will now ‘catch’ sounds
controller. that hit it. To release the sounds, the participant hits the button
again, letting caught sounds go on their path towards the next
For example, consider the configuration of the reflection points as reflection point.
shown in Figure 5. When the sound hits reflection point A it
reflects towards reflection point C. When it hits reflection point C, The blue lights suspended from the truss turn on when a sound hits
it reflects to point E. When it reaches point E, it returns to point A. its associated reflection point. We use the phidget interface to
control the lights as described below.

941
direction of a reflection point. This same software module sends a
signal to the Phidget outputs to turn on lights when a sound hits a
reflection point.

Lighting
The Phidget interface output is connected to a lighting controller
board that allows eight 110V light bulb sockets to be switched
under computer control. The lighting controller is made from a
Light-o-Rama 8 Channel Triac Board (MC-TB08).

4. INTERACTION
Participants can play with the spatialized sounds initiated by Beluga
whale motion. The Beluga web feed appears in the background of
the graphic display on the table as well as on the video projection
overhead. Using the buttons, participants control the path of the
sound overhead, or if desired, they can just watch and listen to the
sounds move along their predetermined course. The melon of the
Beluga (the fleshy part on the top of the head) inspires this use of
Figure 7: The interaction table, showing multiple redirected sound. We intentionally made the interaction simple, so
participants jointly creating the sound-scape. that participants need only focus on the spatial aspects of the sound
rather than the tonal qualities. The imagery of Beluga whales
We placed the buttons, speakers, suspended lights and reflection swimming blends with the sound spatialization and visualization to
points on radially collinear axes distributed symmetrically around make a rich, playful and enjoyable mediascape. The whales’ play
the space centred on the table. This placement makes it easy for results in movement patterns that are fairly repeatable, but
participants to know where a sound is and where it will be reflected. unpredictable and organic. The audience member (or participant)
can listen and watch the mediascape on the overhead speaker truss
Interaction Table and video projection. However, the “sweet-spot” is at the
The functions of the interaction table buttons and display have been interaction table located in the centre of the raised platform of the
discussed in the previous sections, but it is important to quickly note atrium.
the design of the table itself. In [14] the importance of the
aesthetics of technological art installations is stressed. No matter Using a webcam feed of live Belugas raises interesting ethical
how novel the underlying technology, appearance is paramount. questions we have tried to address in our piece. Though they are in
Aesthetics aid in drawing people into participating with the piece. an aquarium environment, the Belugas are wild animals that cannot
Also, installations are set in public places, so surroundings must be fully be controlled. They are there for us to learn from them, but
taken into account during aesthetics design. demand proper respect. For this reason, we gave participants only
partial control over the sounds. Sounds fade in and out according to
As shown in Figure 3 and 7, the table-top is circular: allowing a the activity of the Belugas in each layer; if there is no Beluga
larger number of participants to comfortably stand around it and see activity at all for a certain amount of time, the installation will
what is going on. The large glowing buttons were chosen to draw become silent until the Belugas return. Only when there is a
people towards the table and entice pressing. The buttons, significant level of activity in all layers will all the sounds be
purchased from Happ Controls Inc., are light sea blue, 3 inches in present simultaneously. This aspect of the interaction encourages
diameter and illuminated inside with a small DC lamp. When a participants to reflect on the Beluga whales as living creatures seen
button is pressed, the light turns off to provide immediate, direct, in real time. As such, participants are not able to demand
visual feedback. The large buttons also fulfilled the design performance from the whales, but must patiently wait for them to
requirement of having an interface robust enough to withstand swim into view.
sustained use by the general public. We chose to cover the table
base and top in a fluffy white fabric to appear white like a Beluga’s Experience Goals
skin. The table-top itself has a clear plastic cover to feel like a
Beluga’s skin and protect the monitor surface. The table-top sits on There were three main experience goals we kept in mind when
a metal barrel. The Phidget interface boards and cabling are inside designing Echology: simplicity, approachability, and collaboration.
the barrel to protect them and hide them from view. Mimicking the gentle nature of Beluga whales, interaction with
Echology was designed to be simple and playful. For this reason,
Phidgets Interface we attempted to make the mapping between the Beluga webcam
A Phidget interface [13] is used to detect button presses and to send video, soundscape, and control through the interaction table as
output to the lighting controller. Phidgets are inexpensive digital straightforward as possible. Each button only controls one
acquisition (DAQ) boards that provide simple mechanisms to reflection direction, and each reflection is only in one of 8 states.
connect and control various sensors and actuators to a computer via This makes the control of sound spatialization simple and
USB. The Phidgets 8/8/8 board allows for 8 analog inputs and 8 predictable.
analog outputs.
To create an inviting, approachable space, we constructed the
To access these input and output channels, a Max/MSP external interaction table to appear fun and exciting. We did this through
Phidget object was written. A button press causes the object to send our choice of covering it in a fluffy white fabric and using large,
a signal in the panner controller software module, changing the lighted, colorful buttons to entice spectators over to the table . Our

942
design includes a projection of the Beluga webcam feed so that 6. DISCUSSION
those not at the table could observe and appreciate the mapping Some of the most interesting observations made in the installation
between Belugas and sound. space were of the behaviour between the people around the table. A
The last experience goal of our piece was to allow for collaboration substantial amount of literature has been developed in the area of
among participants at the interaction table. Our circular table-top is table-top displays for computer-supported collaborative work, such
large enough so that up to 8 people may comfortably stand around as [17,18]. This research sheds light on some of our observations of
the table at a time. Given the large table size, it is difficult for one the people interacting with each other while experiencing the work.
person to control all of the buttons at once because of the distance Researchers have consistently seen that the area immediately in
they must reach over the table. Complex control over the movement front of a participant is naturally considered their own. This notion
of the sounds is only achievable from the coordinated efforts of is encapsulated by idea of territoriality, which can be defined as
multiple participants. “an interlocking system of attitudes, sentiments, and
behaviours that are specific to a particular, usually
5. OBSERVATIONS delimited, site or location which, in the context of
The installation was exhibited for the first time in a public setting
individuals, or a small group as a whole, reflect and
during the 5th International Conference on New Interfaces for reinforce for those individuals or group some degree of
Musical Expression (NIME05) in Vancouver, Canada. It was excludability of use, responsibility for, and control over
running for one day, and during that time we received a substantial activities in these specific sites. [19]
amount of feedback from conference attendees and members of the
general public. More than one hundred people were observed In Echology, without explicit instruction, participants chose a button
interacting with our exhibit. to stand in front of, and subsequent use of that button was solely by
that participant. That button became their territory. It was often the
Our observations reflect the goals presented in the previous section case, however, that not all of the buttons had people in front of
and how well they were accomplished. Many of our observations them. Because of the control each button had on the overall path of
are also found in other collaborative musical experiences [7].
the sounds, these unmanned buttons became collaborative territory.
We achieved our goal of approachability by creating an inviting Another effect of territoriality, described by Taylor, is that it helps
space. Participants would immediately come to the interaction table people mediate their social interaction through the very act of laying
upon walking into the installation space. They did not hesitate to try claim to a space [19]. Unmanned buttons, and even the occasional
pressing the buttons. co-opting of manned buttons by other enthusiastic patrons wanting
When the Belugas were inactive, people who had not yet to achieve a specific effect, created dynamic changes in territory
experienced the soundscape would wait around patiently. It was that encouraged conversation and cooperation.
quite similar to a guest at the aquarium quietly waiting for an Because of these dynamics, we found Echology was successful at
aquatic creature to swim into a more visible area. supporting social interaction between participants. The availability
Moments of silence when there were no whales did not feel of territory, collaborative activities, and the draw of the spatial
awkward amongst participants at the table. Participants were sound, graphics and lights all helped to create a novel experience
comfortable being silent for a few minutes or spent time conversing that people felt comfortable interacting around.
about the Belugas to easily break the silence when desired. This
allowed some participants to stand around the table and chat for
7. FUTURE WORK
The installation has proven very successful in its current form;
lengths of time up to 20 or 30 minutes. Thus, the piece works very
well as a “calm” technology [20]. however there are a number of changes we are considering to
further improve user experience and better achieve our artistic
For some, the connection between the Beluga and the mode of objectives.
interaction was not well understood. The abstraction between how
Changes to the physical installation involve a reworking of the look
Belugas communicate, their actions, and what participants could do
of the interaction table. It is currently constructed from a custom
was not obvious to them. Usually an attendant had to explicitly
made circular table-top that sits on a metal barrel. Both are covered
state that the Beluga webcam images were indeed live and not
prerecorded. It is difficult to determine the importance of the with white cotton and the table-top itself is wrapped in a heavy-duty
transparent plastic cladding to protect the cotton and provide a
audience understanding that the Belugas are live and how much that
Beluga-like skin feeling. A different material covering the barrel
understanding changes the metaphor. Because of the live nature of
might prove more robust for long-term installations where there are
the Beluga camera, some people wanted to take the next step – to be
many children. The eight dynamically-changing lights hanging
able to communicate directly with the Belugas. Many people asked,
“Can the Belugas hear the music too?” from the speakers surrounding the installation also add to the sense
of immersion for participants, and we are currently experimenting
There was a prominent pattern for a participant’s visit in the with different coloured filters to enhance their feel.
installation space. The participant walked up to the table, glanced at
Changes to the Echology Max/MSP components may include
the display, and then looked at the overhead lights and speakers. He
making the graphics less abstract; for instance, more like Belugas.
or she would then focus on the display and either press a button to
We may integrate them more with the background webcam video
receive a reaction, or watch what other people were doing. Most
people asked questions about the installation, how it worked, how it by applying dynamic textures. There are also interesting
opportunities to experiment with spatialization techniques.
was made, etc. After discussing the installation itself, the
Currently, the speakers are all arranged in a 2-D plane, but VBAP
conversation often opened up to other topics, other times
also supports 3D soundscapes. For example, it might be interesting
participants would spend a few quiet moments by themselves,
enjoying the Belugas and soundscape. to arrange the eight speakers in a hemisphere. These experiments
will present challenges, as any changes made to Echology must be

943
balanced against the current simplicity, which in itself was a design (NSERC), the Media and Graphics Interdisciplinary Centre
goal. (MAGIC) and the Institute for Computing, Information and
The next iteration of Echology will also include improvements in Cognitive Systems (ICICS).
the video processing and sound control modules based on further
experimentation. Overall people generally felt a relationship 10. REFERENCES
between the soundscape and the Beluga motion but we feel this [1] Tomlinson, B., Downie, M., Berlin, M., Gray, J., Lyons, D.,
relationship could more tightly coupled. We believe that the lack of Cochran, J., and Blumberg, B. , Leashing the AlphaWolves:
tight coupling in the mapping is largely due to variations in the Mixing User Direction with Autonomous Emotion in a Pack of
webcam video quality. Video conditions vary greatly throughout the Semi-Autonomous Virtual Characters., Proceedings of the
day and according to weather, making it impossible to have one 2002 ACM SIGGRAPH Symposium on Computer Animation.
video processing setting. For example, the best conditions for San Antonio, TX.
processing the video are in the mid afternoon on a cloudy day.
[2] Sommerer, C. and Mignonneau, L., A-Volve: a real-time
However, if there is direct sun, it causes ripples on the water’s
interactive environment,
surface to glare brightly. Their motion is picked up by the current http://www.iamas.ac.jp/~christa/WORKS/FRAMES/TOPFRAME
video processing module, causing sound to be played even with no S/A-VolveTop.html, 1994.
whales in the scene. The current system also has no way to keep
track of the number of Belugas in the scene at one time. The [3] Iwai, Toshio. Composition on the table. In ACM SIGGRAPH
addition of this functionality is desirable, but again difficult to ’99 Electronic art and animation catalog, Los Angeles, CA,
implement due to the colour and quality of the webcam feed. pg. 10, 1999.
Once changes are made to the installation we hope to conduct a [4] Poupyrev, I., Berry, R., Billinghurst, M., Kato, H., Nakao, K.,
more extensive, formal user study. The study will be based on Baldwin, L., & Kurumisawa, J., Augmented Reality Interface
video observation and questionnaires to answer questions like the for Electronic Music Performance, Proceedings of the 9th
following: International Conference on Human-Computer Interaction
(HCI International 2001), August, New Orleans, LA, USA.
- Did participants find the spatialization control simple to 805-808, 2001.
understand? Could they hear the spatialization?
[5] Blaine, T. & Perkis, T., The Jam-O-Drum Interactive Music
- Did participants find the space at all intimidating or System: A Study in Interaction Design, DIS2000 Conference
unapproachable? Proceedings, New York, NY, August, 165-173, 2000.
- What patterns of spatialization are most commonly attempted [6] Blaine, T. & Forlines, C., JAM-O-WORLD: Evolution of the
with the interaction table? Jam-O-Drum into the Jam-O-Whirl Gaming Interface,
- How does perception of the installation differ between someone Proceedings of the 2nd International Conference on New
who knows the beluga feed is live compared to someone who Interfaces for Musical Expression(NIME02), Dublin, Ireland,
does not? May 24-26, 17-22, 2002.
- How long is the average time spent at the interaction table? [7] Blaine, T. and Fels, S.S., Collaborative Musical Experiences
- How do people feel about socializing around the table? for Novices. Journal of New Music Research. Volume 32. No.
4. Pages 411-428. Dec. 2003.
- How do they feel after visiting the table alone?
[8] Chapin, W., InTheMix, ACM SIGGRAPH 2000, Conference
- Would people visit the installation more than once? Abstracts and Applications, 2000.
In addition to our artistic motivations, through our creation of [9] Norman, D., The Design of Everyday Things, New York:
Echology and our observations of participants, we hope to Currency/Doubleday. 79-80, 1990.
contribute to the pool of knowledge concerning the use of table-top
displays in interactive arts. As well, since this work provides an [10] de Nijs, M. and van der Heide, E., Spatial Sounds 100 dB/100
km/u, sound/kinetic installation. 2000, 2001.
effective means to facilitate social interaction it could be used in
applications to help groups communicate better such as team [11] http://www.cycling74.com/
building applications. [12] http://www.vanaqua.org/belugacam/
8. CONCLUSIONS [13] http://www.phidgets.com/
Overall, Echology was well liked by participants. Many spent [14] Omojola, O., Post, E. R., Hancher, M. D., Maguire, Y., Pappu,
considerable time watching, playing or just being close to it while in R., Schoner, B., Russo, P. R., Fletcher, R., and Gershenfeld, N.
discussion. People didn’t always recognize the relationship 2000. An installation of interactive furniture. IBM Syst. J. 39,
between the coloured balls moving around and the spatialized 3-4 (Jul. 2000), 861-879.
sound. However, the exhibit was engaging and enjoyable and
facilitated social interaction. Thus, at this stage of creation we [15] Pulkki, V. Virtual Sound Source Positioning Using Vector
believe that the piece works well as an interactive artwork and Base Amplitude Panning. Journal of the Audio Engineering
meets our artistic objectives. Society, vol 45, no. 6 1997.
[16] Pulkki, V. Generic panning tools for MAX/MSP. Proceedings
9. ACKNOWLEDGEMENTS of International Computer Music Conference 2000. pp. 304-
Our gratitude to Valeria Vergara for providing Beluga sounds, and 307. Berlin, Germany, August, 2000.
to the Vancouver Aquarium for their support. This project received
[17] Scott, S. D., Sheelagh, M., Carpendale, T., and Inkpen, K. M.
funding from Natural Sciences and Engineering Research Council
2004. Territoriality in collaborative table-top workspaces. In

944
Proceedings of the 2004 ACM Conference on Computer Group Territorial Cognitions, Behaviors, and Consequences.
Supported Cooperative Work (Chicago, Illinois, USA, New York: Cambridge University Press.
November 06 - 10, 2004). CSCW '04. ACM Press, New York, [20] Weiser, M. and Brown, J. S., Designing calm technology.
NY, 294-303 PowerGrid Journal, Vol. 1, No. 1, 1996.
[18] Tang, J. C., Findings from observational studies of
collaborative work, International Journal of Man Machine
Studies 1991, Vol. 34 No. 2, 143-160
[19] Taylor, R. B. (1988) Human Territorial Functioning: An
Empirical Evolutionary Perspective on Individual and Small

945

Вам также может понравиться