Вы находитесь на странице: 1из 52

www.cgw.

com November 2008

Dogged
Pursuit
Disneys R&D develops new
tools and techniques for the
loose brush style in Bolt

$6.00 USA $8.25 Canada

CGW :1108_p

10/29/08

11:23 AM

Page 1

November 2008 Volume 31 Number 11

Innovations in visual computing for the global DCC community

10

Features

20

Rethinking Moviemaking

and visual effects studios, including Disney, DreamWorks, and


10Animation
others, are using stereoscopic 3D as a new creative tool.
By Barbara Robertson
COVER STORY

Back to the Future

establishes two unique looks for Bolt: one when the superdog is
20Disney
in the TV world and another when he is in the real world. For the latter,
Disney invented new technology.

By Barbara Robertson

Workstations on the Move

is a mobile workstation, and are the recent entries into this market
26What
segment truly worthy of this label?
By Jon Peddie

32

32

Zero to Hero

Departments

multinational group overcomes a number of obstacles to create its first


32Aanimated
feature film, Dragon Hunters.
By Karen Moltenbrey

Editors Note

Backdrop

Viewer Choices
Moviegoers today have many choices when it comes to a CG animated
feature film as studios large and small, domestic and abroad, offer up their
latest creations.

Agati of Sparx Animation, discusses the issues his studio


47Jean-Philippe
faced in order to bring its first animated feature, Igor, to big screens across
the US.

By Karen Moltenbrey

Spotlight

AMDs ATI FirePro V5700 and V3700. Boxxs VizBoxx.


Cell platform.
Autodesk signs an agreement to purchase
4Sonys
Softimage.
Products

News

News Analysis

look at Softimages past and present as a DCC company, and how the
8 Avendors
products and technology may fit within the Autodesk portfolio.

@CGW.com

SEE IT IN

Restoring Classicand not-soclassicfilms.


VFX supes: Getting involved at the
board stage saves time and money
Director Alan Ball discusses his new
film, Towelhead.

Visit www.cgw.com to listen to an audio interview with Autodesk


and Softimage execs concerning the acquisition.

Knowledge & Career

40

Autodesk Animation Academy integrates the ABCs of 3D within core


academic subjects such as math and science.

Back Products

44Recent software and hardware releases from SIGGRAPH 2008.

ON THE COVER

Dog gone it, Disneys done it again. Disney Animation Studios


Bolt adds a unique bite to CG, offering a novel look achieved
with several new technologies. For added dimensionality, the film
also will be shown in stereo 3D.
November 2008

EditorsNote

Viewer Choices

The Magazine for Digital Content Professionals

E D I TO R I A L

hen I was a child, going to the movies on a Saturday afternoon was


a treat. Thats because films geared to youngsters were few and far
between. In 1971, for instance, Disney released two family movies: The
Million Dollar Duck and Bedknobs and Broomsticks. In fact, between 1960 and
1970, the studio rolled out fewer than 10 films for this audience, and not even
half of those were (traditionally) animated features. And back then,
Disney had a tight grip on the animated film genre, as only a few
2D animated movies from other studios, such as Hanna-Barbera,
made it into theaters.
Today, the situation couldnt be more different. Old-fashioned
theaters with a single screen and single projector have given way to
multiplexes, so moviegoers can choose from among 10, 20, or even
more films at any given time. With all this opportunity, studios,
including Disney/Pixar, Sony Pictures Imageworks, PDI/DreamWorks, and others, have churned out animated hit after hit. This
year alone audiences have been entertained by the lovable robot
Wall-e, the pampered zoo crew from Madagascar: Escape 2 Africa, the stuffy martial artists of Kung Fu Panda, the Suessian characters from Horton Hears a Who!,
and more. Far more.
In 2008 alone, we were treated to nearly the same number of animated theatrical releases that were available during the decade from 1960 to 1970.
This month, Disney has unleashed its latest film star, Bolt, who stars in a
feature with the same name (see Back to the Future, pg. 20). Bolt has a unique
look. It is not exactly the same style as the traditional 2D Disney films from
the studios second Golden Age. Yet, considering that Byron Howardwhose
credits include Brother Bear, Lilo & Stitch, and Mulanco-directed, the influence
becomes clear. However, wielding great influence was John Lasseter, the masterful director of Pixars Toy Story, A Bugs Life, Toy Story 2, and Cars. Achieving the
movies hybrid look also demanded new CG technology across four distinct areas.
Its not just the industry giants who are entertaining us. Last month, Sparx
Animation (France, Vietnam) released Igor, a twisted tale about a lovable hunchback and his motley group of friends (see the Backdrop, pg. 47). Next month,
Fathom Studios (Atlanta) will debut its magical adventure film Delgo, featuring
epic battles, bizarre monsters, compelling characters, and stunning environments.
And, finally, the storybook-come-to-life The Tale of Despereaux, Framestores solo
entry into the feature-film arena, will debut in theaters this holiday season.
Another striking animated feature from France, Dragon Hunters, has made its
mark abroad but is still hoping for a US release (see Zero to Hero, pg. 32). In
this futuristic/medieval world, the characters boast a stylized look and the environments are truly different.
Audiences indeed have choices, not just among the animated features, but also
among the live-action movies. So, with so many options, is the market becoming
too diluted with these films? Is an animated movie no longer the special treat
it once was? Perhaps. But as long as studios deliver compelling stories augmented
with equally compelling visuals, then there can never be too much of a good thing!

Karen moltenbrey
Chief Editor

karen@cgw.com (603) 432-7568


36 East Nashua Road
Windham, NH 03087

Contributing Editors

Courtney Howard, Jenny Donelan,


Audrey Doyle, Evan Marc Hirsch, George Maestri,
Kathleen Maher, Martin McEachern,
Stephen Porter, Barbara Robertson

WILLIAM R. RITTWAGE

Publisher, President and CEO,


COP Communications

SA L E S
Lisa BLACK

National Sales Manager


Classifieds Education Recruitment

lisab@copcomm.com (877) CGW-POST [249-7678]


fax: (214) 260-1127

Editorial Office / LA Sales Office:

620 West Elk Avenue, Glendale, CA 91204


(800) 280-6446

P rod u c tio n
Kath Cunningham
Production Director

kcunningham@cgw.com (818) 291-1113

MICHAEL VIGGIANO
Art Director

mviggiano@copcomm.com

Chris Salcido

Account Representative

csalcido@copprints.com (818) 291-1144

Computer Graphics World Magazine


is published by Computer Graphics World,
a COP Communications company.
Computer Graphics World does not verify any claims or
other information appearing in any of the advertisements
contained in the publication, and cannot take any
responsibility for any losses or other damages incurred
by readers in reliance on such content.
Computer Graphics World cannot be held responsible for
the safekeeping or return of unsolicited articles,
manuscripts, photographs, illustrations or other materials.
Address all subscription correspondence to: Computer
Graphics World, 620 West Elk Ave, Glendale, CA 91204.
Subscriptions are available free to qualified individuals
within the United States. Non-qualified subscription rates:
USA$72 for 1 year, $98 for 2 years; Canadian
subscriptions $98 for 1 year and $136 for 2 years;
all other countries$150 for 1 year and $208 for 2 years.
Digital subscriptions are available for $27 per year.
Subscribers can also contact customer service by calling
(800) 280 6446, opt 2 (publishing), opt 1 (subscriptions) or
sending an email to csr@cgw.com. Change of address can
be made online at http://www.omeda.com/cgw/ and click
on customer service assistance.

Postmaster: Send Address Changes to

Computer Graphics World, P.O. Box 3551,


Northbrook, IL 60065-3551
Please send customer service inquiries to
620 W. Elk Ave., Glendale, CA 91204

Karen Moltenbrey
CHIEF EDITOR
karen@CGW.com

November
2008
August 2008

CGW :808_p

7/18/08

4:46 PM

Page 1

Full Throttle.
No Resistance.

With LightWave 3D I can build anything I can think of...


FAST. Its unbelievable to see my dream machine go from a

B_]^jMWl[):

LightWave object to rolling drop-dead coolness. If I can dream


it, I can create it, with LightWave.
Tim Cameron, Visualization Artist, Designer of the V-Rex Motorcycle

B_]^jMWl[l/Get it done.

Visit www.lightwave3d.com/getitdone to download a white paper


on how LightWave gets it done for visualization and design.

B_]^jMWl[WdZB_]^jMWl[):Wh[h[]_ij[h[ZjhWZ[cWhaie\D[mJ[a?dY$D[mJ[a?dY$(&&.$7bbh_]^jih[i[hl[Z$

mmm$d[mj[a$Yec

AMD Rolls Out New Brand, Releases Accelerator Pair


AMD recently unveiled a new brand for its professional graphics accelerators, the ATI FirePro, and then announced two new
professional graphics acceleratorsthe ATI FirePro V5700 and
V3700as the first offerings in that new line.
The ATI FirePro V5700 graphics accelerator, priced at $599,
delivers application performance that is well suited for CAD
and DCC professionals. With the new card, users will experience increased performance for shader-intensive applications,
as much as two times more than the previous generation,
according to the company.
The V5700 also features 512MB of frame-buffer memory and
dual-link DVI and display port connections, while its true 30bit display engine produces more than one billion colors at
any given timeempowering designers to see more of their
data. Unified Video Decoder 2.0 provides full Blu-ray feature
support, including dual-stream and picture-in-picture capa-

bilities, and handles the decoding of various


formats in the GPU, which helps free
up the CPU to handle other tasks.
The ATI FirePro V3700, meanwhile, establishes a new,
low-entry price for
professional 3D
graphics. For CAD
professionals who
are migrating from 2D, the
V3700 provides 3D performance and
application certification at a price of $99. The
graphics accelerator features 256MB of frame-buffer memory,
two dual-link DVI connectors, and VGA-mode support on all
display outlets.
The ATI FirePro V5700 and V3700 are available now.

PRODUCT: GRAPHICS CARDS

Boxx Showcases Viz Solution

Sony Unveils Cell Platform

Boxx celebrated its 10th anniversary at SIGGRAPH 2008,


but instead of blowing out candles, the company demonstrated a number of its recently released products, including
the VizBoxx, its first offering to power virtual-reality systems
and very large displays, and take advantage of the expanding possibilities of GP-GPU massively parallel computing.
VizBoxx features a balance of CPU and GPU processing
power in a hyper-dense 2-by-4U form factor designed to
drive high-end simulation and visualization systems. With five
modules fitting in 4Us of a standard rack, VizBoxx reportedly
packs the graphics power of five Nvidia Quadro FX professional-class graphics cards for a total of as many as 20 DVI
outputs. Five VizBoxx modules also contain 10 Quad-Core
Intel Xeon CPUs, enough processing power to perform the
intense real-time computations required by immersive virtual-reality implementations.
The VizBoxx platform delivers visualization power in a
format that inserts itself easily in high-performance computing environments used by scientists and engineers working on complex challenges. VizBoxx clusters can help trigger new insights and foster a collaborative approach to
problem-solving.

Sony Electronics showed a new workflow solution for faster


processing of high-resolution effects and computer graphics when it unwrapped Zego.
This technology platform is based on the Cell/BE
(Cell Broadband Engine) and RSX technologies, and is
designed to eliminate bottlenecks that can occur during
postproduction, especially during the creation and rendering of visual effects.
The first product to utilize the Zego hybrid multi-core cell
platform is the companys BCU-100 computing unit. The
Cell-type architecture, known as on-chip parallelism, packages a collection of processors and co-processors in a
combination that is specifically designed to optimize specific
types of applications. It incorporates a low-power consumption design with the added benefit of reduced operational
costs when many units are run in a clustered configuration.
Sony has been working with software companies that
specialize in video production, including Nvidias Mental
Images and Side Effects Software, to create applications
that take advantage of the Zego platform.
The BCU-100 computing unit is expected to ship later this
year. Pricing has not yet been announced.

PRODUCT: COMPUTING

PRODUCT: PROCESSING

November 2008

CGW :608_p

6/17/08

12:55 PM

Page 1

HP recommends Windows Vista Business.

When DreamWorks decided to turn a 260-pound panda into a kung fu warrior,


there was only one computer up to the task: the HP Workstation.
Not only do HP Workstations have the memory, processors and graphics power required for
DreamWorks visual storytelling, theyre available with HPs performance-tuning software
so all your applications, operating systems and hardware work together efciently. Put that
kind of power to work on your next big idea.
HP Workstations, starting at $639.* Learn more at hp.com/go/workstationspeed

*See hp.com for pricing on the xw8600 model shown above; reseller price may vary. Certain Windows Vista product features require advanced or additional
hardware. See http://www.microsoft.com/windowsvista/getready/hardwarereqs.mspx and http://www.microsoft.com/windowsvista/getready/capable.mspx
for details. Windows Vista Upgrade Advisor can help you determine which features of Windows Vista will run on your computer. To download the tool, visit
www.windowsvista.com/upgradeadvisor. Kung Fu Panda & 2008 DreamWorks Animation LLC. All Rights Reserved. Copyright 2008 Hewlett-Packard
Development Company, L.P. The information contained herein is subject to change without notice. Simulated images. Microsoft and Windows are
U.S. registered trademarks of Microsoft Corporation. Windows Vista is either a registered trademark or trademark of Microsoft Corporation in the United States
and/or other countries.

Autodesk Signs Agreement with Avid to Buy Softimage


Acquisition to accelerate Autodesks games strategy; complements digital entertainment and visual communications offering
Late last month, Autodesk and Avid Technology signed a
definitive agreement for Autodesk to acquire substantially all
the assets of Avids Softimage business unit for approximately
$35 million (see Changing Hands, pg. 8).
Softimagewhich develops 3D technology for the film, television, and games marketswas founded in 1986 by Daniel
Langlois and is headquartered in Montreal, Canada. Its flagship product, Softimage XSI, is an extensible 3D animation
software solution used by leading media and entertainment
companies, including Digital Domain, Ubisoft, Sega, CapCom,
Animal Logic, and The Mill. Meanwhile, Autodesk Media and
Entertainment provides animation, visual effects, editing/finishing, and color-grading solutions for the 3D market, including
entertainment and design industries.
Softimage has been developing state-of-the-art 3D technology for more than 20 years, and its products are recognized as
best-of-breed in the entertainment
industry,10:17
says AM
MarcPage
Petit, 1
CGW_Half_HOR:CGW_Half_HOR
9/15/08
senior vice president, Autodesk Media and Entertainment.

November 2008

Upon the completion of this acquisition, we will be adding


Softimage technology and products to our portfolio, and
welcoming one of the most talented teams in the industry to
Autodesk Media and Entertainment. Both will help us accelerate the work of our Games Technology Group, as we build the
next-generation of real-time, interactive 3D authoring tools for
games, film, and television.
Gary Greenfield, CEO and chairman of Avid Technology,
adds: We are excited about what this transaction means for
customers of Softimage. The Softimage 3D product line has
performed well in the video games market, a sector where
Autodesk has a track record of success. Autodesk will provide
a great home for the business.

Softimage Product Integration


Upon completion of the acquisition, Autodesk states that it
intends to continue developing and selling Softimages core
product line, while integrating certain Softimage technology

into future versions of Autodesk solutions and products.


According to Autodesk, it plans to acquire and continue
developing the following Softimage products:
Softimage XSI: Including XSI Essentials, XSI
Advanced, XSI Academic, XSI Mod Tool, and the
XSI software development kit. XSI is productionproven 3D animation software, offering a complete
3D modeling, animation, rendering, and development environment for VFX and custom tools.

Softimage Face Robot: A complete software solution for easily rigging and animating 3D faces. Face
Robot enables studios to create lifelike facial animation at incredible speeds.
Softimage CAT: This advanced character animation system is, ironically, a plug-in for Autodesk 3ds
Max software. It is intended to be integrated into the
3ds Max product line.

Softimage Crosswalk: This interoperability solution is intended to be integrated with Autodesks


own interoperability technology.
CGW_Half_HOR:CGW_Half_HOR
6/17/08 12:35 PM Page 1

NEWS: ACQUISITION

Petit comments: As we have demonstrated since the


acquisition of Alias in 2006, were committed to giving our
customers choices when it comes to their 3D tools. We plan
to maintain and grow the Softimage product line, and through
Autodesk FBX, provide better interoperability between Softimage products, 3ds Max, and Autodesk Maya. FBX also
provides interoperability between Softimage products and our
specialized applications such as Autodesk Mudbox, Autodesk
MotionBuilder, Autodesk ImageModeler, and Autodesk Stitcher, as well as numerous third-party applications.
Petit concludes: At Autodesk, we care deeply about 3D
technology; we know users invest a lot of time and energy
into mastering their favorite 3D product. To all 3D artists out
there, I want to tell you that we understand your passion for the
creative tools you use daily, and that with Autodesk, you can
choose your passion.

More at CGW.com

Go to www.cgw.com for an audio interview with


Autodesks Marc Petit and Softimages Marc Stevens
concerning this breaking industry news.

November 2008

News Analysis

Autodesk to buy Softimge. The news came


during a wild and scary late October week
in the world economy, and in certain corners of the content creation world, those
words were as surprising and game-changing as a stock market plunge (or rise, or rise
and plunge).
There are Softimage customers who are
not going to be happy about this news, but
Autodesk, for its part, promises to do right
by Softimage. The company plans to take
advantage of Softimages technology to
build out its game development tools. In
a phone interview, Marc Petit, Autodesk
senior vice president and
head of the Media and
Entertainment Group,
contends that this deal
comes at the perfect time
for Autodesk, as the 3D
industry is undergoing
revolutionary changes.
What is definitely a
sign of tough times is the
size of the deal. Autodesk
is acquiring Softimage
for $35 million. This
isnt a huge amount for
8

November 2008

Autodesk, but it does put some dollars in


Avids pocket. To keep things in perspective, Autodesk acquired Alias, in another
deal that surprised the industry in 2006,
for $197 million.

Past to Present
Things werent always this way for the company. Softimage has long been considered
the boutique tool for 3D content creation.
Daniel Langlois founded the company in
Montreal in 1986, and Montreal is where
it has stayed through several upheavals. In
the early days, it seemed like all glamour
as the sophisticated Canadians sold their
high-priced tools to the people who really needed them
and appreciated what they
could do. Softimage brought
new animation capabilities
to 3D modeling; it was one
of the first to introduce inverse kinematics, the ability
Softimages XSI 3D
content creation software
has loyal followers in the
entertainment markets.

to animate characters by connecting rigid


segments and jointsthat is, connecting
the thigh bone to the leg bone.
But then the first wave of democratization started to happen, and the boutique
was under siege. The largest upheaval was
the acquisition of Softimage by Microsoft
for $130 million in 1994. At that time,
Microsoft was battling SGI on the workstation front and hoping to entice professionals off RISC-based workstations with Unix
operating systems and on to Windows NT.
Softimage, which was sold with SGI workstations, was bait. And, it was one of the
first high-end content creation tools to be
ported to Windows NT.
There, it went head-to-head with 3D
Studio, which was ported to Windows
NT in 1995 (from MS DOS, no less) and
renamed 3D Studio MAX at the time.
Microsoft was working on a grand vision
to have end-to-end products in content developmentespecially television production, as visions of digital television began
to dance in industry heads. The company
was promoting Softimage tools for content
creation in TV, combining 3D content,
video, audio, and HTML.
But in the end, Microsoft really didnt
know what to do with a 3D modeling and
animation company. However, it did do a
good job of selling the idea of intertwined
tools for audio, video, 3D content, HTML,
and whatever else might come along, and
Avid bought Softimage in 1998. At the
time, the deal was reported to be worth $285
million, and Microsoft took a stake in Avid.
That was then. In recent years, Softimage
has done a lot with very little, thanks to a
hardworking team in Montreal who happen to love the cityand Softimage. Softimage has been willing to make big bets to
keep its code up to date with the competition, and Avid has supported the group.
Softimages engineering talent runs deep.
As with all big bets, the Softimage division
has taken some hits, but its engineers believe
they have built a solid development plat-

News Analysis

n n n n

form for future upgrades that can better take


advantage of hardware advances, including
multiple CPUs and advanced GPUs.
Just this past summer at SIGGRAPH
2008, Softimage introduced ICE (Interactive Creative Environment). ICE is designed to meet the demands of production
managers and artists who need to develop
and manage their own tools. Maya has led
in this market, but it, too, is getting a little
long in the tooth, and developers complain
about the difficulty of programming in
Maya. Softimage hoped to win over not
just users, but entire teams with its nodebased development platform.
Meanwhile, Autodesk acquired French
AI company Kynogon, which the company
wanted to use to expand its role in the game
development pipeline. By around SIGGRAPH 2008, says Marc Stevens, general
manager of Softimage, he and Petit realized
they were both heading in the same direction. We had the same vision, and we were

the process of analyzing the company.


Stevens and Softimage product manager Bill Roberts realized that the opportunities for 3D modeling and the
animation market were well defined
but limited. However, the opportunities within the entire development
pipeline, especially the game developJust this past summer, Softimage added ICE, a
ment pipeline, were quite large.
transformative open platform, to XSI Version 7.
We needed an investment to buy
middleware and game-engine techcould not pass up that talent.
nology, says Roberts. Avid, at the same
Stevens is going to take Softimage into
time, decided it needed to focus on its Autodesk, and hell join Autodesks execucore video and broadcast products, and tive team. Autodesk says it will maintain
the wheels were set in motion to find Soft the Softimage XSI product line, along with
image a home where it could grow.
Autodesks current stable of 3D content creThe rumor mills have been pumping ation tools, including 3ds Max and Maya.
out theories about where Softimage might
Max is increasingly becoming an engine
go. Obviously, given todays economic cli- for visualization, and Autodesk is strengthmate and tight money, a leveraged buyout ening the links to Revit, its architectural
was unlikely. According to Stevens, several product, via the common, open FBX format
companies were interested. And within (originally from Kaydara, which was bought
the industry, rumors have been flying for by Alias in 2004). In fact, says Petit, the real
opportunity is in interactive 3D storytelling
on all fronts, including moviemaking, Internet interactive applications, virtual worlds,
and even engineering.
Making the 3D assets is just a tiny part of
the job. Stevens and Petit want to improve
development tools, expand their middleware products, create game-like interfaces,
and expand the entire 3D market.
Maybe the acquisition of Softimage by
Autodesk isnt what anyone expected, and
certainly there will be people who do not
want this. But, the ideas and talents that
these two companies can bring together
could blossom into a broad range of new
products and capabilities. For many Soft
Stereoscopy is a hot area right now in terms of entertainment, and Autodesk is addressing this
image employees, there is probably a sense
area in its Maya 2009 product.
of relief now that the waiting is over and
going at it from two different directions, months, though most of the speculation the big question has been answered. n
says Stevens during the joint phone inter- has been around Apple; Autodesk was
view with Petit.
not mentioned much. This isnt what we Kathleen Maher is a contributing editor to CGW, a
About nine months ago or so, the Soft thought would happen, admits Roberts senior analyst at Jon Peddie Research, a Tiburon, CAimage executive team took stock. They were when asked about the deal.
based consultancy specializing in graphics and multimedoing so as their parent company, Avid, unAutodesks Petit points out that once dia, and editor in chief of JPRs TechWatch. She can be
der new management, was going through he became aware of the opportunity, we reached at Kathleen@jonpeddie.com.
November 2008

Acquis

Stereoscopy

Could we be about to witness a revolution


in filmmaking as profound as the introduction of sound and color? Proponents of
stereoscopic 3D films think so, and it looks
like they might be right.
In 2005 when Disney created a stereo
3D version of its first CG feature Chicken
Little, the stereo 3D version played in only
84 theaters equipped with the new, digital
projector-based RealD systems, which use
disposable, polarized glasses. The following
year, stereo 3D versions of Sonys Monster
House and Disneys The Nightmare Before
Christmas landed in 200 RealD theaters. By
DreamWorks Animation SKG plans to author all its movies in stereo 3D,
starting with Monsters vs. Aliens, scheduled for release in March 2009.
10

November 2008

Stereoscopy

Director Henry Selick is using varying depths to


dramatize story points for Focus Features Coraline,
a stop-motion animation created at Laika and
scheduled for release in February 2009.

November 2007, when Paramount Pictures


released Beowulf in stereo 3D, the number
of RealD theaters had grown to 900. Now,
the chickens and the eggsthat is, the theaters and the contentare quickly moving
into place for a major revolution.
Digital projection is the key to stereo 3Ds
theatrical success. The stereo 3D systems,
such as those from RealD, that sit in front
of the digital projectors control the dual
images (left eye, right eye) with split-second accuracy. This has helped eliminate the
headache-producing misalignments of lefteye/right-eye frames that can result when
two sprocket-based projectors put images
on screen (see Supersized, January 2007).

However, the slow adoption of digital


projectors by movie theaters has slowed
the adoption of stereo 3D. Thats about
to change. On October 1, a consortium
of Hollywood Studiosincluding Disney,
Paramount Pictures, Universal Pictures,
Twentieth Century Fox, and Lions Gate
announced their pledge of $1 billion-plus
to upgrade 20,000 North American movie
theaters to digital projector systems.
Digital conversion is the major cost,
says Joe Daley of MarketSaw.blogspot.com,
a Web site focused on 3D movies. Once
thats done, its a relatively minor cost for a
theater to move to 3D.
The digital projection installation project

will unspool during the next three years,


but already 1300 theaters show 3D films,
with RealD, the leader in this field, boasting of deals in place for future installations
that would bring its total to 5000 theaters.
Further increasing the potential number of
theaters is Dolby, which entered the picture
in 2007. Dolbys stereo 3D systems, which
dont require special screens but need reusable polarized glasses, have now landed
in 150 US theaters and an additional 350
around the world. And, just last month,
Sony announced a 3D adapter for its 4resolution movie-theater projectors that it
plans to ship in March 2009.
The 5000-theater number would be
November 2008

11

n n n n

Stereoscopy

a milestone: When a film studio releases


a so-called tent-pole filma movie, for
example, like Iron Man, Wall-e, Indiana
Jones, or The Dark Knightit lands in approximately 4000 theaters on opening day
in the US. So, having 5000 potential 3Dcapable theaters has sparked excitement in
the major studios.
We anticipate there will be between
2500 and 3000 screens in North America by the first quarter next year, stated
John Batter, co-president of production at
DreamWorks Animation SKG, speaking at
The Conversation, a forum co-hosted by
Scott Kirsner in early October. As for content, such prominent tent-pole filmmakers
as Steven Spielberg, James Cameron, Tim
Burton, Robert Zemeckis, and Peter Jackson have 3D films in the works.
In fact, Daley lists on his Web site 21 3D
films scheduled for release in 2009CG
features, stop-motion animation, and live
action filmsincluding Zemeckiss 3D
animation A Christmas Carol, Pixars Up,
Sony Pictures Imageworks Cloudy With a
Chance of Meatballs, Foxs Ice Age: Dawn
of the Dinosaurs, and Focus Features Coraline (wide release). Camerons 3D live-action science-fiction thriller Avatar opens
in December 2009. Spielbergs 3D animation Tintin, produced by Peter Jackson,
is scheduled for 2010. Moreover, Disney
plans to author and release all
its CG films in 3D.

Disneys Meet the Robinsons was the


second CG animation that the studio
converted into stereo 3D.

12

November 2008

Belgian director Ben Stassen floated the characters in his stereo 3D film Fly Me to the Moon in
front of the screen, over the heads of the audience. Stassen created the film exclusively in 3D.

So does DreamWorks Animation.


Were looking at the point where theres
going to be 3D content coming out every
couple of weeks now, says Phil Captain
3D McNally, stereoscopic supervisor at
DreamWorks Animation. McNally supervised the conversion of Disneys Chicken
Little into 3D at Industrial Light & Magic
and was stereoscopic supervisor at Disney
for Meet the Robinsons.

Fundamental Changes
The studios drive to release 3D films is
causing a fundamental change in filmmaking that is ripping through the production
process. Every movie we release starting in
March 2009 will be authored in 3D, says
Batter, alluding to the March release of Monsters vs. Aliens. Well
conceive in 3D and shoot in 3D.
Its the next great frontier.
Thats the tipping point: Filmmakers are now considering stereoscopy from the beginning.
Chicken Little and Meet the
Robinsons were post conversions,
says Robert Neuman, stereoscopic
supervisor at Walt Disney Animation Studio. Bolt is the first film in which 3D is
part of production. As were laying out the
films and setting out the cameras for 2D,
were building the 3D version.

Similarly, at the visual effects studio Sony


Pictures Imageworkswhere artists created
the animated films The Polar Express, Monster
House, and Beowulf, all shown in 3Dstereo is moving further upstream, according to
Rob Engle, senior stereographer and digital
effects supervisor. With Polar and Monster
House, it was sort of an afterthought, he
says. We are much more embedded into
production from the outset now.
Engle finds that this is also true for liveaction directors. Filmmakers are asking us
how they can adjust their films for the stereoscopic medium, he says. Theyre talking to us about which camera angles and
compositions work better. These conversations didnt happen two years ago.
And, thats why the practitioners of stereo
scopy compare stereo 3D today to the introduction of color and sound in years past.
If you look at the advances in the history of cinema, they all had technical hurdles
to overcome, says Neuman. Color had to
have an emulsion developed that conveyed
the spectrum properly. And, in the earliest
color movies, filmmakers were using color
more for spectacle than for storytelling.
Color was so saturated reviewers said it
hurt. Much like color, Neuman explains,
3D had false starts because of the technology and the spectacle of 3D that seduced
early filmmakers: The studios would go

CGW :1108_p

10/29/08

11:18 AM

Page 1

New Multibridge Pro has SDI, HDMI and Analog


editing with multi channel audio for only $1,595
Multibridge Pro is the most sophisticated
editing solution available. With a huge range
of video and audio connections and the
worlds rst 3 Gb/s SDI. Advanced editing
systems for Microsoft Windows and Apple
Mac OS X are now affordable.

Worlds Highest Quality


Multibridge Pro includes 3 Gb/s SDI and Dual
Link 4:4:4 SDI for connecting to decks such as
HDCAM SR. Unlike FireWire, Multibridge Pro
has a 10 Gb/s PCI Express connection for powerful HD real time
effects in compressed or uncompressed video le formats.

Connect to any Deck, Camera or Monitor

Microsoft Windows or Apple Mac OS X

Multibridge Pro is the only solution that features SDI, HDMI,


component analog, NTSC, PAL and S-Video for capture and
playback in SD, HD or 2K. Also included is 8 channels of XLR
AES/EBU audio, 2 channels of balanced XLR analog audio and
2 channel HiFi monitoring outputs. Connect to HDCAM, Digital
Betacam, Betacam SP, HDV cameras, big-screen TVs and more.

Multibridge Pro is fully compatible with Apple Final Cut


Pro, Adobe Premiere Pro, Adobe After Effects, Adobe
Photoshop, Fusion and any DirectShow or QuickTime
based software. Multibridge Pro instantly switches between
feature lm resolution 2K, 1080HD, 720HD, NTSC and PAL
for worldwide compatibility.

Advanced 3 Gb/s SDI Technology


With exciting new 3 Gb/s SDI connections,
Multibridge Pro allows twice the SDI data
rate of normal HD-SDI, while also connecting
to all your HD-SDI and SD-SDI equipment.
Use 3 Gb/s SDI for 4:4:4 HD or edit your latest feature lm using
real time 2048 x 1556 2K resolution capture and playback.
The Drawn Together images are courtesy of Comedy Partners.

Multibridge Pro

$1,595
Learn more today at www.blackmagic-design.com

n n n n

Stereoscopy

for the gimmicks, the throw everything at


the audience approach.
Now, filmmakers working with animated and live-action features are using
3D more creatively. To learn what they
are discovering, we talked with several 3D
pioneers creating animated CG films, stopmotion features, and live-action films.

Sculptural Movies
I talk about stereo 3D now as spatial
moviemaking, says McNally. The difference is like comparing painting to sculpture. We have the potential to conceive
the whole storytelling art form as a spatial
art form, especially when we combine CG
with stereoscopic moviemaking.
Close-up shots provide one example
of how 3D might differ from traditional
moviemaking. In the latter, when a director
shoots a star for his or her close-up, the
character in the film appears closer because
it looks bigger. Its part of the illusion you
create when youre working in a medium
that doesnt support distance, McNally
says. But, in a 3D movie, its possible that
the close-up could be shot with the framing
quite wide, and with the character moving
closer to us. Is that more powerful, or is it
distracting? These are questions were trying to understand.
When stereographers structure a 3D
experience, they work within certain limits. We have something we call the stereo
budget, or the parallax budget, Engle says.
Thats the technical limitation for how far
away or close something can be before it
becomes uncomfortable [to look at]. So,
the creative aspect is in using that space to
give the audience the best experience.
Within that limitation, a director can
bring objects out into the audience or use
stereo 3D as a window into which the audience looks deeply. Ive often heard James
Cameron says hes keeping the subject of
interest on the screen, McNally notes.
Hes shooting live action, and you can understand why you would want to edit with14

November 2008

Director Eric Brevig filmed the live-action Journey to the Center of the Earth in stereo 3D. The
July 2008 release into multiplex theater chains earned $100 million at the box office.

out worrying about the subjects jumping


around. The other extreme is Fly Me to the
Moon. The space of the movie is detached
from wherever the screen is, almost nothing
plays at the screen, and almost every character is within arms reach. Were using both
techniques and working between [them].
Don Hahn, who was the producer at
Disney for a number of feature animations, including Beauty and the Beast, The
Lion King, The Nightmare Before Christmas,
and The Emperors New Groove, is currently
the executive producer for Tim Burtons
3D Frankenweenie, scheduled for December 2009. Directors are entering a whole
new space with new rules, he says. For
example, you might have a deep set, like a
big ballroom, but you want the audience
to focus downstage or way upstage. So, you
converge the eyes to different points on the
screen. Also, you can play with distance to
create a child-like wide-angle perspective or
a more normal adult sense of a room. But,
I suppose pacing is the biggest difference.
When you convert flat animation to 3D,
there are times when you wish you could
linger and let the eye adjust to a scene. The
[stereo 3D] directors have extra time to follow objects to the z plane and back again.
The possibility of such leisurely pacing is
something that Adam Holmes, vice president of Wide Band Entertainment, finds
enticing. Holmes is currently executive

producer for a stereo 3D feature animation, an independent production made


feasible by the promise of 5000 theaters.
If we can design stereo 3D films in which
we hold on to the scenes, we might be able
to bring some artistry back into our ADD
world, he says. We may be able to extract
emotion from films, not just fast cuts and
fast pacing.

Emotional Depth
In designing the overall look through the
arc of a film, the stereoscopic teams frequently compare the conceptual use of 3D
to that of a sound track. We constantly
play with the depth like a symphony, McNally says. We might start quiet, build
to a crescendo, and then fall back so that
depth is something that flows. The stereoscopic artists consider depth within individual scenes as well.
When filmmakers went for the gimmick of hurtling stuff at the audience, they
pulled the audience out of the film, instead
of creating an immersive experience, Neuman says. Were taking a more mature
approach. For example, in the film Bolt
(see Back to the Future, pg. 20). Disney
stretches the environment fully for action
scenes, but when the dog Bolt talks with
the cat Mittens, and its a lighthearted moment, the crew tones down the depth.
The instruments that stereographers

CGW :1108_p

10/29/08

11:19 AM

Page 1

Finally, converters that auto switch SD and HD


and include AES/EBU and analog audio!
Build your studio with the worlds most advanced converters. Only
Mini Converters include auto SD/HD switching, redundant input,
AES/EBU and analog audio on 1/4 inch jack connections, plus
advanced 3 Gb/s SDI. There are 4 great models to choose from
depending on the conversion you need, plus a sync generator model!
Auto Switching SD and HD
Mini Converters instantly switch between all
SD and HD formats, including NTSC, PAL,
1080i/59.94, 1080i/50,1080PsF/23.98, 1080PsF/24,
720p/59.94, 720p/50. Updates can be loaded via USB.

New 3 Gb/s SDI Technology


Mini Converters include the latest 3 Gb/s SDI
technology, so youre always future proofed! 3
Gb/s SDI is also fully compatible with all your
existing standard denition and high denition SDI equipment.
Broadcast Quality
Mini Converters are built to the highest quality standards with low
SDI jitter, so you get the longest SDI cable lengths combined with
ultra low noise broadcast quality analog video and audio.
Five Exciting Models

Redundant SDI Input


Mini Converters feature a redundant input and loop through SDI
output. Connect a redundant SDI cable to the second input, and
if the main SDI input is lost, Mini Converters will automatically
switch over in an instant. Thats great for mission critical tasks such
as live events.
Pro Analog and AES/EBU Audio
Standard 1/4 inch jacks are built in to each Mini Converter for
professional balanced audio that switches between AES/EBU or
analog. Unlike other converters you dont need expensive custom
audio cables.

Choose either SDI to HDMI, HDMI to SDI, SDI to Analog or


Analog to SDI models for only $495. Reference all your studio
equipment with Sync Generator for $295.

Mini Converters

$495
Sync Generator

$295
Learn more today at www.blackmagic-design.com

n n n n

Stereoscopy

use to move objects toward the audience


or push them far away are the distance
between the two cameras (one for each
eye) and the convergent point for the two
images. The choice of camera lens affects
depth as well.
Lets say you want to use a long lens,
maybe a 100mm lens, Neuman says.
That gives you an unsatisfactory result in
3D: There is a large separation in depth
between the characters from foreground
to background, but each character looks
like cardboard. To compensate, you can
increase the interocular or interaxial distance [between the cameras], which increases the internal volume in the characters, but the gaps between the characters
are also magnified. And that could quick-

ly use up the stereo budget. To solve this


problem, Disney and other studios use
multiple sets of camerasperhaps one for
the foreground, one for midground, and
one for the backgroundand dial in the
depth to create the internal volume and
roundness for the character. Then, they
composite the layers together.
With that kind of control available, directors can choose where to place characters in scenes and how much depth to give
characters based on the story theyre telling. One metaphor we build on is equating the emotional depth of a scene to the
depth of the character, Neuman says. We
might increase the depth until we get a
nice, round, full character for an emotional
beat. A second literal metaphor we use is

equating emotional separation to depth.


For example, when directors want the
audience to connect with a character,
they might place that character on the
audiences side of the frame; that is, bring
the character out into the audience a little.
When they want the audience to feel detached from the character, they might push
the character back.
To do this without changing other characteristics in the scene, Neuman sometimes
uses another stereoscopic technique called
the floating window: The stereoscopy crew
puts a black mask on the edges of the images and then varies the thickness of that
mask between the two eyes to change the
perceived location of the theater screen.
We create a virtual screen (a window)

t
w
s
t
New tools from several software companies now help artists
working in 3D. For example, Autodesks Lustre speeds the
work of colorists who are grading 3D films. The Foundrys Ocular plug-in for Nuke helps compositors who paint one eye to
create the second eye in the correct depth. Quantels Pablo
system aids artists matching the colors on the left and right.
And, Tweaks RV software gives artists the ability to see images
in stereo 3D on standard monitors within RV viewing software.
The animation and visual effects studios are also creating
3D-specific tools, and those tools provide a glimpse into the
state of the art for stereoscopy.

Weve had to come up with a whole new workflow, says Matt


Welford, head of compositing at Weta Digital, where work is
under way on James Camerons Avatar and Steven Spielbergs
Tintin. We use [Apples] Shake and Nuke for compositing,
which are both node-based. Initially, we made a left tree for the
left eye and a right tree for the right eye, so we had two trees
per shot rather than one, as in traditional films. To reduce the
number of files and make file management neater for artists, we
worked on a way to represent stereo images in a single image
file. To do that, we formalized the names of the left and right eye
within the EXR file format.
Weta called the new format SXR, and worked with the
16

November 2008

OpenEXR community to make it broadly useful. Over the last


six months, weve seen companies starting to adopt, or at least
support, this format, Welford says. Now, within Nuke, for example, theres a left-eye and a right-eye button in a single file.

We have a whole suite of tools that take care of the nuts and
bolts of stereo for the artists, says Phil McNally, stereoscopic
supervisor at DreamWorks. If an animator has a scene with a
character running up a road straight to camera, we would want
the character to feel distant at the beginning and literally get
closer to us at the end. We give the artists controls so they can
dial the stereo as they watch the performance. Artists can have
a different stereo setting on every keyframe without worrying
about distortions, and without knowing the nuts and bolts.
We also have something called the multi-rig, McNally adds.
In CG, its easy to have three, four, five stereo rigs all pointing
to the same scene in much the way we have multiple lights putting rim lights on one character, for example, but not another.
Unless youre doing a 3D science project, why not have everything up for manipulation and artistic interpretation?

We developed a camera rig that uses a results-driven paradigm.


I can define how far out from the screen I want the closest part of

fl
t
w
t
u

u
v
n
p
t
a
t

t
E
p
s

Stereoscopy

But, he also has begun


using the floating window more creatively, to
temper the balance between emotional and
stereo depth.
Sometimes if we
can just pull out one
corner, we can reImageworks turned the CG film Beowulf into a stereo 3D version
tain the proximity we
using techniques honed on The Polar Express and Monster House.
want, Neuman says.
that we can float into the theater, push back And, were experimenting with tilting the
into the screen space, or change the orien- window during action sequences to create
tation of the image, Neuman says. This tension. Were sculpting a 3D environment
helps them fix problemsperhaps a frame artistically based on where the eyes are drawn
edge that crosses in front of an object ap- and where the action is taking place.
pearing in front of the screen. Moving the
At Laika, Coraline director Henry Selick
frame puts the depth cues back in sync. is using stereo 3D to enhance a main story

the scene to be, how deep into the farthest part of the scene, and
where I want the convergent point to be, says Robert Neuman,
stereoscopic supervisor at Disney. Based on those three things,
the system sets up the camera positions for us.
Neuman continues: We also have a tool to visualize the
floating window within a 3D scene. We can see exactly where
the floating window lies. So, if I have an over-the-shoulder shot
with a character breaking the frame on frame right, I can grab
that floating window within the camera space and pull it out
until its in front of the character.
To avoid the cardboarding that can happen when the crew
uses a telephoto lens, they have tools that tell them the internal
volume of characters. If a character has a factor of 1.0, its a
nice, round character the audience relates to. If its .1, or only 10
percent round, thats a problem, and we might need to use multiple camera rigs, Neuman explains. We also have tools that
allow us to view what it would look like with multiple rigs. All the
tools we developed in-house work within [Autodesks] Maya.

We have a huge suite of tools, from image viewing to dialing


the depth within Maya, that were constantly refining, says Rob
Engle, senior stereographer at Imageworks. But one area were
particularly interested in moving forward is the 2D-to-3D conversion. The most relevant example is our work on G-Force.
In this film, which is scheduled for July 2009, a specially
trained squad of guinea pigs becomes a force for doing good.
Were taking 2D plates, isolating elements that would be at
different depths within the plate, and producing the other eyes

n n n n

point for the stop-motion animation. Our


main character, Coraline, is a little girl who
lives in a bland, overcast place, says Brian
Vant Hul, VFX supervisor who won an
Oscar for visual effects in King Kong while
he was at Weta Digital. In that real world,
we keep the stereo flat. But when she goes
to the magical land, we increase the depth
to enhance the move. Were using stereo
3D in an intentionally dramatic fashion.
Because stop-motion animators shoot
their films one frame at a time, in order
to create the stereo 3D version, they simply shoot two stills. We shoot the left
eye with the digital still camera and then
move it over and shoot the right eye, explains Vant Hul. Compositors layer the
stereo images into backgrounds moments

point of view. We slide the elements over to create depth and


then fill in the holes that the other eye would see. Were developing a wide variety of tools that allow us to do that. And
then, we have the added challenge of putting elements from
the virtual world into the plates. And, were still investigating
the techniques well need for Tim Burtons Alice in Wonderlandimagine Beowulf but with live people in a 3D virtual world
thats Tim Burtons vision of Alice in Wonderland.

Frantic created 200 shots for Journey to the Center of the Earth
using Autodesks 3ds Max and Mudbox, Pixologics ZBrush,
Nvidias Gelato, and several proprietary tools, including the studios water-simulation software. For compositing, particularly to
work with stereo, the studio created a series of scripts and plugins for Eyeons Digital Fusion 10 that theyve named Awake. The
studio sells the tools it developed to work on Journey and other
films on its FranticFilms.com Web site.
We wrote a completely new pipeline for Journey, says Chris
Harvey, co-visual effects supervisor at Frantic with Mike Shand.
The entire movie was shot in stereo, so we had double plates
for everything. We could have decided to work on one eye and
then ask artists to create the second eye, or we could have had
scripts generate the second eye. But, we came up with a third
approach. Our artists worked in stereo all the way from tracking to finishing the shots. We literally worked on the shots with
both eyes at the same time stacked side by side or vertically.
We called it stereo stacking. It was a huge timesavings.
Barbara Robertson
November 2008

17

Stereoscopy

after an animator shoots the frame.


Before beginning the project, Selick consulted with Lenny Lipton, CTO at RealD
and a renowned stereo 3D pioneer. The
studio brought in other stereo experts, as
well. They taught us all the rules, and its
good to know the rules, Vant Hul says.
But many of us are from creative backgrounds. Were trying to break as many
rules as we can.
Depth of field is one point of contention. The experts say everything should be
as sharp as possible to make stereo work,
Selick says. That way theres no eye strain
and you can see everything. I say, Yeah, but
the whole point is that we dont want the
audience to look everywhere. The easiest
thing when you have so much in frame is
to go a bit wider and throw the background
out of focus so the foreground pops.
Thus, Vant Hul and his crew have decided to flaunt conventional wisdom when
they can. We dont launch an animator
on a shot lightly, he says. We dont have
time to go back. But sometimes given the
energy of a shot, we can be bold if the shots
arent on the screen long enough to cause
eye strain.

Live Action
One of the biggest 3D success stories for
2008 has been the feature film Journey to
the Center of the Earth, directed by Eric
Brevig, who received two Oscar nominations for best visual effects while at ILM
(Pearl Harbor, Hook) and a special Achievement Award from the Academy for Total
Recall. This Walden Media/Warner Bros.
film, which Brevig shot in 3D, has grossed
$178 million worldwide.
Frantic Films, one of several studios that
worked on the feature, created approximately 200 VFX shots, including those
in which they integrated footage of actor
Brendon Fraser and others in a raft that
floated on CG water. We had to track two
cameras for every shot, says Chris Harvey, co-VFX supervisor with Mike Shand.
18

November 2008

Stereo 3D at Home
Although movie theaters are excited that stereoscopic 3D promises to pull people away from their screens at home, consumer products are in the works for
viewing 3D at home and even on the road. Samsung and Phillips have demonstrated no-glasses 3D TVs. Hundai sells a 46-inch 3D TV that requires glasses.
At the Ceatec show in Japan recently, Panasonic showed a 3D high-definition
home theater, which requires 3D glasses, NEC demod a nine-inch glasses-less
3D LCD, and KDDI showed a 3.1-inch glasses-less 3D LCD display. And Fuji
has announced a pocket-sized camera that shoots 3D movies.
In addition, Nvidia is introducing new glasses and a transmitter that works
with existing CRT displays, high-definition DLPs, and the new 120Hz LCD
monitors. Software in the system converts a standard 3D video game into a
stereoscopic 3D game. Since all the data is coming down the pipe in real
time, we dont have to pre-author, says Andrew Fear, product manager. We
have a predefined depth amount, but if the end user wants to adjust the depth,
he or she can.

Although the target for the glasses is gamers, Nvidia expects that when a Bluray or other standard emerges for high-definition 3D movies, the DVD crowd will
want to view stereo 3D movies at home, too. I think were probably six months
to a year away from having a rigid format, Fear predicts. In the meantime, were
building our architecture to play back whatever they adopt.
Also aimed at gamers is iZ3Ds $599 22-inch LCD monitor that displays stereo
3D with the help of a pair of glasses. The display, which gamers can also use as a
standard monitor on PC systems equipped with a dual-output video card, displays
1680x1050-resolution images and stereo with a 170-degree viewing angle.
Meanwhile, such display manufacturers as Alioscopy are readying glassesfree 3D systems. Adam Holmes, producer of an announced stereo 3D feature
animation, has been using these monitors for pre-production and looks forward
to the day when glasses-free displays will find a home in living rooms. For now,
though, Im most excited about the possibility of monitors like these being used
as digital billboards, Holmes says. That could really help advertise our independent movie. Barbara Robertson

And for every frame, we had to create two


separate images.
Typically, the crew would track one eye
view and then use the studios software

tools to generate the second eye view.


Even so, the artists needed to tweak the
result. The physical cameras caused discrepancies, says Harvey, and the two

Stereoscopy

tracks had to match perfectly.


Similarly, rotoscoped images needed to
match perfectly, even though the original
images might not. For example: The camera views of Brendon Frasers hair might be
slightly different, says Shand. You might
see a curl of hair in the left-eye image but
not in the right-eye image. That confuses
the viewers brain and causes what I call
buzzing. So, the artists had to tweak the
images by hand.
Specular highlights captured by the
camera were often different for the left and
right eyes, as were lens distortions and colors. The biggest things that would make
this process easier would be if the camera
systems produced footage that was more
identical in color, Shand says, and, if we
could improve the interocular tracking.
As do filmmakers working with 3D
animation, Frantics visual effects crew
used the distance between the cameras as
a storytelling device for their largely digital
shots. If you increase the distance, things
effectively look small, as if you were a giant, Harvey says. If you do the reverse,
everything feels bigger and more imposing. So, we constantly shifted interocular
distance and convergence to tell the story.
We did a lot of adjustingliterally sliding
the images left or right.
In doing so, the crew quickly discovered
that making those adjustments on a computer screen wasnt sufficient. You have to
look at these images big, Harvey says. To
get things to sit properly in stereo, we were
moving things at a sub-pixel level.
For example, differences in the live-action
images sometimes caused ghosting. Ghosting happens when high-contrast areas close
to each other are at different depths, Harvey says. It causes buzzing in your eyes,
and its hard to resolve. If youre working
only in CG, you can move the images onto
a zero plane with no convergencethe left
and right images are the same. In live action, what was shot is what was shot.
At Weta Digital, where work on Camer-

ons Avatar is ongoing, Matt Welford, compositing head, is running into some of the
same challenges. Often, we find that pulling a key for the left eye will not work for
the right because of differences in lighting,
he says. We have to make sure the quality
is perfect for both eyes, but the cameras are
looking in slightly different areas.
That makes the job of painting the images much more difficult. Hand painting
on a single view is already difficult, Welford says. But now, if we have to remove
something from a scene and hand-paint the
image, were painting a very specific depth.
If we put something in a slightly different
depth for one eye, we throw the stereo off.
We had to toss a lot of the tips and tricks we
used to use out the window. However, its good to
be doing something new
and different.

n n n n

eron. It remains to be seen at what stage everyone jumps onboard, and that may never
happen. And three, making the decision a
no-brainer. Were not there yet. We have a
lot of work to do with tools, but what were
really missing right now is experience. The
visual language is still evolving.
What might we see in the future as
stereographers such as Engle, McNally,
Neuman, and various directors and visual
effects supervisors continue exploring the
possibilities? Engle offers a few thoughts:
So far, everyone is using 3D as a representational experience, he says. Theyre
mimicking reality. Why do we need to do
that? I think there is a filmmaker out there
who will make a revolutionary movie

Moving Deeper
Welford is certain that
although the studio will
solve some of the complex problems in the next
few years, creating hybrid
Disneys 2005 animation Chicken Little was the first stereo 3D
live-action/CG films, such
film shown in 84 theaters newly equipped with RealD projectors.
as those hes working on,
Experts estimate that in the first-quarter 2009, approximately
wont become a push-but3000 North American theaters will be stereo 3D-ready.
ton process anytime soon.
I think maybe for people working in someone who will really push the technolfully CG movies, it will be easier to make ogy. Someone who will break the rules.
3D a process, but it will take a long time
McNally is looking in that direction, too.
for people working with live-action stereo You dont expect a painting to be a photofilms to develop automated processes that graph, he says. You expect to see the mewill be at a standard thats acceptable for dium. What does it mean when were in a
final production.
full three-dimensional spatial delivery of a
Even so, Imageworks Engle believes that story? Thats what is so exciting. We have
the choice to make a 3D film will eventual- an undiscovered medium.
ly become as insignificant as the choice beBut with 5000 3D theaters in the offing
tween filming in color or black and white. and directors readying new content, its a
We have three hurdles to overcome, En- medium thats ready for its close-up. n
gle says. One, audience acceptance among
adults as much as kids. Two, filmmaker ac- Barbara Robertson is an award-winning writer and a
ceptancenot every filmmaker is as excited contributing editor for Computer Graphics World. She
as Burton, Spielberg, Zemeckis, and Cam- can be reached at BarbaraRR@comcast.net.
November 2008

19

20

CG Animation

November 2008

CG Animation

Images 2008 Disney Enterprises, Inc.

as the little American white shepherd dog


with the black lightning bolt on his side discovers in Bolt, Disneys latest animated film. Bolt, the dog,
stars in an action-packed TV show, and the CG feature opens with an amazing chase scene. Bolt leaps
through the air, stops the bad guys in their tracks with his superbark, and, most important, rescues
his co-star, Penny, from the green-eyed villain. Then, its over. The director yells, Cut, the crew takes
Bolt back to his trailer, and Penny goes home.
The production crew, we learn, wants little Bolt to believe he really does have superpowers; they
wont let him live a dogs life with Penny. And, he does believe. So, when a series of mishaps put him
outside the trailer where he spots the green-eyed man, he believes the villain has kidnapped Penny
again. He races to her rescue, only to land in the back of a truck heading for New York City. And thats
when Bolts real adventure begins.
Directed by Chris Williams and Byron Howard, Bolt stars the voice talents of John Travolta as Bolt
and Miley Cyrus as Penny. Its the first film shepherded through Disney Animation Studios by John
Lasseter, who arrived as chief creative officer at Disney in 2006 following the Pixar acquisition.
Every artist in this building is operating at a level theyve never operated at before because of John
Lasseter, says Clark Spencer, Bolt producer. Four years ago, this was a completely different film, with
a different title and different directors. This movie would not be what it is without John Lasseters
involvement in it.
Four hundred artists worked on the feature for 88 weeks, producing approximately four seconds
of the film per week. Lasseter chose the directors: Williams directed Disneys first 3D short film Glagos Guest, wrote the story for Mulan and
The Emperors New Groove, and received an Annie nomination for the latter;
Howard was a supervising animator on Lilo & Stitch and Brother Bear, an inbetween artist on Pocahontas, an animator on Mulan,
and received an Annie nomination for his character
animation in Brother Bear. For Bolt, Howard tended to concentrate on the animation side
side, while
Williams focused on the story.
John [Lasseter] loved the essence of the story
we had, that a dog lives on a TV show and only
knows the TV show, and is thrown into the real
world and discovers [his previous life] is a lie,
Clark says. And then he returns to the little
girl to find out if she was acting.
In the film, once Bolt is in the real world,
he tangles with a street cat named Mittens.
Later, a hamster named Rhino, who lives inside a glass ball, rolls into the adventure. Heres
the dynamic among the three: Bolt believes
Bolt, the dog star
Mittens will lead him to the green-eyed villain
of
the film Bolt, owes
and, thus, to Penny, so he leashes the cat to his
the look of the painterly
collar. Mittens quickly realizes who Bolt is and
CG background behind
ridicules his supposed superpowers. Rhino,
him (at left) to new technolon the other hand, a nerdy fanboy who has
ogy invented at Disney Animation
Studios for this project.
memorized every Bolt TV show, totally
November 2008

21

n n n n

CG Animation

believes in his hero and encourages Bolt to


display his superskills. When Bolt warns
Rhino that their journey could be danger
ous, the little hamster replies from inside
his glass bowl, I eat danger for breakfast.
Rhino is like the kid who would throw
a towel around his neck like Superman and
jump off a roof, says Nathan Greno, super
vising storyboard director, who joined the
crew as Lasseter and Ed Catmull took the
reins at Disney Animation. Although Bolt
was always a dog and Mittens always a cat,
originally Rhino was a rat. Everything was
up for grabs, Greno says. Rhino turned
into a hamster during a retreat.
We were with about 35 or 40 directors
from Pixar and Disney, and someone said,
I always wanted to do a hamster in a ball,
Clark says. We all knew, at that moment,
wed have a hamster in a ball.

Animal Truth
Disney casts animators based on their skills,
and organizes the supervisors by character.
Sixty-two animators worked on Bolt, led
by six supervisors. Clay Kaytis led the team
of animators working on Rhino.
Of all the characters, Rhino has the
most controls, Kaytis says. Hes the most
complicated because he is fat and round,
and volumes want to crash. The anima
tors worked in Autodesks Maya using rigs
that allow for squash and stretch. Deform
ers rode on the topology to simulate what
a muscle system would do, but calculated
the blendshapes only on surface changes.
Kaytis moved Rhinos belly with a control
button; the hamsters whiskers, though,
moved automatically.
All the animals talk, except when hu
mans are on screen, and the animators
started working on shots by listening to the
dialog. I started with the character in a set,
in a layout pose, says Kaytis. Then I lis
tened to the audio over and over, to under
stand what the throat is doing and when to
take a breath. I did five frames of poses to
show the director. And then I shut the door
22

November 2008

Rhino, the hamster in the ball, tells his hero Bolt that hed be a big help. Mittens, the cat leashed to
Bolt, isnt convinced. About 400 artists, including 62 animators, worked for 88 weeks on the film.

and worked for a couple days.


Rhino is the most anthropomorphic of
all the animals because he stands up inside
his bubble; otherwise, all the four-legged
animals stay on all fours. John [Lasseter]
has certain rules, Kaytis says. Well, maybe
not rules, but consistencies. His mantra is
truth in materials; something has to move
the way it looks. So, if an animal looks like
a dog, it should move like a dog.
Although Bolt looks cartoony in his TV
show, when hes outside, hes in the real
world, which meant he needed to move
realistically. To help the animators under
stand real-world animal behavior for the
dog, cat, hamster, and other critters in the
film, Disney had trainers bring various
animals into the studio and had Dr. Stuart
Sumida, a biology and anatomy professor
at California State University in San Ber
nardino, teach classes. In addition, Rhino
animators could observe a real hamster
named Doink, which joined the crew and
had his own glass ball.
Sometimes, though, the filmmakers
needed to fudge accuracy. For example:
We reached a point when we thought we
had nailed the dog, Kaytis says. We had
animated a couple sequences. But he wasnt
appealing. So, we looked at the dogs in Lady
and the Tramp, at the shape of their brows,
how their muzzles worked, their ears, and
their eyes-to-face relationship. And then

we remodeled Bolts face. They gave him a


larger brow ridge, made his eyes bigger and
rounder, and changed the fur color on parts
of his face.

Dogged Artists
Art director Paul Felixwho came to Bolt
with credits as a production designer for
Lilo & Stitch and The Emperors New Groove,
visual development artist for Brother Bear
and Mulan, location designer for Tarzan,
and character designer for Mulanalso
considered Lady and the Tramp when he
began working on the film.
The journey that Bolt, Mittens, and Rhi
no take from New York City to Hollywood
sends them trotting (and rolling) across the
US, which meant the crew needed to design
and build what Adolph Lusinsky, the films
look and lighting director, calls 200 envi
ronments on a 30-environment budget.
To design the environments, Felix refer
enced early Disney films and the Ashcan
School of art. I liked the sense of place in
Disney films like Mickey and the Beanstalk,
Peter Pan, and Pinocchio, and the dirty
quality and looseness in American painter
Edward Hoppers architectural settings,
Felix says. Hoppers watercolors and paint
ings often include Impressionist back
ground buildings and simplified geometry,
which, while accurate in detail, is none
theless abstract. I like the idea of simple

CG Animation

characters as a good foil against painterly


backgrounds, Felix says.
Achieving painterly backgrounds with
computer graphics, however, isnt always
easy, and the job of translating Felixs vision
fell to Lusinsky. We worked with R&D
for a year to develop tools and processes to
achieve the loose brush style in the early
Disney films, Spencer says. The result:
new technology; so new that Disney has
filed for patents on it.
Lusinsky groups the new technology into
four areas: ray painting, normal mapping,
texture painting, and painterly shadows.

Loose Edges
In 1999, Disney animators sent Tarzan, a
2D character, swinging through painterly
3D jungles with the help of new technology called Deep Canvas (See Deep Background, July 1999). Background artists

use the brushstrokes on silhouettes in CG


models, Lusinsky explains. We call it
ray painting.
The process starts with painters working
in Adobe Photoshop or Corel Painter to
create and store libraries of brushstrokes.
For Bolt, to create the Edward Hopper-like
backgrounds, they used a dry brush.
Were only interested in the quality
of the edges, Lusinsky points out. We
want the edges to feel like brushstrokes. If
we had painted on the geometry, the sides
would recede in space. In contrast, our
brushstrokes never recede in space; they stay
on the edges because they can track with
the camera and stay flat to the image.
To make this possible, the studios expression-based XGen software grows virtual
cards2D planeson the geometry in the
scene. For each card, XGen associates particular brushstrokes from the reference library.

Bolt, who believes he really is a superdog, thinks hes helping Penny escape from the villains, but
its only television fakery. The Bolt look and lighting team based color curves and contrast ranges
for the movie on two film stocks, one for scenes in the TV show and another for the real world.

painted on 2D plots of 3D scenes. Deep


Canvas then applied information about
each brushstroke to positions in 3D space.
But, Deep Canvas relied on a proprietary
renderer, and for Bolt, the crew wanted to
use Pixars RenderMan.
We came up with new ideas and techniques that involved raytracing, to put
the brush strokes in 3D space so we can

That gives us a way to get the brushstrokes onto a piece of geometry in 3D


space, Lusinsky says. We never render
the brushstrokes. We render the final
model and then send a raytrace. When the
ray hits a brushstroke, it sends information
back to the geometry. It says, this is what
brushstroke I am, and this is how Im going to look.

n n n n

Brushing Light
In addition to XGen, Disneys look development tool kit includes a 3D paint
program and shader expressions, all rolled
into a powerful system that allows artists to
create procedural expressions without having to write procedural shaders (see Fast
Forward, April 2007, and The Skys the
Limit, November 2005).
Our Paint3D program for painting
textures is powerful, Lusinsky says. You
dont just paint in it; it has an expression
library. Therefore, artists using the program can easily and quickly move back
and forth between procedural and handpainted textures.
For example: You can bring different
properties of the geometry into Paint3D,
Lusinsky says. If you pull the normals in
as images and paint on those images, theyll
remap to the geometry, so the geometry
thinks it has a new normal.
The new normal mapping technology
allowed artists working on Bolt to give the
surfaces of the painted geometry a brushstroke quality by working with surface
normalsthe imaginary lines tangent to
the surface of geometry that renderers use
to calculate specular reflections. We remapped the normals to brushstrokes so the
surface appears to be made of many brushstrokes, Lusinsky explains. As the specular
light falls across the surface, it gets broken
up and takes on a brushstroke quality.

Impressionist Painting
To further help the artists create painterly
backgrounds in Edward Hoppers style,
the technical crew devised technology they
call Look A, Look B for texture painting.
Lusinsky uses a brick wall to explain how
the new tools worked in practice. First, the
artists painted and saved an Impressionistic
layer in which, for example, one brushstroke
might represent 20 bricks at a time. Then,
they painted individual grout lines between
the bricks to create and save a more detailed
layer. The painters created these two types
November 2008

23

n n n n

CG Animation

Animators gave Bolt cartoony superpowers when the dog starred in his TV show. When Bolt
travels through the real world, though, the animators followed John Lasseters maxim of truth to
materials, and made sure the little dog behaved as a real dog would.

of layers for each material in Bolt.


When we rendered the materials, we
could decide whether we wanted a looser,
abstract interpretation, or more detailed
look, Lusinsky says. Sometimes, the artists
made the decisions arbitrarily; other times,
they set up the system to make procedural
decisions based on depth cues for example,
to render less detail when a material is some
distance away from the camera, or, similarly, to render looser details in shadows.
Lastly, to create painterly shadows, the
technical team developed new technology
that works within RenderMan to fringe the
shadow edges. We developed tools and
techniques that give the edges a different
color than the interior, Lusinsky notes. To
do that, the shaders referenced libraries of
painted brushstrokes.
For the deep shadows in the characters
hair, though, the crew leveraged the capabilities of technology already within XGen.
XGen keeps track of where each groom
(guide) hair grows, how many hairs are associated with it, how far away it is from the
camera, how big, and so forth, Lusinsky
says. Because it is open-ended and anyone
can write expressions or plug any kind of
code into it, we used XGen to shade the
hair in a more art-directable way. Rather
than being dependent on volumetric shad24

November 2008

ers, we derived volumes of hair using information from XGen, which helped speed
up hair renders quite a bit.

Believable Light
For cinematography, Felix turned away
from animated films and toward live-action movies of the early 70s. In particular,
he liked Vilmos Zsigmonds cinematography in Heavens Gate and the soft sensibility of light in Ridley Scotts McCabe
& Mrs. Miller. Scotts photography in the
traveling picture Thelma & Louise also
inspired him.
On the technical side, Lusinskys team
studied film stocks, particularly older ones.
We developed film curves that all the renders used on our show, he says. That gave
us a painterly textural world that seems photographic because the light is photographic.
One film stock provided the color curves
and contrast range for the sequences in
which Bolt stars in the TV show; the other
provided correct exposures for the rest of
the movie, when Bolt rumbles through the
real world. To accurately light Bolts journey across the country, Lusinsky and others
traveled his route themselves.
We wanted to have a naturalistic feel
for the lighting, so we spent a lot of time
studying light across the country, Lusin-

sky says. We wanted to capture the quality of the light in each area and pull color
palettes from each area.
To light shots in the garment district of
New York City, for example, they developed
a desaturated color palette that mimicked
humid afternoons when the sun is soft and
muted. In Ohio, they discovered that the
humidity in the air tended to cast the sky a
turquoise color. For Los Angeles, they rendered the sun hot and hard.
We really tried to capture the subtleties
that make each place different, Lusinsky
says. The light, the architecture, the foliage. We wanted to be true to the locations
so you felt like Bolt was in the real world.
We thought if we made it too stylized, people wouldnt believe hes in a real world; it
would feel like fantasy. We wanted it to be
believable. And, John Lasseter is really big
on details in the environments.
When Lasseter and Catmull arrived at
Disney Animation, they tore down walls,
literally and figuratively. Formerly walledoff offices became an open conversation
pit with a jukebox, magazines, and a cereal
bar. Layers of management peeled away,
too, which invigorated the studio with
new, creative energy.
Before John [Lasseter] and Ed [Catmull] arrived, I kept saying how great
it would be if we had a boss like John,
Howard says. Hes like a big kid who
loves animation, and that enthusiasm filters through the crew.
Williams adds, John brought so much
energy and excitement. He raised the quality bar. He gave us a real creative surge.
Bolt was the first lightning rod for that
renewed energy, and it could hardly be
more fitting. In the film, the star learns that
he doesnt have superpowers. In making
the film, the animation studio remembers
it does. n
Barbara Robertson is an award-winning writer and a
contributing editor for Computer Graphics World. She
can be reached at BarbaraRR@comcast.net.

n n n n

Hardware

Workstations
on the

Todays latest mobile


devices are truly worthy
of the workstation label

By Jon Peddie

Dell recently unveiled a new line


of Precision mobile workstations.
For users who value fashion as well as
function, the M6400 Covet features a classy
Vibrant Orange chassis.

26

November 2008

Hardware

This year, 2008, marks the year Moores


law gave us the most mobile workstations
ever. They are laptop computers with 17inch or greater screens, powerful dual- and
quad-core processors with gigabytes of fast
DDR2 DRAM, powerful GPUs with lots
of graphics memory, high-speed and largecapacity disks with RAID, optical drives
with Blu-ray, every kind of I/O you can
imagine, and more features than you can
find on many desk-bound machines. Those
are the facts. But what does it all mean?
Aside from the sheer pleasure of being
able to take your work with you all the
time, what would be the motivation for
having a portable workstation, and how
portable are we talking? What about the
workstation side? Are there compromises?
It seems like there are some incontrovertible conflicts in the premise.

What Is A Workstation?
A workstation is distinguished from an ordinary PC by a few critical items. Theres
a fairly regular debate about what a workstation is and what it isnt. And too many
vendors try to market high-end PCs as a
workstation in order to get a higher price
for them.
Workstations cost more than a typical
PC, and there are good reasons for that. For
one thing, a workstation is often used in a
mission-critical situation and, therefore,
has to work 24/7, 365 days a year with no
hiccups or glitches. Workstations are used
with high-performance and often expensive software, and have to be qualified by
the software vendor. That is done to give
assurance to the buyer that the machine is
going to work, and work well.
Generally speaking, a workstation has a
high-resolution display and high-performance graphics, a large capacity of highspeed (high-bandwidth) memory, top-ofthe-line CPUs, and fast high-capacity disk
drives. Furthermore, a workstation will have
application-certified drivers that guarantee

Workstation SegmentWorldwide Market Share


60%
50%

mobile
entry
midrange
high-end
ultra high-end

40%
30%
20%
10%
0

Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2
04 04 05 05 05 05 06 06 06 06 07 07 07 07 08 08
Courtesy Jon Peddie Research

reliable operation with an application. Also,


the certified drivers are tuned to the specific
applicationsa qualification that is not just
a label but really means something.
One of the distinguishing hardware features of a workstation is its error-correction
memory (ECM). However, in the case of
mobile workstations, there is no ECM. Additionally, a workstation is equipped with a
power supply that contains extra capacity.
Therefore, cooling and noise design is critical, and more care is put into baffles and
routing than is found in an ordinary PC.
These are some of the most challenging
aspects in building a mobile workstation
that, ultimately, limit the systems abilities.

Categories
Jon Peddie Research (JPR) categorizes the
workstation market into five classes, or segments: entry, midrange, high-end, ultra
high-end, and mobile. The entry level has
no 3D capability, whereas the ultra highend is, as you can imagine, the most powerful machine that can be built with dual
graphics boards, dual processors, and zillions of megabytes of memory.
Somewhere in between is the mobile
workstation, which, like the mobile PC, is

the fastest-growing segment.


The workstation market in terms of unit
shipments is about the same size as the enthusiast gamer market, which is an interesting correlation with regard to the users
demand for high performance and a willingness to pay for that performance.

Can I Take It with Me?


So, considering all the above information,
why would a person want a workstation to
be mobile? Collaboration is a big buzzword
in the industry, and among distributed
workforces and partners, being able to share
a design or a problem is critical. But what if
youre on a campus or in an aircraft manufacturing facility? In those situations, being
able to grab the workstation and scoot off to
a meeting, or jump on a jitney and go to the
other end of the production line, would be
an enormous advantage. However, working
on a design while traveling in an airplane,
although often mentioned, isnt a realistic
application. For one reason, these portable
workstations are big, and for another, the
design work is so sensitive youd never want
to risk exposing it in public. That tends to
mitigate the argument about battery life.
And for those workstation users who fly in
November 2008

27

n n n n

Hardware

The Lenovo ThinkPad W700 has some


unique features, including a built-in Wacom
tablet below the keyboard.

business class or better, there is usually available power on the plane.


Lets get back to the initial question,
though: Can a workstation actually be portable? Display size, resolution, and graphics
performance are usually critical to a workstation user, and those features are major
power drains. So, if a portable workstation
can survive on battery operation for more
than two hours, thats considered pretty
good. And when you pack that much power
into a mobile machine, it ends up weighing
as much as eight pounds (3.6 kg) or more.

Workstation Displays
The new crop of portable workstations has a
17-inch WUXGA (1920x1200) screen with
400-nit brightness. (More about nits later.)
This year, LCD display technology has
improved to the point where users can get
an almost-perfect color representation. In
the film industry, judge the color correctness by comparing it to Adobes RGB color
space, developed in 1998 to encompass
most of the colors achievable on CMYK
color printers.
Today, some of the new mobile workstations are expressing their display capability
in terms of a percentage of Adobe color
space. This is an awkward description be28

November 2008

cause of the seven-dimensional


nature of the color space, and it
begs the question: Which part of
that color space is compromised?
That undoubtedly will be a marketing debate among the suppliers.
No matter, Lenovo maintains that
it supports 72 percent of the Adobe
color space, and HP maintains it supports 100 percent, albeit with an external DreamColor monitor. The idea is to
be able to pick a color on the screen and
expect that color to show up on the printed
page. Remember WYSIWYG? Well, were
getting pretty close to it now.
The Lenovo ThinkPad W700 workstation has a unique built-in color calibrator.
A small sensor is located in the base, just
below the space bar. When you close the
lid, the display runs through a series of test
colors and patterns; this process is monitored by the sensor, and the system then
calibrates the screen so it is as color-accurate as possible.
HP, on the other hand, is promoting the
DreamColor aspect of its EliteBook 8730w
mobile workstation. DreamColor is a perfectly color-corrected monitor that HP introduced recently, starting with the LP2480zx
monitor for the film and photo market.
The LP2480zx external stand-alone
monitor has a set of RGB LEDs to extend
the color range. The LED panels fit as part
of HPs DreamColor technology initiative,
which encompasses higher-precision (true
30-bit, 3x10-bit per channel) and colorcalibration technology.
In the context of HPs mobile workstations, the DreamColor external monitor
provides calibration technology, but on a
foundation of more conventional 24-bit
(3x8-bit) color precision. Nevertheless,
they are equipped with a WUXGA RGB
LED backlight panel, and while HP is still
tweaking the color spaces, the company is
targeting 132 percent of the Adobe space
(and 151 percent of RGB). The DreamColor panel option on the mobile worksta-

tions just recently began shipping.


In addition to color balance, the screens
brightness and contrast are important
as well, especially during DCC applications. Brightnessor more correctly, luminanceis expressed in nits; one nit is
equal to one candela (a unit of measure of
the lights intensity) per square meter (1cd/
m2). And bigger is better, so you want the
most nits, or candelas, you can get for the
dollar when selecting displays, whether as a
stand-alone or on a laptop.
Contrast, meanwhile, is a ratio and
is based on black as the reference point.
However, LCDs dont project a very good
black; typically, the best they can offer is a
very dark gray. So polarizing and wave filters are placed in front of them to achieve
a blacker black. This requires a brighter
backlight, which in turn drives up power
consumption. Contrast ratios for a good
home theater TV will be between 5000and 10,000-to-1. A midrange laptop will
have a ratio of between 500- and 1000-toone, and it is important to note that the
contrast ratio impacts the displays ability
to reproduce the color spectrum.
The HP 8730w delivers 300 nits and an
800-to-1 contrast ratio, while the Lenovo
W700 offers 400 nits with a 500-to-1 contrast ratio.

Powering the Displays


The new mobile workstations are equipped
with powerful GPUs, but have limited
memory compared to their desktop counterpart. Typically a desktop will have a
graphics add-in board with 1gb of video
RAM, whereas a mobile workstation will
typically have a maximum of 512mb.
In the evolving mobile workstation market, Nvidia has jumped to an early lead,
with vendors such as Dell, Fujitsu, HP, and
Lenovo usually positioning the Quadro
FX3700M and FX2700M in the standard
configuration, with the AMD ATI FireGL
V5725 offered as an option. GPUs from
both companies support OpenGL 2.1, but

CGW :908_p

8/22/08

2:22 PM

Page 1

More Cores. More threads.


Heart-pounding performance
in the palm of your hand.

Softimage is in an excellent
position to leverage Intel
multi-core processors.
We have an architecture that
isat the coredesigned to
leverage the power of multi-core.
SOFTIMAGE | XSI* 7 and ICE
deliver the performance and
interactivity that people need for
creating today's 3-D content."
Merten Stroetzel
Director Services,
Softimage

Learn more at www.intel.com/software/visualadrenaline

2008, Intel Corporation. All rights reserved. Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and other countries. * Other names and brands are the property of their respective
owners. Image courtesy of Softimage Co. and Avid Technology Inc. SOFTIMAGE and XSI are either registered trademarks or trademarks of Avid Technology, Inc. in the United States and/or other countries.

n n n n

Hardware

HP has added a DreamColor option to its


8730w mobile workstation, which is already
nicely equipped with a Montevina-class
Intel Core 2 Duo processor and more.

additionally, ATI delivers DirectX 10.1 support (Nvidia is at DirectX 10.0). The Nvidia FX3700 is capable of supporting 1gb of
video memory, and both HPs EliteBook
8730w and Lenovos ThinkPad W700 use
CGW_1 FOURTH:CGW_1 FOURTH 10/23/08
that GPU; thus, the workstations can be
equipped with 1gb of video memory.

Even though Nvidia currently has the


standard position in these mobile workstations, that situation could change. Given recent evidence both from SIGGRAPH
and from vendor comments, we can expect
to see a surge on the part of AMD mobile
workstation graphics in the coming year.
The company has clearly renewed its efforts with respect to workstation graphics,
and vendors have taken note.
Other issues to consider are the user interface and input devices. The mouse and
trackball still reign supreme (see Invaluable
Input, October 2008), but there is also a
large segment in the DCC world who work
with a tablet, using a pen for input. With
that in mind, one supplier, Lenovo, has
cleverly built in a Wacom tablet just below
the keyboard of the ThinkPad.
5:18 PM

Page 1

Processors and Memory

Workstations, mobile or fixed, are expected to have the most


powerful CPUs and the
fastest and most RAM.
However, in a mobile
workstation, or any
FaceShop
mobile device for that
Take any photo and
matter, power and therturn it into a 3D head.
Use it as a morph with
mal management is cruany application that reads
cialsuper-fast CPUs
OBJ, like Maya, 3DS Max,
Poser, Lightwave, DAZ
and GPUs get super hot,
Studio, etc.
and theres scant space
Available in WIN/MAC
to put in fans. Also, the
$89.99
limited space restricts
the amount of memory
FaceShot 3D Camera system
that can be packed in,
3D Mesh on a budget.
and it has to be cooled
FaceShot comes with 2
as well.
Canon EOS XT cameras,
mounts, hardware and
HP is equipping its
FaceShot 1.0 software.
EliteBook 8730w moCreates full 360 3D
bile workstation with
heads with 30,000
polygons in minutes.
a Montevina-class InStarts at $1,799
tel Core 2 Duo (up to
(Compare to similar
systems costing 10x, 20x
3.06ghz) and up to
as much)
8gb of 800mhz DDR2
www.abalonellc.com
memory. Lenovo configures its W700 with

Create 3D Faces Today!

1.

2.

30

November 2008

an Intel Core 2 Extreme Quad-Core CPU


and up to 8gb of high-speed DDR3 memory. Lenovo also offers optional dual hard
drives with RAID configurations, and an
optional Blu-ray DVD burner/player,
while HP can provide up to 640gb when
a second drive is added (displacing an optical drive).

They for Real?


So, are these new monster machines real
battery-powered workstations? The answer
is a slightly qualified yes. Qualified because they dont offer ECM, and they dont
have the most powerful GPUs. But, those
are the trade-offs for size, power consumption, and cooling, and they are only slight
compromises at that. When you think back
to just a few years ago, the idea of having
a portable workstationa real workstation-class machine, not just some high-end
branded PCwas on the verge of absurdity.
It would be like putting a Ferrari inside a
Smart carnot only physically impossible,
but ridiculous in concept. And look at us
today. We have it, with gigantic screens and
super-high resolution, super-fast multicore
processors, powerful GPUs with a gigabyte
of fast memory, and special features like Bluray disk drives.
Having been a workstation builder and
now a workstation user over the past few
decades, I find it amazing to think about,
let alone imagine it possible a decade ago.
And, thats not all. Dell unveiled a new
line of Precision mobile workstations,
and got the ball rolling with the 17-inch
M6400, which supports up to 16gb of
RAM, a 1gb graphics card, an Intel Core
2 Duo Quad-Core QX9300 Extreme Edition processor, four memory slots, and
RAID capability. So there are even more
choices for professional users on the go. n
Jon Peddie is president of Jon Peddie Research, a
Tiburon, CA-based consultancy specializing in graphics
and multimedia that also publishes JPRs TechWatch.
He can be reached at jon@jonpeddie.com.

CGW :808_p

7/16/08

11:45 AM

Page 1

CG I

T
l
W
c

32

November 2008

CG I

hey can be found in every society, con men who take advantage of others while capitalizing on
their victims weaknesses and fears.
In real life, their deceitful deeds are
no laughing matter. But in the
world of make-believe, their actions and
consequences can be hilarious, and the
characters themselves endearing. Such is
the case with Gwizdo, a medieval swindler
who stars in Dragon Hunters, a European
CG feature film.
Dragon Hunters follows the adventures
of Gwizdo and his partner, Lian-Chu, both
hunters for hire in a futuristic/medieval
world of floating land masses terrorized by
various monsters, also known as dragons. In
this timeless universe, dragon hunting is not
a fairy tale, but a real job involving contracts,
money, and problems with clients who dont
want to pay. Knights work for free but are
not always available when needed, thus providing opportunity for commercial hunters.
Small in size, cowardly, and less-thanhonorable, Gwizdo is the brains of the
operation, negotiating contracts with the
helpless villagers, while the hulking but
kindhearted warrior Lian-Chu provides
the brawn (it is his task to slay the beasts).

However, that seldom happens. Both pretend to be brave and collect advance payments for dragon kills that never occur.
While opposites in stature and personality,
the two have been longtime friends, having been raised together in an orphanage.
Often, they are accompanied by their pet
dragon, Hector, which is more like a pet
dog than the large, menacing dragons they
hunt. Then, one day, they meet little Zoe,
a believer in fairy tales and the niece of a
wealthy nobleman frightened by the return of the fiercest dragon of them all, the
World Eater. Soon, the fly-by-night dragon
hunters find themselves on an exciting adventure: their first actual hunt.
The underlying story is universal in
nature, a description that also holds true
for the multinational group of nearly 450
people behind this film. Mac Guff Ligne
in France, Trixter Film in Germany, and
LuxAnimation in Luxemburg were responsible for the movies creation. French
graphic novelist Arthur Qwak developed
the original story and served as co-director alongside Guillaume Ivernel, with the
French company Futurikon producing.
The movie found eager audiences throughout Europe but is still waiting for possible

The two con men-turned heroes face their biggest fear,


literally, in the film as they chase down the fierce
World Eater, whose immense size (450 feet tall)
challenged the hunters as well as the artists.

North American distribution.


Nevertheless, some folks in the US may
recognize the story line and the characters.
Thats because the property began in 2004
as a television series on Cartoon Network.
From the beginning, Dragon Hunters was
conceived as a concept that could be a series, a video game, a feature film, and so
forth, says Caroline Blin, marketing director at Futurikon. To that end, Futurikon
recently completed a second season (52
half-hour episodes) of the series and rolled
out a video game for the Nintendo DS to
coincide with the film debut.
Even though the concept was imagined
as both a feature and a TV series from the
start, one of the biggest hurdles was getting
the producers to pony up the necessary
funds for a 3D film infused with a tremendous amount of effects. We were like
Gwizdo and Lian-Chu: two lonely rogues
who needed credibility! jokes Qwak.

Film Production
Blin is quick to point out that the moviea
prequel to the series, a how it all beganis
more than a mere extension of the TV shows
in story, tone, scope, and character development. The biggest difference, though, is the
animation itself: The series has a traditional
2D cel look, while the film is 3D CGI.
Therefore, no assets from the series could
be reused for the film. Yet, the 3D style is
atypical of the CG films produced in
the US, with their rounded edges,
glossy imagery, and tighter frames.

November 2008

33

n n n n

CG I

Most unusual are the films environments, which are in constant motion and float in midair, as
do elements and debris, created with a particle-based system and turbulence field.

VFX producer Jean-Jacques Benhamou


from Mac Guff describes the fairy-tale style
of Dragon Hunters as a Miyazaki mood
mixed with German romantics influences.
In contrast, the worlds of Dragon Hunters contain realistic surfaces (grass, water,
stones, fire) but with unique physics in
which everything is moving all the time.
The worlds had to feel real in order for the
adventure to be credible, says Qwak. So
we approached it similar to a live-action
film. That meant realistic rendering, matte
paintings, and a tremendous amount of
work on compositing. The characters, while
cartoon-shaped, have realistic textures.
According to executive producer Michael
Coldewey from Trixter, the shape of the
characters and the entire production design
is one of immense contrasts. If we have to
show something big, it is big in this movie.
If there is a happy moment, we show little,
fluffy, white bunnies flying through the
sky, he adds. Coldewey also believes the
film gives audiences time to enjoy the artwork: We have some fast-edited sequences
and action, but we also provide emotional
pauses to relax and enjoy the art.

Character Building
In all, the film took more than two and a
half years to completeone year of development (storyboards, designs, references,
34

November 2008

and a 2D animatic) and another year and


a half of CGI production. The majority
of the films CG work was done by Mac
Guff, which created the characters, rigs,
textures, and some animation. The facility also did all the background renderings
and composites of the final images. Trixter,
meanwhile, was responsible for the character animation and the musical score. LuxAnimation modeled the props, sets, and
backgrounds.
In addition to a sometimes language
barrier among the three studios, the facilities had to ensure that their production
pipelines were connected and compatible
for asset sharing. Also, the studios had to
work as a single production unit.
Using sketches, paintings, and clay models as references, the crew at Mac Guff used
Autodesks Maya, along with a good amount
of in-house software and special scripts, to
bring the Dragon Hunters characters to 3D
life. Character layouts soon followed to validate the 2D-to-3D transfer in terms of the
look, proportions, and so forth.
One of the biggest challenges, says modeling and lighting supervisor Nicolas Brach,
was creating some very stylized characters
that would be integrated in very realistic
environmentsa look they dubbed illustrated realism. This required the modelers
to focus on adding fine details to the mod-

els while retaining their uncluttered style.


Often, this meant adding more modeling
detail to the characters props and suits
rather than to the cartoonish heads and
faces. For example, Knight Lens Flair and
his horse have very realistic, detailed armor on more stylized and cartoonish body
structures and proportions.
While all the characters had their unique
qualities, the most difficult to model, says
Brach, was Hector the blue dog-dragon
not because of his shape, but his topology.
From the beginning, we knew we had
to consider all the motion possibilities he
should have, he says. We had to be able
to squash and stretch all parts of his body,
including his eyes and teeth, without crashing the skinning and blendshape process.
Another was the World Eater because of its
size (450 feet tall); also, it was to appear in
both wide shots and close-ups.
For the hair and furfrom Zoes braids
and Lian-Chus ponytail, to the fur trim on
the clothesthe artists at Mac Guff used
the studios proprietary tool called Symbor.
With the tool, they generated guide splines
on each vertex of the fur, then brushed
them into the modeling layout with special
tools, akin to using combs. Afterward, they
created a large number of hairs to fill the
surface using those guides. For every character, the team had at least three hair systems for different densities and thicknesses,
and then chose the most appropriate system based on the location of the character on the screen and its proximity to the
camera. We didnt want to unnecessarily
overload the memory, Brach adds.
The hair and furas well as the
clothesmoved with the help of dynamics, as well as with Syflexs cloth-simulation
tool and the plug-in from Mac Guff.
For the most part, the group hand-painted the textures in Adobes Photoshop and
used Pixologics ZBrush as well to create
displacement maps and deep UVs to deal
with the films complex texture projections.
To ensure that the stylized characters fit well

CG I

aesthetically into the backdrops, the artists


used realistic textures, acquiring basic materials from photographs and then mixing
the textures in Photoshop. They used this
process to create the clothes, leather goods,
metal objects, skin, and other materials.
With Symbor, the artists then built
complex shaders driven by mathematical
parameters such as facing ratio and distance to the camera. They used these Blinn
shaders for basic surfaces such as leather
and metals, some velvet shaders for clothes
and horse hair, and subsurface scattering
for the skin.
We had a lot of subtle shading details
on things like Zoes clothes or her Uncle
Arnolds clothes: Both wear garments embroidered with gold and have different velvet shaders to give them a sense of fineness
and richness, Brach says. Nevertheless, the
artists used what Brach describes as a fat
S shader for the hairs instead of a traditional hair shader, for a soft look achieved
with a short render time.

along with some raytraced light sources.


Mac Guffs R&D department created a
multilayer system based on various outputs
that enabled the group to generate basic
layers (occlusion pass, diffuse pass, normal
camera pass, and so forth), as well as the
possible mask and shader output needed
by the compositing departmentall in a
single process. We even rendered a layer
for each light source, with all the possible
information (diffuse, spec, shadow) so we
could refine the lighting color in real time
within 2D, Brach says.

Animated Characters
After the modelers were finished, the files
were handed off to the rigging department.
Each biped character rigset up in Maya
and managed with InK, the studios inhouse toolcontained approximately 75
joints, with 100 or so controllers. Because
this is cartoonish animation, everything had
to be squashable, stretchable, bendable,
says Mathieu Trintzius, character rigger,

The characters, including the hulking Lian-Chu and little Zoe, are clearly stylized, yet to fit them
into the realistic environments, artists added fine details to the characters clothing and props.

Lighting and rendering was done with


Mac Guffs Symbor as well, combined
with MGLR, the studios rendering engine. The light rig was based on ambient
occlusion, which allowed the group to
control the mood of the sequence; sometimes image-based lighting was used,

but I wouldnt say that the animation falls


into the cartoonish style; it is more realistic
animation, with the exception of Hector.
Even some of the dragons were rigged
with the base system, including the World
Eater, although the group devised special
rigs for creatures that had unique designs.

n n n n

This also held true for the props.


All the characters were then hand-animated in Maya, reflecting a style that animation supervisor Kyle Balda describes
as somewhere between full and a hybrid
limited animation in 3Dhardly surprising given Qwaks traditional animation background. There is a lot of sappy
pose-to-pose animation, but for the most
part, the style of the animation is driven
more from the personality of the characters, explains Balda. For example, LianChu hardly moves except for when he is in
an action sequence, so his acting is subtle
compared to Gwizdo, who moves a lot as
he talks, but more or less freezes up when
its time for action.
According to Laurent de la Chapelle,
animation director, the team was tasked
with creating a full features worth of animation in about nine months, and with
26 animators divided between the Mac
Guff and Trixter facilitiesabout half the
time and resources as a Hollywood animated feature.
The groups commonly employed video
reference, which enabled the directors to
communicate the acting/action intentions
for a sequence that was being animated
long distance by Trixter in Munich. This
was simply a way to communicate the
characters motivation for a sequence, as
opposed to using a detailed guideline of the
exact poses we wanted, notes Balda.
The facial setups were based on muscular movements in the face, rather than the
creation of shapes that reference particular
phonemes or emotional expressions. In this
way, the animators were not limited to the
same F-shape or the same surprised eyebrow pose. It was up to them to sculpt the
poses themselves based on a combination
of the anatomy of the character and their
own graphic sensibilities, Balda says.
Each character model contained several
facial controllers on the eyes and mouth,
giving the animators a good deal of freedom in creating facial expressions. This sysNovember 2008

35

n n n n

CG I

tem negated any need to use facial poses.


No motion capture was used, though some
characters were influenced by real-life actorsGwizdo (Joe Pesci) and Zoe (Kate
Hudson from Almost Famous, and Miranda Cosgrove from School of Rock).
Another challenge was that, due to the
production schedule, the teams began
animating before the final dialog was recorded, limiting them to only animating portions of sequences that were not
dialog-driven for the first few months.
Fortunately, there were plenty of action
sequences in Dragon Hunters, but not all
the animators are necessarily differentiated
toward action-based animation; some are
natural actors who may struggle a bit with
the concrete and technical nature of physicality, explains Balda.
In addition, casting was still going on
in the early months of the film, but that
limitation proved beneficial when the crew
moved to the acting sequences: By that
time, the animators had become familiar
with every joint and anatomical movement
of the characters, Balda points out.
Lip synch proved especially challenging,
as the dialog was created in English to attract a widespread international audience,
while the animators were predominantly
French. To make this process go more
smoothly, Balda accompanied the directors to the voice-recording sessions in Los
Angeles, to offer a native perspective. For
an animation supervisor, this is an unusual
practice, but now I am convinced that it
should be part of the process, Balda says.
Within eight days we recorded all the
character voices, focusing on one character at a time, from start to finish, from the
perspective of that specific character. Later,
this proved invaluable when I was working one-on-one with the animators to help
them understand where their character was
in his or her respective arc.
Balda also gave the animators a video
reference of him lip-synching the final recorded dialog, so the French animators had
36

November 2008

a reference of the mouth poses that were


not obvious to a non-English speaker
this was particularly helpful for Gwizdo
and Zoe, who talk very fast and use a lot
of slang.

Environmental Issues
One unique aspect of the film is the floating islands. We had to take into account
the fact that all the animations were made
before the modeling of the high-definition sets even began, and that the animators worked on low-definition scenes made
solely for them, explains Franck Clarenc,
modeling supervisor working on the films
sets and environments. As a consequence,
all the assets in our pipeline were created
according to that low-definition layout,
which was divided into arbitrary parts that
were much too heavy when turned into
high definition.
In the case of the floating village, the
artists created it as a single scene, which,
according to Clarenc, turned into a huge liability when they reached the critical poly-

enabling the artists to work on the various


individual parts in a more convenient way.
Basically, we visualized in our asset
manager the sub-scenes and the global
scene in the form of a nodal tree, where we
could connect, disconnect, and replace at
will any part we wanted, explains Clarenc.
This was a major achievement in flexibility, work division, and efficiency for us.
Throughout the film, there is a lot of
debris flying around, some of which are
just rocks, but others are part of destroyed
buildings. To achieve the floating effect,
the group devised a particle-based system to
generate a turbulence field; that, along with
a direction-based rotation script, moved the
flying rocks, vegetation, and more.
There are also a number of buildings in
various states of disarray during different
chapters of the story. Because of the tight
production schedule, the group had to find
a way to optimize the creation of those
heavy models; in the end, they designed
most of the structures using modules that
could be reused and, with some modifica-

Artists surrounded each set with a sphere mapped with a high-resolution matte-painted sky.
Most of the reflections are actual blurred reflections; sometimes they are environmental maps.

gon mass that made working in real time


within Maya a distant memory. So, the
group cut the huge data into smaller bits,
and with help from the development team,
automated the gathering of those subscenes into a single, massive one, thereby

tions, would allow the group to craft as


many different buildings as needed.
We originally thought about using the
same nodal techniques as we used for the
flying village in our asset manager, but
that turned out to be impractical: It was

CG I

too complex a structure for a single building, explains Clarenc. Instead, we created
a library of parts, like rooftops, walls, and
blocks, and assembled them by hand.
The actual building destruction was done
with a rigid-body dynamics system. If you
want to break a building into two pieces,
you must also model the interior and break
everything into small pieces, explains
Emilien Dessons, special effects supervisor.
We tried to use an algorithm-based shattering technique, but the shattering was
equal and ugly, so we looked for a chaotic
shatter with a graphically appealing shape.
We manually modeled everything, shattering each piece with a reference drawing.
We also added classically instanced particle
systems to the destruction scenes to bring
life to them, with a large amount of compositing and different smoke techniques
added in as well.
Furthermore, the crew used two modeling LODs, one with as few faces as possible, and another with beveled angles so
the team could automatically subdivide the
objects depending on the needs during rendering. You could probably say we had a
third level, where the objects were detailed
according to the textures for extreme closeups, Clarenc says.

Devil in the Details


While atypical, the environments nevertheless had to look realistic, in contrast to
the character models, in keeping with the
style of the movie. Guillaume Ivernels references were very detailed, and we had to
translate that level of detail in every texture
we created, says Fabien Polack, texturing
supervisor.
The environmental textures were fairly
traditional, a combination of texture maps
with simple shaders. The challenge for us
was more artistic than technical; we had to
match Ivernels vision and enhance it whenever possible, says Polack. Due to time limitations, the artists created ground shaders
that worked for almost every shotfrom

n n n n

Hector the blue dragon proved difficult to model. He had to squash and stretch all parts of his
body, including his eyes and teeth, without adversely affecting the skinning and blendshapes.

expansive shots to close-ups. They also


painted detailed textures but were careful to
not fall into a common trap by over-texturing objects. Like painters, we removed
some material at the end of a texture to
make it look more natural, he adds.
Every set, Polack explains, was surrounded by a sphere mapped with a highresolution matte-painted sky, provided by
the matte department or Ivernel himself.
Most of the time the reflections were real
blurred reflections, and less often, environmental maps.
Every map was created, at minimum, at
4k resolution, sometimes 8k, and on occasion, 16k. The group provided the traditional set of texturescolor, specular,
bump, reflectivity, and, depending on the
context, roughness, displacement (usually
at 16 bits), and normal. Then, each highdefinition shader was converted to mid(2k) and low-def (1k) depending on the
shots. The bump factors were also driven
by a distance-to-camera node.
Despite using what Clarenc calls a classic modeling setup, the group still ran into
some issues. For instance, our renderer
doesnt have an angle-threshold limit to calculate the phong smoothing, which means
that all our low-def models were totally out
of shape and all the details were completely
lost. So we developed a tool to convert the

Maya hard edgeswhich are nothing but


an edge-bound angle limit for the phong
smoothingto real hard edges that were
vertex-based with doubled adjacent edges
to stop the phong from averaging values
across them. And, voila, we had our lowdef Maya objects in their crisp, detailed
glory. While it doesnt sound like a big deal,
it soon became vital for the completion of
the huge sets at the end of the movie.
To render the images, Mac Guff had a
650-processor renderfarm availablesmall
in comparison to Hollywood standards
though on average only 200 to 300 processors were used per shot. To keep the
rendering times manageable, the crew was
encouraged to keep each shot to a rendering time of less than two hours per frame,
no matter the number of layers. Nevertheless, working in layers gave the artists more
control over the final result.
Despite the nodal structure for the large
scenes, an ever-growing library of modules
and buildings in the asset manager, and
a way to maintain the sharp angles of its
low-def models during rendering, these solutions were not enough when it came to
addressing the biggest problem of all: the
world cemetery, where 200 shots occurred.
Like the floating village, the world cemetery at the end of the movie was a single,
huge set but contained dozens of detailed
November 2008

37

CG I

interaction, says Dessons. So, the group


used a voxel-based technique for a volumetric effect. Building on that technique,
they generated a particle system that emitted voxels for a waterfall scene; this enabled
the artists to control the water animation.

A Small World

The team used Mac Guffs Symbor to generate the characters hair and the fur on their costumes. For each actor, they had at least three hair systems of varying densities and thicknesses.

buildings, and the camera continuously


moved through them.
The animation was done, and there was
no way we could change the layout, but the
number of buildings was so large that none
of our machines could render the scene,
explains Clarenc.
The team had to drastically reduce the
polygon count; they did so by baking the
high-definition models onto lower-definition modelssometimes as much as 10
times lighter. This involved both normal
and color maps so that no one would be
able to tell the difference if the objects were
far enough from the camera.
However, that was the easy part. Afterward, the developers had to find a way to
switch from the low-def models to the
high-def versions, and vice versa, both
in Maya and in the studios proprietary
MGLR rendering software. This enabled
the artists to arrange the layout for each
shot and add details where necessary
while keeping the non-important zones
as light as possible.
If our developers had not succeeded
in doing so, the last sequence would have
looked simple and empty, says Clarenc.

VFX and More


According to Dessons, the crew used a significant amount of simulation throughout
the movie. For instance, the team animated
38

November 2008

groups of bats and rabbits using a behaviorbased system. In the few occasions where
there was water, the artists used simple
transparent and refractive shaders.
Mac Guff also developed new techniques to calculate clouds, reducing the
overall time. This proved invaluable for a
cloud sea sequencein which the flying
village is riding the cloud sea and dragons
are crossing back and forthcontaining
a number of camera moves and set interactions. Here, the crew redeveloped the
old-school ray-marching technique from
the raytracer to get a real volumetric effect,
although it resulted in huge
rendering times (about 20
hours a frame).
This technique was also
used for the dragon trails,
enhanced with a 2D motion-blur process.
The raymarching technique
also provided the opaque
smoke throughout the
film. We also
needed a smoke
that reacted precisely with the
environment.
The ray-marching is a
sphere-based technique whereby
you build your smoke with
spheres, so we couldnt get that

The exciting and universal story line, the


large personalities of the characters, and
the stunning visuals make Dragon Hunters a movie for everyone. This film is an
example of why I am working in the animation business, says Coldewey. And it
is a great example of how Europe works
together. Think about it: The French and
the Germans produce something in English for the international market. How
cool is that? Technically, the film utilizes
the best techniques available, borrowing from US animation schools, French
artwork, and Japanese image layouts and
atmosphere. Will that be enough to satisfy US audiences? Thats difficult to say,
says Coldewey. To get a family feature to
US theaters, you need to invest more than
we had to make this movie. Yet, I think
there is a big audience in the US asking
for this type of story.
Perhaps so. As Qwak points out, the
movie is full of contrasts. The tone
of the characters is cartoony, even
though theyre textured like real
ones. We like the contrast
between the adventure/suspense, and the sad/funny
scenes, he says. Emphasizing this contrast, Qwak describes
Dragon Hunters as
Tom and Jerry go to
Saurons place (in The
Lord of the Rings
Rings).
Everything is blended, just as
it is in real life, Qwak adds.
Karen Moltenbrey is chief editor of Computer
Graphics World.

CGW :908_p

9/19/08

10:05 AM

Page 1

Poser Pro

Poser
Fusion

Geometry
I/O
COLLADA
3D Host
Application

Final
Project

The A,B,Cs of 3D

Autodesk Animation Academy teaches students about 3D, within the


context of other subjects and topics, including environmental awareness

40

utodesk Animation Academy is a program designed to teach


secondary school students about the companys Maya, 3ds
Max, Mudbox, and MotionBuilder software products in the
classroom. But this doesnt mean that the curriculum has to focus
solely on rote tasks. Instead, the Animation Academy expands education horizons, introducing students to 3D animation and visual
effects technology while immersing them in core academic subjects
such as science, math, language, and art.
To further pique the interest of this demographic, Autodesk
Animation Academy has included a Pollution module as part of
the program, whereby students can create their own mini-games
focused on environmental concerns.
But the program is not just fun and games. The Animation

of the moon. Also new is the Pollution module, a supplement intended as an introduction to the program.
Previous versions of Animation Academy were available for purchase as separate programs with 10 seats of either Maya or 3ds Max
software. However, for the first time with the 2009 release, the six
animation curriculum modules included with Animation Academy
will come with 10 seats each of 3ds Max, Maya, MotionBuilder,
and Mudbox. The curriculum is directly tied to the Autodesk products, and is designed so that instructors who arent necessarily well
versed in 3D can still easily get their students up and running via
the step-by-step instructions.
Animation Academy is exciting because it gives students a
chance to further engage with science and technology subjects

Academy curriculum adheres to numerous academic standards,


including those of the International Technology Education Association and the US STEM standards for science, technology, math,
and engineering. The program is the Autodesk Media & Entertainment divisions primary offering for K12 students. With a focus on
the high school level, it introduces students to 3D technology via
core course subjects.
Curriculum modules that will be released with the 2009 version of Autodesk Animation Academy include reconstructing the
Parthenon, learning the digestive system, studying weather systems
with a focus on tornadoes, and experiencing astronomy and phases

Autodesk has integrated its Pollution module as part of the companys


Animation Academy curriculum. The goal is to teach students the
fundamentals of 3D content creation as they craft mini-games that
have an environmental focus.

November 2008

while also exposing them to 3D technology at a young age. Theyre


learning about topics like pollution and the digestive system by
building their own 3D modelsexposing them to these subjects
in a way that would be impossible in the traditional classroom setting, says Alice Palmer, Autodesks Education marketing manager.
Furthermore, fields where 3D technology is being applied are exploding, so whether these students go on to work in entertainment,

design and
digital arts

A little professional training


helps a lot of imagination go a long way.
From the fundamental principles of design, to the fundamental changes of the digital age, success in the creative industries
starts with the programs in Design and Digital Arts at NYUs School of Continuing and Professional Studies (SCPS). Every one of
our programs, from graphic design, to animation, to product designas well as our Masters in Digital Imaging and Design
provides you with the expert instruction of our award-winning faculty in the dynamic environment of our advanced digital labs.
Best of all, youll have access to an unparalleled network of industry professionals who will help fuel your creativity and bring
your ideas to fruition.

Graduate deGreeS:
Digital Imaging and Design
Graphic Communications Management
and Technology
underGraduate deGree:
Digital Communications and Media
(offered through the McGhee Division)

CertIfICate ProGramS:
Animation
Digital and Graphic Design
Digital and Graphic Design Production
Interior Design
Product Design

ContInuInG eduCatIon:
Animation and Compositing
Design Concepts and Fundamentals
Digital and Graphic Design
Digital Imaging and Photography
Interior and Product Design
Techniques
Web Design

Graduate Information Sessions:


Tuesday, November 11, 68 p.m.
Wednesday, December 17, 68 p.m.
Please call 212-998-7200 for locations and to RSVP.

scps.nyu.edu/x94

1-800-FIND NYU, ext.94

New York University is an affirmative action/equal opportunity institution. 2008 New York University School of Continuing and Professional Studies

Seiter&Miller PNY-8118 Pub. Computer Graphics World Size 8 x 10.75

Issue November

CGW_THIRD_vert:CGW_THIRD_vert

10/24/08

dbquvsf

jnbhjobujpo

8 : C I : G  ; D G  9 > < > I 6 A


> B 6 < > C <  6 G I H
6I7DHIDCJC>K:GH>I N

EGD<G6BH
(96C>B6I>DC
>CI:G68I>K:
B:9>6
<6B:6GI
8 =6G68I:G
6C>B6I>DC

L L L # 8 9 > 6 7 J # 8 D B
L6AI=6B!B6L6H=>C<IDC !98

42

November 2008

The Pollution module, included in the 2009 Animation Academy curriculum, also can be
downloaded free of charge so educators can see what the program offers.

architecture, manufacturing, or design,


were exposing them to skill sets that could
benefit their career choice.

Going Green in 3D
With the introduction of its Animation
Academy Pollution module, Autodesk
is taking part in raising environmental
awareness in schools while introducing
students to 3D animation using 3ds Max.
Although it was just recently released,
Pollution has been successful, with nearly
1000 downloads to date, and is a fitting
complement to the Animation Academy
program.
Utilizing Autodesks 3ds Max modeling and animation software, students
are taught about the effects of pollution
as they build games that are focused on
cleaning up a city environment, doling
out points for picking up garbage, cleaning oil spills, and changing cars to hybrids
within an allotted amount of time.
The game promotes environmental stewardship while reinforcing essential science
and math concepts. The instructor lecture
notes included with the module provide
teachers with a guide to the principles of
creating a simple game level using 3ds Max
software. Following those guidelines, students are able to build and design their own

levels, and understand the building blocks


of creating digital art and environments in
3D. The student workbook and datasets
deliver a framework that explores sustainable design concepts through the creation
of a game level emphasizing the environment. A completed dataset is also included,
enabling students to check their work once
they have finished the exercise.
Existing Autodesk Animation Academy
customers can use the Pollution module
with 3ds Max software Versions 2008
and above. Teachers who are considering
incorporating Animation Academy into
their academic programs can purchase the
solution or try out this module through a
free trial download at www.applied-ideas.
com/pollution_downloads.html.
While Animation Academy isnt intended to be as comprehensive or rigorous
as 3D programs offered at the university or
professional level, it provides an engaging
and valuable introduction to 3D technology through subjects that are already being
taught at the high school level, concludes
Palmer. Our hope is that Animation
Academy will continue to spark students
interest in traditional science, math, and
language arts subjects while simultaneously
developing their 3D skills for application in
a wide variety of professional fields. n

CGW :1108_p

10/29/08

11:24 AM

Page 1

For additional product news


and information, visit CGW.com

Served up at SIGGRAPH
This months Products section is dedicated to product introductions and upgrades
unveiled and demonstrated during SIGGRAPH 2008 in Los Angeles. For more news
from the show, visit www.cgw.com.

SOFTWARE

animators, illustrators, architects, matte


painters, and CG professionals. It boasts
a number of new features, including
third-generation patented EcoSystem
technology, the Spectral 2 atmospheric
engine, SolidGrowth 4 HD, and a new
indoor Radiosity engine.
E-on Software; www.e-onsoftware.com WIN MAC

Compositing
Simulation
Vue 7 Previews
E-on Software previewed its Vue 7
xStream and Vue 7 Infinite professional
software solutions for natural 3D envi- and workflow than previous versions.
ronments. Vue 7 xStream is the latest All Vue tools are now integrated in the
version of E-ons tool set for creating host application, providing users direct
natural environments and rendering access to 3D environment-creation tech
them in Autodesk 3ds Max and Maya, nologies, such as EcoSystem Painting.
Softimage XSI, Maxon Cinema 4D, Vue 7 Infinite, a solution for creating,
and NewTek LightWave. Version
7 is 5:17
animating,
and1 rendering natural 3D
CGW_Half_HOR:CGW_Half_HOR
10/23/08
PM Page
designed to deliver greater integration environments, is targeted at studios,

44

November 2008

Fusion 6
SIGGRAPH attendees were the first to
witness Fusion 6, an upgrade to Eyeon
Softwares flagship compositing application. Version 6 offers various new and
enhanced features, including a stereoscopic and multi-layer imaging system,
stereoscopic views, 3D in stereo view,
and multi-layer imaging. Fusion 6 is
designed to deliver seamless image
quality across multiple platforms, including Linux, Windows x64, Windows 32-

CGW :908_p

10/28/08

5:32 PM

Page 1

SA08 20.95x27.94cm Computer Graphics World Ad.ai

10/21/08

2:54:16 PM

NUX

bit desktop PCs, and Intel-based Macs.


Fusion 6 is available to anyone with a
valid subscription license at no additional charge. Others can upgrade to
Fusion 6 for $695, which also includes
12 months of subscription upgrades.
Eyeon Software;
www.eyeonline.com WIN MAC LINUX

Middleware
GeForce Experience Pack

as the ability to run PhysX-accelerated


applications faster on their GeForce
GPU. The entire pack can be downloaded for free from Nvidias Web site.
Nvidia; www.nvidia.com WIN
Allegorithmic ProFX
Allegorithmic, developer of procedural
texturing tools for real-time 3D content,
released ProFX. ProFX middleware
enables game developers to professionally render high-quality procedural textures. It is designed to quickly
produce small texture files while keeping high visual quality intact. ProFX 2.6
procedural texture files are small enough
for use in massively multiplayer online
games (MMOs), virtual communities,
and casual and downloadable games.

Allegorithmic; www.allegorithmic.com
Nvidia debuted its first online GeForce
Experience Pack, which takes advantage of Nvidia PhysX technology, at
SIGGRAPH. The pack includes: the full
Warmonger action game, Unreal Tournament 3 PhysX Mod Pack boasting three
maps with impressive effects, and sneak
peeks at the upcoming game Metal
Knight Zero and Nurien social-networking service based on Unreal Engine 3.
Also offered are demonstrations of
Nvidias The Great Kulu, which showcases the use of PhysX soft bodies in a
real game-play environment, and Fluid,
which simulates realistic fluid effects with
a variety of liquids. Nvidia also released
new WHQL-certified drivers that enable
PhysX acceleration for all GeForce 8, 9,
and GTX 200 Series GPUs. After installing these new drivers, users are said to
gain higher levels of interactivity, special
effects, and realism on their PC, as well

WIN

Motion tracking
Imagineer Systems in LA
During SIGGRAPH, Imagineer Systems
demonstrated Mocha for After Effects,
the newest addition to its product portfolio. A planar tracking tool, Mocha-AE
was developed specifically for Adobe

After Effects artists and modeled after


Imagineers Mocha stand-alone tracking
solution. Imagineer also demonstrated
the latest versions of its other VFX
desktop tools, including: Motor roto-

scoping tools, Mocha stand-alone tracking station with 2.5D planar tracking
technology, Monet 2.5D tracking and
element replacement tool, and Mokey
for tracking, wire and rig removal, keying,
image stabilization, noise reduction, and
film-grain management.
Imagineer Systems;
www.imagineersystems.com WIN MAC LINUX

HARDWARE
Motion tracking
Intersense VCam
InterSense introduced a new virtual
production system, the IS-900 VCam
Virtual Camera Tracking System. Being
referred toas the InterSense Camera,
earlyversions have been used to streamline pre- and postproduction in animated
features and game cinematics. The new
system, combining motion-tracking
technology with virtual production software, is designed to help producers of
animated and 3D feature films, games,
and videos to increase productivity and
streamline workflow. The IS-900 VCam
integrates InterSenses MicroTrax 6DOF motion-tracking sensors into a
production camera body with Autodesk
MotionBuilder functions mapped to the
cameras interface controls. A threeaxis analog joystick added to the VCam
enables translation, rotation, and scaling of the virtual environment in MotionBuilder, helping set up or scout a virtual
shot. Camera buttons deliver zoom,
play, record, and jog transfer production
control to the camera operator.

InterSense; www.isense.com

November 2008, Volume 31, Number 11: COMPUTER GRAPHICS WORLD (USPS 665-250) (ISSN-0271-4159) is published monthly (12 issues) by COP Communications, Inc. Corporate offices: 620 West Elk Avenue, Glendale, CA 91204, Tel: 818-291-1100; FAX: 818-291-1190; Web Address: info@copprints.com. Periodicals
postage paid at Glendale, CA, 91205 & additional mailing offices. COMPUTER GRAPHICS WORLD is distributed worldwide. Annual subscription prices are $72,
USA; $98, Canada & Mexico; $150 International airfreight. To order subscriptions, call 847-559-7310.
2008 CGW by COP Communications, Inc. All rights reserved. No material may be reprinted without permission. Authorization to photocopy items for internal
or personal use, or the internal or personal use of specific clients, is granted by Computer Graphics World, ISSN-0271-4159, provided that the appropriate fee
is paid directly to Copyright Clearance Center Inc., 222 Rosewood Drive, Danvers, MA 01923 USA 508-750-8400. Prior to photocopying items for educational
classroom use, please contact Copyright Clearance Center Inc., 222 Rosewood Drive, Danvers, MA 01923 USA 508-750-8400. For further information check
Copyright Clearance Center Inc. online at: www.copyright.com. The COMPUTER GRAPHICS WORLD fee code for users of the Transactional Reporting Services is
0271-4159/96 $1.00 + .35.
POSTMASTER: Send change of address form to Computer Graphics World, P.O. Box 3296, Northbrook, IL 60065-3296.

46

November 2008

Jean-Philippe Agati is
general manager of Sparx
Animation, a production
studio founded in 1995 and
specializing in keyframe
animation. with facilities in
France and Vietnam.

Interview with Chief Editor

Karen Moltenbrey

In the recently released CG feature Igor, a hunchbacked lab assistant from Malaria has big
dreams of becoming a mad scientist and winning the first-place prize at the annual Evil Science
Fair. He finally gets his chance when his cruel master kicks the bucket a week before the big event.
Igor, with the help of his two experimental creationsBrain, a not-so-bright organ confined to
a jar, and Scamper, a cynical rabbit that was once roadkillembarks on building the most evil
invention of all time: a huge, ferocious monster. Rather than evil, the monster turns out to be Eva,
a gentle giant who aspires to be an actress. Soon these misfits uncover a truly evil plot that threatens
their world, and they spring into action to save it.
Similarly, with the feature Igor, Sparx Animation got a chance to prove itself in the CG world
with its huge animated creation. Unlike with Igor, in this instance mostly everything went
according to plan, and it was Sparx that succeeded in world dominance. Well, nearly so, if you
consider how difficult it is for a European-created CG feature to break into US theaters!
Here, Sparx general manager Jean-Philippe Agati provides details about this engaging film.
How long was Igor in
production?
Igor was in production for 22
months, starting in November
2006, with CG being done in less than 18
months. Postproduction finished midAugust 2008.

Visual development was carried out


in Paris for the first six months. At
the same time, we started to enhance our
Maya pipeline and create all the tools that

team of six animators and two cameramen.


The 3D animatic was done in Paris in four
months, with all box modeling made in
Vietnam. We spent two and a half months
revising the animatic prior to layout and
animation. In Vietnam, 50 animators had
six months to create the animation. We
went for a three-step process: blocking,
inter, and final (being mostly facial animation). At the end of animation, the movie
was brought back to Paris for the lighting
and final compositing.
What challenges surfaced as
a result?
We have extensive experience sharing work between Vietnam and
Paris. Nevertheless, given the ambition

Is this Sparxs first feature


animation?
Yes, this is our first one. Igor, which
is 86 minutes in length, is Sparxs
first feature. Prior to this, Sparx did some
TV series and direct-to-videos, such as
Rolie Polie Olie and Mickeys Twice Upon a
Christmas.
Why did you undertake such
an ambitious project?
Sparx has been doing animation for
10 years, and we felt that it was the
right time and a natural evolution after our
work on Twice Upon a Christmas.
How was the work divvied
up among the groups?

Work on Sparxs CG feature Igor was split between the studios Paris and Vietnam facilities.

would be necessary for our Vietnam studio


to do the animation. Character modeling
and rigging were done in Paris, while set
and prop modeling was done in Vietnam.
We went straight to a 3D animatic with a

of the movie, communication and data


synchronization were the keys to make
things happen in such a short period. That
said, we would have loved to have had the
director in both locations at the same time.

November 2008

47

Compared to US-made CG
features, what makes Igor
different?
Everything, and nothing, I guess.
Igor has its own look and story, but like any
other animated movie, it aims to give the
audience a good time. Technically, I dont
think it differs. The tools and the software
are the same. However, the size of the
budget and the schedule were those of an
independent movie.
What were your biggest
challenges?
Animation is and will always be the
biggest challenge. How do you make
CG puppets lovable? How will they bring
emotion to the audience? In this regard, I
am proud of our work. We did a lot of training, looking at Pixars and DreamWorks best
animation moments. We had the director
repeatedly act out the 1450 scenes that are
in the movie, and we have some talented
people that just love acting.
Did you encounter any
modeling issues?
The movie has 120 characters,
65 locations, and 250 props. In
Malaria, everything is asymmetrical, which
created quite a modeling challenge. Animation was the main focus of our attention.
Everything we did from the beginning was
done with the final animation in mind. At
the end, we averaged 0.6 second of animation per animator per day, which is not a lot,
but its what it takes to get to feature quality.
What tools did you use?
We used Maya for modeling,
rigging, texturing, and animation,
and Digital Fusion for compositing.
We used some other software for specific
needs, including our own tools for rigging
and data management. For hardware, we
have a partnership with HP, so everything
at Sparx is HP, from workstations to the
renderfarm, and I never heard an artist
48

November 2008

As its main content-creation tool, Sparx used Autodesks Maya running on HP workstations.

complaining about [the hardware]. First


time in my life.
Did you have any particular character-modeling
concerns?
The high complexity of the models
and the unique look in 2D were the
biggest challenges. It turned out that the
3D models came out nicely with unique
personalities, and all in record time.
How did you bring out the
emotions?
This all comes from having the
right acting on great voice recordings. We got lucky because Exodus (the
production company) picked a great cast.
Who would not want to animate on Steve
Buscemi, Molly Shannon, or Eddie Izzards
voices? And then, it is all about the skills of
the animators. As for the facial expressions,
Sparx has a long history of developing its
own tools for body and facial animation.
We had to go the extra mile on Igor and
bring these tools to the next level. Theyre
top secret but allow you to do great things
in a highly customizable way.
Which characters were
the most difficult to
model/animate?
Probably Igor. It seems that the hero
of a movie is always the most difficult to
animate. Maybe because he is at the center
of all the attention.

Were there rendering


issues?
For rendering, we used Mental Ray
for Maya. It was an obvious choice
for both technical and financial reasons.
In the first half of the production, we got
support from the Maya team at Autodesk
in using all the Mental Ray efficiencies and
improving the render time. This was profitable for both companies, as we each learned
a lot of what was needed to optimize
rendering. I am proud to say we provided
some informative data to Autodesk as well.
What about the
environments?
Director Tony Leondis went asymmetrical on everything, and with
irregular shapes on top of that. That is what
Malaria is all about.
There was a big battle
scene?
Yes, and both the crowd and the
choreography of the fight itself
were tricky. How do you make a big fight
and a big crowd without spending half
your budget? Its all in the magic. I love the
phrase: Magic is not perfection. Magic we
can afford, perfection is out of reach.
So, whats next?
When do we do the sequel? The
team had such a good time working
on this movie that this pretty much
sums up our feelings right now.

CGW_ :1207_p

12/5/07

10:36 AM

Page 1

A view from your edit suite.

Imagine true broadcast-level HD editing in places you never thought possible.


Io HD, paired up with Apples MacBook Pro and Final Cut Studio 2,
gets you a no-compromises HD editing suite wherever you want to work.
One FireWire cable to the Mac is all you need to work with full-resolution 10-bit
4:2:2 HD and SD video, using the powerful new ProRes 422 codec from Apple,
uniquely integrated directly into Io HDs hardware.
Add in Io HDs 10-bit up/down/cross conversion and unmatched video and audio
connectivity, and youre seamlessly working in any format you want,
anywhere you need it... in a portable, professional package.
Check out Io HD at our website, or give us a call to find an authorized
AJA Desktop Dealer near you.

www.aja.com
800.251.4224

CGW :808_p

7/15/08

1:38 PM

Page 1

Dominate your design.

Convergence of pure performance and your vision


You have a vision. ATI FirePro workstation graphic accelerators give you the power to help
realize it. The robust portfolio of graphics accelerators delivers leading performance and
reliability across the broadest spectrum of engineering, scientific and DCC applications and
operating systems. ATI FirePros display engine produces over a billion colors on 10 bit displays
and delivers the power needed to manipulate huge data sets and support large displays.
Go to ati.amd.com/FirePro to learn more about ATI FirePro family of products.

2008 Advanced Micro Devices, Inc. All rights reserved. AMD, the AMD Arrow logo, ATI, the ATI logo, FirePro and combinations thereof,
are trademarks of Advanced Micro Devices, Inc. All other brand names, product names, or trademarks belong to their respective holders.
Images courtesy of Will Burdett at the University of Hertfordshire, PTC, Youngwoong Jang, and Todd Daniele

Вам также может понравиться