Вы находитесь на странице: 1из 3

LINE ITEMS

Is FCP X the Future of Editing?


David Leitner assesses Apples innovation.

70

FILMMAKER

FALL 2012

PHOTO COURTESY OF APPLE

diting is older than motion pictures. The ordering and pacing of dialogues, scenes, entrances and exits to build conflict and resolution have long defined Western theater, from Aeschyluss Oresteia to Wagners The Ring of the Nibelung [Der Ring Des Nibelungen]. It was the insertion of first-person thoughts into dialogue and plot that modernized 18th- and 19th-century novels and clever sequencing of mechanically animated magic lantern glass slides that thrilled Victorian audiences to popular epics like Ben-Hur. Nevertheless, as Walter Murch likes to point out, film editing was invented 14 years after motion pictures. Uncut reels of onrushing locomotives, sneezing and kissing were gripping, profitable entertainments, such were the frisson and novelty of realistic moving images. (Vitascope was the YouTube of its day in this regard.) It would take Georges Mliss camera to jam and restart in a Parisian street in 1896, and a similar mishap at a 1901 horse race in Bristol, England, to reveal, respectively, the trick of editing within shots (effects) and between shots (cutting). Disruption of time, place and point of view through film editing would soon yield a new dramatic art: Cinema. It could as easily have been called Cubism. As film syntax matured, technique formed and a profession emerged. A century on celluloid carried us from scissors and glue to guillotine tape splicers and upright Moviolas, then flatbeds. Video brought big iron linear editing systems for online, then finally PC-based non-linear editing systems (NLEs) all mostly operated by professionals, at least through the late 1990s when hardware-based Avids still cost tens of thousands of dollars. What cracked the door to editing for the rest of us was the introduction of software-based Final Cut Pro 1.0 at NAB in 1999, coincident with the arrival of FireWire-enabled DV camcorders. All dismissed as amateur, naturally. Weve seen how that turned out. Digital

Final Cut Pro X

democratization spread inexorably. Its not hard to draw straight lines from FCP + DV to HDV and DVCPRO HD, to the rise of small camcorders, Internet streaming, cheap SD cards to record on, RED usurping film and cheap HDSLRs usurping RED. (With a regrettable drop in pay rates along the way.) As a consequence, there are exponentially more people, professional and nonprofessional alike, of all ages, in all countries, now creating, editing and distributing digital movies. Everyone with a point-and-shoot or smartphone in their pocket is a potential HD source. (Todays equivalent of a Kodak Brownie: You push the button, we do the rest. Insert irony here.) And yet, though weve embraced digital code as the motion picture medium of our time, the technology of nonlinear editing remains very much a work in progress. For one thing, picture, sound, music and effects continue to invite the forces of invention. For instance, we now need torrents of metadata data about data simply to keep track of everything. Not only during editing, but to manage future access and archiving. And another thing, nonprofessionals whatever this distinction signifies in an information age rife with underemployment

have vaulted forward in technical savvy and technique thanks to the explosion of shared knowledge on the Internet, plus the extensive capabilities of their low-cost digital tools. As an NLE designer today, where would you draw the line between professional and nonprofessional? Which features would you include or deny? Wouldnt you wish to meet the high-end needs of the workplace, yet attract that vast center of the bell curve of potential users? If only to carve out the largest possible market share? As it happens, this past year ushered in a crop of powerful, affordable, newly 64-bit professional NLEs, including Final Cut Pro X, Avid Media Composer 6.5, Adobe Premiere Pro CS6, Sony Vegas Pro 12, Grass Valley Edius Pro 6.5 and the resurrected Lightworks. (Media 100 Suite remains 32-bit. Another dozen Windows-based NLEs exist for under $100.) 64-bit architecture introduces dramatically faster importing, transcoding, rendering and output. It demolishes the old 32-bit performance barrier of 4GB RAM, replacing it with a theoretical 17 billion GB, just in time to meet the coming decades demand for rock-solid stability, instant timeline loading and flawless playback of real-time effects in H.265 (twice as efficient as H.264), 3D, 4K and beyond.

Details vary, but on the whole these NLEs offer the latest camera codecs; codecs for proxy editing and finishing; timelines that accept mixed codecs, resolutions and frame rates; motion effects; image stabilization; primary and secondary color correction; audio mixing; effects plugins; sophisticated titling; support for third-party hardware (Matrox, AJA, Blackmagic, MOTU, Bluefish); support for multicamera editing; support for stereoscopic 3D editing; extensive metadata tagging of clips; media management across myriad drives and sources; output compressions; and project/timeline interchange with other apps, NLEs and audio editing programs. All you need is a credit card, not a guild card. Whats not to like? What these NLEs dont aspire to, with the exception of FCP X, is evolution. Despite constant churn in the technology of creation and consumption of digital moving images viewing now entails phones, tablets, laptops, TVs, cinemas editing hardware remains tied to a mouse-driven desktop environment conceived decades ago. On the software side, why perpetuate dual source/record windows from 1970s tape editing, or interface metaphors adapted in the 1980s from film editing (Avid Media Composer), or 1990s timeline design (Final Cut Pro 7)? Why not exploit this 64bit great leap forward in speed and processing to rethink, perhaps even reinvent, editing for the coming file-based century? In introducing its mobile operating system, iOS, five years ago, Apple seized an opportunity to innovate new file systems (hidden), control interfaces (touchscreen), gestures (multitouch), screen displays (full), app switching (fast) with saved states (flash memory), Internet upgrades (App store), and voice commands (Siri). And lets not forget erasing pixels (Retina display). iOS is an offshoot of OS X, 32-bit but written in Objective-C like OS X. Both operating systems possess a layer-cake architecture with a dedicated media layer that contains graphics, audio and video frameworks such as Core Animation (fluid icons, controls that fade), Core Audio and Core Media. Frameworks are collections of functions that can be shared by different apps in a modular fashion, without having to be rewritten each time. With iOS 4 (2010) and OS X 10.7 Lion

Be

Part of the Action!


Direct, Edit, Produce...

Open Houses - Saturday, November 17 & December 8 at 1 pm


An Affordable Four Year College with Scholarships

305 N. Service Road Dix Hills, NY 11746 www.ftc.edu 631.656.2110


AUDIO RECORDING TECHNOLOGY BROADCASTING BUSINESS ELEMENTARY TEACHER EDUCATION FILM/VIDEO J O U R N A L I S M M A S S C O M M U N I C AT I O N M U S I C T E A C H E R E D U C AT I O N M U S I C B U S I N E S S M U S I C P E R F O R M A N C E T H E AT R E A R T S

(2011), the media layer of each OS gained a new framework, AV Foundation the engine of FCP X. A big advantage of conjoined operating systems is that user-interface breakthroughs on mobile devices such as the iPad can readily migrate to Mac apps like FCP X for instance, use of animation, multitouch, auto-saving, full screen display, Retina display, integration with flash architecture all of which in turn optimize FCP X for use on portable MacBook Pros with trackpads. On the latest MacBook Pro with Retina display, for example, you can view full 1080p in FCP Xs small Viewer window. Of particular significance: the 64-bit AV Foundation found in OSX supplants the now legacy 32-bit QuickTime framework (video files will continue to sport QuickTime extensions). AV Foundation brings, at last, multi-core and GPU-assisted speed to Final Cut Pro rendering tasks (using OS Xs Grand Central Dispatch and OpenCL), as well as full color management from input to output and finer time accuracy for subframe events. Of course the broad gibe against FCP X at its introduction in June 2011 was that it represented nothing more than a pro version

of iMovie, which, not surprisingly, also relies on AV Foundation. Apples senior vice president of Industrial Design Jony Ives is a devotee of German designer Dieter Ramss Less, but better, philosophy, evident in all Apple products. FCP X chief architect Randy Ubillos was the creator of the first three versions of Adobe Premiere in the early 1990s, while senior product manager Steve Bayes, a working editor for years, was once Avids principal product designer for Media Composer, Symphony and DS Nitris. He also wrote the essential The Avid Handbook. As futurists endeavoring to envision the shape of tomorrows pro editing, theyre not exactly chopped liver. So why the virulent public protest? In addition to incorporating OS innovations and building out extensive control of metadata and media management, the FCP X team sought to directly address several prominent trends in production: Digital cameras generate endlessly more footage than film cameras ever did, which must be readily reviewable and searchable. Multiple cameras are now common and often wild (no sync). Democratization encourages see page 83
FILMMAKER FALL 2012 71

Where would an editor fit in this mix? While Telltale doesnt hire anyone for that specific job title, Parson says, filmmakers are certainly in demand. Weve hired cinematic artists who came from film backgrounds and didnt always have experience working in 3D software. Of course, any of that you do know would only make you more qualified. Teaching software packages is pretty easy. Teaching talent and creativity is hard. In discussing the future of gaming, he adds, I also fully expect to see more integration between games and other media, especially filmmaking. I think it will become more and more common for ideas, talent and properties to cross-pollinate between filmmaking and game development.

FCP X AND THE FUTURE OF EDITING


from page 71 many to edit regardless of experience; at the same time, audiences expect perfect finished quality regardless of budget. FCP Xs solutions, in order: fast Skimming with pitch-corrected audio, Keywords & Smart Collections, Multicam (introduced in January in FCP Xs third upgrade in a year), and a friendlier, less cluttered interface for those with less experience, with deep controls located just below the surface for experienced editors. The uncluttered interface is key to understanding how radically innovative FCP X truly is. Conventional timelines resemble orchestral scores, with dozens of staffs representing myriad instruments and sections, each charted across time. In a conventional NLE timeline, video and audio tracks can similarly number in the dozens, overflowing even the largest display. In many cases, these tracks are mostly empty, containing only a handful of clips. Arguably, a massive waste of precious screen real estate is the result. FCP X has no tracks. It adopts a different metaphor, one that Aeschylus would recognize. Instead of a timeline with tracks above and below, FCP X provides a single primary storyline that serves as a narrative spine, with a beginning, middle and end. Individual clips are connected at points along the storyline, floating on, just above (video) or below (audio) the storyline. A complex stack or sequence of clips can be collapsed and nested into a simple compound clip that can be edited like a single clip or momentarily reopened into its own storyline for internal editing. Sync relationships are preserved by a Magnetic Timeline. Since clips and compound

clips are attached to points along the storyline, its impossible to knock them out of sync in the course of inserting or deleting other clips. If two clips happen to collide in the course of an edit, one slips above or below the other (literally, using animation), preserving the relationship of both clips to the storyline. The editor, free from worry about accidentally knocking clips or complex sequences out of sync, can playfully shuffle clips and sequences, focusing entirely on story structure. Dispensing with the clutter of conventional tracks also favors use of FCP X on mobile devices and compact laptops with smaller screens a clear nod to the future. When 64-bit FCP X displaced 32-bit FCP 7, which was summarily discontinued, theres no question pro editors whose livelihoods depended on FCP 7 were deeply shaken. But many experienced editors groused because, I believe, FCP X was strangely unfamiliar territory. Others faulted FCP X for features that were, at first, missing, instead of lauding innovations like Magnetic Timeline and the fact that, with OS X 10.7 Lions autosaving and Resume, a power loss or unlikely crash no longer means loss of work. FCP X projects reopen exactly where they left off, like magic. Rome wasnt built in a day and neither was what became FCP 7 (10 years). In the course of FCP Xs first year, five free upgrades have arrived via the App Store (no need to change out of your bathrobe), including support for Multicam, XML, media relinking and broadcast output. Features to arrive later this year include multichannel audio editing tools, dual viewers and support for MXF plug-ins and RED. And then there are useful OS features like voice dictation, which arrived in July with OS X 10.8 Mountain Lion. Doubleclick the function key and you never have to type an event label or clip description in FCP X again. Just speak. Concision, after all, is the soul of editing, and the film-editing project begun a century ago with scissors and glue may yet reclaim its own simplicity. What we really want, if were unashamedly honest, is facility of editing with the ease of dreaming. To say, wouldnt it be cool if and see instant results on the screen. Maybe someday we can tell Siri where to make that cut and how long to extend that dissolve. Change the tint, add vignetting, a little more saturation... With FCP X, were taking baby first steps in that direction. Final note: I cut a 17-minute documentary one evening last summer (footage I shot)

using FCP X on a 17-inch MacBook Pro with an internal 500GB SSD and fast G-Tech 8TB RAID with Thunderbolt. I loaded files, reviewed footage, cut picture, sound, music, added titles and credits, and finished in 11 hours straight. There was no initial rough cut, then fine cut. I edited carefully along the way, with utmost precision. The finished results were projected before an audience the following morning. I couldnt have pulled this off using pokey old FCP 7. In other words, if this is the future of editing, Im loving it.

EDITING LIFE OF PI
from page 73 behind the screen. The way that we wanted to do the fade-up is that the sun comes up first, and then everything else follows. If you just fadeout, where infinitys at 20 pixels and then you fade-up the sun at 32 pixels, your eyes are still thinking infinitys at 20 pixels, so the sun feels like its in a hole. It feels very strange. So what we did is this: when the sun comes up first, it comes up at 20 pixels back, and then, as the rest of the scene comes in, the sun drops back to 32 pixels. You dont feel it [when you watch the movie], but if we did it differently, you would feel it, and it would feel strange. For every dissolve you have to think about how the shots interact. For some of our transitions, the outgoing shot has to drop back as the dissolve is starting, or even before the dissolve starts. Theres one dissolve where we cut from a wide shot of the boat at night to a close shot of Pi writing. If you just do the dissolve, this little tiny boat looks like its floating right on Pis nose; it just looks dumb. Leading into that cut, we had to drop the boat back, so [coming in] it feels like its behind his head. [In 3D] you have to carefully consider every dissolve, every transition. So [the stereography] really is part of the editors job. Thats fascinating. And Im sorry, but can you explain how pixels are the unit of measurement here? Yes. Thats the offset between the left eye image and the right eye image. Lets say you have an image of a star. If you look at it on the screen without the 3D glasses, youll see two stars. When you put the glasses on, each eye only sees one and they automatically reconverge. If the right eye image is to the right and the left eye image is to the left, [the star] feels like its behind the screen. If its the opposite, it feels like its in front of the screen. And so, by repositioning the images left and right, you can control whats behind the screen and whats in front of the screen, and you can do that in postproduction.
FILMMAKER FALL 2012 83

Вам также может понравиться