F E A T U R E S    Issue 1.08 - December 1995

The New Hollywood: Silicon Stars

By Paula Parisi

More and more movies are being made on location - in cyberspace with synthespians like T-rex, Casper, and now the entire cast of Toy Story. Proving that the only major new talent in Tinseltown this year is technology.



The business of computer imaging has become so hot that the top animators in the field can name their price. SGIs are being uncrated by the dozen at companies all over town in Hollywood - the only problem is finding enough qualified people to run them. The Spielberg/Katzenberg/Geffen venture DreamWorks SKG plans to hire 100 computer animators, and people who might normally make US$50,000 (around £30,000) a year can expect that to more than double. RAM jammers are outearning MBAs. They're being referred to by the establishment as "talent," a term normally reserved for actors. Even when used derisively - and Hollywood is probably the only place it can be - the title carries a grudging respect.

The digital artisan is a class of collaborator so new it's still being defined, drawing film industry gurus, computer science graduates, video postproduction staff, photo retouchers, and traditional fine artists. As Scott Ross, CEO and partner of Digital Domain, a nearly 3-year-old special effects facility, told The Hollywood Reporter, "The kind of artistic digital specialists the technology requires don't really exist yet. We're basically asking a lot of people to learn Esperanto and write poetry at the same time." The ultimate goal: to create life itself.

A decade ago, only an intrepid few, led by George Lucas's Industrial Light & Magic, were doing high-quality digital work. Now computer imaging is considered an indispensable production tool for all films, from the smallest drama to the largest visual extravaganza. Leaping from 2-D (think Bugs Bunny) to 3-D, animators can now concoct digital sets, enhancing natural environments or crafting fantastic ones - a point-and-click version of what used to be known as matte painting. With a little techspertise you can pull off awesome fly-bys in three-dimensional cityscapes. When it comes to lifelike locales, the digital studio is imminently going to become a reality.

"We're on the threshold of a moment in cinematic history that is unparalleled," says the techno-trailblazing filmmaker James Cameron, who uses Digital Domain, where he's chair, as a personal R&D lab for cinematic stunts. "Anything you imagine can be done. If you can draw it, if you can describe it, we can do it. It's just a matter of cost."

The most difficult, hence ambitious, work is in character development. An organic creature that moves, changing perspective in three-dimensional space, is the greatest challenge facing Hollywood's hard drives. The absolute imperfection of living things is a renderer's nightmare. For many, the ability to generate a photorealistic human, an "artifactor," remains the elusive goal. While animators have been developing a lively tradition of computer-generated "characters" in the form of animals, aliens and others, the ability to conjure a convincing human from a synthetic source has hovered tantalisingly out of reach. But that's changing. "We can make an animal, and if you do that, you can make a human," says Lucas, whose personal toy box, ILM, is now immersed in creating an anthropomorphised beast for Universal's Dragonheart, set to début in the summer of 1996.

There are some who would argue that there have been synthetic film actors around since Walt Disney released Snow White in 1938. They would be right, but the photorealistic quality of today's computer animations has raised the stakes considerably. The new generation started in 1989 with the slinky, translucent water snake in Cameron's The Abyss; then in 1991, we got our first truly believable computer-generated character in the morphing metal cyborg of Cameron's Terminator 2: Judgment Day. Two years later, Steven Spielberg's Jurassic Park created an atmosphere in which audiences quite literally could not believe their eyes.

This season, moviegoers will be shocked to learn that the jungle herds of TriStar's Jumanji stampeded in off the cybersavannah. In Jumanji, based on the book by Chris Van Allsburg, a young boy gets trapped inside a board game and grows up to be Robin Williams, lonely, rambunctious, and spinning around in a parallel universe that wreaks havoc when it crosses paths with the real world. "We have a shot in which we've got a wild animal stampede flying through a room, suspended in air, like on a vortex," says director Joe Johnston, an alumnus of ILM who helmed Honey I Shrunk the Kids. "There's no other way to do that besides computer-generated imagery. You could try stop-motion animation, but it's not going to be photorealistic. You're certainly not going to suspend animals by cable on a blue screen. The thing is, computers are doing stuff that you couldn't do before - at any price." (They still can't do hair, though. "The big challenge was fur," recalls Johnston. "It needed to be matted down, with knots in it, burrs and things animals would have. When our first tests came back on the lion, he was totally groomed with this big mane of perfect hair. He looked like Tina Turner. They spent quite a bit of time dirtying it down.")

Dragonheart, which stars a fully realised reptile-as-thespian, will push a little further toward crossing the line between human and humanoid talent, a heady progression fraught with its own implications for the industry. "This is not just a dinosaur that moves," says the director, Rob Cohen. "This creature emotes, feels, is threatening, and has the voice of Sean Connery - and, we hope, some of his presence and wisdom, too. What we're trying to do is create the first computer-generated actor. He's a co-star to Dennis Quaid, scene for scene."

Jim Cameron remembers that when he wrote The Terminator in 1980 and '81, the concept of the next-generation cyborg, T-1000, was impossible to pull off, un-heard of. Dinosaurs at least had historical precedent in the "Superdynamation" stop-motion technique of Ray Harryhausen, who wowed audiences in the '50s and '60s with movies like One Million Years B.C. and Jason and the Argonauts. Cameron contemplated his silvery cyborg and scratched his head. "I knew there was just no way to get this thing on the screen," he says. "I was actually prevented from creating an image that was preexisting in my mind by the inadequacy of the technology." Cameron had to put T-1000 on the shelf for five years. To communicate the ecstatic experience of having his fantasy fulfilled in Terminator 2, he invokes the 1956 science fiction classic Forbidden Planet and its ephemeral Krell, a race of beings who evolve beyond the limitations of their bodies, existing as pure thought. Equating computer imaging with "the Krell dream of pure creation," Cameron recalls the thrill of watching his ideas travel "from imagination to the screen with no visible intermediate."

The intermediaries may be invisible, but they control a good portion of the process. On Dragonheart, for instance, ILM's troops built a "puppet" of Draco the dragon, orchestrated his every move, and then dropped him seamlessly - at 20 feet tall and 40 feet long - into a pre-shot live-action scene. (In the fine-tuning stages, Cohen coached the animation supervisors on sharpening expressions - giving an eyebrow just the proper lift, concocting the perfect smirk, tilting the head just so.) While the modellers were at work back at the ranch in San Rafael, California, Cohen was out on location in the hills of Slovakia, shooting with the human components of his principal cast. The ana-logue star, Quaid, played his big moments against empty space. Elaborate rigs were constructed to shake up rocks in spots where Draco's feet would later be inserted, and sway vegetation that would later bend to his enormous bulk and the beat of his 75-foot wingspan.

Cohen predicts this type of collaboration - "the blending of something that's imagined with something that's really there" - will become "a new specialty art form." The dance with an invisible partner was clearly the most challenging aspect of the production, and it was achieved only by meticulous planning and collaboration between director and animators. "It's not an arena where you do a lot of improvisation," Cohen says, laughing. After the lengthy preparation, and much mental energy spent imagining his digital star springing to life at ILM, Cohen describes his first glimpse of Draco as "a moment I'll never forget." In the scene he watched in the screening room, the dragon delivers a nice, wry line in a sunlit close-up. "He was as real as the rocks and the trees behind him," enthuses Cohen. "I felt like shouting, 'Don't show this to anybody, they'll burn us at the stake! It's witchcraft!'"

You'll love what it does for you

In the last two years, dozens of actors have gone "under the beam," among them Jim Carrey, Arnold Schwarzenegger, and Denzel Washington. The technique is called scanning, and it involves running a laser beam over a person or object, feeding the minutest details of shape, texture, and colour into a computer. The digital data set of an actor can then be manipulated at will. A mainstay of the computer-aided-design industry (and an outgrowth of military R&D), scanning was first used on actors in 1986 when ILM digitised the principals of Star Trek IV: The Voyage Home for a short scene in which their heads dissolve into particles. So far, actor scanning has been confined almost exclusively to the head, though that's likely to change with the introduction of the first full-body scanner from Cyberware Inc.

When Denzel Washington's head shatters in Virtuosity, it's by means of his digital image, giving a much more realistic effect than a model would. Digital also offers predictive control: you can know exactly how his head will fly apart, down to each jettisoned nostril. Once the 3-D digital "data set" of an actor is inside the computer, it can be manipulated at will, made to do literally anything.

Meanwhile, the digital stunt doubles in Outbreak, Batman Forever, and Judge Dredd allowed filmmakers to fake action more convincingly than before. Synths will uncomplainingly take those suicide leaps (take after take after take). And crude camera tricks that boost the egos of vertically challenged stars will soon be replaced by a synthetic stretch. Already, real-time animation devices allow cartoon characters to "live." Systems such as Vactor and Alive propel toons onto talk shows and into interactive installations at theme parks. Eventually, there will come a day when none of us will be really sure that the image on our screens is real. William Gibson's Mona Lisa Overdrive proved prescient in its vision of "entertainers," celebrities assembled by committee and marketed by corporation. The trend is already foreshadowed in real life in the idoru, or idol singer, craze in Japan, which Gibson has recently stated will be a feature of his next book.

As cartoon characters get increasingly real, actors may get more cartoonlike - and heroic: the possibility for a developmental curve that exceeds one's natural limitations is truly tantalising. Digital artistry will allow actors to bioengineer themselves, or else be unwittingly bioengineered, to perfection. A performer with no aptitude for dance, for example, can have all the right moves programmed in. Stars will be constructed from the choicest body parts, in the same way dozens of animators work in concert to create a Disney character like Aladdin or Pocahontas, each injecting his or her little contribution. Early screen tests featuring Casper as a biped rather than trailing a genie-like wisp prompted comments like, "The legs - lose 'em!" - the shape of things to come.

Digital Frankenstein

Scott Billups is the first person - in Hollywood, at least - to reach deep into the heart of his bit-circuited incubator and pull out something imbued with a spark of electronic life. Billups's courtly manner, springy step, and tidy hair call to mind a gearheaded Cary Grant. But he's a special-effects meister with an attitude, sourly complaining that "carbon-based" actors are glamourous "only until you've had to work with them." The first postmodern effects cowboy, he talks about a filmmaking "shift from the organic bias to the inorganic," and exhibits a healthy scepticism of commonly held beliefs ("Let's face it, a set is little more than a synthetic representation of an actual or imagined environment rendered in organic materials").

When it comes to the movie industry's special-effects mainstream, Billups is almost as isolated as those cinematic mad scientists of the 1930s, who repaired to remote mountaintops in Transylvania for their pioneering studies. He is more interested in challenging the status quo than joining it and is generally content to puzzle through off-beat projects like Pterodactyl Woman of Beverly Hills (hysterical housewife morphs into historical reptile) and Really Big Bugs (oversized insects invade LA). But Billups occasionally goes commercial, too - he built one of the most ambitious cyberstars to date, a virtual actress designed for telco subsidiary GTE Interactive Media using Marilyn Monroe (who else?) as a template.

Virtual Marilyn was put together using five actresses and models, a Cyberware scanning machine, Alias modelling software, and Wavefront's motion-software package, Kinemation. While virtually all of Hollywood's computer-generated characters have been designed with one goal in mind - to deliver the maximum bang per megabuck during a few precious seconds of screen time - Marilyn was engineered from the ground up with a nominal capacity for interactivity and an eye towards growing her "intelligence," with the digital equivalent of a primitive nervous system stuffed inside her slinky shell. She's no dead ringer for Marilyn (and her hair, incidentally, is as caramelised as Dolly Parton's exotic coiffures), but at certain angles she bears an uncanny resemblance to the original. She demonstrates admirable if not entirely desirable range, with a propensity to slip at a moment's notice from strikingly beautiful to alarmingly grotesque. Her attempts at motion are as endearing as an infant's first feeble gestures; her awkward grace is as inspiring as it is frightening. Watching Marilyn recalls the chilly seduction of the first artificial flirt, captured so precisely in the classic climax to James Whale's 1935 Bride of Frankenstein. Elsa Lanchester swoons; the scientists gasp.

In a way, not much has changed in the 60 years since Whale conjured those emotions using carbon-based actors on a very real soundstage. Observers at Billups's studio are experiencing the same feelings. Physically, it is still a two-dimensional exchange. Nevertheless something more is beginning to happen in several f/x houses, as special-effects firms are known in the industry: slight intelligences - instincts, really - are being instilled through a process known as reverse kinematics.

Kinemation represents a huge breakthrough in motion animation. It used to be that when computer animators wanted to move something - specifically an organic creature - they would have to create the motion themselves, body part by body part, detailing by hand every nuance of movement. With Kinemation, Wavefront began building certain "instincts" into the software - so that when, for example, a hand moves, the muscles on the fore-arm will flex automatically. "We are writing software that not only will allow you to create objects that have geometric qualities, colour qualities, and textural qualities, but will also allow you to teach them to be objects," says Alias/Wavefront president Rob Burgess. "When a ball hits the wall, it compresses, and when it moves away from the wall it uncompresses. When a foot hits the ground, it knows to bend." Everyone at ILM will tell you that without this software, it would have been next to impossible to animate Jurassic Park, because it would have taken so long to get the dinosaurs to move properly.

The lines of code so painstakingly developed for a smart software package like Kinemation are more than mere conveniences. In celluloid terms, they're digital DNA, the very fabric of computer-generated life. And like many of the film industry's silicon-based breakthroughs, the core technology was appropriated from other fields. Computerised creatures created for medical training at the University of Pennsylvania Center for Human Modelling and Simulation were originally designed to mimic "gross body responses," according to Dr. Norman Badler. "Their blood pressure, respiration, and neurological responses are all programmed in," explains Badler. "So if there's no oxygen and the synthetic doesn't get proper treatment within a logical response time, its brain will die."

The puppet masters

There are three ways to map movement onto a computer-generated creature: digital animation, which amounts to computer drawing, a frame-by-frame process much quicker but not all that different from the way Mickey Mouse was locomoted; capturing the motion from a person who has been scanned or outfitted with sensors that feed data into a computer; and building a mechanical model that is similarly rigged.

To animate Sil, Natasha Henstridge's synthetic alter ego in Species, Richard Edlund, founder of Boss Film Studios, invented an elaborate puppet system. His crew began by building an actual-size plastic-rubber model (nearly 7 feet tall), plus a 2-foot-tall version which could be manipulated from a distance using joysticks and keyboards. Next, they traced a grid with lines intersecting every half-inch or so and scanned the model's data set into the computer. This gave them the digital outline that would become the character's skeleton. "Each one of those intersections is a polygon," explains Edlund. "All the polygons have to match, and there has to be elasticity to the skin. It's an arithmetical nightmare," he says, noting that since the skeletal Sil was transparent, the process was complicated by interior as well as exterior shapes. The low-resolution Sil comprised about 5,000 polygons, while the final film image went as high as 500,000 polygons. (By comparison, the per-picture element for Jurassic Park's dinosaurs was in the region of 50,000 polygons.)

A team of puppeteers manipulated the model's movement, which was computed instantly to a low-resolution image of Sil displayed on a video monitor. The huge advantage of Edlund's system was that it allowed Species' Roger Donaldson to direct his creature in something approximating real time. He could see his computer-generated character (albeit a low-resolution, grainy version) right there on the set, composited into the scene as he directed. (This was a competitive breakthrough for Boss - and it nearly broke the company; the R&D was so expensive that Boss announced its "retirement" from features, at least for now.)

The hulking dinos snarl, the alien bitches hiss. But the A-list animators are not content. What do they want now? Facial capture. Sensors - from as few as five or six to as many as twenty-five, depending on how much detail is required - are positioned on an actor's face. Taking direction, the performer will move his or her face, and particular expressions will be recorded and mapped onto a digital character. Edlund's team has created a system that is basically a library of captured facial images, "a visual saxophone," as he calls it. "It has all these keys and twists. They correspond to the eyebrows and jaws moving, one lip lifting, then the other. Being able to close one eye. Through this complex switching system, we could 'play' a facial performance and do takes on the face just like we did takes on the body."

The first feature film to dabble in facial capture was Casper, which also showcased the first speaking synthespian. Director Brad Silberling had initially hoped facial capture could be used extensively to save drawing time, but he changed his mind. "In the end," says ILM's Dennis Muren, the digital-character supervisor for the Casper project, "We could get a better performance out of an animator than a person." Body capture, which essentially enlarges the scope of what is recorded to include every movement of a human model, was also tried and rejected. Muren traces part of the dissatisfaction with both techniques to the fact that "we were dealing with ghosts - they fly, and they aren't shaped like humans."

For his part, Edlund chose a puppet for Species because "humans didn't have the athletic capability of the character we had in mind. You would have had to hang them from bungee cords, and it would have been a nightmare." Similarly, Jurassic Park and Jumanji were powered by puppets. "Real animals are just too unpredictable," says Johnston of his raging herd. Explains Edlund, "The puppet becomes infinitely manipulatable. We could do 100 takes an hour. It's really facile."

One of the most striking capabilities of digital animation may be its capacity to endow human and nonhuman beings with each other's abilities, traits, and "character." The computer diminishes the importance of natural or inherent differences; each is useful for some things - the animator can pick, choose, borrow, and still end up morphing. As John Dykstra,a visual-effects supervisor who has worked on films from Star Wars to Batman Forever, quite simply puts it, "We're learning the personality of motion."

The money trail

It's no accident that Silicon Graphics Inc., the de facto hardware standard in film imaging, named its own top-of-the-line machine the Reality Engine. These are the motors driving the industry towards a new visual frontier. But SGI has moved to diversify in the past few years by entering into strategic alliances with both ILM and DreamWorks and by purchasing the two leading manufacturers of entertainment imaging software: Alias Research and Wavefront Technologies. While all the top digital houses write their own software, virtually all also own popular off-the-shelf programs marketed by these companies, as well as programs from the competing, Microsoft-owned Softimage. (The official line at ILM is that the company's breakthroughs have been achieved using proprietary software, with maybe a little off-the-shelf thrown in. But unless you're some kind of code warrior with access to this jealously guarded material, it is less than easy to know what has made it into the home brew.)

The ground-up engineering of not only programs but new characters - some recognisably human, some not - has a potentially huge upside and promises to shift the balance of power in the film industry. Where visual effects houses were once relegated to the second-class status of "vendor," that is changing as the top shops transform themselves into digital production studios. The new paradigm, and one of the premises on which

Digital Domain was founded, would see f/x houses sharing character ownership. Digital characters lend themselves easily to merchandise spinoffs - a strategy Disney is counting on for Toy Story. The synth boom has also stirred up much hype about trading profits. Pixar, Disney's partner in Toy Story, is poised to go public when the movie opens. (Stock in Alias, the software program used to model Terminator 2's T-1000, actually dipped after the movie came out.) A boutique digital effects firm, Kleiser Walczak Construction Co., believes there is enough of a future here to have trademarked the word "Synthespian."

"Digital content is a return-on-assets gold mine, because once you create Terminator 3, the character, it can be used in movies, theme-park rides, videogames, books, and educational products," said Lucie Fjeldstadt, then IBM new business chief, in early 1993 shortly after her company made a bold strategic investment with Cameron, Stan Winston, and Scott Ross in Digital Domain. "Not only that, they're reusable assets. You can take that hand, or any part of that body, and put it on another part. Change it a little, and it becomes new. Think digital back lot." Or, think action-figure heaven. The numeric controls used in designing computer-generated characters could easily be ported to factories milling plastic toys - a fact that has given pause to computer artists operating under the naive assumption they were designing film elements, not Toys R Us inventory. Reportedly simmering on the burner at DreamWorks is a huge, 600-shot film called Small Soldiers. Hear that assembly line hum!

Once upon a time, actors were human

"I'm not afraid of the technology," said Tom Cruise - the highest profile actor to go public with his views on the digital revolution - addressing the audience at the first Artists Rights Digital Technology Symposium back in April 1994. "I think it is important not to restrict the creative aspect of what digital can do and to keep that growing. But in terms of limiting the use, in terms of redefining who we are ... I don't want anybody else playing the roles I play, and I don't want to play anybody else's roles." Cruise called for the establishment of laws to govern the new territory, although he expressed concern over whether such a delicate task could safely be entrusted to the federal government, which "doesn't understand what this technology is capable of doing." Summing up, he said, "It's quite terrifying."

Though Cruise may be unusual among fellow actors in the degree to which he's weighed the ramifications of digitisation, many instinctively share his trepidation. Information about imaging technology is blowing through town like a chill wind; people are talking about the parallel to the revolution of sound, which broke many a career in the '30s. Aside from the immediate fear of being "replaced," issues of rights and ownership loom ominously. The Tom Cruises and Tom Hankses of the world may always have negotiating clout, but thousands of others are probably not just paranoid in visualising themselves as tomorrow's "digital assets," techno plankton that will be swallowed up by the studio sharks swimming hungrily toward the new media future.

Who will own the digital databases? In the past, all imagery has been the property of the copyright holder, generally the production entity - in other words, a motion picture studio. In the future, who can say?

Screen Actors Guild rules prohibit the reuse of actors' images if it would substitute for hiring the actor, but that's small protection. It's reasonable to speculate that the studios will negotiate for ownership of digital rights, either during an actor's lifetime or posthumously. Already a grey market exists in the reuse of body parts, either to augment existing actors, à la Robert Patrick in T2, or to construct, à la Scott Billups, new ones.

"The big challenge is going to be in detecting it," says Joseph J. Beard, a law professor at St. John's University School of Law in New York. Why bother with cyberborrowing? "Because, if you already have the data, it's cheaper than building new body parts," Beard speculates. Roughly 80 countries - but not the United States - are signatories to something called the Berne Convention, a treaty that offers film artists certain protections. The 1990 Visual Rights Act, however, does extend some safeguards to painters, photographers, and sculptors under US law.

"We're cheap actors is what we are!" quips ILM animation supervisor Steve Williams, the man behind The Mask. Not so cheap, as a matter of fact. Per second of screen time, Cameron estimates, it cost more to cast the digital T-1000 than to hire Arnold Schwarzenegger.

While the corporate overlords may be conspiring to capitalise on digital, the creative community is making soothing noises. "Synthetic characters are fine, as long as they don't take work away from real actors," says Spielberg, adding with a laugh, "I couldn't find an actor to play a dinosaur, so I cast it in the computer."

But some aspiring actor is never going to get the chance to play a young Sean Connery. Connery will be digitally "youthified" in a movie called Do Not Go Gentle, which will reportedly be directed by Dragonheart helmsman Rob Cohen. It's about an aged astronaut struggling with the ignominious fact that, on the eve of his moon mission, he got cold feet and was grounded. Flashback! Connery, now 65 years old, will revisit his 30s without having to resort to a stand-in or those generic tricks-that-don't-work like soft lighting and younger-looking clothes. Animators will simply take a scan of his face and tweak off the years.

From there, it may not be a long leap to cyberstardom. Even, as you read this, GTE's Marilyn is receiving instruction from an f/x director on precisely how to swing her voluptuous hips and pout her irresistible lips. Billups says that she's a fast learner.

Paula Parisi (pparisi@aol.com) covers technology for The Hollywood Reporter.