urbanfox - company logo  
     urbanfox.tv > technology articles > film effects articles > digital effects

ON THIS PAGE:

Getting more for less
Optical flow

The all-in-one solution

Invisible effects
Blurring the lines
The end of film?

Motion capture

Digital Effects

by David Fox

The rapid progress of digital effects technology means that the mantra "we'll fix it in post" can now apply to even low-budget film and TV productions. [This article was written midway through 2000].

Getting More For Less

"The biggest, most important thing for me is the rate of change of processor speeds," says Dave Throssell, animator, The Mill/Mill Film, who works mainly on commercials. Ten years ago a fast SGI machine had 25 MHz, today it is 1,000 MHz, 40 times faster (and ten times cheaper). "The faster the processor, the quicker and better the rendering." As they continue to get faster and less expensive, he says this will make it easier to do work impossible before. At the same time, disk storage has increased 100-fold, from 500 MB ten years ago to 100 GB now, "meaning we don't have to worry about storage anymore," he says.

Throssell, who has been working on digital effects for more than 20 years, believes technology is no longer a key issue. Where, previously, "the software for computer animation was way ahead of the hardware; now it is at the point where the computer is fast enough and cheap enough to make it work," he says. He rates all the major off-the-shelf software packages as "very good and very cheap."

He believes one of the biggest developments has been the advance of motion tracking, making it easy to precisely locate individual elements and stick them together. "Everyone can now do good compositing and good tracking," he says. With the latest virtual camera tracking, you "can very easily drop things into live action plates and they don't slide around everywhere."

Talking animals, as in Babe 2 and Animal Farm, has also been a big growth area, but The Mill's other main work is set extension and set creation, such as on Ridley Scott's Gladiator, where Mill Film built a Colosseum full of people (where only a small portion was real). It also built Rome as a digital matte painting in Photoshop. Combining this with 3D techniques, with people walking around, makes Rome look very realistic, he claims.

On the 2D side, crowd replication, where a small number of extras shot separately, using moving cameras, then composited together, makes big battle scenes believable on limited budgets. "Competition in digital film is growing rapidly, and producers want and get more for less," he says.

Back to the top.

Optical Flow

Frank Kitson, technical director 3D, at the Kodak-owned Cinesite, worked on Mission Impossible II. He believes the main long-term trend in special effects is the hybrid use of live film effects and 3D technology. This mixes in "a lot of the traditional skills of practical effects and practical photography," with an attitude that no longer cares how an effect is achieved. "3D can't be isolated. It has to work with traditional film making to get the best results," he says.

He sees the most significant advancement in 3D as global illumination, which allows you capture the environment, deconstruct it, put all the attributes in and light it using radiosity or other effects, then put it back together.

This flows from one trend started by The Matrix, that of "reverse engineering reality to produce more realistic effects," such as extracting the lighting model existing in real elements, then enhancing it and being able to play with it "to produce something that didn't exist before," he says.

Kitson is doing a lot of work with "optical flow", a process of examining an image or series of images and tracking the behaviour of all the pixels, so you can do different things with them. For example, it allows you lift objects off a background without using a blue screen or to use the physical properties of an explosion to ripple through wings on a character to make them more look realistic.

Back to the top.

The All-in-one Solution

As the technology becomes more sophisticated, will it mean that users will no longer have to specialise? Kitson thinks so. "In some ways, there is a lot of blurring of those traditional lines." He believes that "the days of just having people doing one thing will end. Compositing will become so easy. Being optical, you can create such sophisticated edge detection that it can produce edges where resolution is no longer an issue."

However, while the digital workstations may in future be able to do everything, Throssell believes the people will still specialise, whether in editing, compositing or 3D, and maintains that demarcation between jobs is a good idea, "because everybody knows who is responsible for what." When he does some compositing, he makes sure a specialist compositor oversees it.

However, he believes special effects people, and post-production staff in general, will be working a lot more on location, especially with 24p and digital video, as they can easily edit or check effects on set - if that's how the director wants to work.

Back to the top.

Invisible Effects

So widespread has the use of digital effects become in film that viewers hardly notice them anymore - mainly because they are not supposed to. Invisible effects, such as set replication, can both enhance the story and reduce costs, according to Alex Patrick Smith, managing director, MAX Post Production Alliance - a partnership of five independent digital film companies across Europe.

On Wim Wender's Million Dollar Hotel, 30 minutes of its two hours uses digital effects, "but you don't see a single effect in the movie," says Patrick Smith. This is because they include motion tracking to stabilise long helicopter shots, wire removal, and making broken neon signs light up. "It really enhances the story telling, while cutting costs," he says.

At the Computer Film Company, senior visual effects designer, Paddy Eason, also tries to ensure that people don't notice his work.

On Tim Burton's Sleepy Hollow, they did about 130 shots (Industrial Light and Magic did the other 260 or so), mainly removing the Headless Horseman's head - "an effect no one seemed to notice, which we took as a backhanded compliment," says Eason. It wasn't just a case of adding background where the stuntman's blue-covered head had been, but also where the head had covered the moving axe and, especially, his high coat collar. They had to create a 3D model of the collar and track it into each shot.

Every Headless Horseman shot also has lightning strikes, which had to be animated by hand, and reflected off the collar and axe. They also had drifting smoke which had to go through where the head had been. In total, there were 100 shots involving either CFC's own compositing system or Cineon, Maya or Houdini, and Painter with Commotion.

CFC also did some (particularly gory and realistic) decapitations, and a few pieces of standard digital effects work, such as replacing sky, painting out vapour trails, removing lighting rigs from long shots - "an example of visual effects being able to save time and money for a production rather than having to go through contortions to hide or remove lighting rigs physically," he says.

The effects used hardly any bluescreen or tracking markers, to make life as simple for the director as possible. Once he got a shot he was happy with, they just did another shot clean (to extract the background from), keeping it as close as possible to the original move, rather than using motion control. Then CFC used auto-tracking software, and where it failed, such as during lightning flashes, they tracked by hand (using trees or buildings to fix to).

CFC also worked on Mission Impossible II and Ardman Animations' Chicken Run, which despited being shot on film and output to film, had all 90 minutes go though the computers in between.

Back to the top.

Blurring The Lines

Indeed, an increasing number of animated productions are using digital effects, such as the animated movie, Miracle Maker (released at Easter), which used a mix of 2D and 3D stop motion, and 2D and 3D traditional animation to chart the life of Christ.

The Welsh/Russian co-production was originally envisaged as four half-hour TV episodes, but they decided it would work just as well as a movie.

According to Ed Hawkins, senior compositor, Digital Film which did the post-production on Miracle Maker, it involved a whole range of techniques, including bluescreen, motion tracking, coping with lighting changes in the middle of a shot, flickering problems with the film, and problems with the puppets (such as rotting from handling by the puppeteers). They also had to digitally create landscapes for wide shots.

"Normally you are making things look real, but here you are making them real only in a puppet sense," says Hawkins.

Computer generated water plays an important part in the film, as there are lots of fishing scenes, including one which has a fish's eye view. They also used particle effects, for rain and dust, and to cope with the transitions between the stop motion and 2D Animo animation.

Back to the top.

The End Of Film?

With the rise of 24p, which is perfect for special effects, "I don't think we will see film five years down the line, even if some directors disagree," says Throssell.

"We are just at the beginning of understanding what effect digital technology will have on film production," says Patrick Smith, more cautiously, although he believes the acceptance of digital video production for film making is lowering the barriers to entry;

It particular, the success of the Blair Witch Project has given people confidence to shoot films without film, he believes. Wenders' Beuna Vista Social Club was shot on Digital Betacam, partly because Wenders only realised once it was shot and edited that it could be a feature film. By using HD post-production, at Das Werk (a member of MAX partly owned by Wenders), they were able to make it look better than video before transferring it to film for projection. Besides, even if Wenders had wanted to do it on film originally, he wouldn't have got the budget. By shooting on video he could work to a 30:1 shooting ratio, three times what he would have had with film.

Patrick Smith believes film will take time to disappear as an acquisition format, "but now there is a choice: 24p. This looks like offering a real alternative and a way of reducing costs."

Wenders was one of the first to use 24p, using it to shoot a U2 video in Ireland, and Patrick Smith says "it's significantly better than Digital Betacam. If you are viewing it on a big screen you'd probably not know it wasn't shot on film. I see this as something that's really going to move very fast and opens up this sort of film making to a much wider audience."

However, with advances in telecine, such as Spirit, the cost of film scanning is also falling considerably, making it more accessible to go from film to digital post production and back to film, he adds.

Back to the top.

Motion Capture

Motion capture specialist, AudioMotion worked on both Gladiator, with Mill Film, and Jason And The Argonauts, with Framestore, as well as providing smaller amounts of motion capture data for other productions. It works with full body capture, facial capture and hand capture.

It involves putting a person in a suit or capture apparatus, then capturing exactly what they do in the bluescreen studio, turning the data into a skeleton which can be used for "very realistic human character animation or realistic facial animation," says AudioMotion's head of motion capture, Richard Hince.

Its two optical motion capture rigs (one for the body, the other for the face) are large and expensive, and can track markers on a person or anything else, locating them in 3D space so the computer animation system can take the data and use it to generate models which move realistically.

Computer games are currently 65 to 70% of their business, because they need a lot of realistic motion, and AudioMotion has captured some 25,000 individual moves for about 200 different games. However, they are also seeing an increasing use of it for pop promos, such as one which used transparent 'glass' characters for a dance sequence.

In films, it is usually used for characters in the background. In Gladiator it was used to help populate a 3D set. "It means the character can be moved with the set more easily than if they used 2D characters", says Hince.

"Very realistic [human] characters which are full screen are very difficult," he says, but aliens and other non-human creations are easier (such as some of the characters in Phantom Menace or The Mummy).

They did 180 minutes of motion capture for the Gladiator, with six or seven people in shot (as elements to be used as part of the background in scenes). Hince says such multiple-character shots would have been impossible to do with motion capture only a year ago.

Back to the top.

© 2000 - 2010

More...
High Definition and 24P - is video finally good enough to replace film?
Dinotopia The BBC's Walking With Dinosaurs brought dinosaurs back to life more realistically than any TV show had ever done before. Now, the facility behind the series is working on the next step - making them talk. Due for transmission in April 2002.
FreeD The BBC's camera tracking system is now available to help create real-time virtual sets on film having been adapted by London facility, The Moving Picture Company.
The Last Post? - interview with Mitch Mitchell - Professor of Advanced Imaging at the University of Bournemouth, a fellow of both the royal photographic society and the BKSTS, a freelance visual effects supervisor and general imaging odd job man....
| BACK TO HOME PAGE | SEARCH | CONTACT US | TECHNOLOGY ARTICLES | PRODUCTION | CREATIVE STUFF | COURSE INFORMATION | CAMERA WORKBOOKS |

David Fox