The Inquirer-Home

Lucasfilm will combine video games and movies to axe post-production process

Customise movies while watching
Fri Sep 20 2013, 14:59
Lucas Film talks up the value of real-time video production

OVER THE NEXT DECADE video game engines will be used in film-making, with the two disciplines combining to eliminate the movie post-production process.

That rather ambitious claim comes from Lucasfilm, the California production company responsible for the Star Wars franchise. Speaking at the Technology Strategy Board event at BAFTA in London this week, the company's chief technology strategy officer Kim Libreri announced that the developments in computer graphics have meant Lucasfilm has been able to transfer its techniques to film-making, shifting video game assets into movie production.

Real-time motion capture and the graphics of video game engines, Libreri claimed, will increasingly be used in movie creation, allowing post-production effects to be overlayed in real time.

Real-time motion capture refers to the use of a special suit covered in reflective markers along with specialised cameras so computers can calculate the motion of the underlying skeleton in a way that can be used to drive a computer generated character.

Extracting and visualising these performances in real-time enables interactive virtual production and allows lens shots on virtual scenes.

Apparently this technology will provide means for the removal of the post-production process.

"Everyone has seen what we can do in movies, and I think most people will agree the video game industry is catching up quite quickly, especially in the next generation of console titles. I'm pretty sure within the next decade, we're going to see a convergence in terms of traditional visual effects capabilities - [such as] making realistic fire, creatures, and environments - but working completely interactively," Libreri said.

"We think that computer graphics are going to be so realistic in real time computer graphics that, over the next decade, we'll start to be able to take the post out of post-production; where you'll leave a movie set and the shot is pretty much complete," Libreri said.

Lucasfilm is confident in this concept as it has been testing it in the development of a series of prototypes created with the team at Lucasfilm's motion picture visual effects company Industrial Light & Magic (ILM).

The first was a short film created in eight weeks, with Lucasfilm and ILM working together to heavily modify the Lucasarts' gaming engine. They changed the rendering techniques to produce a video that wasn't rendered in the traditional visual effects way at 10 hours a frame, but generated at 24 frames a second. That's 41 milliseconds per frame, generated on a games engine with a lot of games hardware."

"The prototype was a film created on a games engine and a vision statement for where ILM would like to go in the future, and at the same time how Lucasfilm is getting into the same generation of console hardware," Libreri said.

After the prototype movie, Lucasfilm and ILM worked on a Lucasarts Star Wars video game project called 1313, which was shown off at the E3 gaming conference in 2012. The game was in development for around two years using Nvidia gaming hardware, before it was cancelled when Lucasarts was shut down by Disney in April this year. However, 1313 has been used by Lucasfilm to demonstrate real-time motion capture, giving it the confidence to believe that video games engines could be used in movies and could one day replace the post-production process.

"I think that the current way that we make movies is very pipeline stage process, takes away a little bit of the organic nature of a movie set or real environment. I'm hoping real time graphics technology brings back the creative possibilities that we have in the real world," Libreri said.

"Let's not dismiss the artistry you put into a final shot, we do spend a lot of time steadily tweaking blooms and lens flares or the lighting in a shot, but we'll be able to get a lot closer so that more run of the mill windows replacements will be created interactively on stage."

Lucasfilm believes that over the next ten years, this concept of exchanging assets between movies and video games will also pave the way for capabilities for viewers to customise movies in real time.

Libreri used the future example of an animated Disney film that could be streamed live on an iPad from the cloud, allowing anyone that watches it to customise it; changing the costumes of the princesses, or putting their own friend in the background.

"There's so many things that you can do with the fact that video graphics is going to be real-time and not this post-process that we've had traditionally," he added.

"If you combine video games with film-making techniques, you can start to have these real deep, multi-user experiences. Being able to animate, edit and compose live is going to change the way we work and it's really going to bring back the creative experience in digital effects.

To wrap up Libreri showed off a video demonstrating Lucasfilm's "performance capture stage" driving the game engine for 1313. The video shows the possibilities of this converging world of video games and movies and can be viewed below. µ

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot – a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Apple announces the iPhone 6, iPhone 6 Plus and Apple Watch

Which of Apple's new products will you be buying?