Fun and Games

March 30th, 2019 3D Artist, Film, Tech, Tech Features

When you consider the effect game engines will have on moviemaking, one term that might spring to mind is ‘flattening’.

Not having to wait for render farms to spit out shots to review (and change – or not, when you’ve finally run out of time or budget) means the production is ‘live’ from day one. It blurs the lines between pre, post and everything in between, flattening the whole workflow into one long creative input party.

Realtime VFX rendering in game engines actually seems like the perfect way to design and build a CGI world for cinema screens. Where a single frame in a movie has traditionally taken hours or days of data crunching time, games have to do it all in milliseconds as a player navigates and looks around an entire virtual world, doing it constantly with every frame refresh.

Deploying a game engine-defined universe completely upends the VFX workflow where a character designer hands a shot off to a lighting artist, who then hands it off to a colour grader, etc. In fact it flattens the whole pipeline so much it’s not unusual for shots constructed for previz to find their way into final frames.

It’s actually a bit like the old way of doing things where you build a set, dress and design characters and then figure out the best way to light it and shoot the action in it. The only difference is it’s all virtual.

Game engines have the potential to do away with the notion of upstream/downstream VFX workflows completely. Think of working with far-flung colleagues in a Google Docs or Office 365 file, everyone’s work assimilated and saved every time a change is made.

Building the environment of your movie – maybe just the set of just one scene – lets everyone dive in and do their part concurrently, all of it rendered out in real-time just like playing a video game. It solves what Tom Box, co-founder and general manager of London animation studio Blue Zoo, calls ‘a big conveyor belt of stages where any stoppage causes a bottleneck further down the line.’

“When you’re looking at a preview monitor or in some cases seeing things on set in real-time, you’re able to make all the creative judgments at the same time,” adds Rob Bredow, Executive Creative Director and Head of Industrial Light and Magic (ILM).

Systems with benefits

A good example of having the real environment available to everyone comes from the Millennium Falcon flight sequences ILM built for Solo: A Star Wars Story. A rear projection movie screen was set up in front of the cockpit set so the actors were actually looking at the full resolution render of the jump to hyperspace, affecting and heightening their performances.

The advantage was what Bredow calls ‘better representation earlier in the creative process.’ If the Solo actors had been looking at a green screen for ILM to paint out and insert the VFX sequence later it would have been fine (and it’s the way we’ve been doing VFX for 30 years) but having the final render right there changed and even influenced the shots themselves.

For example, instead of cutting from the actors to the iconic hyperspace starfield from over their heads like every other Star Wars has done, the camera panned slowly away from space outside onto actor Alden Ehrenreich’s face. “You could actually see the reflection of hyperspace in [his] eye,” Bredow says, “you get a different response when you have more elements available.”

Extrapolate that further and you can surround the actors not with green screens but projections of the final render of the world around them as they work. As Kim Libreri, CTO of Unreal Engine publisher Epic Games puts it; “Instead of animating shots in a car chase you can drive a virtual car around a scene, filming the action just like you would in the real world but iterating and reshooting it as many times as you need.”

As you can imagine, it stands to make the whole process quicker and cheaper, but to most creatives it’s about the potential for better storytelling, Box saying he’s seen productions that might see eight versions of a shot per day go up to 100 using realtime technologies.

Libreri agrees, saying it will simply make moviemaking ‘better’ rather than just cheaper – especially with the capability to integrate technology like performance capture and VR exploration directly into the Unreal Engine.

But Isabelle Riva, Head of Made with game engine publisher Unity, has also seen quantifiable results from realtime VFX, saying short films worked on using Unity have seen gains of up to 50 percent in time spent.

Then there’s the data portability. Make a hit superhero movie these days and a lot of sequels and licensed products from the video game tie-in to the Netflix series are a certainty.

Spend a fortune on every project engaging a new crop of VFX vendors whose only reference material is the finished movie if you like, or just export and share your game engine data containing the landscapes, weather behaviour, populations of characters and every other detail to the relevant company and save them (and your client) a bundle.

For a company like ILM, sitting on 40 years of materials and data about Star Wars worlds and characters, such a system is critical – and the company has its own framework to maintain and share it.

But even at a studio like Blue Zoo there’s time and effort to be saved. Box talks about several TV projects the company has worked on that have attendant game titles. The workflow calls for two character rigs suited to each output, the original animation simply exported and mapped to each one. In a multi-platform world, Box and his team can find ‘massive’ efficiencies.

Beyond that, the benefits are nearly endless. Sketch out dangerous stunt sequences (stuntviz) to make sure they match what the director wants before committing resources to setting up the sequence – 2015’s Mad Mad: Fury Road used the process extensively (https://www.unrealengine.com/en-US/blog/virtual-production-lights-cameras-action), let a director walk through the fully rendered world using VR to find the shots he/she wants just like a gamer, and much more.

Good enough for…

The big question mark looming over all this is whether game engines produce high enough visual quality for cinema. Despite the jaw-dropping graphics in big name videogames, they’re still not quite as photoreal as cinema VFX.

Except that (depending what you read) they are. Epic Games claims Unreal Engine can not only match what it calls the ‘the right visual fidelity to match the aesthetic style of your creative vision’, it lets you capture footage in formats right up to 8K.

And plenty of VFX companies have dipped their toes into the game engine waters in feature films already. Back in Rogue One: A Star Wars Story, as many as 12 shots of the heroic reprogrammed droid K2SO (played in motion capture by Alan Tudyk) were done using real-time VFX and composited into scenes after.

Riva points to a bevy of award-winning short films made in the Unity engine, and a story on production technology website Redshark as long ago as 2014 said we could ‘nearly’ make movies with game engines.

Tom Box says we’re now at that crossroads. “The technology is matching what you traditionally take hours to render for the first time because of the increase in what the graphics cards can do,” he says.

There might be a few final frontiers like realistic humans and water effects, but it’s all coming to a very exciting head this year when the 2019 Unreal Engine release will include real-time ray tracing, something Box calls the ‘holy grail’ and which new generation GPUs will make mincemeat of. Unreal uploaded the amusing short film Reflections last March to showcase what would soon be possible.

But it’s also true that as technology improves, we ask more of it. Bredow says the limitations today are one of thing things he’s been concerned with for about five years. “Take simple surface shading, which we can do very well in the GPU today, he says, “the inner reflections and complicated shadowing we expect in a photoreal modern feature film are still pretty challenging.”

Not if, when

So when does that mean we might see the world’s first global blockbuster made completely with real-time VFX? Bredow doesn’t think it’s years away anymore (but he admits he’s been saying that for 10 years), and Libreri says we’ll see ‘tremendous’ adoption over the next five years.

Inasmuch as the industry as a whole can have collective intent, at the moment it’s just to let the technology slowly catch up to and surpass existing and established workflows as people see the benefits.

But whenever it happens, games engines are set to enjoy a second life in a market that might meet or eclipse the influence they’ve had so far in our game consoles. As Riva says, “From a publishing standpoint films and games are apples and oranges, but from an authoring and technical perspective they’ve been converging for a long time. In film, we want to revolutionise the creative process by making a complex production chain simple.”

Boxout – Everybody’s talking about it, but who’s doing it? Adoption of game engine technology in movies.

Epic games has been selling filmmakers on using the Unreal Engine since 2014, and the company’s Kim Libreri says its adoption in areas beyond games has ‘skyrocketed’. Tom Box of Blue Zoo says studios are excited about seeing what game engines can offer, but that ultimately it’s going to be up to the creative arbiters of filmmaking – directors – to want to use them.

Isabelle Riva of Unity says VFX vendors are also starting to make the approach from their side to see how game engines can help. In many ways, she sees adoption as a process of change management rather than technology. “Given the complexity of a feature film there are groups at many levels that have to manage how new workflows impact them and change how they do their jobs,” she says.

Among the biggest changes will be the point at which artists join a project. VFX won’t just come in after principal photography any more – in a world without notions of post or preproduction CG assets need to be built in advance, and everyone from the DP, art director and production designer on down will need to be very digitally savvy, building virtual props and sets alongside real ones.


Full client and publication list:

  • 3D Artist
  • APC
  • AskMen.com
  • Auscam
  • Australian Creative
  • Australian Macworld
  • Australian Way (Qantas)
  • Big Issue
  • Black Velvet Seductions
  • Black+White
  • Bookseller & Publisher
  • Box Magazine
  • Brain World
  • Business News
  • Business NSW
  • Campaign Brief
  • Capture
  • CHUD.com
  • Cleo
  • Cosmos
  • Cream
  • Curve
  • Daily Telegraph
  • Dark Horizons
  • Dazed and Confused
  • Desktop
  • DG
  • Digital Media
  • Disney Magazine
  • DNA Magazine
  • Empire
  • Empty Magazine
  • Famous Monsters of Filmland
  • Fast Thinking
  • FHM UK
  • Film Stories
  • Filmink
  • Follow Gentlemen
  • Geek Magazine
  • Good Reading
  • Good Weekend
  • GQ
  • How It Works
  • Hydrapinion
  • Inside Film
  • Internet.au
  • Loaded
  • M2 Magazine
  • Marie Claire Australia
  • Marketing
  • Maxim Australia
  • Men's Style
  • Metro
  • Moviehole
  • MSN
  • Nine To Five
  • Paranormal
  • PC Authority
  • PC Powerplay
  • PC Update
  • PC User
  • PC World
  • Penthouse
  • People
  • Pixelmag
  • Popular Science
  • Post Magazine
  • Ralph
  • Reader's Digest
  • ScienceNetwork WA
  • SciFiNow
  • Scoop
  • Scoop Traveller
  • Seaside Observer
  • SFX
  • Sydney Morning Herald
  • The Australian
  • The Retiree
  • The Sun Herald
  • The West Australian
  • thevine.com.au
  • TimeOut
  • Total Film
  • Video Camera
  • Video&Filmmaker
  • Writing Magazine
  • Xpress
  • Zoo