If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Epic Games shows off lifelike "digital humans" with Star Wars and Andy Serkis demos

At GDC today, the Unreal Engine creator demonstrated how real-time raytracing will help games finally cross the uncanny valley

The introduction of real-time raytracing will move Epic Games one step closer to crossing the uncanny valley, with the engine company using GDC as a platform to showcase the latest techniques for creating "digital humans."

Epic's annual State of Unreal GDC session arrived the same week that Nvidia revealed its RTX raytracing technology, and Microsoft announced it would add the same to DirectX 12 with DXR. Both companies touted the huge improvement to 3D graphics that would result, and a significant chunk of State of Unreal was devoted to doing the same.

"They only came up with these acronyms last week," CTO Kim Libreri told GamesIndustry.biz earlier this week, an excited smile spreading across his face. If Libreri seemed excited, though, Tim Sweeney was nothing short of ecstatic, exclaiming the words "It's real-time raytracing!" with the sort of unrestrained glee not normally associated with Epic's gently spoken CEO.

According to Sweeney, the advent of real-time raytracing in games is "a breakthrough that we've been waiting on for 20 years now", and one that happened in the film industry as far back as The Matrix: Reloaded in 2003. In the era of the first Toy Story, Libreri added, the most common technique for rendering images was rasterization, and that is still the basic idea used in games today. With ray-tracing, it's possible to create much more accurate reflections, giving objects a definition and dimension that isn't possible with rasterized images. It also allows for a "more intuitive way of lighting", one that closely resembles that used on real movie sets and actors.

"You don't see these infinitely small lights with sharp shadows," Libreri said of film sets, which he knows from his time working at LucasArts. "You see these beautiful, soft, diffuse lights that they use - you need raytracing to be able to do that."

Nvidia approached Epic many months ago, proposing a collaboration in which the effect of raytracing on video game graphics could be demonstrated to the world. Libreri contacted his friends at LucasArts and ILMxLab, and the result is "a little sketch" set in the Star Wars universe, and created in the Unreal Engine.

Cover image for YouTube video

"We just wanted to show the world what would be possible, and in the next year we'll follow with what we're actually going to do in the engine," Libreri told us of the short film, Reflections. "It's amazing quality for a game engine, and it all runs in real-time.

"[Raytracing] brings a new dimension to it; it almost looks 3D, even though you're seeing it on a 2D screen."

The effect is undeniably impressive, and the demo shown at the the State of Unreal session showcased a variety of sophisticated lighting effects and how their effect on reflective surfaces. The result is a much more powerful sense of a realistic world, in which objects have definition and position and weight.

Reflections runs on an Nvidia DGX-1, a very powerful computer with four GPUs; evidence of Libreri's admission that "the hardware has to catch up a little bit" before we can expect to see such visual fidelity in an actual game. The goal of the piece, though, is to demonstrate that, "this is going to happen - gaming is going to change."

"Parts of that will come much sooner," Sweeney added. "Parts of this will come in the next couple of years. It's moving very quickly now."

"Honestly, between five and ten years from now, I don't think you're going to be able to tell the difference between the real and the virtual world," Libreri continued. "You'll see hardware that can support these kinds of capabilities pretty shortly, and then, finally, the greatest blockbuster with the most complicated effects, within ten years, you'll be able to do that in real-time."

The point was further proved with two demos, both of them showing progress made in creating "digital humans" since the live Hellblade demo that impressed so many at GDC in 2016. The first was a collaboration between Cubic Motion, 3Lateral, Tencent and Vicom: Siren, a digital human driven by a live actress on the showfloor. Attendees can ask the actress questions and see her performance rendered in real-time in the form of a virtual character. As with Reflections, the effect is undeniably impressive, even if it doesn't represent the very near future of games.

Cover image for YouTube video

"We're pretty happy," Libreri told us. "We're getting close to crossing the uncanny valley. We're not totally there yet, but we're getting close."

Raytracing is key to producing the fidelity evident in Siren, Libreri said; it allows for the kind of lighting that allows a human face to look real when it moves, picking out the innumerable quirks and crenellations that make each face distinct. The second demo was even more impressive; a performance of a soliloquy from Macbeth, performed by the actor Andy Serkis and captured with a combination of 3Lateral's '4D Capture' scanning technique and the Unreal Engine.

Serkis' delivery was full of rage and sorrow and elaborate facial expressions, all of it captured and rendered in pin-sharp detail. A representative from 3Lateral then showed how the data could be manipulated using an iPad running ARKit; moving the position of Serkis' eyes, raising the eyebrows, and fundamentally changing a performance we'd seen given only a few minutes before. Finally, the visage of an alien was laid over Serkis performance, resulting in what looked for all the world like a Mass Effect character giving an impassioned reading of Shakespeare.

When these technologies are available to game developers, Libreri pointed out, they will open up new creative avenues for storytelling; in much the same way that Ninja Theory convincingly explored mental illness in Hellblade, despite having a limited budget and a small team.

"It's exciting times," he said. "The possibilities are crazy"

Cover image for YouTube video
Cover image for YouTube video
Related topics
Author
Matthew Handrahan avatar

Matthew Handrahan

Editor-in-Chief

Matthew Handrahan joined GamesIndustry in 2011, bringing long-form feature-writing experience to the team as well as a deep understanding of the video game development business. He previously spent more than five years at award-winning magazine gamesTM.

Comments