There are myriads of reasons to be excited about a new generation of hardware. For players, it might be the prospect of new entries in a beloved series or easier opportunities to play co-op with their friends.
But on the development front, a new generation means new tools to explore, which can bring potential solutions to the current bottlenecks content creators are facing.
"The real time rendering needs of video games means that we don't have access to a lot of the pre-render processing power that something like film can access," says Corinne Scrivens, artist at VR developer Polyarc. "In addition to this, VR's need to render two screens means we have even less resources to pull from. The hardware and engine improvements give us the capability to do more. It lets us add more to the worlds we create and leverage the advantages of virtual reality to make them even more immersive."
"One of the core goals of Unreal Engine is to develop features that make the creation of high-quality 3D content much simpler"
Enabling the creation of more immersive worlds is exactly the challenge Epic Games set to tackle when it started working on the latest version of its game engine. The company showcased Unreal Engine 5 earlier this year, with a real-time gameplay demo running on PlayStation 5. And making things easier for artists was at the heart of Epic's vision.
"One of the core goals of Unreal Engine is to develop features that make the creation of high-quality 3D content much simpler," says Epic Games' product director Arjan Brussee. "To create compelling virtual worlds across all kinds of consumer hardware ranging from phones to upcoming new consoles, more high-end 3D content that can 'scale' is needed. This process needs to become much more automated and efficient, and we are working on a variety of solutions to reduce the cost and time needed to make photorealistic, and also stylised, scalable 3D content with any sized teams."
Among Unreal's upcoming tools to make things more efficient for artists is Nanite's virtualized micropolygon geometry, which provides massive visual quality improvements through increased polygon detail. Lumen's fully dynamic real-time global illumination immediately reacts to scene and light changes, and erases the need to wait for lightmap bakes to finish and to author light map UVs. Plus, Unreal Engine now has a full suite of built-in animation tools such as Control Rig and Full Body IK to create sophisticated animation much faster.
"The quality that can be achieved by any team is now much higher than it has ever been before," Brussee continues. "[Thanks to] Niagara particles and visual effects, our Chaos physics and destruction system, strand-based hair and fur, nondestructive landscape creation and editing tools, a physically-based SkyAtmosphere component, and more."
With each new generation, access to development tools becomes easier -- and the next one should be no exception to the rule, Scrivens reckons. It should have the domino effect of making the whole industry much more accessible to newcomers as well.
"The quality that can be achieved by any team is now much higher than it has ever been before"
"Back when I started learning 3D the cost of programs and difficulty in finding relevant information made it not very accessible to learn," she says. "Now I can download a program like Blender for free and learn up to date workflows on how to model, rig, animate or whatever else I want to know from a quick search on YouTube. It's an exciting time to be a new artist."
Maintaining the growing accessibility of games tech is very much at the core of Epic's vision too, with executive producer Julie Truong saying that the company's technology "is and will remain freely available on the internet."
"[This includes] the source code and a lot of learning resources and sample projects so that anybody can tap into it," she adds. "We are very focused on usability and abstracting the complexity of game creation; with tools such as Blueprint or Niagara, artists can achieve a lot on their own.
"And as the underlying technologies are getting much more complex, it's actually possible to make amazing fidelity content without deep knowledge of underlying hardware constraints."
Looking back at the past 20 years, from when 3D games were first being developed, Brussee points out that a lot of time has been spent making sure assets work on target, with all kinds of tricks being used to work around limitations.
"Tricks such as LOD creation, destructive 'baking' of high fidelity meshes down to models with bump/normal maps, packaging assets to stream, and so on. Now, we're on the verge of not having to do this anymore as artists and developers will be able to throw photorealistic assets in a level, and the rendering and streaming part of Unreal Engine 5 will do the heavy lifting to make it all work."
Next-gen should enable photorealism to reach its climax, getting closer and closer to what the eye can see in real life, potentially resolving a decades-old issue: the uncanny valley.
"With the new graphics cards coming out and advances in rendering I think we will definitely be seeing characters that look more photo real than ever before," Scrivens says. "The trick with removing the uncanny valley will be getting animations and deformations to match the visual fidelity of the characters themselves. Correctly portraying how a smile raises the cheeks to cause a slight happy squint in the eyes or how a down turned brow changes the upper eyelid and causes deformation that can radiate up a forehead will be more important the more realistic the character becomes."
Brussee adds that, with real-time rendered 3D scenes, we're getting very close to ending the uncanny valley already, as shown in more and more movies and TV shows that are using "real-time generated 'final pixels' which are indistinguishable from reality." And Epic Games is of course exploring more solutions in this area.
"On animation and dynamics, we're doing a lot of development in areas such as our Chaos physics simulation system, from destruction, cloth physics and fluid simulations; to simulating realistic styled and moving hair," Brussee continues. "We're doing a huge amount of R&D with our 3Lateral and Cubic Motion studios on raising the quality bar for digital humans to get through the uncanny valley."
The only caveat to a new generation of powerful new machines is potential higher costs for art.
"Yes, that's a fundamental issue here that we are working on mitigating," Truong replies. "With photogrammetry, the cost of creating photorealistic assets is much lower and with Quixel Megascans, these assets are game ready. Our Quixel Mixer tool can be used to art direct and stylise these assets to fit any game. A lot of ready-made content is also available within the Unreal Engine Marketplace."
Overall, it's an exciting time for games technology, with development tools also becoming more widely used in TV and film. There's a lot these industries can learn from each other and we could be seeing games converging with other entertainment forms more and more as the next generation progresses.
"We certainly feel that Unreal Engine is at the forefront of pushing this convergence," Brussee says. "With the upcoming UE5 tools, as demoed in 'Lumen in the Land of Nanite', we are now able to take movie-quality assets and not just run them on a high-end PC for offline post-process movie-production, but actually run them on consumer hardware in real-time. The dream is that there will be cross-media projects sharing assets between movies and games, and beyond, without recreation or manual optimisation of content."
Want to hear more about what next-gen gaming will mean for developers and players? Watch November 11 episode of the Pulse, a video series hosted by Epic Games. Game journalist Brian Crecente got together with Epic's Zak Parrish and the developers of Kena: Bridge of Spirits and Kine to share their insights. Watch episode now: 'The Next-Gen Gaming Revolution'