Unreal Engine 4.20’s creative, scalability, and cross-platform game capabilities

With all of the recent Fortnite headlines, it’s easy to forget that Epic Games is the creator of the Unreal Engine. But today, Epic introduces Unreal Engine 4.20, the gaming engine that powers some of the most realistic games.

The new version of the entertainment creation tool will make it easier for developers to create more realistic characters and immersive surroundings for usage in games, films, television, virtual reality, mixed reality, augmented reality, and corporate applications. The new engine combines real-time rendering improvements with enhanced creativity and productivity features.

Why do certain games (veterans) see blockchain as a huge opportunity?

It makes it easier to release blockbuster games on all platforms since Epic had to create hundreds of improvements for iOS, Android, and the Nintendo Switch when creating Fortnite. These advanced features are now available to all Unreal Engine 4.20 developers. I chatted with Epic Games’ chief technology officer, Kim Libreri, about the upgrade and how the firm learned what it needed to give in the engine during the creation of Fortnite.

According to Library, new features will promote photorealistic real-time visualization across automobile design, architecture, manufacturing, and other non-game enterprise applications. A new level of detail system defines a player’s visual elements on platforms such as mobile devices or high-end PCs, among other features.

On October 3-4 in San Francisco, CA, MetaBeat will bring together metaverse thought leaders in providing direction on how metaverse technology will alter how all industries communicate and conduct business.

Over 100 mobile improvements created just for Fortnite will be available to all 4.20 users, making it easier to ship the same game across several platforms. There are also tools for “digital humans,” which make the lighting and reflections perfect, making it difficult to discern the difference between a digital and a real human.

The same techniques used on Epic’s “Siren” (in the video above) and “Digital Andy Serkis” demos exhibited at this year’s Game Developers Conference are now available to all customers. Mixed reality, virtual reality, and augmented reality are all supported by the engine. It can be used to make content for the Magic Leap One and Google ARCore 1.2, among other devices.

The Library filled me in on these specifics. Still, we also discussed Epic’s competitiveness with Unity and how Epic can leverage what it learns by making its games to try to outdo its competitor.

Kim Libreri (Kim Libreri): Unreal Engine 4.20 is what we’re talking about. We’re doing what we said we’d do at GDC, according to the majority of the themes for this release. This version includes many of the items we showed, demonstrated, and spoke about at GDC this year. Fortnite’s mobile improvements are a big part of it, thanks to the fact that it’s now available on iOS and Android. We’ve put in a lot of effort to improve the engine so that you can develop one game and scale it across all of those platforms. All of our customers will benefit from significant improvements in mobile device performance.

One of the reasons for this was the proxy LOD mechanism we installed in the engine. As a result, intelligent automatic LOD generation is possible. It’s similar to what Simplygon does, but we’ve added our twist to it. That’s how shipping works. Another interesting feature is that Niagara is now shipping in a usable state. We’re getting a lot of fascinating feedback on how customers find it and how great it will improve their game’s appearance in terms of visual effects—the new particle system in Niagara. In Unreal Engine, we used a method called Cascade, and Niagara will eventually replace Cascade.

We did a handful of digital human demos at GDC, which you may have seen. Thanks to our work with Mike Seymour, we’re releasing the code and functionality and some demo content. He was gracious enough to allow us to distribute his digital likeness so that others might learn how to create a lifelike person. In terms of programming, we now have improved skin shaders that allow for multiple specular hits on skin. Light can also pass through the sides of the nose and the backs of the ears. Subsurface scattering realism is improved. In addition, we have screen space for indirect skin illumination, which allows light to bounce off cheekbones and into eye sockets, which is accurate to what occurs in real life.

We also demonstrated several virtual production features in the engine, including cinematography in a real-time environment. The tool we showed, which was used to illustrate a number of our GDC demos, is now available with 4.20 and sample material. A virtual camera can be used to control an iPad.

A new cinematic depth of field has been added. It was created for the Star Wars Reflections demo. We currently have the industry’s greatest depth of field. Not only is it accurate to what real cameras can accomplish, it also handles out-of-focus objects over a shot or in the background gracefully, which has typically been a challenge in real-time engines. It’s also faster than the previous circle’s depth of field. It will greatly improve the realism of games and cinematic content.

The area lights that we demonstrated at GDC are now available for purchase. This allows you to light characters and things in a way that is more like how you would do it in a real-world setting using real-world lights. You get accurate-looking diffuse illumination and accurate-looking reflections. You have more matches when lighting with a softbox or diffusion in front of a light like you would for a fashion shot, a movie, or whatever. All of it can now be replicated in the engine.

On the enterprise side, the engine now supports video input and output. Suppose you’re a professional broadcaster producing a green-screen television presentation, such as for weather or sports. In that case, you may now feed live video from a professional-grade digital video camera directly into the engine, retrieve a chroma key, and overlay it over a mixed reality picture. The Weather Channel, for example, is now doing some interesting things. We’ve seen many broadcasters use the Unreal Engine for live streaming, and we’ve just made it even easier with support for AJA video I/O cards.

On the enterprise side, we have display, which allows you to render a scenario over numerous display outputs using several PCs. Let’s say you have one of the large tiled LCD screens, and you need to connect it to a particularly high-resolution display. To drive these tiled video walls, you can now use numerous Unreal Engine PCs that are all synced.

Shotgun is a common production management tool that many cinemas, television, and games studios use to manage their artists in terms of what assets they need to create when they’re due, and what dependencies they have. We’ve integrated it into Unreal Engine, making it very simple to follow assets in and out of the engine as your production progresses.

In terms of what we’re doing, that’s the primary narrative. There will be a lot of small details in the release, but that’s the important stuff.