Learn More

AR/VR

Published on May 28th, 2020 | by Emergent Enterprise

0

Epic Games’ Insane Video Game Graphics Demo Explained in Simple Terms

Emergent Insight:
It may seem odd for emergent-enterprise.com to share an article about a virtual reality game engine but truth be told it is these same engines that drive business-related VR applications. Unreal Engine, one of the more widely used engines, gave a sneak peek of their newest version 5 recently and Aaron Frank of Singularity Hub shared this insightful interview about the new product. Two of the biggest challenges in VR are fidelity and throughput. How can a VR software developer not only create photo or videorealistic experiences but also build it so it can be run by a computer/processor, ideally not tethered to a PC, that is affordable and accessible to the average company or individual? It appears that Unreal Engine 5 makes big steps towards that goal. The sequences shown in the accompanying video shows that incredible, lifelike VR experiences are ahead for not only gamers but businesses, too.

Original Article:
Image credit: Epic Games

Most of the words that follow aren’t necessary to see why the reveal of Unreal Engine 5, an Epic Games video game graphics engine set for release in 2021, blew up the internet last week.

You can watch the following video, know that this is a demo of a video game and not footage of anything real, forget the technical aspects being discussed, and simply see that video game graphics are about to take a big leap forward.

Video games during the 2020s could look almost like scenes ripped straight from real life. The overwhelming response from game developers was like a collective jaw dropping to the floor.

The current generation of gaming consoles, including Sony’s PlayStation 4 and Microsoft’s Xbox One, are now using seven-year-old hardware technologies, first released in 2013. Both Sony and Microsoft are releasing their newest generation of consoles this year. For this demo, Tim Sweeney, the CEO of Epic Games, credits the upcoming PlayStation 5, the system being used, for providing the capabilities to render much of the scene. He also says that the features described in the demo will work on all next-generation consoles.

To be sure, this video is aimed at game developers, 3D artists, and those who Epic Games hopes to convince to use their technology instead of a competitor’s. While it’s full of technical details, that didn’t prevent millions of people from watching the demo within the first few hours of its release, and the response was overwhelmingly positive.

As I followed the meme-soaked conversation online, I sensed 3D artists were enthusiastic, but not being a 3D artist myself, I wanted to know exactly what made this so special. So, I reached out to one of the experts I saw tweeting about it to see if she could help me better understand.

Estella Tse is an augmented reality and virtual reality (AR/VR) creative director and artist based in Oakland, California and has worked as an artist-in-residence with Google, Adobe, and others.

She was very kind to address my questions explaining the demo in simple terms.

Aaron Frank: Can we start at the beginning? What is a game engine?

Estella Tse: A game engine is a computer program developers use to make games, interactive experiences, and AR/VR apps. You can place assets like 3D models, 3D environments, images, sound effects, and music into a game engine and add interactions to those elements. It can be as simple as controlling a ball to collect points or as involved as adding the complex interactions we see in major release titles.

In the AR/VR industry, we create most experiences with one of two game engines: Unity or Unreal Engine.

AF: I discovered your work when I came across your enthusiastic tweet about the number of “triangles” that can be rendered in the engine. The video mentions “hundreds of billions” of triangles rendered across the whole demo. What does that mean exactly, and what are triangles?

ET: For cameras, “megapixels” measure how high-quality of a photo can be captured. For Photoshop artists, “pixels per inch” is an indicator of image resolution. And in 3D model terms, “polygons” are the number of faces a model has. A cube, for instance, has six faces.

These polygon faces, which commonly consist of triangles, as the Unreal demo refers to them, are a major factor for graphics quality in games and AR/VR experiences. The more polygons a model has, the smoother and more realistic the 3D model can look. It’s a bit like how in a low-resolution photograph you can see the pixels, but in a high-resolution photograph the pixels are nearly undetectable. The more polygons for a 3D model, the more information and detail it holds.

One might want as high quality of a model as possible in their game, but hardware is a limiting factor. A mobile phone, for instance, can’t handle intricate 3D models with millions of polygons. Even most high-end VR experiences require 3D models to be optimized, which reduces the number of polygons so that more bandwidth is available for interactivity and other models to load at the same time. This is all done so that everything functions without lag.

You may have heard of the term low-poly models. These are models that have minimal polygons that tend to create an aesthetic like a classic video game. Here is an example of a low-poly model using Google Blocks. This model is versatile and can be integrated into experiences built for mobile devices all the way up to high-fidelity game experiences.

On the other hand, here is an example of a very high-poly painting using Google Tilt Brush. My Tilt Brush paintings won’t load on mobile devices without optimization, or “decimation,” and even then will likely crash devices because of the millions of polygons. There are easily over a million polygons in this painting and as a 3D artist, this is something I pay close attention to when I make AR/VR creations. I have to keep in mind that not every device or experience can contain the maximum capacity of my work.

AF: Another concept I routinely saw is captured in this tweet, referencing “normal” and “occlusion” baking. (Also this one and this one.) Can you explain what “baking” means?

ET: Another incredible piece of news from the UE5 demo was the real-time lighting capabilities of Lumen, on top of the capacity for high-fidelity models. This is really powerful!

Often, developers have to “bake” lighting—which is essentially faked lighting to save on graphics processing—into 3D models to create the illusion of lighting in an environment. This saves a lot on processing and loading time, prevents lag, and allows for more interactivity and smoother experiences. Here is an example of how lighting is baked onto a “normal” map.

To continue reading, go here…

Tags: , ,


About the Author

Emergent Enterprise

The Emergent Enterprise (EE) website brings together current and important news in enterprise mobility and the latest in innovative technologies in the business world. The articles are hand selected by Emergent Enterprise and not the result of automated electronic aggregating. The site is designed to be a one-stop shop for anyone who has an ongoing interest in how technology is changing how the world does business and how it affects the workforce from the shop floor to the top floor. EE encourages visitor contributions and participation through comments, social media activity and ratings.



Back to Top ↑