GamingGPUsNews

Lumen vs Ray Tracing: Is Ray Tracing Better?

Epic’s Unreal Engine 4 is one of the most popular game engines. It has been widely used by AAA giants like EA, Ubisoft, and Microsoft, as well as tons of indie studios. With Unreal Engine 5 and its Lumen and Nanite technologies, Epic is looking to extend its dominance in the video game industry. In this post, we compare Lumen, the global illumination solution used by Unreal Engine 5 to conventional ray tracing, and see how it holds up.

What is Global Illumination?

Global Illumination is the process of lighting or illuminating a scene by calculating the light emitted by luminant sources on and off the screen (by approximations or tracing its path). With ray and path-tracing, the light rays are cast from these sources, reaching various objects in the scene and illuminating them.

The rays behave differently depending on the nature of the objects encountered. For example, glossy (shiny) objects will reflect the ray, while opaque ones will simply block it and redirect it somewhere else. This redirection of light rays by objects in different directions is known as indirect or diffuse lighting, while the redirected rays are called diffuse rays.

Ray Traced Global Illumination (RTGI)

These indirect or diffuse rays act as newly cast rays, further crashing into other objects and illuminating them in the process, the object redirecting them now behaving as a light source. When the ray finally reaches the camera (the screen), the information gathered by it is used to calculate the lighting of the scene.

In most cases, the color of the ray is determined by the color of the pixel reflecting it. To save performance, the light rays (data like color, intensity, etc) directly hitting the screen are calculated with complex algorithms, while the rest of the diffuse rays are let off with simpler equations.

So, now you may be wondering how the rays are cast and how the ray count is determined. That is handled by probes that are sources of light placed across the scene at runtime. They act as point sources of light, casting rays, and illuminating their surroundings. Each probe can cast one or more rays, radially.

The rays cast by these probes are traced and shaded and the data such as irradiance and distance to geometry is used to calculate the final lighting for the scene. In initial ray-traced titles, most developers used a single (or two) light probes to calculate the diffuse lighting. In the case of Metro Exodus, this was the sun and the sky textures.

The Enhanced Edition increased the probe count to 256 light sources. Therefore, overall, you had 256 light sources, plus the rays from the sun and the sky all used to calculate the lighting of each pixel. A 1080p display has 2 million pixels, 4 million for 1440p, and 8 million for 4K!

To ensure that contemporary hardware was able to run the game satisfactorily, the developers used grid cells or clusters to partition the scenes. Then similar to Screen Space effects, the light probes in range (in the grid) were invoked to calculate the lighting. The primary difference is that the screen space is divided into sections (depending upon their position in the Z-buffer), while here, the game world is partitioned, avoiding similar coverage issues.

Another optimization involves accumulating rays from previous frames and using them for additional diffuse light bounces, much like temporal upscaling. This is used to generate the lighting grid which can then be reused over consecutive frames, allowing nearly infinite bounces of light rays. You’re essentially casting diffuse rays temporally across multiple frames to reduce the performance impact and make the method more feasible for real-time use.

Unreal Engine 5: Lumen vs Ray Tracing

Okay, so before we begin, let’s make one thing very clear. Lumen is a form of ray-tracing, albeit a more optimized, and flexible form of it to allow more widespread adoption, across different graphics architectures without the need to own a $1,000 GPU.

Lumen is Unreal Engine 5’s new Fully Dynamic Global Illumination and reflections system that is designed for next-generation consoles. Lumen is the default global illumination and reflections system in Unreal Engine 5. It renders diffuse interreflection with infinite bounces and indirect specular reflections in large, detailed environments at scales ranging from millimeters to kilometers.

From the developers

By default, Lumen uses software ray tracing (doesn’t utilize RT cores/Ray Accelerators). It leverages multiple forms of ray tracing including screen-tracing (SSRT), Signed Distance Fields (SDFs), and Mesh Distance Fields (MDFs) in parallel to calculate the global illumination of the scene depending on the objects, their distance from the screen, and certain other factors.

Signed Distance Fields and Mesh Distance Fields

Signed Distance Fields or SDFs are simply distance vectors in a particular direction. Let’s take the below example to illustrate this:

SDF

A ray is cast from the camera which passes through the screen and then approaches the circular surface. With ray tracing, the most important part is figuring out which rays hit objects in the scene and which ones miss. SDF is used more specifically to find out (for a ray starting in a particular direction) the closest point on the surface of the object where there’s an intersection.

Ray Marching

The green circle represents the SDF. Calculating the SDF involves finding out how far we can go (at the minimum) in its direction from a point on the ray. That distance is covered, and the SDF is re-evaluated. This entire process is called ray-marching. In this case, it’s a miss, so the process is ended after a predetermined number of steps. In the case of a hit, the SDF would have been used to calculate the lighting.

Lumen uses Global Distance Fields, Mesh Distance Fields, and Screen Traces to calculate the lighting in the software/hybrid tracing pipeline.

Global Distance Fields, Mesh Distance Fields, and Screen Tracing

Global Distance Fields are the fastest but least accurate. This works to their advantage as they are used for the abstract exoskeleton of the scene, such as the walls, and floor, and large (but simple) texture blocks like cushions. Mesh Distance Fields are more detailed but only hold sparse details near the surface of the objects. These SDFs are used with mipmaps depending on the distance of the objects from the camera.

Screen tracing (screen space ray-tracing) is the first step in the lumen pipeline. It is conducted against geometry/objects in the depth buffer (visible on the screen). This is often leveraged for edges and crevices, essentially, geometry where screen space shadows (SSAO) are cast. After screen tracing, SDFs are used. First, Mesh Distance Fields are used for nearby objects, followed by Global Distance Fields for the rest of the scene.

Mesh Distance Fields (called detailed tracing) are traced for the objects up to 2 meters away from the camera, while the rest are traced using global distance fields (called global tracing). Each method has its 2D voxel representation of each scene, as you can see above. GDFs are less accurate as you’re tracing them against the object silhouettes but are faster than MDFs. The nature and distance of the objects from the camera complement them. MDFs, on the other hand, trace relatively low-poly versions of various objects in the scene to calculate the lighting.

To further speed up the process, Lumen uses Surface Cache. Surface cache captures the geometric properties of objects from all angles and stores them offline in the form of an atlas (cache). Captures happen as the player moves around, in higher resolution as you move closer and in lower resolution as you move farther from an object.

However, this will only work for meshes with simple interiors. Caching happens for a limited amount of polygons of objects (a few hundred MBs of VRAM) and requires the LOD for various objects/sections for effective utilization.

Drawbacks of Lumen

One of the primary drawbacks of Lumen software’s RT pipeline is that it doesn’t work with skinned meshes (primarily skeletons), as they’re dynamic and change their shape with every frame (deformations/movement, etc). The BVH structures for these objects need to be created for every frame, which isn’t possible with Lumen’s software ray-tracer. Lumen creates the BVH objects for static meshes only once at runtime, speeding up the process but rendering it useless for dynamic meshes.

Lumen also comes with hardware ray-tracing, but most developers will stick to the former, as it’s 50% slower than the software implementation, even with dedicated hardware such as RT cores. Furthermore, you can’t have overlapping meshes with hardware ray-tracing or masked meshes either as they greatly slow down the ray traversal process. Software Ray tracing merges all the interlapping meshes into a single distance field as explained above.

Overall, Lumen looks phenomenal, but its primary drawback is that it’s limited to Unreal 5. This means it will never see the same amount of adoption as other open-source techniques (FXAA, SMAA, or even TAA). Additionally, as evident from recent releases, games leveraging Unreal Engine 5 and its technologies run very poorly even on high-end hardware.

Areej Syed

Processors, PC gaming, and the past. I have been writing about computer hardware for over seven years with more than 5000 published articles. Started off during engineering college and haven't stopped since. Mass Effect, Dragon Age, Divinity, Torment, Baldur's Gate and so much more... Contact: areejs12@hardwaretimes.com.
Back to top button