The Unreal Engine 5 demo that was recently shown off running on the PS5 started a slew of discussions across different forums and media. This includes the resolution and frame rates of the demo, the technologies used, and how it compares against contemporary PC hardware:
- The PS5 Demo (Running @ 1440p) Proves Next-Gen Consoles’ll be All About Render Optimizations: VRS, Temporal & AI Upscaling
- Tim Sweeney Explains Why PS5’s 5.5GB/s SSD Speeds are Important;
Today we are going to look at another such facet of the PS5 demo, namely the lighting. Epic has been touting it Lumen GI as a next-generation global illumination technology, and now we have a better idea of how it actually works. In an interview, Daniel Wright, the Technical Director of Graphics at Epic, said:
Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing. Lumen traces rays against a scene representation consisting of signed distance fields, voxels, and height fields. As a result, it requires no special ray tracing hardware.
Lumen uses a combination of different techniques to efficiently trace rays. Screen-space traces handle tiny details, mesh signed distance field traces handle medium-scale light transfer and voxel traces handle large scale light transfer.
The important revelation here is that while Lumen GI does use ray-tracing for indirect lighting, it’s not the same as NVIDIA’s RTX implementation which relies on ray-triangle intersection and BVH. It leverages three different methods to trace the rays and obtain the required information.

While in NVIDIA’s implementation, the objects in a scene undergo conversion into BVH objects using the RTCores, in Unreal’s Lumen GI, the scene is represented using voxels, SDFs (signed distance fields) and height fields. These aren’t as intensive as BVH and therefore don’t require hardware support.

Think of voxels as 3D triangles used to calculate lighting and shadows in a scene by examining which ones obstruct and which allow light to pass. Signed distance fields (SDF) are similar to BVH objects except they’re much simpler and don’t require any bounding. They are objects that represent surfaces and (or well, objects) which are used to determine whether a light ray passing through a pixel hits that surface or not, and if yes, then at what angle. Height fields are similar, except they determine the height of the surface, while SDF calculates the distance from the ray.
As Wright states, the finer details are traced by the rays in screen space. This is very similar to how the ReShade ray-tracing shader works which utilizes data provided by the depth buffer. Another word for this is RTGI (ray-traced global illumination) or Screen Space Ray Tracing (SSRT).

Furthermore, mesh SDFs trace medium-scale light transfer and voxels are used to trace rays providing large-scale light movement. Overall, it seems like the Unreal 5 powered Lumen GI that ran on the PS5 uses a very complex lighting method which can’t be completely classified as ray-tracing (debatable). Still, the results are impressive and that’s what matters.

This also indicates one key trend that we’ll be seeing in the coming months and even years. The definition of “ray-tracing” is soon going to vary from developer to developer and engine to engine. Furthermore, not all implementations are going to be as accurate, and won’t exactly classify as a ray-tracing technique. You can be fairly certain that even the ray-tracing implementation on the PS5 and the Xbox Series X will be different and it won’t be an apples-to-apples comparison. As such any performance claims or ray-counts shared by Microsoft and Sony will largely mean very little. Ray-tracing is about to get a lot more complicated.