AMD Might be Working on AI Accelerators for RX 7000 GPUs to Tackle NVIDIA Tensor Cores

Upscaling techniques, especially AI-based algorithms such as NVIDIA DLSS have been one of the most critical parts of games this generation. Both NVIDIA and Intel have a cutting-edge upscaler in their arsenal. AMD, though (for the time being), is relying on its spatial FSR filter. Unfortunately, FSR almost always falls behind DLSS in supported titles. Team Red might change this with its next-generation of RDNA 3 graphics cards. A patent spotted by Coreteks indicates that the Radeon RX 7000 GPUs may feature a machine learning accelerator die for specialized upscaling algorithms.

Keep in mind that just because a patent for a particular technology exists doesn’t mean that it’ll come to fruition, or will be featured in the next generation of AMD’s graphics cards. It’s just one of the many possibilities.

The memory and Accelerator Die seem to be one and the same or stacked above one another. The flowchart explains how the ML accelerator works. The shader is first executed by the graphics core, after which the machine learning ALUs perform the requested ML tasks via one or more inter-die (Infinity Fabric) interconnects.

It’s worth noting that as per this patent, the machine learnings tasks are performed on the memory and ML die rather than the graphics core. This makes it highly unrealistic due to the added latency and the physical distance from the primary die. Both NVIDIA and Intel’s machine learning hardware (Tensor Cores and XMX matrix units) exist alongside the primary FP32/INT32 shaders rather than on a different die. The presence of the Infinity Cache may help mitigate the latency to an extent, but I still think that this design is highly unlikely to find its way to the final floorplan.


Computer hardware enthusiast, PC gamer, and almost an engineer. Former co-founder of Techquila (2017-2019), a fairly successful tech outlet. Been working on Hardware Times since 2019, an outlet dedicated to computer hardware and its applications.