CPUsGamingGPUs

CPU vs GPU: What’s the Difference Between the Two and Their Functions?

When building a new gaming rig, there are a number of important questions you’ll have to address. What’s the best case for optimal airflow? How much is a good amount of RAM? What’s a good monitor size, and what panel tech is ideal? However, these are all minor quibbles in comparison to the big two purchase decisions you need to make: Your CPU and GPU. What roles do these two components play? Why are they so important? And which, if either, has a bigger impact on your gaming experience? Let’s take a look:

Introduction and History: CPUs vs GPUs

It’s a good idea to first cover the history of CPUs and GPUs in the context of gaming. Right up until the mid-1990s, there was no such thing as a GPU. GPU, in fact, was a marketing term Nvidia used to promote the GeForce 256, described as the first dedicated graphics processing unit on the market. Prior to this, graphics rendering was handled entirely in software, on the CPU. For the first few decades of gaming, the CPU handled virtually all functions of a game—everything from AI to game logic to audio to rendering the actual visuals.

Descent

In the mid-90s, with the arrival of Quake, Descent, and a number of other advanced, fully three-dimensional games, calculation requirements went up exponentially. At the same time, a number of commercial 3D rendering tools—from Maya to AutoCAD—expanded their commercial presence.

CPUs—which were, at the time, single-core components, simply did not have the resources to juggle game building tasks and rendering at the same time. Quake 1’s software mode is a case in point here: Powerful CPUs of the time, like the Pentium II, struggled to render the game at 640×480 and a frame rate higher than 30 FPS.

World’s First GPU: NVIDIA GeForce 256

To address this issue, 3DFX, Nvidia, AMD, Matrox, and others built hardware graphics accelerators as add-on cards for PCs. These early GPUs were built in a fundamentally different way from CPUs of the time. Unlike single-core CPUs, GPUs were massively parallel, with a large number of fixed-function processing units that were very good at certain kinds of math operations used for rendering, but little else. APIs (application platform interfaces) such as DirectX, Glide, and OpenGL were built to allow CPUs to talk to add on GPUs and leverage their capabilities in supported software. 

From this, it’s evident that GPUs and CPUs have historically played different roles in gaming: GPUs handled visual rendering while CPUs worked on world-building. In short, the CPU tells the GPU (via draw calls) what to render and where to render it (objects via polygons) and the GPU does the actual rendering.

With the advent of unified shaded architectures, though, GPUs became substantially more versatile in terms of what they could do. GPU capabilities were no longer limited to fixed-function rendering tasks. Instead, general-purpose GPU shader cores could be leveraged anywhere that massively parallel compute-intensive hardware would be of use.

GPGPUs, Accelerators and HPC

There was an issue, though. GPUs tend to utilize single-precision 32-bit floating-point math for rendering operations. While this was fine for graphics rendering, computational workloads, like physics modeling and simulation, high accuracy financial computations required far greater precision. Workstation and HPC-centric GPUs arrived on the scene: they offered support for high-precision math, accelerating certain types of calculations by orders of magnitude compared to CPUs.

Outside of the HPC scene, GPUs and GPGPU (general-purpose GPU) utilization caught on in the gaming space too. Many tasks that were traditionally CPU-bound, such as AI, physics, and game logic, could be offloaded to GPUs. This was especially important in the early eighth-gen, as Sony and Microsoft paired frankly terrible CPUs with midrange 2012-era GPUs in the PS4 and Xbox One. GPGPU was extensively utilized in titles like Horizon Zero Dawn to offset the lack of CPU capabilities. 

As we can see, there’s been a considerable amount of convergence between CPUs and GPUs. GPUs have become more adept at general-purpose workloads. Meanwhile, CPUs have become wider, with 6 and 8 core designs now the norm. Even in these circumstances, though, the CPU and GPU do not play an equal role in gaming workloads. What is the right CPU to buy for gaming? What is the right GPU?

How to Select the Right CPU and GPU for your PC?

When selecting the right GPU and the right CPU, you need to have a clear idea about your gaming goals. Are you looking, for instance, to run AAA games at the highest possible resolution with the settings cranked up? Or do you want to run competitive eSports titles at high framerates? Do you have a standard 60 FPS monitor or are you looking to leverage high refresh rates on a 144 Hz panel? 

If you want to enable the highest quality settings at the highest resolutions, the GPU needs to be your purchase priority. Thanks to just how wimpy eighth-gen CPUs are, the vast majority of entry-level to midrange processors available today allow for 60 FPS experience, assuming you have the requisite GPU grunt. If a maxed out, high-res 60 FPS experience is all you’re after, the Ryzen 5 3600 is a great option for the here and now, and for the next couple years. In our benchmark test, the 3600 delivers framerates well above 100 FPS in titles like Shadow of the Tomb Raider, Deus Ex: Mankind Divided, and The Division 2.

Moreover, the presence of 6 cores and 12 threads means that you get a measure of future-proofing as games become increasingly multithreaded. If you’re short on cash and really just want 60 FPS in the here and now, the Ryzen 1600 AF is available for as little as $85. You get 6 previous-gen Zen+ cores running at 3.7 GHz. Frametimes and minimums will take a hit relative to 3rd gen Ryzen, but this is a far more future-proof part than anything else that currently retails for under $100.  

Gaming @ 4K 60Hz vs 1440p 144Hz

You’ll want to use those cost savings to make a better GPU purchase. For 4K gaming, the only real, uncompromised option is the NVIDIA GeForce RTX 2080 Ti. No other card consistently hits 60 FPS in most AAA titles. If you’re fine with dialing down the quality settings and dropping the resolution scale to 0.8, the RTX 2070 Super and RX 5700 XT are solid alternatives. Pair these with a midrange CPU and you’ll have an excellent 60 FPS experience. 

What if you’re looking to maximize framerates, though? High framerate gaming is tricky because it demands lots of CPU and GPU power. The only GPU even remotely capable of utilizing high-refresh 4K monitors in the here and now is the RTX 2080 Ti.

If you’re looking to game at 1440p 144 Hz, though, your options open up considerably on the GPU front, even more so at 1080p 144 Hz. For 144 Hz 1440p gaming, you’re looking at the RTX 2060 Super and RX 5700 as the bare minimum. These cards have a large 8 GB framebuffer to accommodate 1440p, but they also have the raw pixel-pushing power to deliver framerates above 60 Hz. At 1080p, the RX 5600 XT and RTX 2060 are enough for the eSports crowd. 

If you want to make full use of 144 Hz refresh rate 1440p panels, though, 2080 Super class hardware or better is a must, unless you’re playing eSports titles like Overwatch. 

You will need stronger CPUs, too. The Ryzen 5 3600 is a decent starting point, but it can choke at ultra-high framerates. From Team Red, you should look towards the Ryzen 7 parts to power 144 Hz experiences. If you want an Intel part instead, the i7-9700K is your best bet, though it’ll cost a pretty penny. 

As you can see, there is no easy answer to whether the CPU or GPU is more important in your rig. If you’re aiming to maximize quality at a relatively lower framerate, get yourself the best possible GPU and pair it a CPU that won’t bottleneck at your target refresh rate. If you want to play at the highest possible refresh rates, though, you’ll need to invest in a high-end CPU, too, with eight or more cores and a high clock speed. 

Arjun

Penguin-published author, and journalist. Loves PC hardware but has terrible hand-eye coordination. Most likely to be found playing Total War or watching weird Russian sitcoms.

Related Articles

Back to top button