In a gaming PC, the monitor is easily the single most important component. No matter how expensive your fancy GPU may be, as long as you don’t have a decent monitor, the output will look sub-par. While picking a display for your PC, there are some key points that need to be kept in mind. While it’s best to go to a physical store when buying a monitor, if that’s not an option for you, this guide will come in handy.
Ths screen resolution and refresh rate may important (and they are), but they are other factors that are equally important. Many gamers ignore them and the results can vary from washed-out colors to poor viewing angles and even incompatibility issues.
This is an obvious point, but we can’t overstate how important it is to get a monitor with a screen size that works for you. I currently use a 55″ 4K TV mounted on my desk as a monitor. Will that work for everyone? Absolutely not. Might it work for you? Maybe. As a rule of thumb, don’t look at monitors below 24 inches. At that point, your display won’t fill up your field of view even if you’re sitting very close. Monitors in the 27″ to 32″ range are a great option for desks. You can sit close enough. Moreover, they tend to be large enough to deliver an immersive experience. A couple of vendors like Philips offer monitors (not TVs) with larger screen sizes, such as the 43″ Philips Momentum.
These are considerably larger than your typical desktop monitor. You will need to make sure that you have enough space on your desk before making this purchase. Do you have space for a large monitor as well as your peripherals? Are you comfortable with having a large monitor? Do you tend to sit up close or sit far away? Consider these factors when looking at monitor size. Another thing to note is that pixel density scales with monitor size. This means that 1080p, for instance, looks better on a 24 in monitor than on a 27″ monitor. Higher resolutions look better on higher monitors, but higher resolution panels can add to cost.
What resolution works for you? The higher the resolution, the sharper the image will be at the given screen size. High-resolution 1440p and 4K monitors can have a significant impact on the visual fidelity, especially the texture. The drawback, of course, is that monitor resolution generally scales with the price. 4K monitors usually cost more than the fastest CPUs.
Another factor is that it takes significantly more GPU power to run high-resolution displays at their native resolutions. It takes 4 times as much GPU processing power to deliver a 4K image at 60 FPS, compared to a 1080p image at 60 FPS. However, in most cases, the performance penalty is well worth it. The difference between 1080p and 4K is clear as day: Sharper textures, more detail, notably less aliasing, vibrant colors, and so on.
If you want to make the most of a high-res monitor, you’ll need capable hardware. If you’re planning on using a 4K panel, you’ll need a GeForce RTX 2070 Super at the very least and an RTX 2080 Super (or better yet the RTX 2080 Ti) for the ideal experience. For 1440p, the Radeon RX 5700 XT, or yet again, the RTX 2070 Super is ideal. 1080p or FHD works well with a large number of midrange, entry-level, and previous-gen graphics cards.
An important point to note here is that the more GPU-bound your gaming load gets, the less important CPU power becomes. This means that if you plan on gaming at higher resolutions and lower framerates, you might be able to save cash by investing in a cheaper CPU. Keep in mind, though, that CPU power doesn’t just impact framerates, but frame times as well. Even at your target framerate, a weaker CPU could offer worse frame pacing for a perceptually worse experience.
The higher the refresh rate, the smoother the movement is onscreen. Most monitors run at a standard 60 Hz refresh rate, which means that the image onscreen is refreshed at a maximum of 60 times a second. 60 Hz is plenty smooth in most cases. However, if you’re an eSports gamer or just really like the feel of high refresh rate gaming, you want to opt for higher refresh rate panels at 120 Hz, 144 Hz or above.
It’s important to note that hitting high framerates is a necessity to make good use of a high refresh rate panel, and the higher the framerate, the harder the workload on your CPU and GPU. To game at 144 Hz, you’ll need a powerful processor and graphics card. Manufacturers tend to see high refresh rate as a luxury value-add which means that they almost always cost more than comparable 60 Hz panels.
A lot of the time, manufacturers also opt to use lower quality TN panels on their high refresh rate monitors. These have significantly worse contrast and black levels than IPS and VA panels. If you’re buying a high refresh rate monitor, make sure to avoid TN panels unless you really don’t have a choice budget-wise.
Many panels are also advertised as featuring variable refresh rate technology (e.g. FreeSync and GSync). These are great add-ons. A variable refresh rate monitor is able to sync its refresh rate with the framerate that your GPU outputs. This means that you won’t have to deal with stutter and screen-tear. While variable refresh rate tends to carry a premium, many entry-level monitors now sport this feature. Keep an eye out for it.
Closely related to the refresh rate is the monitor response rate, typically measured in milliseconds. This determines how long it takes for a monitor to respond to a change in pixel color (typically from grey to grey). A low response rate, in the sub-4ms range, is ideal and will minimize perceived latency. Higher response rates can lead to perceived latency and introduce lag when gaming.
We’ve covered monitor panel types in detail before. Long story short: TN panels cost the least, but deliver the worst image quality. IPS panels offer the best all-round viewing angles, response rates, contrast, black levels, and color accuracy while costing more. VA panels offer very high contrast and rich colors. However, they tend to have slow response rates. A general rule of thumb is to look out for the best IPS panel you can find at a given price range. If you can’t an IPS panel that meets your needs, look to VA panels with higher response rates.
The type of panel you pick isn’t the only determinant in image quality. Your monitor’s color gamut and contrast ratio are very important measures to look at. The color gamut refers to the range of colors that a monitor can display. Conventional 8-bit panels can display around 16 million colors.
Color accuracy refers to how close the specific colors your monitor displays lines up with standards such as S-RGB, Adobe RGB, DCI-P3. For this, good color calibration is a must, so that your monitor not only shows 16 million colors but so that it shows the right 16 million colors. Any decent monitor should have in excess of 99 percent S-RGB coverage. Professional monitors should cover a wide part of bigger gamuts. If possible, look to monitors with 90 percent or greater Adobe RGB coverage.
As you can see, sRGB has limited green coverage and light blue is also not the best. NTSC and DCI are much more diverse. Any such monitor (even DCI 95%) will be better than an sRGB monitor. Adobe RGB is traditionally considered the best but these displays tend to be slightly on the expensive side. You can, however, grab a decently priced one during sales or rebates.
What about the contrast ratio? The contrast ratio refers to the ratio between the brightest and dimmest light output a monitor can manage. Because of the way the tech works, TN panels tend to have the worst contrast, with sub-1000 contrast ratios the norm. Look to monitors with 1000:1 contrast ratio at the minimum: most decent IPS panels are in this range. VA panels, for all their other tradeoffs, have excellent contrast. It’s not uncommon for even budget VA panels to have 3000:1 contrast ratios or higher. At a given budget, make sure to get the most color-accurate and high contrast monitor available.
HDR and SDR
HDR stands for high dynamic range. It refers both to a monitors ability to display a wide range of colors (10-bit color, meaning 1 billion shades onscreen), and its ability to dynamically deliver a high range of brightness values (up to 1000 nits). SDR (standard dynamic range) monitors offer 8-bit color, with 16 million shades available, and go up to 400 nits in brightness.
Personal opinion: Everyone should at least try using an HDR monitor. Even HDR10 is a big step up from SDR. The colors are much richer. The contrast and gamma are also notably better. It’s like 4K, once you get used to it, you won’t go back.
HDR panels dynamically adjust the brightness based on the scene to accurately mimic brightness levels in real-life. An important point here is that lower HDR standards (HDR 400) might actually look worse than SDR on certain monitors, especially if the monitors don’t feature local dimming tech. What’s local dimming? It’s the use of small LEDs that adjust brightness levels in local areas of the display instead of brightening or dimming the entire thing.
This is something people tend to overlook. It’s critical to have a monitor that accepts what your GPU outputs. If your GPU outputs over DVI-D and your monitor doesn’t have a DVI-D input, you’re out of luck (unless of course, you have a different cable lying around). Make sure that your monitor has display inputs that work with your GPU.
There are tons of factors to consider when buying a monitor and we haven’t even really skimmed the surface here. But as a rule of thumb, look for the highest-resolution IPS panel you can find, at a screen size that’s right for you, with variable refresh rate technology. If that combination of features isn’t available, consider what trade-offs are easiest to make for you.