Variable refresh rate, otherwise known as Adaptive Sync is one of the most important technologies in modern gaming. Both NVIDIA and AMD have their own respective implementations of adaptive sync. Although they do the same thing (make your gameplay smoother at lower refresh rates), the way they work is quite different.
More importantly, which one is superior for gamers: G-Sync or FreeSync? Should you buy a FreeSync monitor or one that supports NVIDIA’s G-Sync technology.
Variable Refresh Rate in the form of FreeSync or G-Sync is used to tackle screen tearing and in some cases input lag that can have a consequential impact in competitive gaming. Screen-tearing occurs when parts of multiple frames are displayed simultaneously, resulting in something like this:
It looks as though the frame has been stretched and torn into parts, thereby the name “tearing”. Input lag is rather straight-forward: you press a key and there’s a delay before you see the result. These two problems, (although may not seem like much of an issue) can be the cause of victory or defeat in eSports and competitive titles.
What is Adaptive-Sync (Variable Refresh Rate)?
Traditional monitors come with a fixed refresh rate. The most common is 60 Hz. This is the rate at which the monitor refreshes the screen, before displaying the next frame. It tells you how many frames your monitor can display per sec without tearing. However, as I’m sure you already know games don’t always run at a fixed frame rate, there are inconsistencies, sometimes the GPU ends up rendering more frames than your monitor can display, sometimes less. This results in screen tearing and lags, respectively.
The easiest way to tackle this is to either drop some of the more intensive graphics options or enable V-Sync. However, the latter can cause input lags as it essentially limits the frame rate, generally to the refresh rate, but in some cases to half of it.
What G-Sync and FreeSync compatible monitors do is that they vary the refresh rate of your monitor according to your game performance. Say you are running a game at 50FPS. Then if your monitor supports adaptive sync, it’ll scale down the refresh rate to 50 for smoother gameplay. However on the flip side, if you’re getting more frames per sec than your monitor can display, then this technology can’t help you.
AMD FreeSync vs NVIDIA G-Sync
One of the main differences between FreeSync and G-Sync is that the former like most of AMD’s technologies is OpenSource. It leverages the VESA Adaptive-Sync that comes along with Display Port 1.2a. As such, there are no penalties or royalties that need to be paid to implement FreeSync, allowing OEMs to integrate it into even the cheapest of monitors. The lower end FreeSync models cost less than $120.
G-Sync, on the other hand, is NVIDIA’s own proprietary technology. Earlier there was only one version that required a PCB from Green team to enable it. However, recently NVIDIA embraced the VESA standard, dubbing the monitors that support it “G-Sync Compatible”. G-Sync compatible displays are essentially FreeSync monitors vetted and tested by Jensen and Co. All this requires manpower and resources, so even the third-tier G-Sync Compatible ones are fairly expensive. The cheapest ones cost almost twice as much as some of the entry-level FreeSync monitors.
When it comes to quality, G-Sync monitors take the cake. Of course, they cost more than most modern graphics cards, and that’s the drawback. Low-end FreeSync monitors “get the job done”. Not to say that they are inferior per se, but most of the cheap ones only support Adaptive Sync between 48Hz and 75Hz. Basically, if your frame rate goes below 48, it’ll result in unbearable stuttering.
AMD has come up with something called Low Framerate Compensation (LFC) to deal with this, but only the higher-end models support it. LFC duplicates the frames to push up the average frame rate to the minimum supported by your monitor. Say you’re getting 25 FPS in a game and your monitor supports FreeSync north of 50 FPS. Then LFC will render an identical copy of each frame and display it between constant intervals to increase the average FPS to 50.
Looking at G-Sync displays. There are three tiers: G-Sync Compatible, G-Sync and G-Sync Ultimate.
- As already mentioned, G-Sync Compatible is NVIDIA’s FreeSync equivalent but keep in mind that these are on par with the more expensive models, the ones that come with LFC. Of course, NVIDIA claims that these monitors undergo several dozen tests but this is the main advantage.
- G-Sync is the good old proprietary variant that NVIDIA charges a premium for. These monitors although feature the best implementation of Adaptive Sync, cost more than a pretty penny. They cost well over $500 and are often some of the top-spec monitors. As such, if you are buying a G-Sync monitor, you can be sure you’re buying a top-end display.
- Lastly, there’s FreeSync 2 and G-Sync Ultimate. These come with advanced features such as HDR, LFC and often with monitors boasting a
AMD has got another advantage with respect to connectivity. Traditional G-Sync monitors only work over Display Port. Many G-Sync compatible monitors support HDMI as well, but the more expensive ones are largely limited to DP. Both FreeSync and FreeSync2 monitors come with HDMI as well as DP support, providing more versatile connectivity options.
So there you have it, G-Sync vs FreeSync. Earlier, the main difference was that FreeSync was for the masses: Not the best but affordable. G-Sync was largely limited to enthusiasts with deep pockets. The differences are more subtle now. FreeSync2 is improved and older screens with LFC are mostly on par with G-Sync compatible models, but often can’t be had for cheap. With NVIDIA’s adoption of the VESA standard, the playing field has mostly leveled. However, you can still find FreeSync monitors much cheaper than rival G-Sync screens. They’re not the best, but considering the dirt cheap prices, they’re more than what you can expect.