GamingGPUsNews

Don’t Panic Over Cyberpunk 2077 Specifications Just Yet: Revisiting the Witcher 3 Launch

These days many outlets have started posting “expected” system requirements for CD Project’s upcoming RPG, Cyberpunk 2077. These specs are usually quite intimidating with the CPU recommendation being the Core i7-10700K and the GPU option being NVIDIA’s top-guns, the RTX 2080 Super and the 2080 Ti. Let me make it clear, don’t fall for these reports. They are plain misleading.

Let’s revisit the launch of the Masterpiece that skyrocketed CDPR’s popularity, The Witcher 3. If you look at the initial benchmarks, you’ll see that it was impossible to run the game at ultra with reasonable frame rates:

Source: Techspot
Source: Techspot

Even the then top-end GTX 980 failed to hit 30 FPS at 4K ultra. However, turning the quality settings down to medium or high made it a much less scarier:

Source: Techspot
Source: Techspot

The GTX 980 was able to hit a meaty 100 FPS at 1080p medium and 30 FPS at 4K. The later launched GTX 980 Ti was able to net around 40 FPS and nearly 60 FPS at ultra with 2-way SLI. Slowly with patches and the launch of the 10-series Pascal lineup, even the GTX 1080 was able to run The Witcher 3 at 4K ultra with reasonable frame rates. The 1080 Ti, on the other hand, was able to net a silky smooth 60 FPS experience.

I expect a similar scenario with Cyberpunk 2077. The GeForce RTX 2070 Super|2080 won’t be able to max out the game at 1080p ultra with ray-tracing, even WITH DLSS 2.0 turned on. The RTX 2080 Ti should be able to but that’s hardly a card for sub-1080p resolutions. Turning RTX off and turning the raster settings to medium-high should, however, let most graphics cards run the game at reasonable frame rates (50-ish).

The upcoming Ampere (RTX 3000) and AMD’s Navi 2x GPU should be able to handle the game just fine though, even with the RTX settings cranked up to ultra. That’s one of the reasons the launch has been pushed to November. Otherwise, the ray-tracing options would look quite obsolete, and of course, folks would rise up calling the game unoptimized, similar to what we saw with the Witcher 3 launch.

As far as the CPU side is concerned, you don’t need Intel’s 10th gen CPUs at all. They’re basically better-binned 9th Gen chips. The Core i5-9600K is what I’d suggest for optimal performance and on AMD’s side the Ryzen 5 3600X. Like the Witcher 3, I expect Cyberpunk to make proper use of multi-core CPUs, but proper optimizations paired with DX12’s reduced CPU overhead means that the game will also run with quad-core chips such as the Ryzen 3 3300X. The recently demoed preview further backs my story:

Cyberpunk 2077 Preview Ran @ 1080p 60 FPS w/ DLSS 2.0 with Drops to 30 FPS

Areej

Computer Engineering dropout (3 years), writer, journalist, and amateur poet. I started Techquila while in college to address my hardware passion. Although largely successful, it suffered from many internal weaknesses. Left and now working on Hardware Times, a site purely dedicated to. Processor architectures and in-depth benchmarks. That's what we do here at Hardware Times!
Back to top button