In a new series of features - 2020 Vision - Alex Battaglia returns to genuine PC classics, re-evaluating their technology and seeing how well these games perform on modern, mainstream PC hardware. In this initial piece, Alex revisits the astonishing The Witcher 2: Assassins of Kings. Think you can run it maxed at 1080p60 on a mainstream gaming? Well, let's talk cinematic depth of field and ubersampling...
will try to do a summary, if it's necessary (and if I'm not still playing the temtem stress test)
- The Witcher 2 came at a time where consoles have been hitting their peaks and PC was treated as a secondary platform
- minor environmental elements have high geometric detail for the time
- NPCs use the same models when in cutscenes or in gameplay when playing on the highest settings
- NPCs have texture resolutions and polycounts on par with main characters
- Ultra texture settings use 600MB of vram alone
- DX9 game, lacked mult-threading, lots of pop-in in the environment
- deferred rendering, lots of dynamic lights with realtime shadows
- GI faked via point lights
- AI and Quests affected by the realtime time of day
- GTX 480 sucked
- point light shadows are very low res
- 580/1060 can't max the game at 1080p/60 (with ubersampling)
- 580 has higher framerate, 1060 more stable
- 2080Ti gets under 60fps with 4K/Ubersampling (8K)
- cinematic depth of field is "insane" (direct words from the person who made it)
- cdof chokes a 2080Ti like crazy
Last edited: