Something is up with Nvidia's drivers or the how the game scales, older architectures perhaps I can see, but a 590 beating a 2060!?
I question how much value 'future proofing' really has though when providing 'Ultra' settings where people have to strain to notice the difference in still shots (if any), yet they cause such a huge performance hit. When Ultra is something like ray-tracing or providing massively increased texture quality/lighting, sure. But slightly more accurate fog? That's not going to make this game look next-gen 5 years from now when midrange cards can run it at 4k 60fps (if even). All it does is just puts the onus on the user when if they have a modern system, they're justified in immediately slamming all the sliders to the right out of habit. Maybe I'm being overly cynical, but it seems like often the 'Ultra' settting is given so publishers/developers can say "See PC gamers we hear you!" when very little optimization has gone into actually making them worthwhile, it's just a notch on the slider so us PC gamers can feel like we're being catered to.
Even taking that into account and ignoring ultra settings, I think the performance overall right now seems a little subpar. Remember how well GTA5 was optimized for the PC, just blew away the console versions in performance/graphics. Whereas in most multiplatform titles, a 590/580 can usually come close or match (and exceed in some cases) the Xbox One X version running at the same quality settings and resolution - as of now, it doesn't look like you're getting a locked 30fps even on medium (?) with those cards at a native 4K like the X is doing. Not even with a 1660Ti which from my experience easily outpaces the X in virtually every multiplatform title I've seen, hell it would be lucky to break 20fps at 4k/medium from the benchmarks I'm seeing.