How so?Christ, the sheer amount of graphics options in this games makes my skin crawl.
Yeah I know, but it was very brief and the final settings weren't confirmed at that point. I generally preferred Alex's videos when he used a 1070 in his system as I think that was far more in-line with what your average PC gamer had and even if you didn't, it was easier to extrapolate.Rich already made a video with RX 580/GTX 1060 btw. IIRC the 580 was very close to X1X considering you can't go as low on all the settings.
What CPU do you haveI finally got this game working after updating my BIOS and completely redoing all my computer's settings afterwards. With my 2080 Ti I'm able to get it mostly all ultra and still get 4K60, so I'm happy.
Yeah, I get that
Should be easily done tbfI'm going to use this video as a guide to get the highest settings I can @1080p/60fps on my 9900k/1080 Ti.
The X has 12gb of ramAs I was saying in the other topic, texture VRam requirement doens't make any sense and need to be fixed.
Ultra quality (Xbox One X setting) takes 6-8GB of VRam... and that's mental, especially if you think that Xbox One X use 8GB of shared Ram, so both Ram and VRam.
This NEED to be fixed by R*.
I'm holding out for the Steam version. Hopefully, there will be a patch or two by then to iron some of the issues out.
I mean ultimately it's better with more granularity but yeah I get it sometimes I just wanna set a preset and go.Christ, the sheer amount of graphics options in this games makes my skin crawl.
4GB is taken by the system, so it's 8GB left for the game.
I would also check out the Gamers Nexus GPU video. With their High settings the 1080 Ti was getting 97fps at 1080p. You can use that to tweak things up to Ultra from there.I'm going to use this video as a guide to get the highest settings I can @1080p/60fps on my 9900k/1080 Ti.
I would also check out the Gamers Nexus GPU video. With their High settings the 1080 Ti was getting 97fps at 1080p. You can use that to tweak things up to Ultra from there.
Ok, so I quickly tested the optimized XBX-equivalent settings and got the following rough numbers at 4k with the benchmark (using DSR on my 1080p monitor):
1st scene: 46 fps
2nd scene: 52-58 fps
3rd scene: 52 fps
4th scene: 53 fps
5th scene: 42-55 fps (drops to 42+ on horseback scene. Lowest drop is 32 during final shootout. Interestingly, this final scene had snow in it so not sure if that affected performance)
Min: 33.76 fps
Avg: 45.24 fps
This is on a 1080 Ti (oc'ed a bit), stock 3700x and 32GB ram.
To be honest, I was expecting much higher numbers than that with my PC. My initial expectation was hitting 60 fps with similar console settings at 4k, which I am so far able to achieve with other games (usually with even better fidelity too)
nvidia doesnt perform well in this game. At this point people shouldnt be surprised anymore. Nvidia cards just wont age well in the big titles as long as both consoles use AMD. The more impressive the game is, the worse nvidia typically performs.
This could have an explanation.
Nvidia is superior to AMD in video drivers. Vulkan/DX12 allow developers to bypass most of driver-level optimization to go directly to the "metal" optimization.
On the other hand Nvidia has shown to be able to deliver much better performance on DX11, because their engineers still have a better knowledge of the internal behavior of the hardware.
So when you remove from the equation this driver-level boost that Nvidia still retains, things get equalized with AMD.
Love to see your settings.I finally got this game working after updating my BIOS and completely redoing all my computer's settings afterwards. With my 2080 Ti I'm able to get it mostly all ultra and still get 4K60, so I'm happy.
I wanna see where I might be going wrong with trying to maintain 60fps at 1440p with mostly ultra, with my 3700x and 2080ti.Yeah, I am just now messing with it. If I want 60fps at 4k with my 2080ti, I pretty much have to match the Xbox X settings provided here. Anything more and 60FPS is not happening.
I wanna see where I might be going wrong with trying to maintain 60fps at 1440p with mostly ultra, with my 3700x and 2080ti.
If only there was a video you could watch to answer thisso which settings could i lower to high from ultra and not really notice a difference but gain framerate?
Btw everyone, thanks for such kind words. Taking a day of tomorrow, but will reply to everyone here very soon.
The game runs quite poorly on the 1000 series sadly, not sure why.Ok, so I quickly tested the optimized XBX-equivalent settings and got the following rough numbers at 4k with the benchmark (using DSR on my 1080p monitor):
1st scene: 46 fps
2nd scene: 52-58 fps
3rd scene: 52 fps
4th scene: 53 fps
5th scene: 42-55 fps (drops to 42+ on horseback scene. Lowest drop is 32 during final shootout. Interestingly, this final scene had snow in it so not sure if that affected performance)
Min: 33.76 fps
Avg: 45.24 fps
This is on a 1080 Ti (oc'ed a bit), stock 3700x and 32GB ram.
To be honest, I was expecting much higher numbers than that with my PC. My initial expectation was hitting 60 fps with similar console settings at 4k, which I am so far able to achieve with other games (usually with even better fidelity too)
3GB not 4GB
Because I can.Why are you comparing specs between two different architectures??? (Pascal ≠ Turing)
And seriously where do you get your info to think the Ti is faster than a 2080?
Because I can.
And the GTX 1080 Ti, specs wise, is faster in basically every single way. How many times must I repeat myself?
So factual information doesn't make sense to you. Ok :)
Here goes:Yeah, I am just now messing with it. If I want 60fps at 4k with my 2080ti, I pretty much have to match the Xbox X settings provided here. Anything more and 60FPS is not happening.
The only mechanism we have to measure the relative power of GPUs is by benchmarking it, and different benchmarks run differently on different architectures. We saw on day 1 that Turing thrived on games that that were traditionally said to be AMD favoring - things like Wolfenstein 2, or Strange Brigade or whatever all run great on both Turing and GCN.
At the time of the 2080's release, aggregate game benchmarks put it roughly on par with the 1080ti - a touch slower depending on which specific games you looked at. The 1080ti has slightly more memory bandwidth than the 2080, and it has more overall memory. It has roughly the same FP32 compute. It has more Rops and Cuda cores, but fewer transistors. Architectural differences are what allow the 2080 to compete with the 1080ti despite paper specs where the numbers are higher on 1080ti. Considering transistor counts that doesn't necessarily mean it's much more efficient, but it is different.
Unless we get a patch or driver update tomorrow that totally improves performance on Pascal cards, we won't be able to confirm whether the game is designed in a way that heavily utilizes Turing's differences, or whether it's some random last minute glitch, or because it hasn't had much attention paid to these cards from Rockstar, or whether Nvidia hasn't put much effort into performance on those cards for this game, or whatever. There's lots of things it could be (possibly multiple things in conjunction).
I will say that this game is a very abnormal data point in terms of the sheer difference between Pascal and Turing. Like there's a ~25% advantage that the 2080 has over the 1080ti here, whereas in Wolfenstein 2 (a game famous for running much better on Turing, generally the one with the largest difference when the reviews came out) only has a ~12-15% performance advantage on 2080 over 1080ti.
The performance advantage using Vulkan in Wolfenstein 2 is around 35% performance advantage in favor of the 2080 over the 1080ti.
Ok, so I quickly tested the optimized XBX-equivalent settings and got the following rough numbers at 4k with the benchmark (using DSR on my 1080p monitor):
1st scene: 46 fps
2nd scene: 52-58 fps
3rd scene: 52 fps
4th scene: 53 fps
5th scene: 42-55 fps (drops to 42+ on horseback scene. Lowest drop is 32 during final shootout. Interestingly, this final scene had snow in it so not sure if that affected performance)
Min: 33.76 fps
Avg: 45.24 fps
This is on a 1080 Ti (oc'ed a bit), stock 3700x and 32GB ram.
To be honest, I was expecting much higher numbers than that with my PC. My initial expectation was hitting 60 fps with similar console settings at 4k, which I am so far able to achieve with other games (usually with even better fidelity too)
The only mechanism we have to measure the relative power of GPUs is by benchmarking it, and different benchmarks run differently on different architectures. We saw on day 1 that Turing thrived on games that that were traditionally said to be AMD favoring - things like Wolfenstein 2, or Strange Brigade or whatever all run great on both Turing and GCN.
At the time of the 2080's release, aggregate game benchmarks put it roughly on par with the 1080ti - a touch slower depending on which specific games you looked at. The 1080ti has slightly more memory bandwidth than the 2080, and it has more overall memory. It has roughly the same FP32 compute. It has more Rops and Cuda cores, but fewer transistors. Architectural differences are what allow the 2080 to compete with the 1080ti despite paper specs where the numbers are higher on 1080ti. Considering transistor counts that doesn't necessarily mean it's much more efficient, but it is different.
Unless we get a patch or driver update tomorrow that totally improves performance on Pascal cards, we won't be able to confirm whether the game is designed in a way that heavily utilizes Turing's differences, or whether it's some random last minute glitch, or because it hasn't had much attention paid to these cards from Rockstar, or whether Nvidia hasn't put much effort into performance on those cards for this game, or whatever. There's lots of things it could be (possibly multiple things in conjunction).
I will say that this game is a very abnormal data point in terms of the sheer difference between Pascal and Turing. Like there's a ~25% advantage that the 2080 has over the 1080ti here, whereas in Wolfenstein 2 (a game famous for running much better on Turing, generally the one with the largest difference when the reviews came out) only has a ~12-15% performance advantage on 2080 over 1080ti.
I mean, I just recently updated windows to 1903 and my performance has dropped, 1 min longer for boot ups to windows, lower benchmark scores on 3Dmark both via cpu and gpu with the newest Nvidia drivers, we are only talking about 4-5fps difference but it's still lower and slower than it was before and I can see that when comparing benchmarks, some people might be inclinced to update their hardware when they notice slowdowns, that's all I am saying.
Slightly OT but not really.... over 1 minute Windows boot? That's insane, if you're still on a regular HDD you NEED to go SSD, it'll accelerate everything including games like RDR2. If you already did something else might be wrong. Newer Windows versions have patches against all those CPU vulnerabilities integrated which will hit older CPUs so slight performance degradation is expected if you're coming from an unpatched version but it shouldn't be that bad.