• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

empo

Member
Jan 27, 2018
3,127
Rich already made a video with RX 580/GTX 1060 btw. IIRC the 580 was very close to X1X considering you can't go as low on all the settings.
 

Flappy Pannus

Member
Feb 14, 2019
2,343
Rich already made a video with RX 580/GTX 1060 btw. IIRC the 580 was very close to X1X considering you can't go as low on all the settings.
Yeah I know, but it was very brief and the final settings weren't confirmed at that point. I generally preferred Alex's videos when he used a 1070 in his system as I think that was far more in-line with what your average PC gamer had and even if you didn't, it was easier to extrapolate.

I get having the 2080ti in there, I want to see its performance as what's possible, but it's so out of the range of anyone but the most hardcore enthusiasts, and also has more vram than any other card, so in a video focusing on what you can get with the 'PC' and how it relates to a console, it's a little too tightly focused on hardware the vast, vast majority don't have.
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
I finally got this game working after updating my BIOS and completely redoing all my computer's settings afterwards. With my 2080 Ti I'm able to get it mostly all ultra and still get 4K60, so I'm happy.
 

SixelAlexiS

Member
Oct 27, 2017
7,743
Italy
As I was saying in the other topic, texture VRam requirement doens't make any sense and need to be fixed.
Ultra quality (Xbox One X setting) takes 6-8GB of VRam... and that's mental, especially if you think that Xbox One X use 8GB of shared Ram, so both Ram and VRam.

This NEED to be fixed by R*.
 

SleepSmasher

Banned
Oct 27, 2017
2,094
Australia
Lovely video, although I could personally care less about "Optimised Xbox One X settings". Would like it even more if this was a side by side comparison between Low and Ultra and the FPS impact on each, then it'd be up to the viewer to choose the best settings for his/her scenario. The summary at the end of the video with the settings to reflect X1X IQ is great though.

I do understand why his videos use such template, not complaining, just my personal opinion.
 

Deleted member 224

Oct 25, 2017
5,629
As I was saying in the other topic, texture VRam requirement doens't make any sense and need to be fixed.
Ultra quality (Xbox One X setting) takes 6-8GB of VRam... and that's mental, especially if you think that Xbox One X use 8GB of shared Ram, so both Ram and VRam.

This NEED to be fixed by R*.
The X has 12gb of ram
 

sugarmonkey

Banned
Oct 27, 2017
515
Damn, can't watch it right now. Can someone do a summary ?

Using the settings in the attached image is the baseline on PC if you have similar hardware. Better hardware with these settings will give you a better image at 60+ FPS depending on how much better your hardware is. Some of the settings marked as "Low" are even lower on Xbox 1X, but DF wasn't sure exactly how.
ZrwEdoL.png
 
Oct 25, 2017
2,945
This game can eat GPUs for brekfast, wow. Really hope high end Ampere or RDNA 2 cards are here by the time I want to uprgrade my PC.

Thanks for your hard work Dictator!
 

Tahnit

Member
Oct 25, 2017
9,965
so which settings could i lower to high from ultra and not really notice a difference but gain framerate?
 

icecold1983

Banned
Nov 3, 2017
4,243
Ok, so I quickly tested the optimized XBX-equivalent settings and got the following rough numbers at 4k with the benchmark (using DSR on my 1080p monitor):

1st scene: 46 fps
2nd scene: 52-58 fps
3rd scene: 52 fps
4th scene: 53 fps
5th scene: 42-55 fps (drops to 42+ on horseback scene. Lowest drop is 32 during final shootout. Interestingly, this final scene had snow in it so not sure if that affected performance)

Min: 33.76 fps
Avg: 45.24 fps

This is on a 1080 Ti (oc'ed a bit), stock 3700x and 32GB ram.

To be honest, I was expecting much higher numbers than that with my PC. My initial expectation was hitting 60 fps with similar console settings at 4k, which I am so far able to achieve with other games (usually with even better fidelity too)

nvidia doesnt perform well in this game. At this point people shouldnt be surprised anymore. Nvidia cards just wont age well in the big titles as long as both consoles use AMD. The more impressive the game is, the worse nvidia typically performs.
 

Kadath

Member
Oct 25, 2017
621
nvidia doesnt perform well in this game. At this point people shouldnt be surprised anymore. Nvidia cards just wont age well in the big titles as long as both consoles use AMD. The more impressive the game is, the worse nvidia typically performs.

This could have an explanation.

Nvidia is superior to AMD in video drivers. Vulkan/DX12 allow developers to bypass most of driver-level optimization to go directly to the "metal" optimization.

On the other hand Nvidia has shown to be able to deliver much better performance on DX11, because their engineers still have a better knowledge of the internal behavior of the hardware.

So when you remove from the equation this driver-level boost that Nvidia still retains, things get equalized with AMD.
 

icecold1983

Banned
Nov 3, 2017
4,243
This could have an explanation.

Nvidia is superior to AMD in video drivers. Vulkan/DX12 allow developers to bypass most of driver-level optimization to go directly to the "metal" optimization.

On the other hand Nvidia has shown to be able to deliver much better performance on DX11, because their engineers still have a better knowledge of the internal behavior of the hardware.

So when you remove from the equation this driver-level boost that Nvidia still retains, things get equalized with AMD.

Not really, even under DX11 nvidia falls behind.
 

F34R

Member
Oct 27, 2017
12,007
Yeah, I am just now messing with it. If I want 60fps at 4k with my 2080ti, I pretty much have to match the Xbox X settings provided here. Anything more and 60FPS is not happening.
I wanna see where I might be going wrong with trying to maintain 60fps at 1440p with mostly ultra, with my 3700x and 2080ti.
 

Hero_of_the_Day

Avenger
Oct 27, 2017
17,380
I wanna see where I might be going wrong with trying to maintain 60fps at 1440p with mostly ultra, with my 3700x and 2080ti.

I didn't fuck with it a ton, but in my short struggle to 60FPS, TAA was the deciding factor. I went from maxed to off, so there might be an in between that would do it, too. Didn't do a ton of testing.
 

R0C

Member
Oct 28, 2017
5
Can't get the full screen working at startup despite setting -fullscreen in arguments in DX12
 

JahIthBer

Member
Jan 27, 2018
10,395
Ok, so I quickly tested the optimized XBX-equivalent settings and got the following rough numbers at 4k with the benchmark (using DSR on my 1080p monitor):

1st scene: 46 fps
2nd scene: 52-58 fps
3rd scene: 52 fps
4th scene: 53 fps
5th scene: 42-55 fps (drops to 42+ on horseback scene. Lowest drop is 32 during final shootout. Interestingly, this final scene had snow in it so not sure if that affected performance)

Min: 33.76 fps
Avg: 45.24 fps

This is on a 1080 Ti (oc'ed a bit), stock 3700x and 32GB ram.

To be honest, I was expecting much higher numbers than that with my PC. My initial expectation was hitting 60 fps with similar console settings at 4k, which I am so far able to achieve with other games (usually with even better fidelity too)
The game runs quite poorly on the 1000 series sadly, not sure why.
 

Cripterion

Banned
Oct 27, 2017
1,104
Great video. I run the game without any performance issues on my rig and I'm only annoyed about the constant crashing but your vid highlighted pretty much everything.

Though my personal opinion is the game on ultra settings doesn't differ that much from the Xbox One X if you aren't playing in first person and looking for the obvious differences.
In the end having finished the game on ps4 pro, and playing it now on pc the visual differences are mostly in lighting (though I do remember the resolution being fuzzy on Playstation) as I don't have hdr going from TV to monitor and I'm enjoying the smoothness of the game with higher framerates and Gsync, that and much faster loading times of course.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
User banned (2 weeks): Trolling over a series of post. History of related behavior.
Why are you comparing specs between two different architectures??? (Pascal ≠ Turing)
And seriously where do you get your info to think the Ti is faster than a 2080?
Because I can.

And the GTX 1080 Ti, specs wise, is faster in basically every single way. How many times must I repeat myself?
 
Nov 8, 2017
13,135
The only mechanism we have to measure the relative power of GPUs is by benchmarking it, and different benchmarks run differently on different architectures. We saw on day 1 that Turing thrived on games that that were traditionally said to be AMD favoring - things like Wolfenstein 2, or Strange Brigade or whatever all run great on both Turing and GCN.

At the time of the 2080's release, aggregate game benchmarks put it roughly on par with the 1080ti - a touch slower depending on which specific games you looked at. The 1080ti has slightly more memory bandwidth than the 2080, and it has more overall memory. It has roughly the same FP32 compute. It has more Rops and Cuda cores, but fewer transistors. Architectural differences are what allow the 2080 to compete with the 1080ti despite paper specs where the numbers are higher on 1080ti. Considering transistor counts that doesn't necessarily mean it's much more efficient, but it is different.

Unless we get a patch or driver update tomorrow that totally improves performance on Pascal cards, we won't be able to confirm whether the game is designed in a way that heavily utilizes Turing's differences, or whether it's some random last minute glitch, or because it hasn't had much attention paid to these cards from Rockstar, or whether Nvidia hasn't put much effort into performance on those cards for this game, or whatever. There's lots of things it could be (possibly multiple things in conjunction).

I will say that this game is a very abnormal data point in terms of the sheer difference between Pascal and Turing. Like there's a ~25% advantage that the 2080 has over the 1080ti here, whereas in Wolfenstein 2 (a game famous for running much better on Turing, generally the one with the largest difference when the reviews came out) only has a ~12-15% performance advantage on 2080 over 1080ti.
 

Polyh3dron

Prophet of Regret
Banned
Oct 25, 2017
9,860
Love to see your settings.
Yeah, I am just now messing with it. If I want 60fps at 4k with my 2080ti, I pretty much have to match the Xbox X settings provided here. Anything more and 60FPS is not happening.
Here goes:

Resolution: 3840x2160
Screen Type: Windowed Borderless
VSync: On
Triple Buffering: On
Texture Quality: Ultra
Anisotropic Filtering: X16
Lighting Quality: Ultra
Global Illumination Quality: Ultra
Shadow Quality & Far Shadow Quality: High
SSAO: High
Reflection & Mirror Quality: Ultra
Water, Volumetrics, Particle, Tessellation, TAA: All High
FXAA & MSAA: Off
Graphics API: Directx 12
Near & Far Volumetric Resolutions: High
Volumetric Lighting Quality: High
Unlocked Volumetric Raymarch Resolution: Off
Particle Lighting Quality: Medium
Soft Shadows: High
Grass Shadows: Medium
Long Shadows: On
Full Resolution SSAO: Off
Water Refraction & Reflection Quality: High
Water Physics Quality: 80%
Resolution Scale: Off
TAA Sharpening: Full
Motion Blue: Off
Reflection MSAA: Off
Geometry LOD: Full
Grass LOD: 80%
Tree, Parallax Occlusion, Decal, Fur Quality: All High

My GPU is also boosted a bit by MSI Afterburner's overclock feature.
 

Monster Zero

Member
Nov 5, 2017
5,612
Southern California
The only mechanism we have to measure the relative power of GPUs is by benchmarking it, and different benchmarks run differently on different architectures. We saw on day 1 that Turing thrived on games that that were traditionally said to be AMD favoring - things like Wolfenstein 2, or Strange Brigade or whatever all run great on both Turing and GCN.

At the time of the 2080's release, aggregate game benchmarks put it roughly on par with the 1080ti - a touch slower depending on which specific games you looked at. The 1080ti has slightly more memory bandwidth than the 2080, and it has more overall memory. It has roughly the same FP32 compute. It has more Rops and Cuda cores, but fewer transistors. Architectural differences are what allow the 2080 to compete with the 1080ti despite paper specs where the numbers are higher on 1080ti. Considering transistor counts that doesn't necessarily mean it's much more efficient, but it is different.

Unless we get a patch or driver update tomorrow that totally improves performance on Pascal cards, we won't be able to confirm whether the game is designed in a way that heavily utilizes Turing's differences, or whether it's some random last minute glitch, or because it hasn't had much attention paid to these cards from Rockstar, or whether Nvidia hasn't put much effort into performance on those cards for this game, or whatever. There's lots of things it could be (possibly multiple things in conjunction).

I will say that this game is a very abnormal data point in terms of the sheer difference between Pascal and Turing. Like there's a ~25% advantage that the 2080 has over the 1080ti here, whereas in Wolfenstein 2 (a game famous for running much better on Turing, generally the one with the largest difference when the reviews came out) only has a ~12-15% performance advantage on 2080 over 1080ti.


The performance advantage using Vulkan in Wolfenstein 2 is around 35% performance advantage in favor of the 2080 over the 1080ti.
 

Alvis

Saw the truth behind the copied door
Member
Oct 25, 2017
11,237
Spain
Jesus christ, I didn't know the game had THAT many options.

Waiting for the Steam release... This will probably be the first game I throw at my new laptop (i7 9750H + RTX 2070)
 

JesseDeya

Member
Oct 27, 2017
164
Ok, so I quickly tested the optimized XBX-equivalent settings and got the following rough numbers at 4k with the benchmark (using DSR on my 1080p monitor):

1st scene: 46 fps
2nd scene: 52-58 fps
3rd scene: 52 fps
4th scene: 53 fps
5th scene: 42-55 fps (drops to 42+ on horseback scene. Lowest drop is 32 during final shootout. Interestingly, this final scene had snow in it so not sure if that affected performance)

Min: 33.76 fps
Avg: 45.24 fps

This is on a 1080 Ti (oc'ed a bit), stock 3700x and 32GB ram.

To be honest, I was expecting much higher numbers than that with my PC. My initial expectation was hitting 60 fps with similar console settings at 4k, which I am so far able to achieve with other games (usually with even better fidelity too)

Yeah that's bad. I mean it seems the 1080Ti (Pascal) has been nerfed in this game anyway, but your results are very low.

By comparison, I am running a 1080Ti (oc'ed a good amount 2076Mhz core, +750 on the ram) but with a much worse CPU (6700k @ 4.6) and at 4k with these XBX settings I get (two run average):

Min : 45.211083
Avg : 54.299125

Absolutely no way my fps drops to 32 (did you mean 42?) in the final scene.
 

Detail

Member
Dec 30, 2018
2,948
The only mechanism we have to measure the relative power of GPUs is by benchmarking it, and different benchmarks run differently on different architectures. We saw on day 1 that Turing thrived on games that that were traditionally said to be AMD favoring - things like Wolfenstein 2, or Strange Brigade or whatever all run great on both Turing and GCN.

At the time of the 2080's release, aggregate game benchmarks put it roughly on par with the 1080ti - a touch slower depending on which specific games you looked at. The 1080ti has slightly more memory bandwidth than the 2080, and it has more overall memory. It has roughly the same FP32 compute. It has more Rops and Cuda cores, but fewer transistors. Architectural differences are what allow the 2080 to compete with the 1080ti despite paper specs where the numbers are higher on 1080ti. Considering transistor counts that doesn't necessarily mean it's much more efficient, but it is different.

Unless we get a patch or driver update tomorrow that totally improves performance on Pascal cards, we won't be able to confirm whether the game is designed in a way that heavily utilizes Turing's differences, or whether it's some random last minute glitch, or because it hasn't had much attention paid to these cards from Rockstar, or whether Nvidia hasn't put much effort into performance on those cards for this game, or whatever. There's lots of things it could be (possibly multiple things in conjunction).

I will say that this game is a very abnormal data point in terms of the sheer difference between Pascal and Turing. Like there's a ~25% advantage that the 2080 has over the 1080ti here, whereas in Wolfenstein 2 (a game famous for running much better on Turing, generally the one with the largest difference when the reviews came out) only has a ~12-15% performance advantage on 2080 over 1080ti.

Let's be honest, most companies these days cannot be trusted (from my experience at least) it seems like they want to force consumers to upgrade on a yearly cycle, now I don't want to accuse Nvidia of nerfing but I am simply saying, I would not put it past them, just like I wouldn't put it past any company (Samsung with TVs and the auto updates via software which even when you turn them off, still update and you have no ability to roll back the updates and no patch notes to see what changed for example.)

I mean, I just recently updated windows to 1903 and my performance has dropped, 1 min longer for boot ups to windows, lower benchmark scores on 3Dmark both via cpu and gpu with the newest Nvidia drivers, we are only talking about 4-5fps difference but it's still lower and slower than it was before and I can see that when comparing benchmarks, some people might be inclinced to update their hardware when they notice slowdowns, that's all I am saying.
 

Nintendo

Prophet of Regret
Member
Oct 27, 2017
13,388
I posted this in the performance thread but I think it fits this thread better.

I was testing the settings and found something interesting. Reflection setting isn't only for the resolution of reflections on windows and stuff like I thought.

You can clearly see the difference in quality on the wooden surface of the wagon. There's also a blue light bounce/reflection on the ceiling of the barn which is more prominent on ultra settings. Global illumination is the same(High) for both screenshots.

ULTRA reflection quality:
XvUoKaV.jpg


LOW reflection quality:
PpTWQHD.jpg


I hope Dictator can explain what's happening here.
 

thirtypercent

Member
Oct 18, 2018
680
I mean, I just recently updated windows to 1903 and my performance has dropped, 1 min longer for boot ups to windows, lower benchmark scores on 3Dmark both via cpu and gpu with the newest Nvidia drivers, we are only talking about 4-5fps difference but it's still lower and slower than it was before and I can see that when comparing benchmarks, some people might be inclinced to update their hardware when they notice slowdowns, that's all I am saying.

Slightly OT but not really.... over 1 minute Windows boot? That's insane, if you're still on a regular HDD you NEED to go SSD, it'll accelerate everything including games like RDR2. If you already did something else might be wrong. Newer Windows versions have patches against all those CPU vulnerabilities integrated which will hit older CPUs so slight performance degradation is expected if you're coming from an unpatched version but it shouldn't be that bad.
 
Last edited:

Detail

Member
Dec 30, 2018
2,948
Slightly OT but not really.... over 1 minute Windows boot? That's insane, if you're still on a regular HDD you NEED to go SSD, it'll accelerate everything including games like RDR2. If you already did something else might be wrong. Newer Windows versions have patches against all those CPU vulnerabilities integrated which will hit older CPUs so slight performance degradation is expected if you're coming from an unpatched version but it shouldn't be that bad.

Yup, over 1 minute, moving from 1607 (didn't upgrade for a long time for this very reason but was essentially forced to for RDR2.)

Got it on my SSD already plus I have already disabled spectre because it was hammering my CPU and degrading performance.

I have isolated startup programs as well and they aren't the issue, I honestly cannot figure out what is causing the bootup issues. My bootup time on the older version of windows was 6 seconds from bios to login, so yeah, 6 seconds to 68.5 seconds