• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Trisc

Member
Oct 27, 2017
6,489
Not really keen on playing any more of this until the crashes are resolved and performance is up to snuff. It's hard to hit 60FPS at 1440p on med-high settings on a 1080 Ti, and that doesn't sit well with me at all.

I can't fathom why the 5700 XT has such an enormous lead on the 1080 Ti, either. The 1080 Ti typically leads ahead by around 8-10FPS in any benchmark against the 5700 XT, but in RDR2, it's absolutely smoked by it.
 

Guffers

Member
Nov 1, 2017
384
Not really keen on playing any more of this until the crashes are resolved and performance is up to snuff. It's hard to hit 60FPS at 1440p on med-high settings on a 1080 Ti, and that doesn't sit well with me at all.

I can't fathom why the 5700 XT has such an enormous lead on the 1080 Ti, either. The 1080 Ti typically leads ahead by around 8-10FPS in any benchmark against the 5700 XT, but in RDR2, it's absolutely smoked by it.
Bizarre situation. Im not complaining because I run a 5700XT. But it's usually 10% behind the 1080Ti. Now that situation is reversed. It's just odd. Likely to be fixed in a driver update I'm guessing.
 

Lethologica

Member
Oct 27, 2017
1,178
+1 to the "RGL crashes on startup, so I can't even launch the game" crowd. Reinstall did not fix it.

2700X
1080Ti
16 GB DDR4
970 Evo NVME
 

Trisc

Member
Oct 27, 2017
6,489
Bizarre situation. Im not complaining because I run a 5700XT. But it's usually 10% behind the 1080Ti. Now that situation is reversed. It's just odd. Likely to be fixed in a driver update I'm guessing.
It's not like the 1080 Ti is 10% behind, either. Where the 1080 Ti is running at around 55-60FPS, the 5700 XT is getting nearly 20-25% better performance!
 

leng jai

Member
Nov 2, 2017
15,119
Back in 2007 60fps wasn't the absolute PC standard like it is now. People were happy to play Crysis at 25fps.
 

Psyrgery

Member
Nov 7, 2017
1,745
I am baffled.

The XBX runs this game at 4K@30fps pretty much locked.

Any i5/i7 with a 1060/1070/1080 should be able to run the game at the same settings, res and framerate.

The fact that people with GPUs that exceed what the cut down 580 found in the XBX can do have to lower the resolution to 1080p is downright insulting.

I'm skipping this poopy game until R* decides to patch it.

And I am not talking about running it on Ultra, I am talking about running it at the same specs as the XBX
 

DAHGAMING

Member
Oct 26, 2017
519
I love pc gaming and have quite a decent pc 1080 ti 16gb ram and 3600 . I love it when you get games like Gears 5, its just good to go. Then you get this game, I dont have it on pc but looks a right fuck around for you lot, hopefully they optimise it, i mean im not expecting ultra 4k or anything bit id expect highish settings 1440 60fps for 1080ti.
 

Linus815

Member
Oct 29, 2017
19,802
I am baffled.

The XBX runs this game at 4K@30fps pretty much locked.

Any i5/i7 with a 1060/1070/1080 should be able to run the game at the same settings, res and framerate.

The fact that people with GPUs that exceed what the cut down 580 found in the XBX have to lower the resolution to 1080p is downright insulting.

I'm skipping this poopy game until R* decides to patch it.

And I am not talking about running it on Ultra, I am talking about running it at the same specs as the XBX

i dont get this post, no one knows yet what settings the X runs it at. It's perfectly possible that a 1070 could hit 4k30 at the right settings.


Almost nobody here is targeting 30 fps, so lowering resolution to get higher framerates makes sense.
 

DonMigs85

Banned
Oct 28, 2017
2,770
I am baffled.

The XBX runs this game at 4K@30fps pretty much locked.

Any i5/i7 with a 1060/1070/1080 should be able to run the game at the same settings, res and framerate.

The fact that people with GPUs that exceed what the cut down 580 found in the XBX can do have to lower the resolution to 1080p is downright insulting.

I'm skipping this poopy game until R* decides to patch it.

And I am not talking about running it on Ultra, I am talking about running it at the same specs as the XBX
It's actually not a cut-down 580 at all - it has 320GB/sec bandwidth vs 256GB/sec on the 580, more Texture Mapping Units (160 vs 144) and 2560 shaders versus 2304 (though of course the clock speed is lower so an RX 580 can still get higher peak TFLOPS and fillrate).
 

leng jai

Member
Nov 2, 2017
15,119
I am baffled.

The XBX runs this game at 4K@30fps pretty much locked.

Any i5/i7 with a 1060/1070/1080 should be able to run the game at the same settings, res and framerate.

The fact that people with GPUs that exceed what the cut down 580 found in the XBX can do have to lower the resolution to 1080p is downright insulting.

I'm skipping this poopy game until R* decides to patch it.

And I am not talking about running it on Ultra, I am talking about running it at the same specs as the XBX

Has anyone tried using those cards to run at 4k30fps low setttings? You can't compare because no one on PC wants to gimp everything else just to reach an arbitrary resolution number.
 

Steev-veetS

Member
Nov 26, 2017
7
France
Can't even get past the launch screen. I launch the game, hear a "chk" sound, black screen smoke for a second, then the game crashes and the launcher asks me to retry or start in safe mode. Safe mode doesn't work.

I'm probably not the only one with this issue - any fix for this?

2080ti, 9700k 4.8ghz, updated drivers...
My anti-virus (Avast) was causing exactly this.
Set the RDR2 install directory in the exception list and it worked
 

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
I am baffled.

The XBX runs this game at 4K@30fps pretty much locked.

Any i5/i7 with a 1060/1070/1080 should be able to run the game at the same settings, res and framerate.

The fact that people with GPUs that exceed what the cut down 580 found in the XBX can do have to lower the resolution to 1080p is downright insulting.

I'm skipping this poopy game until R* decides to patch it.

And I am not talking about running it on Ultra, I am talking about running it at the same specs as the XBX

No one here knows the graphics settings that are equivalent to the Xbox One X and probably no one here is targeting 30 fps.
 
Oct 27, 2017
3,933
So I am having a lot of issues launching this game too. It just launches to a black screen with sound. I have to kill it from the task manager and relaunch in safe mode to get it going. But then a lot of the textures don't seem to load correctly, the menus are slow and the graphic settings seem to keep reverting. This is disappointing...
I was having the same issue as you. Turns out the HDR is fucked for whatever reason. Launching in safe mode and disabling HDR in the display settings allows the game to display correctly.

As for the textures, I too noticed a load of them were not loading correctly. Switching to DirectX12 fixes that, but performance is noticably worse than Vulkan.

I think I'm gonna wait for this game to be patched. It's clearly not up to snuff right now.
 

Linus815

Member
Oct 29, 2017
19,802
this is straight up not true

It kinda is.

the "60 fps or die" mentality wasn't nearly as prevalent in the mid 2000's and before. It's a relatively recent thing.
The biggest releases, even with high end hardware, regularly didn't reach 60 fps, or at least, not consistently. I remember being VERY happy with 35-40 fps in general. I played through Crysis at 28-35 fps and considered it a good experience.



Also,

oblivion-highend-bloom.png


DX10_High_00.png


cod2low.jpg
 

My Name is John Marston

Alt account
Banned
Oct 27, 2017
111
you're not wrong in what you've written but there are some really noticeable hits to the overall look of the game here (the water, texture tiling, the general texture resolution off to the left side of the frame)

True but that's because of the low grass draw distance and the lack of anisotropic filtering. If I turn these settings up a bit, it will look much better but I wanted the settings to be as low as possible.
 

Ockui

Member
Oct 28, 2017
76
Maybe already said, If anyone is experiencing weird white pixel artefacts around characters turn down msaa.
 

Gobsmack

Alt Account
Banned
Nov 5, 2019
16
What do you mean? Crysis 2 DX11 was almost a proto next gen game, hell it was the first game to use SSR which became standard later on.
He means that Low = High Medium = Ultra = High = Extrame Ultra = beyond compared to other games.

Yes you are running shadows, or LoD on medium but they are at levels other open world games would consider maxed typically for example
 

Sgs2008

Member
Mar 25, 2019
531
From the
What do you mean? Crysis 2 DX11 was almost a proto next gen game, hell it was the first game to use SSR which became standard later on.

From his Twitter post it seems to be indicating that the higher pc settings actually make more of a visual impact than on a standard pc port. As in high on pc is significantly above console equivalent settings. At least thats how im reading it.
 

Gobsmack

Alt Account
Banned
Nov 5, 2019
16
I posted on Twitter but remember how Crysis 2 Was last gen and how it's settings aligned with console?
Yeah about that
Yes people are acting like running, idk, AC:O on High at 60 fps 1080p on a 1070gtx yet getting that same framerate at Medium means something is wrong with RDR2 because High>Medium. When really Medium here is like max settings in most other open world games in terms on assets, LoD ect
 

Javier23

Member
Oct 28, 2017
2,904
Back in 2007 60fps wasn't the absolute PC standard like it is now. People were happy to play Crysis at 25fps.
2007 isn't ancient history that we've already completely forgotten about, why would you even make up silly stuff like this? I don't get it. 60FPS was absolutely the standard then. Performance threads like this one on games such as Crysis, STALKER or Bioshock were very popular. Things haven't changed much, we just fortunately have now a few more tools at our disposal. Physx became a big thing a year afterwards with the release of Mirror's Edge or Arkham Asylum, and how that affected performance was also a big point of contention.
 

Isee

Avenger
Oct 25, 2017
6,235
Vulkan is fundamentally broken for me.

"Oh, you dare to look into the wrong direction Partner. Allow me to annoy you with micro stuttering."

stutterdxk95.jpg
 

Sanctuary

Member
Oct 27, 2017
14,236
Wouldn't The Witcher 2 and its "ubersampling" be the better example here? It wasn't actually meant to be used with then current hardware. IIRC, all it was really doing anyway was some massive downsampling.

Back in 2007 60fps wasn't the absolute PC standard like it is now. People were happy to play Crysis at 25fps.

It's almost like Counter-Strike and Quake did not exist or something, nor did Voodoo cards and their version of SLI target higher frame rates in the mid to late 90s.
 
Last edited:

dodo

Member
Oct 27, 2017
3,997
It kinda is.

the "60 fps or die" mentality wasn't nearly as prevalent in the mid 2000's and before. It's a relatively recent thing.
The biggest releases, even with high end hardware, regularly didn't reach 60 fps, or at least, not consistently. I remember being VERY happy with 35-40 fps in general. I played through Crysis at 28-35 fps and considered it a good experience.

I also playing through Crysis at around 30-40 FPS on a brand new card at the time, but that was an exception considering the game was such a demanding one. Doom 3 was another one where it was generally accepted that 60 was a pipe dream, and MMOs were another story entirely, but I vividly remember scouring tech sites for the best graphics tweaks and settings to hit 60fps on games like CoD2, Republic Commando, Oblivion, Half Life 2, FEAR, etc.

in the mid 00s we were all moving to 60Hz LCD flatscreen monitors, and anything below 60 was as miserable to play on for most things (especially competitive) then as it is now.
 

leng jai

Member
Nov 2, 2017
15,119
2007 isn't ancient history that we've already completely forgotten about, why would you even make up silly stuff like this? I don't get it. 60FPS was absolutely the standard then. Performance threads like this one on games such as Crysis, STALKER or Bioshock were very popular. Things haven't changed much, we just fortunately have now a few more tools at our disposal. Physx became a big thing a year afterwards with the release of Mirror's Edge or Arkham Asylum, and how that affected performance was also a big point of contention.

I probably just remembered my timeline wrong, my bad. There was definitely a period of time when people were playing games at 30-40fps on PC but that was probably when we had CRTs and in the late 90 early 2000s.

Definitely remember most people playing Crysis at 25-40fps which is unthinkable now.
 

Deleted member 34873

User-requested account closure
Banned
Nov 29, 2017
1,460
So Vulkan just straight up causes the game to crash every time the game tries to actually load any assets (benchmark, start Story mode). DX12 runs fine. Is anyone else having this problem? I wonder if it's related to the stuttering issues people are having with Vulkan.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
Yes people are acting like running, idk, AC:O on High at 60 fps 1080p on a 1070gtx yet getting that same framerate at Medium means something is wrong with RDR2 because High>Medium. When really Medium here is like max settings in most other open world games in terms on assets, LoD ect
Do you have any proof of that? We saw a Screenshot comparison on the previous page and there is no difference between the Xbox One X and PC that would justify the enormous hardware hunger. This is a bad optimized port, period. When even the Assissans Creed games are better optimized, you know something is wrong....
 

Guffers

Member
Nov 1, 2017
384
Vulkan isn't crashing for me, but I can't try out DX12. Each time I change it, save it and restart the game it just resets to default settings.
 

Ostron

Member
Mar 23, 2019
1,954
It kinda is.

the "60 fps or die" mentality wasn't nearly as prevalent in the mid 2000's and before. It's a relatively recent thing.
The biggest releases, even with high end hardware, regularly didn't reach 60 fps, or at least, not consistently. I remember being VERY happy with 35-40 fps in general. I played through Crysis at 28-35 fps and considered it a good experience.
60FPS was absolutely something PC gamers were aware of and it was a top priority for certain games (in fact, many went even higher than 60). CS, Unreal Tournament, Quake etc. etc., all pre 2000s. Though back then we had it easy with our CRTs and could adjust resolution accordingly. The debate today is different because we are limited to 1080p monitors as a minimum and people know that sub 60FPS is not ideal. Look at older arcade vs home ports or PAL vs NTSC. By 2007 it was absolutely a thing and any other view is detatched from reality.

Today 240hz is the high end for relevant games and few people let alone engines can reach that target, people are still very much aware of it.
 

Rams

Member
Dec 13, 2017
49
Played around for several hours last night. Pushed past the intro to get into locations where I knew I could push the performance.

Short version: Lots of problems related to settings, startup, HDR, etc. But once you get past all that, and find the sweet spot for your hardware, it actually runs beautifully and looks great.

Long version:

Problems I had to deal with:

-Game starting into a blank, black screen. Alt-tabbing out and back in fixes it.
-HDR being disabled despite the setting being enabled. Toggling some graphics setting fixes it.
-Togging graphics settings sometimes just stops working. The setting is saved, but nothing changes. Restarting the game fixes it.
-Benchmark stops working after one run, just jumps to main menu after that. Restarting the game fixes it.
-Settings sometimes get toggled by themselves when you're changing an entirely different setting.
-Sometimes after toggling settings some illogical stuttering appears and won't go away without restarting the game.

I had decided from to get-go that I was going to achieve a locked 60 fps, no matter the IQ cost. I have the game on XBX so I've already played the game at 30hz, and wanted that smooth experience. Bad news is that a 2080Ti doesn't get even close to handling 4K@60 on Ultra settings. Here's where I ended up, using Vulkan:

-Set resolution to 1440p
-Set the preset slider to Max
-Set everything that says Ultra to High, except leave texture quality to Ultra.

This allowed me to keep a rock solid 60fps in all locations I could find. Even though it's a shame going to 1440p on a 4K TV, it still looks very good. The AA is good enough so it's not jaggy at all, and the loss of detail, while certainly noticeable, isn't bad. All in all, it's by far the best looking game running at 60fps I have ever seen. And it really runs beautifully too, very little asset streaming stutters or any other sort of hitching. With an open world engine like this, that's impressive as hell.

And that's why I don't really agree with the people saying this is a poorly optimized game. Yes, the GPU demands are super super high if you want to play at 4K/60. So high that there's not hardware out there that can do it. But what matters at the end of the day, when all settings are where they are supposed to be, is how the game looks, and how it runs. And like I said, I haven't seen a prettier game at 60fps, and the performance is nearly fault free.

Guess I got lucky with Vulcan as I have no stutters or any other issues.

edit: system:
9900k, 2080Ti OC, 16 DDR4/3200, game on NVME, Win10 latest version
 

Braag

Member
Nov 7, 2017
1,908
At 1440p Ultra settings 2080TI I got 59fps average with the benchmark.
I only now finished chapter 1 but it has been smooth sailing.
I did have to change to DX12 as I was getting some weird graphical glitches in menus and in game with Vulcan...
 

laxu

Member
Nov 26, 2017
2,782
Reading through the thread I'm probably going to refund and play something else if they don't get this shit fixed very quickly. It just sounds too broken to play for tons of hours.
 

RedSwirl

Member
Oct 25, 2017
10,064
On the history of running PC games, I think something did change around 2007, after which a lot of developers switched to working with the console versions as the "original" and then porting them up to PC, making it so cards like the 8800GT could comfortably get 60fps in almost everything. Very few games were a struggle anymore arguably until 2011 when Crysis 2 and Witcher 2 came along.

Now most of the time, PC gamers are accustomed to having the elbow room to get even higher framerates. It's rarer since then that a game will come along with features that legitimately force people with even high-end cards to negotiate all the different settings or framerates in the 40's or whatever. Maybe this is the point where we start seeing Crysis 2 and Witcher 2-like situations. Maybe people just didn't expect that to happen until like, Cyberpunk. We already got Control and Metro though.
 

Mutagenic

Member
Oct 30, 2017
2,317
I probably just remembered my timeline wrong, my bad. There was definitely a period of time when people were playing games at 30-40fps on PC but that was probably when we had CRTs and in the late 90 early 2000s.

Definitely remember most people playing Crysis at 25-40fps which is unthinkable now.
This was absolutely not the case for me. It's why ReForce was such a popular program back then with CRTs. Playing Quake 2 and CS at 120fps/120hz was amazing.

I just made it to Valentine and this game runs at a locked 60 in town. I traveled there immediately once the game let me. The only time it has ever seemed to dip for me is during the carriage ride On the way to set up Horseshoe Overlook. It was when the sunset lighting was on point and I was crossing a river.
 

Bluelote

Member
Oct 27, 2017
2,024
It kinda is.

the "60 fps or die" mentality wasn't nearly as prevalent in the mid 2000's and before. It's a relatively recent thing.
The biggest releases, even with high end hardware, regularly didn't reach 60 fps, or at least, not consistently. I remember being VERY happy with 35-40 fps in general. I played through Crysis at 28-35 fps and considered it a good experience.



Also,

oblivion-highend-bloom.png


DX10_High_00.png


cod2low.jpg

Crysis was a PC exclusive really pushing on the latest tech, the other 2 were released close to the Xbox 360's launch window so also the latest tech,
basically a high end PC was much closer to a 360 when Oblivion and CoD2 were released than a 2080 is to consoles now.

things have changed massively, console games perform more stable now than they did in 2006 and PCs tend to perform a lot better in games also.
so it's not a surprise that people are surprised with the poor performance they get on RDR2.. expectations have changed.

AFAIK RDR2 was designed for now aging consoles first, so I don't find it all that comparable,

in saying that, I played Crysis with a cutdown 8600Gt (like 1/4 of the 8800GTX from that graph) just had to drop the res and settings to medium :)
I think if you go down a bit with settings on RDR2 you will be fine, but, I have to wonder how it performs on PC with settings matching the PS4 image quality, that would be more telling IMO.
 

c0Zm1c

Member
Oct 25, 2017
3,206
It kinda is.

the "60 fps or die" mentality wasn't nearly as prevalent in the mid 2000's and before. It's a relatively recent thing.
The biggest releases, even with high end hardware, regularly didn't reach 60 fps, or at least, not consistently. I remember being VERY happy with 35-40 fps in general. I played through Crysis at 28-35 fps and considered it a good experience.



Also,

oblivion-highend-bloom.png


DX10_High_00.png


cod2low.jpg
Those were outliers, they weren't the norm. We put up with the framerates our hardware could manage at whatever visual quality we could tolerate, just as we sometimes have to do now (I can't get a locked 60fps in No Man's Sky but I still play it). But I remember being very happy with Call of Duty 2's performance. You could tank its framerate to single digits by throwing all the settings up but putting it on more sensible settings it still looked and ran great. The same couldn't be said for Oblivion, where low settings put you in the middle of thick fog!
 

Mecha Meister

Next-Gen Guru
Member
Oct 25, 2017
2,805
United Kingdom
Eh, Fraps not working in RDR2 :( What standalone fps counter is recommended?

I'm using MSI Afterburner with Rivatuners Statistic Server's On-Screen Display.
Here's a link to it on Guru3D: MSI Afterburner 4.6.2 Stable/Final Download

Rivatuner Statistic Server's OSD is included in the installer.

It's been working for me so far on a laptop with a Ryzen 5 3550H and a GTX 1650, as well as a desktop equipped with an i7 2600K and an RX 570, I'm also about to begin some tests on with a GTX 1080 Ti.


 

Isee

Avenger
Oct 25, 2017
6,235
The game scales indeed. Even when having some advanced feature on, like Volumetric Raymarch Resolution or Long Shadows, just turning every ULTRA down to HIGH (except Textures and AF) is pushing average performance from 62 FPS to 80 FPS for me in the Benchmark (+29%).

That's the problem with us PC Gamers, we tend to die over ULTRA settings.
Some examples for ULTRA vs "HIGH"

HIGH (77 FPS)
ULTRA (63 FPS) +22%

HIGH (83 FPS)
ULTRA (64 FPS) +29%

HIGH (84 FPS)
ULTRA (64 FPS) +31%

And I'm pretty sure some things can be left on ULTRA in the first place. Some Settings seem to be performance killers, Water Physics is one of them. Whatever you do, do not set the slider for that over 50%.

Ultra + Water Physics 100% (40 FPS)
Ultra + Water Physics 50% (56FPS) +40%
 
Dec 20, 2018
310
I'm using MSI Afterburner with Rivatuners Statistic Server's On-Screen Display.
Here's a link to it on Guru3D: MSI Afterburner 4.6.2 Stable/Final Download

Rivatuner Statistic Server's OSD is included in the installer.

It's been working for me so far on a laptop with a Ryzen 5 3550H and a GTX 1650, as well as a desktop equipped with an i7 2600K and an RX 570, I'm also about to begin some tests on with a GTX 1080 Ti.




That cpu temperature