• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

jerfdr

Member
Dec 14, 2017
702
I hope the 3080Ti will at least release not that far from the Cyberpunk 2077 launch (16-04-2020). It's still unclear if the 2080Ti can do 4K Ultra + RTX. I really want to play Cyberpunk at launch, but it'll suck if 4K users need to wait on the 3080Ti to release :(
I wouldn't expect the 3000 series until August or September at the earliest, realistically.
Well, I personally try to solace myself by thinking about the fact that by August or September the inevitable bugs will have already been patched (mostly), which is a big plus. Still, the wait will be excruciating.
I'll probably also have to take a break from all gaming forums to avoid spoilers.

And by the way, I'm quite sure that the 2080Ti won't be able to do even 1440p Ultra+RTX at 60fps, which is what I'm aiming for; I think that it's rather unlikely even for the 3080Ti to be able to do 4K@60 Ultra+RTX.

For reference, the 2080Ti gives you 45-50fps in Control with RTX High enabled (without DLSS, as it still results in questionable image quality), and Control is a much more confined game than Cyberpunk. So I would be glad if the 3080Ti allows for 1440p@60 Ultra+RTX. Most likely, at 4K you can only hope for "cinematic" 30fps if you want that Ultra+RTX quality, even with the 3080Ti.
 
Last edited:

Pall Mall

Member
Oct 25, 2017
2,424
Hmm echoing that I'm wondering if this will be good enough for 4k, 60, ultra, etc any next gen game. It's obviously insanely speculative given we know nothing - but I'm wondering, do you guys think itll end up being more prudent to go with a 1440p build with a hypothetical 3080ti at these rumored gains or does 4k seem appreciably possible?
 

Dries

Banned
Aug 19, 2019
309
I wouldn't expect the 3000 series until August or September at the earliest, realistically.
Likely a year or so later than that.

Fuck, I can't wait that long. Guess I'll be trying to get 4K Ultra@30 fps with some small amount of RTX with a 2080Ti :-/

Edit: I of course know this is all speculation though. We know nothing yet officially about the RTX features the game will have. Let alone the performance costs. Wish CDPR would clarify some of this stuff.
 

Mullet2000

Member
Oct 25, 2017
5,907
Toronto
Fuck, I can't wait that long. Guess I'll be trying to get 4K Ultra@30 fps with some small amount of RTX with a 2080Ti :-/

Edit: I of course know this is all speculation though. We know nothing yet officially about the RTX features the game will have. Let alone the performance costs. Wish CDPR would clarify some of this stuff.

I wanted to wait too but caved and got a 2070 Super for my build earlier this month. Rather than get a 2080ti now I figured it would end up costing a similar amount to buy a 2070 Super -> sell it for a 3080 or something if I really want to upgrade by the time the new cards come out.

Pretty happy with the 2070 Super as is, though. I'm getting great framerates at 1440p.
 

SpotAnime

Member
Dec 11, 2017
2,072
I wanted to wait too but caved and got a 2070 Super for my build earlier this month. Rather than get a 2080ti now I figured it would end up costing a similar amount to buy a 2070 Super -> sell it for a 3080 or something if I really want to upgrade by the time the new cards come out.

Pretty happy with the 2070 Super as is, though. I'm getting great framerates at 1440p.

I'm going to get a 2070 Super as well, to replace the POC 5700 card I jusy returned (don't get me started). Keep in mind the 9 teraflops performance of a 2070 Super is on par with the rumored PS5 performance.
 

dgrdsv

Member
Oct 25, 2017
11,885
Guess I'm pessimistic then lol. I just find even a June release hard to picture.
Why? N7+ is in mass production, designs are ready. I mean, they can wait it out till Sep if there will be no threat from competitors but with summer GPU sales being relatively low anyway it won't make much difference.
 

Mullet2000

Member
Oct 25, 2017
5,907
Toronto
Why? N7+ is in mass production, designs are ready. I mean, they can wait it out till Sep if there will be no threat from competitors but with summer GPU sales being relatively low anyway it won't make much difference.

Literally just gut feeling not based on anything. Again might just be pessimism lol.

In the past I've always found rumors tend to lean towards wanting to believe something is coming sooner than it really is, because obviously we all want the cards to be out sooner than later.
 

Laiza

Member
Oct 25, 2017
2,171
Fuck, I can't wait that long. Guess I'll be trying to get 4K Ultra@30 fps with some small amount of RTX with a 2080Ti :-/

Edit: I of course know this is all speculation though. We know nothing yet officially about the RTX features the game will have. Let alone the performance costs. Wish CDPR would clarify some of this stuff.
I'm sorry, I don't understand. The fascination with 4k is just absolutely baffling to me.

There's nothing wrong with just running 1080p with good TAA and ray-tracing. Good TAA alone eliminates the single biggest reason for going for a higher resolution, which is to mitigate the horrible aliasing that occurs with inadequate anti-aliasing on lower resolutions. Once you've eliminated that from the equation the only reason to go for 4k is for additional detail - detail that the vast majority of people will not benefit from at typical screen-viewing distances.

On top of that, the cost for going from 1080p to 4k is extreme, as you're literally quadrupling the number of pixels required to render the entire scene (and thus also quadrupling the compute expense of all per-pixel effects, including ray-tracing)... any sort of basic analysis should tell you that it's just not worth it, not even considering the fact that literally no technology released over the next two years will be capable of 4k60 with ray-tracing. You must temper your expectations based on what is realistically possible. These pie-in-the-sky aspirations will only get you inevitably disappointed.
 

RedSwirl

Member
Oct 25, 2017
10,061
I'm sorry, I don't understand. The fascination with 4k is just absolutely baffling to me.

There's nothing wrong with just running 1080p with good TAA and ray-tracing. Good TAA alone eliminates the single biggest reason for going for a higher resolution, which is to mitigate the horrible aliasing that occurs with inadequate anti-aliasing on lower resolutions. Once you've eliminated that from the equation the only reason to go for 4k is for additional detail - detail that the vast majority of people will not benefit from at typical screen-viewing distances.

On top of that, the cost for going from 1080p to 4k is extreme, as you're literally quadrupling the number of pixels required to render the entire scene (and thus also quadrupling the compute expense of all per-pixel effects, including ray-tracing)... any sort of basic analysis should tell you that it's just not worth it, not even considering the fact that literally no technology released over the next two years will be capable of 4k60 with ray-tracing. You must temper your expectations based on what is realistically possible. These pie-in-the-sky aspirations will only get you inevitably disappointed.

Watching that new DF Crysis 1 Video makes me think we're gonna be going back to a time when everyone made their own decisions on PC performance on a game-to-game basis, because even top-of-the line systems didn't run everything at super-high framerates with max graphics.

That video reminded me of how I had to run Crysis at 1024 X 768. Of course back then some people still had Carts, and now I'm playing my PC games on a TV which looks worse at sub-native res. This is probably why Nvidia and AMD are implementing integer scaling in their cards now (and why individual games are using upscaling solutions).

We might have games coming up that won't look the best they can possibly look until we're playing them on 2025 PCs or PS6s.
 

tuxfool

Member
Oct 25, 2017
5,858
I'm sorry, I don't understand. The fascination with 4k is just absolutely baffling to me.

There's nothing wrong with just running 1080p with good TAA and ray-tracing. Good TAA alone eliminates the single biggest reason for going for a higher resolution, which is to mitigate the horrible aliasing that occurs with inadequate anti-aliasing on lower resolutions. Once you've eliminated that from the equation the only reason to go for 4k is for additional detail - detail that the vast majority of people will not benefit from at typical screen-viewing distances.

On top of that, the cost for going from 1080p to 4k is extreme, as you're literally quadrupling the number of pixels required to render the entire scene (and thus also quadrupling the compute expense of all per-pixel effects, including ray-tracing)... any sort of basic analysis should tell you that it's just not worth it, not even considering the fact that literally no technology released over the next two years will be capable of 4k60 with ray-tracing. You must temper your expectations based on what is realistically possible. These pie-in-the-sky aspirations will only get you inevitably disappointed.
I disagree.

A good TAA solution does not exist at 1080p.

1440p is the minimum I'd consider workable for TAA, or any other postprocess anti-aliasing solution. I'm far from a "sharpness" stickler, but lower resolutions just look too smudgy.
 

MrH

Banned
Nov 3, 2017
3,995
I sold my Switch today, I'm very tempted to put the £300 towards a new GPU. My GTX 1080 is still very good so I'll have to see some benchmarks, and honestly it'll come down to how Cyberpunk runs.
 

Laiza

Member
Oct 25, 2017
2,171
Watching that new DF Crysis 1 Video makes me think we're gonna be going back to a time when everyone made their own decisions on PC performance on a game-to-game basis, because even top-of-the line systems didn't run everything at super-high framerates with max graphics.

That video reminded me of how I had to run Crysis at 1024 X 768. Of course back then some people still had Carts, and now I'm playing my PC games on a TV which looks worse at sub-native res. This is probably why Nvidia and AMD are implementing integer scaling in their cards now (and why individual games are using upscaling solutions).

We might have games coming up that won't look the best they can possibly look until we're playing them on 2025 PCs or PS6s.
That's an interesting thought.

Personally, I have absolutely no idea what to expect beyond the coming generation. We've hit a hard barrier on transistor shrinkage so it's anyone's guess how things will go for the generation after the next. All I know is that 4k60 with ray-tracing is gonna have to wait for a long, long time.
I disagree.

A good TAA solution does not exist at 1080p.

1440p is the minimum I'd consider workable for TAA, or any other postprocess anti-aliasing solution. I'm far from a "sharpness" stickler, but lower resolutions just look too smudgy.
Fair enough. 1440p is certainly an improvement in that regard, without the outsized performance hit that 4k will bring (or being just outright unrealistic when combined with ray-tracing).

I just find that 1080p with good TAA and some moderate sharpening works for me, and the effect of ray-tracing is so obvious and so massive that I'm more than willing to sacrifice some clarity for the sake of superior lighting and material shading.
 

asmith906

Member
Oct 27, 2017
27,404
Well, I personally try to solace myself by thinking about the fact that by August or September the inevitable bugs will have already been patched (mostly), which is a big plus. Still, the wait will be excruciating.
I'll probably also have to take a break from all gaming forums to avoid spoilers.

And by the way, I'm quite sure that the 2080Ti won't be able to do even 1440p Ultra+RTX at 60fps, which is what I'm aiming for; I think that it's rather unlikely even for the 3080Ti to be able to do 4K@60 Ultra+RTX.

For reference, the 2080Ti gives you 45-50fps in Control with RTX High enabled (without DLSS, as it still results in questionable image quality), and Control is a much more confined game than Cyberpunk. So I would be glad if the 3080Ti allows for 1440p@60 Ultra+RTX. Most likely, at 4K you can only hope for "cinematic" 30fps if you want that Ultra+RTX quality, even with the 3080Ti.
If you've got a 2080 ti I'd still grab Cyberpunk and play it. Seems like you'd be punishing yourself trying to not see any spoilers for months.
 

m_shortpants

Member
Oct 25, 2017
11,246
I'm sure they will inflate the price even more this generation

Hoping AMD can get them more competitive.
 

Deleted member 13560

User requested account closure
Banned
Oct 27, 2017
3,087
I'm going to guess that MCM GPUs will be the next step in raw performance increase. I might wait on Hopper. Maybe AMD might beat them to the punch though. But I don't think I've heard anything of AMD bringing that over to their line of GPUs. It has done wonders on their CPUs though... and I think that is still their main point of focus right now.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Yeah, 2080 Ti Founders Edition is 14.2tf and OEMs go much higher again. We can expect the same with the new cards.
allegedly (emphasizing this because of rumors) overclocking potential has gone down down with the ampere cards. partner cards might not bee that much higher than reference
 

kostacurtas

Member
Oct 27, 2017
9,065
allegedly (emphasizing this because of rumors) overclocking potential has gone down down with the ampere cards. partner cards might not bee that much higher than reference
It's not about the overclocking potential but about the boost clock. Nvidia is always conservative about the true boost clock.

For example the official boost clock of the Nvidia 1080 Ti is 1582 MHz but my Asus 1080 Ti Strix is usually over 1900 MHz (without any overclock).
 

JudgmentJay

Member
Nov 14, 2017
5,222
Texas
It's not about the overclocking potential but about the boost clock. Nvidia is always conservative about the true boost clock.

For example the official boost clock of the Nvidia 1080 Ti is 1582 MHz but my Asus 1080 Ti Strix is usually over 1900 MHz (without any overclock).

Yup. Same for my Titan X Pascal. That's a ~2.5TF increase without even touching overclocking. My overclock only gives another ~180mhz before power throttling kicks in. I feel like the high-end cards haven't been very overclockable for a while now.
 

GameAddict411

Member
Oct 26, 2017
8,521
allegedly (emphasizing this because of rumors) overclocking potential has gone down down with the ampere cards. partner cards might not bee that much higher than reference
Like other posters have said, Nvidia makes the calculations on the base clock speed if not the boost speed. But those are very low. Even the most basic GPU with good cooling can hit 1800 MHz. You can also get a 2GHz overclock with ease. My GPU was supposed to not be the OC model, but I managed to hit 2000-2050 MHz. At those clock speed, the RTX 2080 ti is hitting close to 18 tflop or more.
 

Metroidvania

Member
Oct 25, 2017
6,772
25% increase is not enough?

Eh.....25% isn't a whooooole lot considering a die shrink and the supposed boosts that come with it (Unless ray-tracing gets a massive upgrade in terms of performance, at least)

But hopefully if it's only a 25% boost, it'll at least be more aggressively priced - and to be fair, it's all relative, I suppose.

Who am I kidding, Nvidia will price it whatever they want, AMD ain't up to competing
 

Javier23

Member
Oct 28, 2017
2,904
I'm sorry, I don't understand. The fascination with 4k is just absolutely baffling to me.

There's nothing wrong with just running 1080p with good TAA and ray-tracing.
The difference in IQ, jaggies aside, in detail and clarity, between 1080p and 4k, is outstanding.

1440p and 4k, you might have a point.
 

JahIthBer

Member
Jan 27, 2018
10,382
RTX 3080 being around 30% better than 2080 Ti would be amazing, if it's $699, they better not make the xx80 $999 though.