Uhtred

Alt Account
Banned
May 4, 2020
1,340
The IQ shortcomings are easy to notice but I don't want to derail this further. My whole original point is that I'm extremely skeptical that even the 2080ti is going to be able to get 60 FPS at 4K with DLSS for Cyberpunk.
This "DLSS is magic" meme sucks. Like anytime someone says it's pretty good but not always a miracle-worker, you get like ten people jumping up to say it's magic. Makes talking about it impossible because you're not willing to accept any position beyond "Yup it's magic dude, you're right. Pure wizardry. Zap.".

But yeah, must avoid the derail.

Not sure why it would be derailing, but ok. I'll just say this: What's more noticeable, the minor artifacts of DLSS 2.0+, the significant artifacts of any other reconstruction we know off, or the doubling of your frame rate?
 

Jedi2016

Member
Oct 27, 2017
16,060
Pretty sure the demo they're talking about was capped to 30fps, so it's pointless to speculate that the PCs "couldn't" get any more out of it.
 

EatChildren

Wonder from Down Under
Member
Oct 27, 2017
7,057
The 20XX series was literally early adopter ray tracing technology, unfortunately for those who bought it (but should have been obvious). Ray tracing, like with any facet of any game's rendering, will have a performance hit entirely relative to the scope of its application and optimisation, the quality at which it is rendered, and the capacity of the player's GPU. You can take two ray tracing games with ray tracing features enabled and the performance hit will not be uniform due to the extent at which the ray tracing is implemented. Control, for example, seems to be the most comprehensive and demanding of all currently available games. And it also depends on how intensive the effect is; Battlefield V, for example, allows you to scale the quality of the ray tracing no different to any other rendering effect.

While it sucks to consider the 20XX, especially something as high end as a 2080 Ti, not performing adequately in ray tracing, there are so many variables to consider with Cyberpunk 2077's specific implementation of ray tracing and scalability that it's really impossible to make any judgement call yet. And, as noted, it's also important to remember that irrespective of the 20XX raw performance and cost, it is and always will be the worst performing ray tracing series of cards. By virtue of its market entry point and function it is not going to be a leading standard, it is going to be the one series of cards from NVIDIA that struggles to keep up as ray tracing becomes more widely used and uniform, as it came about in a time where ray tracing games did not exist at all, the technology not implemented anywhere, and thus no standard or uniformity on what kind of ray tracing coverage games implement.

Ray tracing also isn't free, as we all know, and like any rendering effect ties into everything else as well. If Cyberpunk is an inherently demanding game computationally on the GPU, irrespective of ray tracing, then enabling ray tracing will come at an even more noticeable performance cost as the performance headroom is less. The impact will be more noticeable as the GPU already struggles to keep up the pace with the game on its highest settings, further deteriorated by trying to render what is looking to be fairly comprehensive ray tracing coverage.

And I mean, that "highest settings" factor is one to consider. The 20XX series is not going to last through an entire generation running games maxed out with ray tracing at 4K, DLSS or otherwise. AFAIK the 2080 Ti can't even hit 60fps Control maxed out with full ray tracing at 1440p. That shit's demanding.

End of the day though with Cyberpunk 2077 I wouldn't jump to conclusions just yet. Too many factors to consider, too many scalable variables, all pertaining to a game that is still technically in development and using what is considered still emergent technology getting regular optimisation.
 

SimpleCRIPPLE

Member
Oct 28, 2017
4,231
I want a 3080, but I looks like I'd be upgrading solely for Cyberpunk at this point. I was hoping that my 2070s would be able to push decent FPS at 1440 using DLSS to upscale from 720p, but I'm not confident of that.

And yes, I need RTX on. CDPR games don't come around very often. I want the best possible 1st play through.
 

EggmaniMN

Banned
May 17, 2020
3,465
The game can run on an Xbone, the 2000 series will be just fine with it. Stop throwing everything on maximum. We'll see how things shake out with its day one drivers.
 

Timu

Member
Oct 25, 2017
15,822
The game can run on an Xbone, the 2000 series will be just fine with it. Stop throwing everything on maximum. We'll see how things shake out with its day one drivers.
Yeah I hardly max out current games unless they aren't demanding, I'm more of a high-medium guy for that framerate!
 

Rpgmonkey

Member
Oct 25, 2017
1,355
Realistically most of the GPUs available by release, if any of them at all, aren't maxing out the game (at higher resolutions?) anyway, if CDPR makes their highest settings futureproof enough.

So all you can do is play the game and see if optimal settings for your current hardware strikes a good enough balance for you, or spend hundreds/thousands of dollars to push the sliders up (and not even necessarily to the max). Doing the former is probably a better use of most people's time and money than buying new stuff because of what you think an unreleased, unoptimized game is going to need.

1) I think that games will be better optimized ot use the hardware later on. In other words future games ar elikely to perform better at raytracing on a 2080ti than these first set of games tackling the technology. The tech is new, the drivers are still new, and the tech is being added on top of existing rendering technology. When devs and engines have things more figured out, you'll probably see a 2080ti do better.

Maybe it's because of the audience most focused on Ray Tracing at the moment (tends to attract people who are into top-of-the-line, often fairly expensive hardware) but at times I feel like there's too much focus on brute forcing things with new GPUs and not enough on how the software factors into this stuff. It's early and so it's understandable to expect and want better hardware but software improvements go right along with it. No one's just using a naive algorithm from an undergrad computer graphics course and calling a day, hardware alone isn't going to get us where we need to be (not at a fast enough rate anyway).

Even besides stuff like DLSS, there's quite a bit of research going into improving the performance of Ray Tracing while improving or maintaining visual quality, or into the development of techniques that follow some similar concepts to get reasonably close and (hopefully) without as big of a performance hit.
 

Poimandres

Member
Oct 26, 2017
6,958
The questionable ray tracing capabilities of the next gen consoles might be a blessing in disguise for the 20 series cards. There's likely to be some lower demand ray tracing options mostly designed for the consoles that will make their way to PC, and these cards should keep up well on these settings (provided DLSS is an option). You'll likely need a 30 series for more comprehensive ray tracing options at a decent frame rate.
 

Deleted member 20986

Oct 28, 2017
4,911
A little bit. But, even when pushed to the max for touting the rtx 3000 series, the effects are as I expected, just limited to prettier light reflections in the environments. I'm fine with it scaled down on my 2000 series card.
 

VoidCommunications

Alt Account
Banned
Aug 2, 2020
199
Not sure why it would be derailing, but ok. I'll just say this: What's more noticeable, the minor artifacts of DLSS 2.0+, the significant artifacts of any other reconstruction we know off, or the doubling of your frame rate?
Ok I'll bite. I think DLSS is a wonderful, amazing technology that can have massive frame rate improvements in most titles. It also induces some artifacts (more noticeable in some titles and with certain effects more than others), and is not totally perfect. Like any other form of reconstruction, it's forced to make tradeoffs. This is why I said the "DLSS is magic" meme is frustrating because I like the tech, but it's not foolproof. And any acknowledgement of DLSS's technical flaws is taken as a criticism of the tech as a whole. Or you're told that you're crazy and they just don't see the artifacts so it must be 4k (I've seriously had that argument here before).

Consider a hammer. The hammer is really good at hammering nails. Getting a nail through some foundation will take some work though. This isn't a criticism of the hammer. It's not made to get a nail through foundation, and a good nail gun does the job much better. Hammer has one set of tradeoffs, the nail gun another.
 

Mr.Deadshot

Member
Oct 27, 2017
20,285
I mean fucking Quake 2 RTX runs with like 45FPs on my 2070S with 1440p. What did you expect OP? RTX is a performance killer. Just turn it down a bit and you will be fine.
 

Uhtred

Alt Account
Banned
May 4, 2020
1,340
Ok I'll bite. I think DLSS is a wonderful, amazing technology that can have massive frame rate improvements in most titles. It also induces some artifacts (more noticeable in some titles and with certain effects more than others), and is not totally perfect. Like any other form of reconstruction, it's forced to make tradeoffs. This is why I said the "DLSS is magic" meme is frustrating because I like the tech, but it's not foolproof. And any acknowledgement of DLSS's technical flaws is taken as a criticism of the tech as a whole. Or you're told that you're crazy and they just don't see the artifacts so it must be 4k (I've seriously had that argument here before).

Consider a hammer. The hammer is really good at hammering nails. Getting a nail through some foundation will take some work though. This isn't a criticism of the hammer. It's not made to get a nail through foundation, and a good nail gun does the job much better. Hammer has one set of tradeoffs, the nail gun another.

i guess my quesiton is... what is your point exactly?

That it's not perfect is not really contributing much to the discussion. We dopn't praise it because it's perfect, we praise it, because it's head and shoulders better than anything else at doing what it does. There's definitely room for improvement, and we expect to see that improvement in the coming years. But right now, the difference in image quality vs native + TAA is something I'd say 99% of people would be happy with when it's nearly doubling their frame rate.

I mean look at consoles. Most console games are not 4K, even though they say 4K on the box. And inferior reconstruction there means playable frame rates. Just going off by what I read on this forum, I'd say most PS4 gamers are happier with it than without it. So to me it's not surprising that PC gamers find DLSS, which achieves a better performance delta and a better IQ, are happy too.
 

Pakesaker

Member
Oct 25, 2017
569
Omaha, NE
I"m hoping for 60fps at 1440p with RTX and DLSS set to performance and some settings turned down a bit on my 2080TI. I'm trying to wait to do a whole new PC build until next year.
 

Deleted member 1238

User requested account closure
Banned
Oct 25, 2017
3,070
This is one of those games I might wait a few years to play. Either on a next gen console or if I build a beefy PC with a 30 series (or maybe better depending on how far in the future). I really want to experience it at its best. I mean I still have to play the Witcher 3, so I think I can fill the time lol.
 
Nov 2, 2017
2,275
All the performance previews we've gotten from Ampere show that Ampere isn't a lot better in terms of raytracing performance compared to Turing. Yes, the 3080 delivers better performance but it's just generally more powerful.

This is what I expected because multiple devs have mentioned that the bottlenecks in raytracing are not in the RTX hardware but shading & probably bandwidth.
 

Jtrizzy

Member
Nov 13, 2017
622
I'm also sure that Nvidia has had it's best people working with CDPR to make sure this runs well on their cards for quite a while. It is the biggest game of the year, and they clearly have a lot invested.
 

leecming

Banned
Oct 31, 2017
74
CP2077 was slated to be launched earlier in the year - before Ampere was going to be a thing.
If you have a decent Turing card and are running it at a sane resolution (e.g. 1440p), I don't think you should worry too much.
 

Tovarisc

Member
Oct 25, 2017
24,532
FIN
Strangely enough, I heard that might be one of the requirements for them issuing out PC review codes. If not, reviewers will get console codes instead.

From where you heard these tales? Your ass?

I mean fucking Quake 2 RTX runs with like 45FPs on my 2070S with 1440p. What did you expect OP? RTX is a performance killer. Just turn it down a bit and you will be fine.

Isn't Quake 2 RTX fully path traced game when you enable RTX?
 

Tremorah

Member
Dec 3, 2018
4,974
I sure hope i can DLSS my way to a decent enough framerate on a 1440p monitor with a 2080, dont even care about all the RTX candies, just a stable decent framerate

I remember Witcher 2 melting my GPU back in the days
 

-Amon-

Banned
Oct 28, 2017
572
Having a 2k monitor and a 2080TI i'm not that worried about performance in Cyberpunk tbh. I don't see it as a game that absolutely needs high frame rates to be played decently. So dropping ray tracing be enough to have good performances.
 

kiguel182

Member
Oct 31, 2017
9,487
The 2000 series were clearly a first gen product. Overall, ray tracing performance seems to be only acceptable on 3000 and up. I bet even the 3060 will be better in terms of ray tracing.
 

Tovarisc

Member
Oct 25, 2017
24,532
FIN
Lol, what? That would extremely sad if true. I mean that card is equivalent to like 3 PS5's.

If you didn't notice people are blatantly making shit up. It's amazing to me how much pushback and shit posting ray tracing generates on gaming / tech enthusiastic site like ERA.

We are getting official system spec "soon" so we will have actual guideline for just how demanding game is.
 

Jedi2016

Member
Oct 27, 2017
16,060