Is there any chance DLSS will one day be applicable to any game without specific development ? The same way you can force antialiasing in the Nvidia settings for example ? Unless all future games will systematically implement it ?
That seems almost impossible because there's no way a software solution can match a solution that is run by dedicated hardware. That wouldn't make any sense. DLSS isn't magic it's run by Tensor cores who are otherwise doing nothing.New upscaling techniques could be in the works. I'm not saying we're getting DLSS, but maybe something that can achieve a similar boost.
It seems impossible because it is. nVidia has dedicated tons of R&D into getting DLSS running on hardware (Tensor cores), which leaves the CUDA and RT cores free to focus on graphics processing instead of upscaling. A software solution would just be what we have now with checker boarding, and not at all the same thing.That seems almost impossible because there's no way a software solution can match a solution that is run by dedicated hardware. That wouldn't make any sense. DLSS isn't magic it's run by Tensor cores who are otherwise doing nothing.
It really comes down to what kind RT performance RDNA 2 packs. Those cards can run some of those RT effects, but lack of scaling tech like DLSS 2 will limit choices.
Hmm ok I'm super new to PC gaming so excuse me if this is obvious, but I'm using an RX 5700XT. Does that support RDNA 2?
Thanks. GeForce Now it is. Control with full RT looks great on the service.
What are you talking about, the 2080Ti was running it at 60+ fps with DLSS2 at 1080p with all the RT effects and graphics settings cranked up at the local events, not ideal, but it is achievable on today's hardware.....This game requires a 3080Ti or higher, clearly!
And even then it will be barely enough for 1080p@60 FPS w/ DLSS2 apparently.
nVidia is going to sell lots of cards bundled with this game. Lots and lots and lots.
What are you talking about, the 2080Ti was running it at 60+ fps with DLSS2 at 1080p with all the RT effects and graphics settings cranked up at the local events, not ideal, but it is achievable on today's hardware.....
So few games support it right now, why even bother. I love the tech too but it is definitelly not the reason why I hate my 1080. (I actually hate my 1080 because I want more VRAM).I have a 1080 Ti and I'm growing to resent it purely for the lack of DLSS hahaha. God I want that feature so bad. But I absolutely cannot justifying upgrading my GPU when next gen consoles are a few months from release.
Pcgh had the same build which did not have RT reflections, but I do not doubt even that it looks dramatically better and different with the other RT effects on.SkillUp claims he had RT in the demo he played (and that it's the " best implementation he's ever seen")
Pcgh had the same build which did not have RT reflections, but I do not doubt even that it looks dramatically better and different with the other RT effects on.
With DLSS 2.0 you maybe could, just not a crazy high resolution.I really hope my poor 2070 can run this at 60fps with everything on. Give me those futuristic puddle reflections!
With DLSS 2.0 you maybe could, just not a crazy high resolution.
get a crt, they look good at lower resolutions and get crazy fps. You just have to have a giant monitor on your desk.I'm gunning for twice that framerate, at least. I'm probably not going to get what I want, but damn it I'm gonna try. After playing DOOM Eternal and Gears 5 at 120fps+, super high framerate is the thing I prioritize above all else these days.
If I could find a high quality widescreen CRT for my PC, locally, I'd be on it no matter the price. That shit is my white whale.get a crt, they look good at lower resolutions and get crazy fps. You just have to have a giant monitor on your desk.
You can try 540p with DLSS and it will look just as good as native 1080p.That's fine, my monitor is 1080p and far from getting retired. Heck I may even try 720p with DLSS 2.0 if it looks passable.
3440 x 1440 / (Very) High settings / RT / DLSS 2.0
Yeah... I think 2080Ti is toast, give me that 3090.
In theory yes. Although from what you have mentioned, it seems like you haven't used a service like this before?I have never used streaming - so is the game Geforce Now max of 1080p, but runs at 60FPS with what I'm assuming is all the settings cranked including the raytracing etc?
I could forgo getting a 3080 this fall and just play on this? I have gigabit internet I'm assuming that will work good enough for this.
In theory yes. Although from what you have mentioned, it seems like you haven't used a service like this before?
In practice, there are quite often issues with these streaming services. Whether that be degraded quality (macroblocking etc.) stuttering or heavy latency.
Connection quality often makes surprisingly little difference. I advice testing the waters first, to check you would actually be happy with the experience. Native hardware definitely still has quite the edge for me.
DLSS and integer scaling too. I want that clean 1080p 120hz on my 4K TV while I wait for HDMI 2.1 sets to arrive.
So few games support it right now, why even bother. I love the tech too but it is definitelly not the reason why I hate my 1080. (I actually hate my 1080 because I want more VRAM).
In a nutshell, integer scaling makes it so that 1080p looks native on a 4K display. Every 1 pixel becomes 4 pixels so it looks exactly like a native image without any upscaling blur. It's a feature exclusive to RTX cards (why this wasn't possible before, I don't know).Oh I'm actually not familiar with integer scaling. Could you tell me a bit about it?
Holy shit, I had no idea it didn't already work like that. I've always felt 1080p looked especially blurry on my 4k display. Like, more so than it would have on a 1080p one. Thought it was just my imagination though.In a nutshell, integer scaling makes it so that 1080p looks native on a 4K display. Every 1 pixel becomes 4 pixels so it looks exactly like a native image without any upscaling blur. It's a feature exclusive to RTX cards (why this wasn't possible before, I don't know).
For people like me who like to run at 1080p/120fps on 4K TV's, it's a game changer. I've just been hesitant to upgrade my GPU since the 3000 series has been right around the corner for seemingly forever.
It's a feature exclusive to RTX cards (why this wasn't possible before, I don't know).
As long as your output resolution is 1080p it'll evenly multiply to a crisp image on a 4K display. Same thing for 720p image on a 1440p display.Holy shit, I had no idea it didn't already work like that. I've always felt 1080p looked especially blurry on my 4k display. Like, more so than it would have on a 1080p one. Thought it was just my imagination though.
So would this also be applied after DLSSing from 720 to 1080 on a 4k display? ("Applied" might the wrong terminology if this is just the integrated output method, but I don't know.)
Right, but... why? I don't understand why it's so intensive. Seems like it should be easier and cheaper than literally any other upscaling method.It uses the tensor cores pretty extensively which aren't available on pre-RTX cards.
It uses the tensor cores pretty extensively which aren't available on pre-RTX cards.
As long as your output resolution is 1080p it'll evenly multiply to a crisp image on a 4K display. Same thing for 720p image on a 1440p display.
I don't know why it wasn't possible before, but it's a new setting on RTX cards.
Right, but... why? I don't understand why it's so intensive. Seems like it should be easier and cheaper than literally any other upscaling method.
Why does it need tensor cores for this task? I'm genuinely curious as to how the tech works.
And if it uses the tensor cores extensively, what's the impact on performance? (If you were simultaneously running other tensor core tasks like DLSS and RT.)
I've always wondered for like 4 years now why it wasn't doable. Seems like a rather straightforward way as you are effectively making 1 pixel the size of 4 pixel by duplication rather than approximation like linear scaling (which seems less straightforward in comparison), at the very least when talking about 2x and 4x scaling. And today I find out that it's doable but only in Turing.In a nutshell, integer scaling makes it so that 1080p looks native on a 4K display. Every 1 pixel becomes 4 pixels so it looks exactly like a native image without any upscaling blur. It's a feature exclusive to RTX cards (why this wasn't possible before, I don't know).
For people like me who like to run at 1080p/120fps on 4K TV's, it's a game changer. I've just been hesitant to upgrade my GPU since the 3000 series has been right around the corner for seemingly forever.