Based on those slides, my thoughts:
- Ampere being a meaningful arch change not just a shrink - likely
- DLSS 3.0 working inherently with any game that has TAA and implying you can just enable it in the new control panel - very unlikely
- RTX voice being upgraded / only in beta - this is not a rumour, this is confirmed by Nvidia
- Encoder improvements - likely
- 4x better RT performance - hard to say
- "RTX on shouldn't lower the performance in games" - extremely unlikely, since the limiting factor according to devs is not the RT core but the shading
- Tensor accelerated vram compression - seems implausible
- Lower power consumption per tier - plausible
- No logins for GeForce thingies - Hell freeze over tier (but I want it)
If they get rid of needing to login to GFE thats really all I want.
Why doesnt the program remember my log in details, pretty much everytime I update to test new features im logged out and said new features dont work lest you log in.
Being that I have autologin set, its the last thing I check so i end up panicking and doing nigh literally everything else before even bothering to login.
Worse still, when GFE was introduced I found it useless so my first ike 3 accounts were throwaway accounts so some settings are pretty much lost to me forever.
Get rid of that shit.
DLSS 3.0 working across the board is actually alot more likely than you are thinking, but it def isnt going to be a automatic control panel thing, the game would still need to allow DLSS to work with it, so devs would still have to allow their game to give out some information to the program....which I think they are just implying alot more games are jumping on board because they plan on making the submission to allow DLSS to be easy as pie, basically when a devs submits a game for driver optimization DLSS is implemented at the same time.
An absolute godsend if true and might actually push me to go 4K. The perf impact of 4K is basically what stopped me from getting a 4K screen, id rather have a few more bells, whistles and frames at 1440p than 4K
If they reeally are pushing that many RT cores then the increase in RT performance should be totally doable and the overall impact should be lowered substancially....but there will def still be a hit.
The only thing they need to announce to completely win this battle is tell us the 2080Ti beating xx70 which has better RT performance cost ~500 dollars. Do that and I would be hard pressed to justify getting a XSX for the same money.