• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
I assume there's zero reason for nvidia to build an SoC for Nintendo on anything older than Ampere right?
Assuming 2022 Switch 2 at the earliest, the custom Tegra has to be based on Ampere?
the only reason would be cost, but with Turning on 12nm, there wouldn't be enough gains in performance and the density would be too similar. and it costs money to put a uarch on a different node. so yea, anything older than ampere is very unlikely.
 
Oct 25, 2017
2,937
anyone know if the channel Moore's Law is Dead has any track record of good leaks? He says he has a new source for ampere

summary article

www.notebookcheck.net

NVIDIA Ampere to offer 10-20% IPC increase over Turing, 4x RT performance with minimal FPS impact, up to 2 GHz OC clocks, and an overhauled software stack to take on AMD RDNA 2.0

New information pertaining to NVIDIA's upcoming Ampere architecture was revealed by Tom from the YouTube channel Moore's Law is Dead claiming to have access to insider sources associated with the launch. As per the sources, Turing is just a guinea pig experiment with Ampere expected to offer a...
The thing is that he's the source of this information, and it hasn't been corroborated by other tech news sites, they're just reporting what he's talking about.

I think as a tech speculation channel he and his brother as a team are fun enough. Like the recent console discussion podcast with that blogger, or the video about sticking an NVMe on a video card (which was a real thing that AMD did in the enterprise space a couple of years ago). But for hard leaks, he's less reputable than Adoredtv. Make of that what you will.

A good rule of thumb is to not get hyped for 'out there' GPU leaks unless videocardz.com starts talking. I remember when they dished the bad news on Turing's pricing before anyone else. We only have a couple more days to wait on Ampere anyway. I'd be more than happy to give MLID credit if he does turn out to be right though.

EDIT: and part 2 of MLID's Ampere leaks came out and its even more specific. I'm trying really hard not to get excited. It sounds freaking NUTS but in a good way. Especially after watching DF's coverage of the Ghostrunner demo.
 
Last edited:

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
according to Chinatimes, Hopper is coming next year on 5nm. definitely not a consumer product in that case

wccftech.com

TSMC 5nm Products Leaked: AMD Zen 4 CPUs, RDNA3 GPUs, NVIDIA Hopper and Potentially An 'Intel Xe' GPU

Products manufactured on the upcoming TSMC 5nm have already been confirmed in a leak from ChinaTimes and while it had products that we were already expecting: AMD Zen 4 CPUs, AMD Radeon RDNA 3 GPUs and NVIDIA Hoppper GPU, it also had an unexpected entry: Intel's Xe graphics. There have been a...
 

Avenger

Alt Account
Banned
Mar 31, 2019
592
If specs are real for 3080, what percentage increase in performance are we expected to have compared to 2080?
 

SolidSnakeUS

Member
Oct 25, 2017
9,616
If specs are real for 3080, what percentage increase in performance are we expected to have compared to 2080?

Posted this a few pages ago:

2080:

GPU Cores: 2944
RT Cores: 46
Tensor Cores: 368
Boost Clock: 1710 MHz

3080:

GPU Cores: 4608 (56.5% increase over 2080)
RT Cores: 144 (3.13x times as many over 2080)
Tensor Cores: 576 (56.5% increase over 2080)
Boost Clock: 2000 MHz (17% increase over 2080)
 

GrrImAFridge

ONE THOUSAND DOLLARYDOOS
Member
Oct 25, 2017
9,675
Western Australia
That 3080 ti is going to achieve sentience if it's real lmao

infinitychamber3080ti1xj30.jpg
 
Oct 25, 2017
2,937
Very excitable stuff. But a reminder that he is the source and nobody else has corroborated. If he is correct, I do appreciate the insight into how internal GPU development is handled at NVIDIA. It sounds obtuse as hell, and having no personal experience in the field, I wonder how anything actually gets done. Also, there's a possibility that DLSS 3.0 news gets put off until the consumer gpus release, seems too gaming focused to be a GTC announcement.
 

Laiza

Member
Oct 25, 2017
2,171

That sounds much more plausible than the previous leaks. 5k cuda cores is well within reasonable expectations for the 3080 Ti (though I suspect we'll still be waiting some months for it after the launch of the 3060/3070/3080), and DLSS 3.0 requiring a game ready driver and some game-specific tweaks to be enabled makes far more sense than it simply being enabled in all games with TAA.

Pretty exciting stuff. Shame we still gotta wait two months and change.
 

signal

Member
Oct 28, 2017
40,199
Kind of vague but matches what people were expecting in another DLSS thread saying that it wouldn't be able to just apply itself to games without specific drivers or dev implementation.

R4tIlUd.png
 

Metroidvania

Member
Oct 25, 2017
6,772
70% increase is pretty bonkers (RIP big Navi if true), but this guy is really laying literally all of his credibility on the line with this, lol.

We've only 2 days and change to find out, at least.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
I assume there's zero reason for nvidia to build an SoC for Nintendo on anything older than Ampere right?
Assuming 2022 Switch 2 at the earliest, the custom Tegra has to be based on Ampere?

I would hope it would be much newer, honestly. The Maxwell architecture in the Switch is a bit old because they were using a bunch of 2015 Tegra X1s that needed to be used, but I would think that the Switch 2 would involve a more dedicated partnership. Sony and Microsoft worked with AMD and are launching the PS5 and Series X with RDNA2 mere months after it appears in the PC space. I'm guessing that if the Switch launches in 2023 (2022 is much too early IMO), it will include whatever the 2022 5nm Ampere successor architecture is (assuming that the 2021 GPUs are just more Ampere). I would be nice if it was basically like a 2060 when docked, but I won't hold my breath for that.

Kind of vague but matches what people were expecting in another DLSS thread saying that it wouldn't be able to just apply itself to games without specific drivers or dev implementation.

This means the dream is dead, right? What percentage of devs are going to go back and patch their games with TAA to use DLSS 3.0, regardless of how much easier it is?
 
Last edited:

Fachasaurus

Member
Oct 27, 2017
1,355
They need to start making 32 inch 4K monitors with GSync. These cards are impressive even with half of what is expected to perform.

I am especially interested in the SSD usage.
 

kami_sama

Member
Oct 26, 2017
7,006
Posted this a few pages ago:

2080:

GPU Cores: 2944
RT Cores: 46
Tensor Cores: 368
Boost Clock: 1710 MHz

3080:

GPU Cores: 4608 (56.5% increase over 2080)
RT Cores: 144 (3.13x times as many over 2080)
Tensor Cores: 576 (56.5% increase over 2080)
Boost Clock: 2000 MHz (17% increase over 2080)
That'd be a ~2.8x increase over my 1070, I think it's gonna be a little overkill lol
And considering I game at 1080p 144Hz, I think it't too much lol

Edit: But then I remember I also play on an Index and the upgrade might be worth it. 144Hz might me possible with such a card. Currently I need to put it at 90 :(
 

Mark It Zero!

Member
Apr 24, 2020
494
This means the dream is dead, right? What percentage of devs are going to go back and patch their games with TAA to use DLSS 3.0, regardless of how much easier it is?
Yeah, it's pretty dead, though that previous rumor was kind of a fool's dream. It would've been pretty cool though, i played Deus Ex MD a few weeks ago and their TAA implementation smeared the skin textures during some conversations. Perhaps DLSS would've helped get some of the detail back.
 
Nov 8, 2017
13,111
Kind of vague but matches what people were expecting in another DLSS thread saying that it wouldn't be able to just apply itself to games without specific drivers or dev implementation.

R4tIlUd.png

Yeah but the thing is, the person who implied it would be able to just be applied without dev implementation... was this guy's last video. Even if you wanted to comb through his last vid with a fine toothed comb and say "oh he never said the exact words 'it won't require any dev implementation'", he definitely implied it because he was making out like "it works with any game that has TAA" was a differentiating feature of "DLSS 3.0". But you'd never say that in reality because... that's already true in DLSS 2.0. He's updated the new video with the clarifications you're posting here, but now all it says is "uhh, 3.0 better than 2.0". So why make the original claims?

For all we know he saw the people nitpicking all his errors on the internet and things that seemed very unlikely or impossible, then released this new video today with "patch notes" so that he seems less off the mark when it inevitably wasn't going to be that. Another random thing is that last time he talked about how the lineup would be only on 7nm for the highest end chips, now he's saying it'll be the bulk of the lineup. I get that you might have competing sources telling you different things... but you're supposed to be vetting this stuff before you go to press with it.

The thing about hardware leaks is that they frequently comes from sources that lack holistic contexts, and either the intial source or the person publishing provides editorializing and interpretation, which is often wrong. Even if this youtuber has access to real info, either he or the person who is passing it along to him is obviously misinterpreting (or misunderstanding) things on at least some level. And of course, it could also be bollocks completely.

I'd also like to say that "They're considering overriding settings in some games to force it on even if you don't push it on" is the kind of thing that will immediately be discovered by benchmarking sites and would be a PR disaster.

Edit:

Omg he still says "GeForce Experience is being merged with GeForce Now." Lmao.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
Seems like he was reading the message boards where knowledgeable people debunked his rumors quickly and now made some adjustments based on that to make it more believeable.

Also some of these info snippets are contradicting each other and make no sense whatsoever.

"They won't mandate any low RT settings, they are going to crank up RT to make you buy a new card"
"cards all the way down to MX550 will feature RT"

um what?

I also feel like I have heard that 4x intersections before... oh yes, straight from Xbox Series X, which does 4x the intersections of Turing.

Here's also a video which really gives away he has no clue what he's talking about, says stuff like Vega will be better than Turing RT because it has FP16 (lol) and demonstrates it by showing the Crytek Noir demo which is not even using the RT cores. Also attacks DF.



Please don't trust this guy. For some reason he is pushing a Turing RT will be obsolete agenda lol
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,885
Kind of vague but matches what people were expecting in another DLSS thread saying that it wouldn't be able to just apply itself to games without specific drivers or dev implementation.

R4tIlUd.png
This is as possible as forcing NV's TAA on a driver level in all games which have their own TAA. Meaning - likely impossible.
The last point is also very dubious, NV is fairly conservative in their benchmarking guides.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Several supercomputers have been announced to use Ampere for the summer. We're definitely getting architecture details sooner than later
 

GrrImAFridge

ONE THOUSAND DOLLARYDOOS
Member
Oct 25, 2017
9,675
Western Australia
When is (or was) computex scheduled? Seems the perfect place to announce consumer cards, but I don't think Taiwan will let it happen.

Originally the beginning of June, now the end of September. But if new outbreaks continue to crop up, I imagine it's going to be cancelled altogether, plus the current date is too far into the month, anyway, assuming Nvidia does indeed want to at least paper launch consumer cards in time for Cyberpunk.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,931
Berlin, 'SCHLAND
Seems like he was reading the message boards where knowledgeable people debunked his rumors quickly and now made some adjustments based on that to make it more believeable.

Also some of these info snippets are contradicting each other and make no sense whatsoever.

"They won't mandate any low RT settings, they are going to crank up RT to make you buy a new card"
"cards all the way down to MX550 will feature RT"

um what?

I also feel like I have heard that 4x intersections before... oh yes, straight from Xbox Series X, which does 4x the intersections of Turing.

Here's also a video which really gives away he has no clue what he's talking about, says stuff like Vega will be better than Turing RT because it has FP16 (lol) and demonstrates it by showing the Crytek Noir demo which is not even using the RT cores. Also attacks DF.



Please don't trust this guy. For some reason he is pushing a Turing RT will be obsolete agenda lol

I am pretty sure NV will target 1080p DLSS on the RTX 2060 as being the viable low end scale for the RT implementations for games, even after Ampere comes out.

And since that means 540p or 720p in reality, that makes a lot of sense. Turing is most definitely not immediately obsolete and RT games that come out later will be unplayable on it. Bad rumours and a bad sentiment from that video.
 

kami_sama

Member
Oct 26, 2017
7,006
Originally the beginning of June, now the end of September. But if new outbreaks continue to crop up, I imagine it's going to be cancelled altogether, plus the current date is too far into the month, anyway, assuming Nvidia does indeed want to at least paper launch consumer cards in time for Cyberpunk.
Oh yeah, September is too far out. I also think they need to be out before CP2077 is out. Considering they already had custom 2080ti cards ready for the old launch date, it seems reasonable to do the same with the new ones.
 

alienups

Member
Oct 31, 2017
387
Had €1500 saved up for an ampere card but Covid 19 cut that almost in half. Hope i can still get a 3080.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
I am pretty sure NV will target 1080p DLSS on the RTX 2060 as being the viable low end scale for the RT implementations for games, even after Ampere comes out.

And since that means 540p or 720p in reality, that makes a lot of sense. Turing is most definitely not immediately obsolete and RT games that come out later will be unplayable on it. Bad rumours and a bad sentiment from that video.
yep, I think so too, it should do fine that 1080p with DLSS in next gen RT games.

As the 2060 is likely going to be the base-performance target in the future, I do wonder if Nvidia will finally deliver RTX across the whole Ampere lineup. I could see Nvidia releasing a RTX 3050 with similar performance to a 2060, making a cheap, small and efficient entry for hardware accelerated Raytracing in next gen games! Should be possible thanks to 7nm efficiency. But a 3060 becoming a 2080ti sounds not plausible at all.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
I am pretty sure NV will target 1080p DLSS on the RTX 2060 as being the viable low end scale for the RT implementations for games, even after Ampere comes out.

And since that means 540p or 720p in reality, that makes a lot of sense. Turing is most definitely not immediately obsolete and RT games that come out later will be unplayable on it. Bad rumours and a bad sentiment from that video.

Will that work if not all games offer DLSS, though?

Also, do you think it's likely we'll soon see Ampere cards that match the 2060 for a much lower price? Something like a $249-299 RTX 3030?
 
Status
Not open for further replies.