• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

ShadowFox08

Banned
Nov 25, 2017
3,524
I would love for history to vindicate your view - but i think it helps to be a bit unexpectant of next gen hardware always. Especially if the company making it errs on the side of producing products based on their manufacturing price and mark-up.

The production timelines could have definitely made it so X2 was in the switch. I mean, consoles have debuted new archs throughout their history.
I am positive the reason for the switch's hardware is very much related to the idea that NV had a bunch of HW and a design which did not sell as much as they wanted and Nintendo was happy to grab that up. It was about price.
I would have loved Nintendo to release Mariko 16nm day one (essentially TX2 without Denver cores, and double ram and bandwidth of tx1), but it's 2020 hindsight for nvidia. Word has been that Nvidia wanted to get rid of their millions of their 20nm chip TX1s anyway, and Nintendo came around and got a sweet discount. Truth be told, even 20bm TX1 in March 2017 was among the top in GPU mobile performance vsflagsjip smartphones outside of apple ones.

Anyway, Nintendo has been consistently 2-3 years behind in tech since 3ds/Wii u era ateast, so expecting in 2023 a switch 2 to have 7nm (or even 5nm)ampere with DLSS and even 2-2.5TFLOPs GPU with A78(or whatever the latest one coming is called and pairing with ampere) isn't far-fetched at all.
 

Vash63

Member
Oct 28, 2017
1,681
What res are u playing on, and are the halo effects noticeable on higher res (4k)?

1440p. The halo effects are definitely noticeable in Metro Exodus & BF5. In Tomb Raider less so but there's artifacting on the ray traced shadows so I disabled it. This is the older DLSS though, I don't own any 2.0 games (Youngblood got shit reviews and Control is Epic exclusive).

DLSS is independent to Raytracing. It does, however, require DX12.

There isn't really anything exclusive to DX12. Every major feature DX12 has offered to date has been done first in Vulkan (whether by KHR or vendor extensions). VRS was especially hilarious as it was in a shipping Vulkan game already (Wolf 2) when Microsoft claimed they invented and patented it for DX12 Ultimate.

Well, you don't have to break the algorithm and try to operate it at unsupported and malfunctioning levels to demonstrate that. The existing in-game controls allow for going as low as (while maintaining 16:9 aspect ratio) 1280*720 screen resolution, with a 640*360 input.

www.youtube.com

Control DLSS demonstration -- 640*360 upscaled to 1280*720

Note that many artifacts resulting from the upscaling are only evident when selecting the high bitrate options available on the player.

This is amazing. If Switch 2 can do this in like a 540p native handheld upscaled to 1080p I think that would be plenty to justify a new console.
 

dgrdsv

Member
Oct 25, 2017
11,878
Microsoft claimed they invented and patented it for DX12 Ultimate
MS does this with every tech which they include into DX as this allows them to give this tech through DX to every IHV for free after that. This has been the DX model of making new tech available to everyone for some time now I believe.
 

AllBizness

Banned
Mar 22, 2020
2,273
More games need Ray Tracing, this game is nearly a year old and we're still talking about it's Ray Tracing because newer AAA games on PC aren't doing Ray Tracing. Such a bummer.
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
More games need Ray Tracing, this game is nearly a year old and we're still talking about it's Ray Tracing because newer AAA games on PC aren't doing Ray Tracing. Such a bummer.
The lack of AMD cards supporting RT is probably the biggest cause. Shame AMD doesn't even allow software RT like Pascal does
 

Deleted member 18161

user requested account closure
Banned
Oct 27, 2017
4,805
More games need Ray Tracing, this game is nearly a year old and we're still talking about it's Ray Tracing because newer AAA games on PC aren't doing Ray Tracing. Such a bummer.

I think once we see more new games we'll start to see it. The new consoles having RT is the best thing that could have happened because it will give most studios the inevitable nudge towards the technology they might not have had if it was a PC only feature.

All AAA next gen console games releasing from late 2021 will have RT in some form as standard imo. Either shadows or reflections or GI then if you have a capable PC you'll be able to enable it all at the same time.
 
Nov 8, 2017
13,108
The lack of AMD cards supporting RT is probably the biggest cause. Shame AMD doesn't even allow software RT like Pascal does

The lack of consoles supporting it is a far larger obstacle, but both will be resolved at roughly the same time. Then it'll be... I dunno, 1.5-2 years before AAA console multiplats begin to ship with RT-only rendering options. Coinciding with them starting to ship next-gen-only without PS4/XBO ports. It won't be all the game's graphcis features that make use of it (naturally), and not every game will do this at once, but you'll increasingly have old cards being forced to run the RT effects because the renderer won't have other options for lighting or reflections or something. LowSpecGamer will be tweaking ini files to completely ice these effects to get playable performance on his Athlon 5200g.
 

UltraMagnus

Banned
Oct 27, 2017
15,670
I would have loved Nintendo to release Mariko 16nm day one (essentially TX2 without Denver cores, and double ram and bandwidth of tx1), but it's 2020 hindsight for nvidia. Word has been that Nvidia wanted to get rid of their millions of their 20nm chip TX1s anyway, and Nintendo came around and got a sweet discount. Truth be told, even 20bm TX1 in March 2017 was among the top in GPU mobile performance vsflagsjip smartphones outside of apple ones.

Anyway, Nintendo has been consistently 2-3 years behind in tech since 3ds/Wii u era ateast, so expecting in 2023 a switch 2 to have 7nm (or even 5nm)ampere with DLSS and even 2-2.5TFLOPs GPU with A78(or whatever the latest one coming is called and pairing with ampere) isn't far-fetched at all.

There's also likely zero chance Nintendo was actually aiming for March (I mean basically February 2017) 2017 either.

The plan was to launch November 2016 almost assuredly, no one launches in freaking Feb/March unless they are forced into it because of delays.

So Tegra X2 or Mariko when the Switch hardware was being finalized likely wouldn't even have been available for the date Nintendo was telling Nvidia.

When they couldn't do a holiday release, they fell back to Feb/March 2017 launch as a way to at least get some good news/launch sales into their fiscal year (ending March 31 every year) but you can't change your chipset that late in the game.
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
the Tegra X2 isn't Mariko. Mariko is still the X1, but on 12nm. the X2 is codenamed Parker and is on 16nm (same thing but less refined)
 

Vash63

Member
Oct 28, 2017
1,681
There's also likely zero chance Nintendo was actually aiming for March (I mean basically February 2017) 2017 either.

The plan was to launch November 2016 almost assuredly, no one launches in freaking Feb/March unless they are forced into it because of delays.

So Tegra X2 or Mariko when the Switch hardware was being finalized likely wouldn't even have been available for the date Nintendo was telling Nvidia.

When they couldn't do a holiday release, they fell back to Feb/March 2017 launch as a way to at least get some good news/launch sales into their fiscal year (ending March 31 every year) but you can't change your chipset that late in the game.

X2 was never a consideration for Switch. I don't know why people keep bringing it up. It's got double the memory bus width; that alone excludes it from Nintendo's consideration from both a cost and a power draw perspective. It would have made a terrible handheld.
 

UltraMagnus

Banned
Oct 27, 2017
15,670
X2 was never a consideration for Switch. I don't know why people keep bringing it up. It's got double the memory bus width; that alone excludes it from Nintendo's consideration from both a cost and a power draw perspective. It would have made a terrible handheld.

Yeah. Obviously they probably had Mariko in mind but that wouldn't be ready until this year and they knew that.

Switch had to be ready to go for fall 2016, the delay to early 2017 only happened late in the game due to software. It was always supposed to be a 2016 product.

Nintendo got kinda lucky in holiday 2016 because the NES Classic became a huge success and Pokemon Go from summer 2016 provided the 3DS with a final nice little Pokeboost.

For 2016, debatably the Tegra X1 was the best or second best mobile chip available. I think only Apple would've had maybe something better at that point.
 

ShadowFox08

Banned
Nov 25, 2017
3,524
the Tegra X2 isn't Mariko. Mariko is still the X1, but on 12nm. the X2 is codenamed Parker and is on 16nm (same thing but less refined)
Mariko is essentially TX2 without the Ram, bandwidth and Denver cores.

X2 was never a consideration for Switch. I don't know why people keep bringing it up. It's got double the memory bus width; that alone excludes it from Nintendo's consideration from both a cost and a power draw perspective. It would have made a terrible handheld.
Well we don't know that it wasn't considered. Of course Nintendo wants to save money so TX2 would give them negligible profits to even losing money in the first year if they sold it at $300. But it doesn't matter. Like I said, 20-20 hindsight. A YX2 with A57s and double bandwidth would have been nice, but even though it would have never happened realistically for Nvidia or nintendo, it is what it is. Not something I care to dwell on.
 

UltraMagnus

Banned
Oct 27, 2017
15,670
Mariko is essentially TX2 without the Ram, bandwidth and Denver cores.


Well we don't know that it wasn't considered. Of course Nintendo wants to save money so TX2 would give them negligible profits to even losing money in the first year if they sold it at $300. But it doesn't matter. Like I said, 20-20 hindsight. A YX2 with A57s and double bandwidth would have been nice, but even though it would have never happened realistically for Nvidia or nintendo, it is what it is. Not something I care to dwell on.

I don't know if TX2 would've been ready for fall 2016 even if Nintendo wanted it.
 

Vash63

Member
Oct 28, 2017
1,681
Mariko is essentially TX2 without the Ram, bandwidth and Denver cores.


Well we don't know that it wasn't considered. Of course Nintendo wants to save money so TX2 would give them negligible profits to even losing money in the first year if they sold it at $300. But it doesn't matter. Like I said, 20-20 hindsight. A YX2 with A57s and double bandwidth would have been nice, but even though it would have never happened realistically for Nvidia or nintendo, it is what it is. Not something I care to dwell on.

If it was, they're idiots. The battery life is already bad enough without having to power twice the memory lanes/dies and far more transistors on the main die. So yes, I have no evidence that Nintendo wasn't considering an X2, but I trust that their R&D department isn't filled with idiots and thus that it was not a consideration.
 

Lukas Taves

Banned
Oct 28, 2017
5,713
Brazil
There isn't really anything exclusive to DX12. Every major feature DX12 has offered to date has been done first in Vulkan (whether by KHR or vendor extensions). VRS was especially hilarious as it was in a shipping Vulkan game already (Wolf 2) when Microsoft claimed they invented and patented it for DX12 Ultimate.
From what I understand, some vendors test it first in vulkan through extensions but only test/incomplete implementations, publishing the final first on DX.

Regarding VRS, Ms didn't claimed they invented, they said they patented an specific implementation that they have done on SX. And vrs is not new to dx12 ultimate, it was added early last year.
 

ShadowFox08

Banned
Nov 25, 2017
3,524
I don't know if TX2 would've been ready for fall 2016 even if Nintendo wanted it.
I don't think so either. Nvidia already decided on TX2 with Denver anyway.

If it was, they're idiots. The battery life is already bad enough without having to power twice the memory lanes/dies and far more transistors on the main die. So yes, I have no evidence that Nintendo wasn't considering an X2, but I trust that their R&D department isn't filled with idiots and thus that it was not a consideration.
Battery life on 16nm TX2 should be the same as Mariko, without the added RAM. Heck just having Mariko on day 1 50% more power/speed (600GFLOPs) in docked mode would have been great. Perhaps the CPU could have been upclocked slightly too (A57 or 72s). I'd say Switch's biggest bottleneck is the bandwidth it's only twice as much as Wii U's If they got ther 128 bit bus at +50 GB/s with Mariko's GPU 50% boost, we would have seen a noticable performance boost and more1st party games at 1080p, and solid 720-900p for more third party games.

It is what it is. Nintendo cares about efficiency and cost savings more, and Nvidia didn't think about using stock arm CPUs for TX2
 

Vash63

Member
Oct 28, 2017
1,681
I don't think so either. Nvidia already decided on TX2 with Denver anyway.


Battery life on 16nm TX2 should be the same as Mariko, without the added RAM. Heck just having Mariko on day 1 50% more power/speed (600GFLOPs) in docked mode would have been great. Perhaps the CPU could have been upclocked slightly too (A57 or 72s). I'd say Switch's biggest bottleneck is the bandwidth it's only twice as much as Wii U's If they got ther 128 bit bus at +50 GB/s with Mariko's GPU 50% boost, we would have seen a noticable performance boost and more1st party games at 1080p, and solid 720-900p for more third party games.

It is what it is. Nintendo cares about efficiency and cost savings more, and Nvidia didn't think about using stock arm CPUs for TX2

You say without the added RAM but then you acknowledge that the memory bandwidth is its main limitation... you need twice as many dies to double the memory bus...
 

Galava

▲ Legend ▲
Member
Oct 27, 2017
5,080
Minecraft with Ray Tracing and DLSS 2.0 coming April 16
www.nvidia.com

The Minecraft with RTX Beta Is Out Now!

Learn how to download and install the Minecraft with RTX beta, and the 6 curated Creator Worlds that demonstrate the capabilities of path-traced Minecraft. Also, get our newest Game Ready Driver, see the latest ray-traced trailer and RTX ON-OFF comparisons, and discover how you can win prizes...
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
minecraft-with-rtx-beta-nvidia-dlss-2-0-geforce-rtx-2080-ti-color-light-shadow-performance.png

minecraft-with-rtx-beta-nvidia-dlss-2-0-geforce-rtx-1920x1080-color-light-shadow-performance.png



this stage seems to go ham with the reflections and GI

minecraft-with-rtx-beta-color-light-and-shadow-003-rtx-off.jpg

minecraft-with-rtx-beta-color-light-and-shadow-003-rtx-on.jpg
 

Buenoblue

Banned
May 5, 2018
313
Got Control free with my 2070 super but wasn't happy with the performance so only played a couple hours, struggled with 1080p low rtx. Dlss was blurry on my 75 inch TV. Tried the new update and WOW. Playing with all rtx on with dlss 2.0 at 835 to 1440p and it runs super smooth and looks great. If every major game can incorporate dlss 2.0 this is a game changer.
 

Deleted member 25834

User requested account closure
Banned
Oct 29, 2017
394
Goddamn. If this is what it looks like at normal viewing distance, I can't imagine why this wouldn't be utilized more.

Wow.
 

Vash63

Member
Oct 28, 2017
1,681
So you can't get 50GB/s on a 128 bit 4GB Ram or even 6GB?

No, I'm saying you can't get a 128 bit memory bus on two 32-bit memory dice. You would need more dice. This increases physical size of the PCB, greatly increases power draw, and increases cost. There's a reason most devices in this form factor don't ship with 64-bit memory buses.
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Mechwarrior 5 got new DXR features in beta (reflections and shadows). according to Overclock3D, they ain't all that great except in the hanger

14185448557l.jpg


14185448919l.jpg


www.overclock3d.net

MechWarrior 5: Mercenaries - RTX Beta Analysis - OC3D

MechWarrior 5: Mercenaries – RTX Beta Analysis Late last month, Piranha Games’ MechWarrior 5: Mercenaries receives support for Nvidia’s DLSS 2.0 technology, allowing the power of AI to be used to boost in-game framerates without any noticeable drops in image quality. Now, MechWarrior 5 has...
 

ShadowFox08

Banned
Nov 25, 2017
3,524
TX2 can get 59GB/s on 128bit with 2 4GB modules
Yes I know.

QUOTE="Vash63, post: 31381644, member: 21001"]
No, I'm saying you can't get a 128 bit memory bus on two 32-bit memory dice. You would need more dice. This increases physical size of the PCB, greatly increases power draw, and increases cost. There's a reason most devices in this form factor don't ship with 64-bit memory buses.
[/QUOTE]
Of course you can't get two 32s to make a 128 bit buswidth. Anyway, I don't care about what could have been with how switch launched. Let's move forward.

i think the biggest question is if a 128 bit bus width should be able to fit on a handheld the size of Switch on Switch 2. I'm not too familiar with that on the hardware side, and considering we don't know what the die size of switch 2 will be... Even with a 5nm shrink, it will have more cores, so it could be around the same size as 20 or 12nm tx1. And if that's the case, couldn't they stack the RAM vertically? Two 64GB bit memory dice, is space was an issue. Switch 2 needs to have slat least double the bus width or it will be severley bottlenecked.
 
Last edited:

Vash63

Member
Oct 28, 2017
1,681
Yes I know.

No, I'm saying you can't get a 128 bit memory bus on two 32-bit memory dice. You would need more dice. This increases physical size of the PCB, greatly increases power draw, and increases cost. There's a reason most devices in this form factor don't ship with 64-bit memory buses.
Of course you can't get two 32s to make a 128 bit buswidth. Anyway, I don't care about what could have been with how switch launched. Let's move forward.

i think the biggest question is if a 128 bit bus width should be able to fit on a handheld the size of Switch on Switch 2. I'm not too familiar with that on the hardware side, and considering we don't know what the die size of switch 2 will be... Even with a 5nm shrink, it will have more cores, so it could be around the same size as 20 or 12nm tx1. And if that's the case, couldn't they stack the RAM vertically? Two 64GB bit memory dice, is space was an issue. Switch 2 needs to have slat least double the bus width or it will be severley bottlenecked.

No, it doesn't need double the bus width. If you're hoping for that I think you'll be severely disappointed. Mobile phone hardware has been 32-64 bit for generations and never goes up. High end €500+ GPUs have been stuck at 256-384 bit for about a decade now, and this also doesn't go up. Memory buses don't go up with time, it costs space and power which are things that do not become more available as time progresses.

What needs to go up is higher data rates through newer tech like LPDDR4X or LPDDR5. This will increase bandwidth without requiring 4 memory dice and all of their associated motherboard traces and power draw.
 

padlock

Banned
Oct 27, 2017
867
Sorry if this has already been answered but I've got a couple of questions.

Does this support currently support ultra wide resolutions (i.e. 3440x1440)?

Could this work for VR?
 

ShadowFox08

Banned
Nov 25, 2017
3,524
No, it doesn't need double the bus width. If you're hoping for that I think you'll be severely disappointed. Mobile phone hardware has been 32-64 bit for generations and never goes up. High end €500+ GPUs have been stuck at 256-384 bit for about a decade now, and this also doesn't go up. Memory buses don't go up with time, it costs space and power which are things that do not become more available as time progresses.

What needs to go up is higher data rates through newer tech like LPDDR4X or LPDDR5. This will increase bandwidth without requiring 4 memory dice and all of their associated motherboard traces and power draw.
LPDDR4x only goes up to 33 and LPDDR5 up to 50GB/s. Really doubt we'll get higher data rates. Bad enough we only got double bandwidth from Wii u to Switch. 50GB/s could be enough to give us parity with xbone, but its not looking good for a console that is nearing PS4 Pro in power.
 

Vash63

Member
Oct 28, 2017
1,681
LPDDR4x only goes up to 33 and LPDDR5 up to 50GB/s. Really doubt we'll get higher data rates. Bad enough we only got double bandwidth from Wii u to Switch. 50GB/s could be enough to give us parity with xbone, but its not looking good for a console that is nearing PS4 Pro in power.

Don't rely too much on the raw specs. Nvidia's designs are vastly more efficient than AMD's, especially with regards to memory bandwidth. They were on their 3rd generation of delta color compression across the memory bus before AMD had their first generation out. PC GPUs from AMD regularly have higher memory bandwidth specs relative to similarly performing Nvidia cards.

As a mobile part Switch 2/pro is going to trail PS4 Pro in raw specs most likely (except in CPU, where Jaguar was already a mobile part in 2013), but more modern designs and a better overall architecture can bridge the gap.
 
OP
OP
ILikeFeet

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
LPDDR4x only goes up to 33 and LPDDR5 up to 50GB/s. Really doubt we'll get higher data rates. Bad enough we only got double bandwidth from Wii u to Switch. 50GB/s could be enough to give us parity with xbone, but its not looking good for a console that is nearing PS4 Pro in power.
there are probably other options, but it would be a matter of what's Nintendo willing to pay
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Tried out 1080p Control with DSR enabled so 1080p/4k DLSS.

Yeah, it's legit. Way better than it was before. DLSS was terrible in this game at launch. Now it actually looks like 4k native. Crazy.

2070 super pretty much pulling off 60fps maxed too. A few things turned down here and there but nothing major.
 

ppn7

Member
May 4, 2019
740
Tried out 1080p Control with DSR enabled so 1080p/4k DLSS.

Yeah, it's legit. Way better than it was before. DLSS was terrible in this game at launch. Now it actually looks like 4k native. Crazy.

2070 super pretty much pulling off 60fps maxed too. A few things turned down here and there but nothing major.
I don't understand you mean you can use DSR and DLSS in the same time ?
 

BeI

Member
Dec 9, 2017
5,976
Tried out 1080p Control with DSR enabled so 1080p/4k DLSS.

Yeah, it's legit. Way better than it was before. DLSS was terrible in this game at launch. Now it actually looks like 4k native. Crazy.

2070 super pretty much pulling off 60fps maxed too. A few things turned down here and there but nothing major.

Good to know that seems to work pretty well. I thought I read someone else tried downsampling for a DLSS 2X effect, but it didn't work.
 

ppn7

Member
May 4, 2019
740
Yeah. You have to edit the games display ini file to make it work but it's an easy edit.

Thank you does the DSR help with artifact/aliasing effect due to DLSS?
How many FPS loosing with DSR and is it worth it ?
To be clear you have a 4K monitor. Pitted DLSS render to 1080p then apply DSR to go again to 4K? I'm lost 😅
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Thank you does the DSR help with artifact/aliasing effect due to DLSS?
How many FPS loosing with DSR and is it worth it ?
To be clear you have a 4K monitor. Pitted DLSS render to 1080p then apply DSR to go again to 4K? I'm lost 😅

I have a 1440p monitor. Using 2.25 DSR option in Nvidia control panel for 4k option. For Control you have to go into the game folder and change the render.ini file to render at 1080p and display at 4k. You have to have already had the dlss check box checked as well. Then it will do proper 1080p DLSS to 4k.

It is still using dlss so I guess there is some hair artificating but otherwise is is extremely sharp. Looks awesome. I have some things set down to medium, no ssao, and no diffusion rtx but all other rtx are turned on. Ultra textures and filtering. Looks great and runs 60-65 pretty well locked (running gsync on a freesync display).

From other posts/guides I've seen SSAO and the rtx diffuse lighting in this game cause weird effects on surfaces and Jesse's and NPC faces. Nvidia rtx guide also says to turn off SSAO if using rtx features.
 
Last edited:

ppn7

Member
May 4, 2019
740
I have a 1440p monitor. Using 2.25 DSR option in Nvidia control panel for 4k option. For Control you have to go into the game folder and change the render.ini file to render at 1080p and display at 4k. You have to have already had the dlss check box checked as well. Then it will do proper 1080p DLSS to 4k.

It is still using dlss so I guess there is some hair artificating but otherwise is is extremely sharp. Looks awesome. I have some things set down to medium, no ssao, and no diffusion rtx but all other rtx are turned on. Ultra textures and filtering. Looks great and runs 60-65 pretty well locked (running gsync on a freesync display).

From other posts/guides I've seen SSAO and the rtx diffuse lighting in this game cause weird effects on surfaces and Jesse's and NPC faces. Nvidia rtx guide also says to turn off SSAO if using rtx features.

Thank you ! Great to know that it works well !