• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

taggen86

Member
Mar 3, 2018
464
Those test for performance were done without ray tracing, yes?
.

There is a 1440p performance comparison in the video when ray tracing is on and 1440p DLSS has the same performance hit as 1440p with a 0.8 shader rate. For some reason, they didn't include a 4k performance comparison when ray tracing was on which is a little suspicious...Maybe DLSS is more beneficial with a lower performance hit than 1800p in that case as you theorize? (and it didn't fit the narrative they wanted to convey?) I remember seeing a performance advantage of DLSS over 1800p when ray tracing was on when testing it myself las week.
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,846
Doesn't TAA which is forced in this game produce artifacts as well?
All TAA produce artifacts but it's a different set of artifacts although some TAA do have an overly strong post-sharpening too.

Personally, I still think that DLSS 1x is a waste of time for everyone involved and hope that we'll see DLSS 2x at some point.
 

TheModestGun

Banned
Dec 5, 2017
3,781
How bad of a bottleneck is my i7 7700K with RTX 2080TI.
Compared to my GTX 1070 everything runs much smoother but then there are games like AC: Odyssey which jump from 60fps to 45 constantly at 1440p.
It won't be a bottleneck at all. Games like AC Odyssey are extremely GPU bound, especially anything over 1080p.

For reference I get a frame rate in the 90's at 1080p on my RTX2080 with an AMD Ryzen 2600 OC'd to 4ghz.

I get a close to locked 60 at 4K 80% resolution scale.
 

JumbiePrime

Member
Feb 16, 2019
1,878
Bklyn
Afterburner beta (the regular version doesn't support the RTX OC Scan yet).

Tips:
-Go into advance settings to unlock voltage editing/monitoring.
-Use RTSS/Afterburner to make sure the relevant stats (voltage/core clock/temp/etc) are displayed in the OSD overlay so you can see exactly what they are in-game (cause I promise you they will be changing slightly on the fly even though you set it to something else).
-Edit your fan curve in the advanced settings too. Find the max % you're comfortable with noise-wise and make it so at around 77-80c it starts to hit your max fan speed. I suggest max around 50-54% speed for fan noise ratio, this was enough to keep my GPU at or below 79c with my highest OC in the most demanding games (and cooler in less demanding ones) and not be obnoxiously loud. Around 44% is when the fans get start to be heard and after 54% it's just too loud for me. But this is still enough to keep temps controlled in my setup and OC. If you don't mind going higher and putting up with a jet engine on your desk, by all means go higher and enjoy cooler temps. It's not worth it for me personally at that point.

As a comparison, stock settings everything I was hitting the default 83c temp limit before doing this within 5-10 minutes of playing a game, and that was with no OC and just stock everything heh. For reference though, on these cards, the max thermal limit is 88c according to Nvidia, but I don't recommend going past the default 83c even though its perfectly safe up til 88-89c, thats just so hot!

Now for the fun part...

-Run the OC Scanner to let the software find a baseline overclock and it will show you your curve with editable points on the graph. Click apply to save the new voltage/core curve. Save it to a profile slot too so you have access to it in case you need to close/reopen Afterburner.

-Pick a voltage number point on the graph you'd like to undervolt to and move all the other points in the graph AFTER that point to match (so it forms a flat/straight line after that point). Click Apply to save the new edits you did.

-Then it's testing after that. Test and raise the core clock on that voltage point (and reflatten the curve after that again). Test in a couple different games or benchmarks. For me, Monster Hunter World was my go to lol. I had stable numbers all day in FFXV and The Division games, but Monster Hunter World would close to desktop after a few minutes with dx gpu error. If you get crashing, then you know your core clock is too high, your voltage too low, or both. Now im stable in everything I play for as long as I play.

-VRAM OC, this one is tricky. You can start by bumping it up by like 200mhz at a time and testing. But with VRAM OC on newer GPUs it's harder to tell when youve reached the limit. From what I understand, you might not see artifacts to tell you you're pushing it too hard, instead the VRAM autocorrects and even though you won't see artifacts, you'll start to actually get lower performance if it's reached the point where it has to do that. And that doesn't show up in a way you can physically see. So you want to test your performance in a benchmark or by seeing your framerate as you increase the VRAM. I've seen mostly everyone getting to around +500-600, and others +1000 or max +1200. I was happy with +1000 and didn't feel like maxing it out at 1200 though.

-OC core clock first, then once you've found your stable point, then do the VRAM.

-Don't forget to hit apply after each step to save/load current changes. And also don't forget to save to a profile preset so your edits dont get lost in case you restart after burner or PC.

-Finally, the most annoying part is even when you have a nice perfect stable curve and overclock, and you save it... the GPU (or Afterburner, not sure which) will still lower (or raise) your curve super super slightly on the fly depending on the temp (even though it's not hitting the set temp limit). It usually adjusts around 60c and then again around 70c. I think it's just how the boost clock works on these GPUs and there's no way to turn that off. So what I do is after I have everything the way I want it, I load up a game with my new curve, and wait for the temps to reach the hottest it'll get in-game, then alt tab out and see that the curve got adjusted about 20-40mhz down, so I readjust the curve to how I had it and hit apply. Now the software kinda adjusts/learns that this is what I want the curve to be at that temperature, so what happens is the curve actually starts slightly higher now than what I set it to, but then settles down to the curve I set within a minute or so and stays locked there.


There's a tutorial that I followed, but I can't remember which one exactly. But it didn't mention all this other stuff I found out while going through the process myself. Lot's of little headaches I wish the tutorial went over, but now you know as much as I do if you wanna give it a shot.
Holy hell thanks yo for the details ! Is there anyway I can save/bookmark a post somewhere?
 

Flappy Pannus

Member
Feb 14, 2019
2,340
Good report from Hardware Unboxed, however they somewhat missed how TAA can look in motion - albeit they did cover some instances of DLSS flickering pretty badly.

A much better showing, but my enthusiasm is again tampered when compared to 1800p, especially with sharpening. I like some elements on DLSS better than 1800p, some not - which is a problem for DLSS, considering regular scaling + sharpening is an option on virtually every game at every aspect ratio and doesn't require you to wait on a patch.

DLSS might still be far superior in games with very poor TAA implementations and it of course can still improve (and the 1440p results look much better), but if we're still having to pick-and-choose which 'parts' are superior vs. regular scaling in its best implementation yet, it's a little difficult to get excited about.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
A much better showing, but my enthusiasm is again tampered when compared to 1800p, especially with sharpening. I like some elements on DLSS better than 1800p, some not - which is a problem for DLSS, considering regular scaling + sharpening is an option on virtually every game at every aspect ratio and doesn't require you to wait on a patch.
If you take a base 1440p image and apply post sharpen to it, it will really not look like those DLSS results there ^^ I tried it, it looks, hah, a good deal different ^_^
 

Braag

Member
Nov 7, 2017
1,908
Looking at some benchmarks around the net it doesn't seem like a 7700k at 4.5 Ghz should limit AC:O to 45 FPS. But with open world games it's never clear on what is actually comparable and what might happen at various points in the game.

What's your GPU usage when it drops?

I'll have to do some tests.
 

shadowhaxor

EIC of Theouterhaven
Verified
Oct 27, 2017
1,728
Claymont, Delaware
So close to pulling the trigger on the RTX 2080 Ti. I originally was against it, but seeing the performance of Anthem (I know, not the best), The Division 2, and a few other games at 3440x1440 ultrawide, I need the extra POWA. My GTX 1080 Ti will move to my HTPC/Streaming PC and my GTX 1080 from that PC is up for sale. (it's on the for sale/trade for $400 if anyone's interested).

The improved NVENC encoding is also a selling point for me. I stream and record tons of gameplay, so I'd be taking advantage of the encoding.

Anyone have the EVGA GeForce RTX 2080 Ti XC Ultra Gaming? If you, anything you like/dislike about it?
 

Lakeside

Member
Oct 25, 2017
9,216
So close to pulling the trigger on the RTX 2080 Ti. I originally was against it, but seeing the performance of Anthem (I know, not the best), The Division 2, and a few other games at 3440x1440 ultrawide, I need the extra POWA. My GTX 1080 Ti will move to my HTPC/Streaming PC and my GTX 1080 from that PC is up for sale. (it's on the for sale/trade for $400 if anyone's interested).

The improved NVENC encoding is also a selling point for me. I stream and record tons of gameplay, so I'd be taking advantage of the encoding.

Anyone have the EVGA GeForce RTX 2080 Ti XC Ultra Gaming? If you, anything you like/dislike about it?

Not the same card but my kids have that model 2070. The only complaint that I can think of is that the HSF is REALLY BIG so make sure that'll work for you. I can promise that you won't be doing the hand-me-down thing with it in a year or two, at least not with that SFF case.

Otherwise it's a reference PCB "A" bin chip setup as far as I know.
 

luoapp

Member
Oct 27, 2017
505
Has nVidia published DLSS spec yet (as in APIs, programming guide, etc.)? Or it's still under a very tight wrap?
 

low-G

Member
Oct 25, 2017
8,144
Has nVidia published DLSS spec yet (as in APIs, programming guide, etc.)? Or it's still under a very tight wrap?

What API? The API is ask Nvidia to take your game into their black-box making black-box.

At some point Nvidia should let devs make their own DLSS patterns, and separate that from the drivers, but I don't expect it to be done this vid card gen.
 

luoapp

Member
Oct 27, 2017
505
What API? The API is ask Nvidia to take your game into their black-box making black-box.

At some point Nvidia should let devs make their own DLSS patterns, and separate that from the drivers, but I don't expect it to be done this vid card gen.

So, what do we know about black boxes i.e. DLSS exactly? Anything apart from AI, Deep learning ... ? What does it do mathematically, or algorithmically?
 

low-G

Member
Oct 25, 2017
8,144
So, what do we know about black boxes i.e. DLSS exactly? Anything apart from AI, Deep learning ... ? What does it do mathematically, or algorithmically?

Do you have any experience with neural nets? They tend to be very opaque. I wouldn't be surprised if Nvidia themselves knows very little about a specific game's DLSS set.
 

AegonSnake

Banned
Oct 25, 2017
9,566
God i love rtx on metro. Every game needs it.

Rtx on
hhb6LqP.jpg


Rtx off
zuS7Z6Y.jpg


It's actually kinda funny how improperly lit the base game can be. Like what in the world is lighting the floor in the second pic lol
 

dgrdsv

Member
Oct 25, 2017
11,846
Has nVidia published DLSS spec yet (as in APIs, programming guide, etc.)? Or it's still under a very tight wrap?
What specs do you expect to be published? DLSS is currently implemented through NGX - NV's proprietary machine learning API. There's nothing really to program there beyond adding DLSS DLL calls to your renderer I guess.

DLSS is an implementation of ML AA technology and it is and will be NV proprietary (since they are the ones who made it). I can see them porting it from NGX to DirectML at some point which will potentially allow it to run on any GPU supporting DirectML - but that's about as much as we can expect here. Anyone can try and implement their own ML AA over DirectML once this will come out of beta in Win10 1903.
 

Flappy Pannus

Member
Feb 14, 2019
2,340
If you take a base 1440p image and apply post sharpen to it, it will really not look like those DLSS results there ^^ I tried it, it looks, hah, a good deal different ^_^

Are you referring to the 1440p DLSS? If so, I do mention that it apparently looks better than 1080p scaled:

DLSS might still be far superior in games with very poor TAA implementations and it of course can still improve (and the 1440p results look much better),

If you're referring to 4k DLSS, as I've said before there's little point to compare it to 1440p, it does not equate to 1440p performance, DLSS of course has overhead. It only makes sense to compare it to the regularly-scaled resolution that its performance profile matches, with Metro this is 1800p+sharpening.
 

xMaximusx

Banned
Oct 28, 2017
235
I've got a 1080Ti with a 6700K, for the tech guys. Would your prioritize a new CPU or GPU at this point?
 

Rhaya

Member
Oct 25, 2017
888
welp , i bit the bullet on this beast .

20190301_195403.jpg


Was quite the expensive bullet to bite on but YOLO ! :)
 

tuxfool

Member
Oct 25, 2017
5,858
God i love rtx on metro. Every game needs it.

Rtx on
hhb6LqP.jpg


Rtx off
zuS7Z6Y.jpg


It's actually kinda funny how improperly lit the base game can be. Like what in the world is lighting the floor in the second pic lol
It is more detailed, but once again it is way too dark. All that light coloured metal would bounce light like crazy.
 

Serpens007

Well, Tosca isn't for everyone
Moderator
Oct 31, 2017
8,127
Chile
Reading this thread always makes me wanting an upgrade I probably don't need instead one I would lol
 

Valkrai

Member
Oct 25, 2017
2,495
I got a new computer too. My last Intel Nvidia build was a i5 3570k + 760/960 from 2013/2014.

Now it'll be a i7 8700 + 2070.
 
Oct 25, 2017
41,368
Miami, FL
So close to pulling the trigger on the RTX 2080 Ti. I originally was against it, but seeing the performance of Anthem (I know, not the best), The Division 2, and a few other games at 3440x1440 ultrawide, I need the extra POWA. My GTX 1080 Ti will move to my HTPC/Streaming PC and my GTX 1080 from that PC is up for sale. (it's on the for sale/trade for $400 if anyone's interested).

The improved NVENC encoding is also a selling point for me. I stream and record tons of gameplay, so I'd be taking advantage of the encoding.

Anyone have the EVGA GeForce RTX 2080 Ti XC Ultra Gaming? If you, anything you like/dislike about it?
I can't lie; being well over 100fps in Anthem and other games is a feelsgoodman every time I load up with my 2080Ti. I have no complaints.
 

Valkrai

Member
Oct 25, 2017
2,495
Go Ryzen and your future proof for quite a while.

I did but I really didn't like the software from AMD's side. Plus the Radeon card on there kind of sucked.

Also streaming isn't as great since it takes away some performance on my CPU, at least with the NVENC encoding its a smaller performance hit.
 
Oct 25, 2017
41,368
Miami, FL
Hey bros, any secrets to getting the card to run with a GPU clock at 2000MHz or better? Mine ocassionally hits like 2020MHz or so, but seems to hover around 1970MHz or so most of the time. Everything I'm playing continues to have an extremely high framerate (games like Destiny 2 and Apex spend most of their time at my set max of 117 fps at 3440x1440 with everything maxed), but it's weird to see MHz be not particularly high.

I'm using Afterburner with Power at 130% and the core clock set to Curve after doing some of those tests. Memory clock at +700MHz, fairly aggressive custom fan curve. No driver crashes or anything; just a lower than expected clock. Any suggestions or should I not even bother worrying about it?

Ordered the EVGA RTX 2080 Ti XC after going back and forth. My GTX 1080 will sell, and that will be the end of it.
That's the one I have. grats! It's been good to me. No complaints.

What drivers are you running? I have a 2080 and I get dips down to 50s and my friend with a 2080ti goes down to 70.
Still on 417.71 because Apex.

In Anthem I haven't noticed dips below maybe 70 or 80 in the city. Generally 100+ in the world.
 

pulsemyne

Member
Oct 30, 2017
2,635
I did but I really didn't like the software from AMD's side. Plus the Radeon card on there kind of sucked.

Also streaming isn't as great since it takes away some performance on my CPU, at least with the NVENC encoding its a smaller performance hit.
I was talking about the CPU not the graphics card.
 

Rellyrell28

Avenger
Oct 25, 2017
28,896
Okay so I just got an installed my rtx 2060 question is when I ordered it it's suppose to come with a free game either Anthem or Battlefield. Where do you redeem them at. I order it from New Egg btw.
 

Isee

Avenger
Oct 25, 2017
6,235
How is memory overclocking on 2080Ti cards? I'm currently at +1000 MHz which seems to be a bit high?
I'm always a bit sensible about Vram overclocking because there is no way to check temperature on there (without modding).
Also, would it help to cool memory if I install thermal pads on the backside of my PCB where memory chips are located so that there is some kind of heat transfer to the metal backplate?