Yes, but it produced strange artifacts that were clearly visible.
Doesn't TAA which is forced in this game produce artifacts as well?
Yes, but it produced strange artifacts that were clearly visible.
Those test for performance were done without ray tracing, yes?
.
Lol good luck convincing the people gushing over the idea of consoles with 12TFLOP GPUs and 8 core Ryzens.
All TAA produce artifacts but it's a different set of artifacts although some TAA do have an overly strong post-sharpening too.Doesn't TAA which is forced in this game produce artifacts as well?
It won't be a bottleneck at all. Games like AC Odyssey are extremely GPU bound, especially anything over 1080p.How bad of a bottleneck is my i7 7700K with RTX 2080TI.
Compared to my GTX 1070 everything runs much smoother but then there are games like AC: Odyssey which jump from 60fps to 45 constantly at 1440p.
Holy hell thanks yo for the details ! Is there anyway I can save/bookmark a post somewhere?Afterburner beta (the regular version doesn't support the RTX OC Scan yet).
Tips:
-Go into advance settings to unlock voltage editing/monitoring.
-Use RTSS/Afterburner to make sure the relevant stats (voltage/core clock/temp/etc) are displayed in the OSD overlay so you can see exactly what they are in-game (cause I promise you they will be changing slightly on the fly even though you set it to something else).
-Edit your fan curve in the advanced settings too. Find the max % you're comfortable with noise-wise and make it so at around 77-80c it starts to hit your max fan speed. I suggest max around 50-54% speed for fan noise ratio, this was enough to keep my GPU at or below 79c with my highest OC in the most demanding games (and cooler in less demanding ones) and not be obnoxiously loud. Around 44% is when the fans get start to be heard and after 54% it's just too loud for me. But this is still enough to keep temps controlled in my setup and OC. If you don't mind going higher and putting up with a jet engine on your desk, by all means go higher and enjoy cooler temps. It's not worth it for me personally at that point.
As a comparison, stock settings everything I was hitting the default 83c temp limit before doing this within 5-10 minutes of playing a game, and that was with no OC and just stock everything heh. For reference though, on these cards, the max thermal limit is 88c according to Nvidia, but I don't recommend going past the default 83c even though its perfectly safe up til 88-89c, thats just so hot!
Now for the fun part...
-Run the OC Scanner to let the software find a baseline overclock and it will show you your curve with editable points on the graph. Click apply to save the new voltage/core curve. Save it to a profile slot too so you have access to it in case you need to close/reopen Afterburner.
-Pick a voltage number point on the graph you'd like to undervolt to and move all the other points in the graph AFTER that point to match (so it forms a flat/straight line after that point). Click Apply to save the new edits you did.
-Then it's testing after that. Test and raise the core clock on that voltage point (and reflatten the curve after that again). Test in a couple different games or benchmarks. For me, Monster Hunter World was my go to lol. I had stable numbers all day in FFXV and The Division games, but Monster Hunter World would close to desktop after a few minutes with dx gpu error. If you get crashing, then you know your core clock is too high, your voltage too low, or both. Now im stable in everything I play for as long as I play.
-VRAM OC, this one is tricky. You can start by bumping it up by like 200mhz at a time and testing. But with VRAM OC on newer GPUs it's harder to tell when youve reached the limit. From what I understand, you might not see artifacts to tell you you're pushing it too hard, instead the VRAM autocorrects and even though you won't see artifacts, you'll start to actually get lower performance if it's reached the point where it has to do that. And that doesn't show up in a way you can physically see. So you want to test your performance in a benchmark or by seeing your framerate as you increase the VRAM. I've seen mostly everyone getting to around +500-600, and others +1000 or max +1200. I was happy with +1000 and didn't feel like maxing it out at 1200 though.
-OC core clock first, then once you've found your stable point, then do the VRAM.
-Don't forget to hit apply after each step to save/load current changes. And also don't forget to save to a profile preset so your edits dont get lost in case you restart after burner or PC.
-Finally, the most annoying part is even when you have a nice perfect stable curve and overclock, and you save it... the GPU (or Afterburner, not sure which) will still lower (or raise) your curve super super slightly on the fly depending on the temp (even though it's not hitting the set temp limit). It usually adjusts around 60c and then again around 70c. I think it's just how the boost clock works on these GPUs and there's no way to turn that off. So what I do is after I have everything the way I want it, I load up a game with my new curve, and wait for the temps to reach the hottest it'll get in-game, then alt tab out and see that the curve got adjusted about 20-40mhz down, so I readjust the curve to how I had it and hit apply. Now the software kinda adjusts/learns that this is what I want the curve to be at that temperature, so what happens is the curve actually starts slightly higher now than what I set it to, but then settles down to the curve I set within a minute or so and stays locked there.
There's a tutorial that I followed, but I can't remember which one exactly. But it didn't mention all this other stuff I found out while going through the process myself. Lot's of little headaches I wish the tutorial went over, but now you know as much as I do if you wanna give it a shot.
If you take a base 1440p image and apply post sharpen to it, it will really not look like those DLSS results there ^^ I tried it, it looks, hah, a good deal different ^_^A much better showing, but my enthusiasm is again tampered when compared to 1800p, especially with sharpening. I like some elements on DLSS better than 1800p, some not - which is a problem for DLSS, considering regular scaling + sharpening is an option on virtually every game at every aspect ratio and doesn't require you to wait on a patch.
Looking at some benchmarks around the net it doesn't seem like a 7700k at 4.5 Ghz should limit AC:O to 45 FPS. But with open world games it's never clear on what is actually comparable and what might happen at various points in the game.
What's your GPU usage when it drops?
So close to pulling the trigger on the RTX 2080 Ti. I originally was against it, but seeing the performance of Anthem (I know, not the best), The Division 2, and a few other games at 3440x1440 ultrawide, I need the extra POWA. My GTX 1080 Ti will move to my HTPC/Streaming PC and my GTX 1080 from that PC is up for sale. (it's on the for sale/trade for $400 if anyone's interested).
The improved NVENC encoding is also a selling point for me. I stream and record tons of gameplay, so I'd be taking advantage of the encoding.
Anyone have the EVGA GeForce RTX 2080 Ti XC Ultra Gaming? If you, anything you like/dislike about it?
Has nVidia published DLSS spec yet (as in APIs, programming guide, etc.)? Or it's still under a very tight wrap?
What API? The API is ask Nvidia to take your game into their black-box making black-box.
At some point Nvidia should let devs make their own DLSS patterns, and separate that from the drivers, but I don't expect it to be done this vid card gen.
So, what do we know about black boxes i.e. DLSS exactly? Anything apart from AI, Deep learning ... ? What does it do mathematically, or algorithmically?
Not true at all for me.Like I said before, at 4k rtx I think 0.6 and 0.7 shading rate looks better than DLSS.
Agree 100%, I hope more games use it for lighting not just shadows or reflections.God i love rtx on metro. Every game needs it.
Rtx on
Rtx off
It's actually kinda funny how improperly lit the base game can be. Like what in the world is lighting the floor in the second pic lol
What specs do you expect to be published? DLSS is currently implemented through NGX - NV's proprietary machine learning API. There's nothing really to program there beyond adding DLSS DLL calls to your renderer I guess.Has nVidia published DLSS spec yet (as in APIs, programming guide, etc.)? Or it's still under a very tight wrap?
If you take a base 1440p image and apply post sharpen to it, it will really not look like those DLSS results there ^^ I tried it, it looks, hah, a good deal different ^_^
DLSS might still be far superior in games with very poor TAA implementations and it of course can still improve (and the 1440p results look much better),
It is more detailed, but once again it is way too dark. All that light coloured metal would bounce light like crazy.God i love rtx on metro. Every game needs it.
Rtx on
Rtx off
It's actually kinda funny how improperly lit the base game can be. Like what in the world is lighting the floor in the second pic lol
GPU. You'll be fine on 6700K for another couple of years at least.I've got a 1080Ti with a 6700K, for the tech guys. Would your prioritize a new CPU or GPU at this point?
I can't lie; being well over 100fps in Anthem and other games is a feelsgoodman every time I load up with my 2080Ti. I have no complaints.So close to pulling the trigger on the RTX 2080 Ti. I originally was against it, but seeing the performance of Anthem (I know, not the best), The Division 2, and a few other games at 3440x1440 ultrawide, I need the extra POWA. My GTX 1080 Ti will move to my HTPC/Streaming PC and my GTX 1080 from that PC is up for sale. (it's on the for sale/trade for $400 if anyone's interested).
The improved NVENC encoding is also a selling point for me. I stream and record tons of gameplay, so I'd be taking advantage of the encoding.
Anyone have the EVGA GeForce RTX 2080 Ti XC Ultra Gaming? If you, anything you like/dislike about it?
Go Ryzen and your future proof for quite a while.I got a new computer too. My last Intel Nvidia build was a i5 3570k + 760/960 from 2013/2014.
Now it'll be a i7 8700 + 2070.
I can't lie; being well over 100fps in Anthem and other games is a feelsgoodman every time I load up with my 2080Ti. I have no complaints.
I can't lie; being well over 100fps in Anthem and other games is a feelsgoodman every time I load up with my 2080Ti. I have no complaints.
That's the one I have. grats! It's been good to me. No complaints.Ordered the EVGA RTX 2080 Ti XC after going back and forth. My GTX 1080 will sell, and that will be the end of it.
Still on 417.71 because Apex.What drivers are you running? I have a 2080 and I get dips down to 50s and my friend with a 2080ti goes down to 70.
tough call. The only GPU upgrade out there for you is the 2080ti and that's going to set you back $1400. I'd probably sit on what you got for awhile if I were you.I've got a 1080Ti with a 6700K, for the tech guys. Would your prioritize a new CPU or GPU at this point?
welp , i bit the bullet on this beast .
Was quite the expensive bullet to bite on but YOLO ! :)
What games are you running right now where you're not happy with the performance at the settings you want to play at?I've got a 1080Ti with a 6700K, for the tech guys. Would your prioritize a new CPU or GPU at this point?
I think it would be better to keep both for now and wait for the next generation of both cpu and gpu.I've got a 1080Ti with a 6700K, for the tech guys. Would your prioritize a new CPU or GPU at this point?
I've got a 1080Ti with a 6700K, for the tech guys. Would your prioritize a new CPU or GPU at this point?
I was talking about the CPU not the graphics card.I did but I really didn't like the software from AMD's side. Plus the Radeon card on there kind of sucked.
Also streaming isn't as great since it takes away some performance on my CPU, at least with the NVENC encoding its a smaller performance hit.
Finally get yourself an ava finally icecold. You look like some schmoe who will get banned and forgotten with that weird egg human default one. haha
They should email you a codeOkay so I just got an installed my rtx 2060 question is when I ordered it it's suppose to come with a free game either Anthem or Battlefield. Where do you redeem them at. I order it from New Egg btw.
Finally get yourself an ava finally icecold. You look like some schmoe who will get banned and forgotten with that weird egg human default one. haha
I couldn't find it at first but now I did so I'm good.