headspawn

Member
Oct 27, 2017
14,672
So, the only important question... what does this break? After all, this is a new Nvidia driver.
 

killer7

Banned
Nov 22, 2018
609
This guy show performance with Rtx 2080 (old driver) and Rtx 2070 S (old driver vs. new driver).

In the first video, "RTX 2080 vs RTX 2070 S" frame rate is about 100 fps (old driver).

Settings Ultra+High: MSAA (8X) and FXAA (on)

pfTDWHw.jpg



In the second video, "RTX 2070 S (with old driver) vs RTX 2070 S (with new driver)" frame rate is more than 119 fps.

Settings Ultra+High: MSAA (2X) and FXAA (off)

NJs2XRi.jpg


 
Last edited:

AldzDrakul

Member
Oct 28, 2017
48
I don't know if some people had this issues before but I noticed some graphical bugs for two games specificalt ... Like literally flashing color of lights ... it does it in GTA V and REMNANT FROM ASHES. All other games are good and I even did benchmarks like fur marks and other stress test. No issues or graphical bugs for those and even other games.
I only see it in the sky though for gta v specifically and on the menu for Remnant and when I look at puddles of water in the game... there's like a bright flashy bug

I have an EVGA RTX 2070.
 

BeI

Member
Dec 9, 2017
6,056
I'm actually really liking the sharpening filter! Makes Skyrim SE look less blurry (just need to figure out why it's locked at 57 fps now).

I wasn't expecting both the sharpening and low latency modes to work on the old 1060 I got, but it was a pleasant surprise after switching from a RX 470. However, some of my games don't want to launch or hang on startup now, unfortunately.
 

SolidSnakeUS

Member
Oct 25, 2017
9,928
I am curious, as someone who has dual monitors, one is a 1440p G-Sync monitor using DP and the other is a 1080p HDMI monitor, why does it, whenever I updated my drivers, that it fully disables the 1440p monitor until I reboot?
 

Kyle Cross

Member
Oct 25, 2017
8,518
I was going to finally install the new driver, but when I googled to get the driver page I saw a news article that says the new driver is causing frame rate dips and stutters in games?
 

gozu

Banned
Oct 27, 2017
10,442
America
As some food for thought in regards to integer scaling being restricted to Turing cards (whether you like it or not or not is up to you, but I haven't seen anybody discuss why this is on a technical level yet so I figured I'll throw this information out there), it's most likely the reason why is that they are utilizing the Int32 cores of said Turing cards, something that Pascal and cards before it simply do not have.
Traditionally, trying to do integer processing on a GPU produces fuzzy results because what integers actually did in shaders requires processing them as floats regardless, so even when comparing what you'd think was integers wouldn't necessarily compare to exactly 0 or 1, for example.
Int32 cores solve this by outright doing what you'd expect with an integer value (no floating fuzziness, 1 will always equal 1, etc). So even the shaders that claim to be doing this already are technically faking it because your typical GPU can only calculate it as a float in the first place. This is a "fun" problem when dealing with colour palettes in restoring DOS era games.

This is all an educated guess of course, Nvidia haven't explained the sauce behind it yet, but I'm betting it's that.

i concur. It's heartening to see skepticism of corporate motives but nVidia is innocent this time.
 

Skyfireblaze

Member
Oct 25, 2017
11,257
As some food for thought in regards to integer scaling being restricted to Turing cards (whether you like it or not or not is up to you, but I haven't seen anybody discuss why this is on a technical level yet so I figured I'll throw this information out there), it's most likely the reason why is that they are utilizing the Int32 cores of said Turing cards, something that Pascal and cards before it simply do not have.
Traditionally, trying to do integer processing on a GPU produces fuzzy results because what integers actually did in shaders requires processing them as floats regardless, so even when comparing what you'd think was integers wouldn't necessarily compare to exactly 0 or 1, for example.
Int32 cores solve this by outright doing what you'd expect with an integer value (no floating fuzziness, 1 will always equal 1, etc). So even the shaders that claim to be doing this already are technically faking it because your typical GPU can only calculate it as a float in the first place. This is a "fun" problem when dealing with colour palettes in restoring DOS era games.

This is all an educated guess of course, Nvidia haven't explained the sauce behind it yet, but I'm betting it's that.

I just read this and while I don't understand it on a technical level I understand the gist of it. That said, I'm curious now, independent from variable Integer Scaling, is there anything that prevents a Pascal or earlier card to display a 1080p image on a 4k display with nearest neighbor without any filters so it ends up 2x2 for each pixel?
 

datamage

Member
Oct 25, 2017
921
Anyone else with a 4k TV unable to set the integer setting without it resetting to aspect ratio?

Finally got around to trying this (or wanting to), and I'm experiencing the same thing. Setting defaults back to Aspect Ratio when I hit apply. Disconnected my primary monitor and issue persists. (LG OLED C6)

Fucking ay.
 

TeenageFBI

One Winged Slayer
Member
Oct 25, 2017
10,367
Man, a bunch of games started crashing for me with the integer scaling driver. Witcher 3 and Nex Machina, for example. They're fine with the driver before that.

2080 Ti

In other news, I'd also love to have a nearest neighbor scaling option. It wouldn't be as flawless as integer scaling but it would still be better than the blurry nonsense we have now when we force games into fullscreen

(I have a 1440p monitor so running old 1024x768 games with integer scaling results in massive black bars, basically identical to the no scaling option. Nearest neighbor would be a nice compromise)
 
Last edited:

Gestault

Member
Oct 26, 2017
13,522
Forza Horizon 4 on the Ultra preset on my Max-Q RTX 2070 jumped from achieving high 80s/low 90s to about 119 fps. That's no joke.
 

Dreamboum

Member
Oct 28, 2017
23,031
Can someone explain to me integer scaling? What does it even mean? Pixel art wasn't displayed through integer scaling before? What?
 

dgrdsv

Member
Oct 25, 2017
12,094
Can someone explain to me integer scaling? What does it even mean? Pixel art wasn't displayed through integer scaling before? What?
You get integer increments of resolution without any filtering with integer scaling. So, for example, a 320x240 game will be scaled to 640x480, 1280x960, 2560x1920, etc. with each pixel of the original resolution being shown as 2x2, 4x4, 8x8, etc. without any smoothing in place. The most modern application of such scaling is outputting a 1080p resolution onto a 4K display which is exactly 2x2 scaling.

Before you get your resolution scaled to monitor resolution (with or without aspect ratio compensation) which pretty much always require multiplying the original resolution by some number with a fractional component, like 4.27 or something. This leads to the need to filter this stretched image as it will exhibit scaling artifacts otherwise but this filtering will always blur the image.
 

Dreamboum

Member
Oct 28, 2017
23,031
You get integer increments of resolution without any filtering with integer scaling. So, for example, a 320x240 game will be scaled to 640x480, 1280x960, 2560x1920, etc. with each pixel of the original resolution being shown as 2x2, 4x4, 8x8, etc. without any smoothing in place. The most modern application of such scaling is outputting a 1080p resolution onto a 4K display which is exactly 2x2 scaling.

Before you get your resolution scaled to monitor resolution (with or without aspect ratio compensation) which pretty much always require multiplying the original resolution by some number with a fractional component, like 4.27 or something. This leads to the need to filter this stretched image as it will exhibit scaling artifacts otherwise but this filtering will always blur the image.

Thanks, I understand a bit about filtering and scaling but how does this play in nvidia's driver?

Imagine I would play Hotline Miami right now at 1080p for example, and I didn't have integer scaling...does this mean I've always played it with linear filtering?

Or does that only apply to 4K? (but why would the filtering even be needed, isn't the software/game engine deciding that?)
 
Oct 27, 2017
3,660
Thanks, I understand a bit about filtering and scaling but how does this play in nvidia's driver?

Imagine I would play Hotline Miami right now at 1080p for example, and I didn't have integer scaling...does this mean I've always played it with linear filtering?

Or does that only apply to 4K? (but why would the filtering even be needed, isn't the software/game engine deciding that?)

A lot of recent games (especially 2D ones) will have an option in the game itself that will output an nearest-neighbour-scaled image at whatever resolution you specify. The Nvidia setting is for games that only output at lower resolutions.
 

dgrdsv

Member
Oct 25, 2017
12,094
Thanks, I understand a bit about filtering and scaling but how does this play in nvidia's driver?
The description is in the driver:

https:///u5Dd.png

Imagine I would play Hotline Miami right now at 1080p for example, and I didn't have integer scaling...does this mean I've always played it with linear filtering?
Modern pixel art games can be tricky as they tend not to actually render in the resolution which their art is in. They also have scaling routines in their engines already so they scale and stretch on their own when you specify the target output resolution. But if some game (I don't know about Hotline Miami specifically) do have some fixed low number rendering resolution then you can use it in such game's settings and driver's integer scaler to output this game to you monitor without any filtering. In most cases with older games this will lead to black borders though.

Or does that only apply to 4K? (but why would the filtering even be needed, isn't the software/game engine deciding that?)
This applies to any resolution which is less or even to a half of your display resolution in both dimensions. Game's engine can do it's own thing but if you're outputting to a resolution which can be scaled up in integers by the driver - it will be scaled up.
 

Kyle Cross

Member
Oct 25, 2017
8,518
I'm kinda sorta freaking out right now.

I just updated to the latest Nvidia driver: 436.30

It has broken everything. HDR doesn't work, and I'm getting disastrous performance in Shadow of the Tomb Raider, 14-15 FPS. I've restarted multiple times, I've checked settings inside Nvidia Control Panel. What do I do? I've never had this happen before. Goddammit, I hate driver updates...
 
OP
OP
Mecha Meister

Mecha Meister

Next-Gen Guru
Member
Oct 25, 2017
2,820
United Kingdom
I'm kinda sorta freaking out right now.

I just updated to the latest Nvidia driver: 436.30

It has broken everything. HDR doesn't work, and I'm getting disastrous performance in Shadow of the Tomb Raider, 14-15 FPS. I've restarted multiple times, I've checked settings inside Nvidia Control Panel. What do I do? I've never had this happen before. Goddammit, I hate driver updates...

Uninstall it and roll back to a previous verison.
 

fivestarman

Member
Oct 28, 2017
377
Anyone else had issues with Apex Legends with the latest Driver? I have been dealing with some serious stutter when playing Apex, even though the frame rate is staying above 100.

So far I've attempted to:
- Uninstall and re-download Apex game files
- Move game files to SSD
- Re-installed Origin
- Used DDU to clear Driver and GeForce Experience, reinstalled Driver only

If I was to rollback to a previous driver, should I go the DDU route again?

Intel i5 8400k + EVGA 1070Ti
 

Kyle Cross

Member
Oct 25, 2017
8,518
I tried uninstalling the driver to install an older one and on the process it changed my driver type from Standard to DCH. I'm really confused.
 

Kyle Cross

Member
Oct 25, 2017
8,518
The driver likely didn't install properly for whatever local reason.
You can always clean up with DDU (and without internet connection) and install it on a clean system.
Okay, did this. I'm back on the driver I was on prior to the update. Sadly, the HDR and framerate issues I was getting on the newest driver that prompted this mess persist even tho everything was perfect on this driver before the update...

EDIT: Fixed the HDR problem. It was my second display being plugged in. I have always had it plugged in and it has never caused a conflict, but now it is. So I just have to unplug it when playing HDR games that don't use Windows HDR.
 
Last edited: