Unfortunately I don't have resolution options to choose from on my actual TV and setting it to PC mode didnt do any good. Weird and annoyingworked it out. it wont integer scale on my tv if i set it to 1080p but it DOES integer scale if i set it to 1920x1080 (not hdtv mode). Noticeable difference.
Holy hell, definitely need to grab this. If only my controller wasn't out of commission...
As some food for thought in regards to integer scaling being restricted to Turing cards (whether you like it or not or not is up to you, but I haven't seen anybody discuss why this is on a technical level yet so I figured I'll throw this information out there), it's most likely the reason why is that they are utilizing the Int32 cores of said Turing cards, something that Pascal and cards before it simply do not have.
Traditionally, trying to do integer processing on a GPU produces fuzzy results because what integers actually did in shaders requires processing them as floats regardless, so even when comparing what you'd think was integers wouldn't necessarily compare to exactly 0 or 1, for example.
Int32 cores solve this by outright doing what you'd expect with an integer value (no floating fuzziness, 1 will always equal 1, etc). So even the shaders that claim to be doing this already are technically faking it because your typical GPU can only calculate it as a float in the first place. This is a "fun" problem when dealing with colour palettes in restoring DOS era games.
This is all an educated guess of course, Nvidia haven't explained the sauce behind it yet, but I'm betting it's that.
As some food for thought in regards to integer scaling being restricted to Turing cards (whether you like it or not or not is up to you, but I haven't seen anybody discuss why this is on a technical level yet so I figured I'll throw this information out there), it's most likely the reason why is that they are utilizing the Int32 cores of said Turing cards, something that Pascal and cards before it simply do not have.
Traditionally, trying to do integer processing on a GPU produces fuzzy results because what integers actually did in shaders requires processing them as floats regardless, so even when comparing what you'd think was integers wouldn't necessarily compare to exactly 0 or 1, for example.
Int32 cores solve this by outright doing what you'd expect with an integer value (no floating fuzziness, 1 will always equal 1, etc). So even the shaders that claim to be doing this already are technically faking it because your typical GPU can only calculate it as a float in the first place. This is a "fun" problem when dealing with colour palettes in restoring DOS era games.
This is all an educated guess of course, Nvidia haven't explained the sauce behind it yet, but I'm betting it's that.
Anyone else with a 4k TV unable to set the integer setting without it resetting to aspect ratio?
You get integer increments of resolution without any filtering with integer scaling. So, for example, a 320x240 game will be scaled to 640x480, 1280x960, 2560x1920, etc. with each pixel of the original resolution being shown as 2x2, 4x4, 8x8, etc. without any smoothing in place. The most modern application of such scaling is outputting a 1080p resolution onto a 4K display which is exactly 2x2 scaling.Can someone explain to me integer scaling? What does it even mean? Pixel art wasn't displayed through integer scaling before? What?
You get integer increments of resolution without any filtering with integer scaling. So, for example, a 320x240 game will be scaled to 640x480, 1280x960, 2560x1920, etc. with each pixel of the original resolution being shown as 2x2, 4x4, 8x8, etc. without any smoothing in place. The most modern application of such scaling is outputting a 1080p resolution onto a 4K display which is exactly 2x2 scaling.
Before you get your resolution scaled to monitor resolution (with or without aspect ratio compensation) which pretty much always require multiplying the original resolution by some number with a fractional component, like 4.27 or something. This leads to the need to filter this stretched image as it will exhibit scaling artifacts otherwise but this filtering will always blur the image.
Thanks, I understand a bit about filtering and scaling but how does this play in nvidia's driver?
Imagine I would play Hotline Miami right now at 1080p for example, and I didn't have integer scaling...does this mean I've always played it with linear filtering?
Or does that only apply to 4K? (but why would the filtering even be needed, isn't the software/game engine deciding that?)
The description is in the driver:Thanks, I understand a bit about filtering and scaling but how does this play in nvidia's driver?
Modern pixel art games can be tricky as they tend not to actually render in the resolution which their art is in. They also have scaling routines in their engines already so they scale and stretch on their own when you specify the target output resolution. But if some game (I don't know about Hotline Miami specifically) do have some fixed low number rendering resolution then you can use it in such game's settings and driver's integer scaler to output this game to you monitor without any filtering. In most cases with older games this will lead to black borders though.Imagine I would play Hotline Miami right now at 1080p for example, and I didn't have integer scaling...does this mean I've always played it with linear filtering?
This applies to any resolution which is less or even to a half of your display resolution in both dimensions. Game's engine can do it's own thing but if you're outputting to a resolution which can be scaled up in integers by the driver - it will be scaled up.Or does that only apply to 4K? (but why would the filtering even be needed, isn't the software/game engine deciding that?)
I'm kinda sorta freaking out right now.
I just updated to the latest Nvidia driver: 436.30
It has broken everything. HDR doesn't work, and I'm getting disastrous performance in Shadow of the Tomb Raider, 14-15 FPS. I've restarted multiple times, I've checked settings inside Nvidia Control Panel. What do I do? I've never had this happen before. Goddammit, I hate driver updates...
Any idea why this might be happening?
The driver likely didn't install properly for whatever local reason.
Okay, did this. I'm back on the driver I was on prior to the update. Sadly, the HDR and framerate issues I was getting on the newest driver that prompted this mess persist even tho everything was perfect on this driver before the update...The driver likely didn't install properly for whatever local reason.
You can always clean up with DDU (and without internet connection) and install it on a clean system.