• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Deeke

Member
Oct 25, 2017
966
United States
As someone who doesn't know much about VRS (Variable Rate Shading), I did some digging...and this sounds pretty amazing for next-gen.

VRS is basically a nifty trick that lets developers lower fidelity of areas we don't really see in order to reduce GPU load, thereby freeing up resources to push faster frame rates. The reduction in quality is slight and apparently not noticable, and could help tremendously in third-person open-world games with faraway distances, and maybe even first-person games with motion blur.

It might be the gateway for the 120FPS (4K?) gaming that Microsoft touted with the Xbox Series X.

In the following VRS On vs VRS comparison, the frame rate speeds jumped by 150% when Variable Rate Shading was turned on. Speaking of ray-tracing, VRS may help stabilize the FPS hit when RT is turned on/used in-game.



Right now VRS is only available for Turing-based Nvidia GPUs. But AMD's new Navi cards will support it (along with ray-tracing).

From a 3DMark press release:

Shading rate refers to the number of pixel shader operations called for each pixel. Higher shading rates improve accuracy but are more demanding for the GPU. Lower shading rates improve performance at the cost of visual fidelity.

With Variable-Rate Shading, developers can vary the shading rate within a single frame. By using VRS to lower the shading rate for parts of the frame that are in deep shadow, far from the camera, or peripheral to the player's focus, for example, a game can run at a higher frame rate with little perceptible loss in visual quality.

From tomshardware:

Developers can reduce the visual fidelity in appropriate areas of the frame, so it's less demanding on a PC's graphics card. That can boost framerates and also let lower-end GPUs run a game better than it would without VRS.

I'm not a games developer, so I can't talk in length on how this will affect games performance or development. I'd love to hear more in-depth explanations on the value of VRS especially in a closed/synergized environment like a console.

But even with this layman's understanding of VRS the feature seems incredibly valuable and could open up tons of new doors for next-gen experiences.

In tandem with the SSD, which can apparently be used as Virtual RAM to feed data/textures/assets directly to the GPU and CPU (via memory paging? this might be wrong), VRS could make consoles much more efficient and reduce the amount of compromise needed to run games at higher resolutions with higher frame rates.

Bonus: Here's a really long Nvidia VRS presentation from SIGGRAPH 2018:

 

bsigg

Member
Oct 25, 2017
22,536
What is Variable Rate Shading?

In a nutshell, it's a powerful new API that gives the developers the ability to use GPUs more intelligently.

Let's explain.

For each pixel in a screen, shaders are called to calculate the color this pixel should be. Shading rate refers to the resolution at which these shaders are called (which is different from the overall screen resolution). A higher shading rate means more visual fidelity, but more GPU cost; a lower shading rate means the opposite: lower visual fidelity that comes at a lower GPU cost.

Traditionally, when developers set a game's shading rate, this shading rate is applied to all pixels in a frame.

There's a problem with this: not all pixels are created equal.

VRS allows developers to selectively reduce the shading rate in areas of the frame where it won't affect visual quality, letting them gain extra performance in their games. This is really exciting, because extra perf means increased framerates and lower-spec'd hardware being able to run better games than ever before.

VRS also lets developers do the opposite: using an increased shading rate only in areas where it matters most, meaning even better visual quality in games.

On top of that, we designed VRS to be extremely straightforward for developers to integrate into their engines. Only a few days of dev work integrating VRS support can result in large increases in performance.

Our VRS API lets developers set the shading rate in 3 different ways:
  • Per draw
  • Within a draw by using a screenspace image
  • Or within a draw, per primitive
There are two flavors, or tiers, of hardware with VRS support. The hardware that can support per-draw VRS hardware are Tier 1. There's also a Tier 2, the hardware that can support both per-draw and within-draw variable rate shading.

Tier 1

By allowing developers to specify the per-draw shading rate, different draw calls can have different shading rates.

For example, a developer could draw a game's large environment assets, assets in a faraway plane, or assets obscured behind semitransparency at a lower shading rate, while keeping a high shading rate for more detailed assets in a scene.

Tier 2

As mentioned above, Tier 2 hardware offer the same functionality and more, by also allowing developers to specify the shading rate within a draw, with a screenspace image or per-primitive. Let's explain:

Screenspace image

Think of a screenspace image as reference image for what shading rate is used for what portion of the screen.

By allowing developers to specify the shading rate using a screenspace image, we open up the ability for a variety of techniques.

For example, foveated rendering, rendering the most detail in the area where the user is paying attention, and gradually decreasing the shading rate outside this area to save on performance. In a first-person shooter, the user is likely paying most attention to their crosshairs, and not much attention to the far edges of the screen, making FPS games an ideal candidate for this technique.

Another use case for a screenspace image is using an edge detection filter to determine the areas that need a higher shading rate, since edges are where aliasing happens. Once the locations of the edges are known, a developer can set the screenspace image based on that, shading the areas where the edges are with high detail, and reducing the shading rate in other areas of the screen. See below for more on this technique…

Per-primitive

Specifying the per-primitive shading rate means that developers can within a draw, specify the shading rate per triangle.

One use case for this would be for developers who know they are applying a depth-of-field blur in their game to render all triangles beyond some distance at a lower shading rate. This won't lead to a degradation in visual quality, but will lead to an increase in performance, since these faraway triangles are going to be blurry anyway.

Developers won't have to choose between techniques

We're also introducing combiners, which allow developers to combine per-draw, screenspace image and per-primitive VRS at the same time. For example, a developer who's using a screenspace image for foveated rendering can, using the VRS combiners, also apply per-primitive VRS to render faraway objects at lower shading rate.

devblogs.microsoft.com

Variable Rate Shading: a scalpel in a world of sledgehammers - DirectX Developer Blog

One of the sides in the picture below is 14% faster when rendered on the same hardware, thanks to a new graphics feature available only on DirectX 12. Can you spot a difference in rendering quality? Neither can we. Which is why we’re very excited to announce that DirectX 12 is the first...
 
OP
OP
Deeke

Deeke

Member
Oct 25, 2017
966
United States

Dreamboum

Member
Oct 28, 2017
22,838
Can it be used to push for higher graphical fidelity? If so, I wouldn't expect a framerate or resolution increase
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
Yes, it's going to be crucial going forward, particularly as 8K displays become more prevalent.
 

exodus

Member
Oct 25, 2017
9,936
Consoles are going to leapfrog PCs early next-gen for a bit, I think.

Console games are already in a generally better state due to the widespread use of dynamic resolution and good upscaling techniques. NVidia/AMD have a lot of driver work to do to catch up. Thankfully things are getting better with games like Gears 5 and Jedi Fallen Order having fantastic dynamic resolution support. But, outside of a handful of titles, gaming on a 4K display is a sub-par experience on PC unless you have a 2080 Ti, and even moreso if you're someone who likes to aim for ~90-120+ fps. Techniques like Dynamic Resolution and VRS are going to be absolutely mandatory to bridge the massive 1080p -> 4K gap.

The problem is that a lot of these new features aren't properly supported by the majority of PC hardware out there, so games will be slow to transition. I seriously hope this is handled mostly at a driver/API level so that we get near 100% support from modern games.

This is where we're going to see performance advancements going forward though. We're nearing the end of Moore's Law, and while GPUs will continue to get faster, progress is going to be very slow and incremental. Techniques like DR and VRS will becomes more crucial as time goes on. Rendering a 4K image at full fidelity is incredibly inefficient.
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,817
Can it be used to push for higher graphical fidelity? If so, I wouldn't expect a framerate or resolution increase
You can supersample a tile instead of shading it in lower-than-display resolution so technically yes, it can be used for higher fidelity. But this isn't very interesting as running shading with supersampling isn't new.
 

Raide

Banned
Oct 31, 2017
16,596
I am interested to see actual in-game usage of this feature, just to see what developers can do.
 

Mars People

Comics Council 2020
Member
Oct 25, 2017
18,177
And yet console games will still probably use this, pump a shit ton more bells and whistles in the game and still end up at only 30fps.

You know it will happen.
 

Nooblet

Member
Oct 25, 2017
13,621
Can it be used to push for higher graphical fidelity? If so, I wouldn't expect a framerate or resolution increase
The idea is that it reduces detail in areas and in a manner where it isn't noticeable, so to do the opposite i.e. increase detail in areas which are not as taxing would basically mean you are adding details where it would not get noticed. So it's more or less pointless to do it in other direction.
 

exodus

Member
Oct 25, 2017
9,936
And yet console games will still probably use this, pump a shit ton more bells and whistles in the game and still end up at only 30fps.

You know it will happen.

Of course. We're only looking at a GPU that's 2x as powerful as the 1X, so all things being equal, the XSX could theoretically run a 4K30 X1X game at 4K60...and not much more than that. GPUs simply haven't gotten that much faster in the last few years.
 

Wetalo

Member
Feb 9, 2018
724
Does it cause pop-in at all I'm wondering? Like does it have the same issues/worries as LODs or does it somehow improve quality as you get close without any noticeable pop-in?
 

plagiarize

Eating crackers
Moderator
Oct 25, 2017
27,491
Cape Cod, MA
I think people are forgetting just how weak the CPU is in the base consoles when they talk about how Series X 'only' has twice the GPU power of the X.

CPU is definitely a large hurdle for 30 fps in those consoles. They were always underpowered CPU wise (compared to their GPUs or PCs of the time). Series X won't be. For open world style games, 60 fps should be much more readily achievable now.

That said, we don't know how much of a spanner in the works for such thoughts lockheart might be.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
The idea is that it reduces detail in areas and in a manner where it isn't noticeable, so to do the opposite i.e. increase detail in areas which are not as taxing would basically mean you are adding details where it would not get noticed. So it's more or less pointless to do it in other direction.

Not really. You can add extra shading where it would get noticed.
 

exodus

Member
Oct 25, 2017
9,936
I think people are forgetting just how weak the CPU is in the base consoles when they talk about how Series X 'only' has twice the GPU power of the X.

CPU is definitely a large hurdle for 30 fps in those consoles. They were always underpowered CPU wise (compared to their GPUs or PCs of the time). Series X won't be. For open world style games, 60 fps should be much more readily achievable now.

That said, we don't know how much of a spanner in the works for such thoughts lockheart might be.

Yes, the CPU is a hurdle, but we know from PC ports that the GPU is still primarily the limiting factor at 4K resolution. The X1X GPU is pretty much in line with what we would expect for 4K30. Consoles are currently at the breakpoint point for both CPU and GPU utilization.
 

exodus

Member
Oct 25, 2017
9,936
Areas that are noticed and not affected by VRS means that that area is already taxing...I feel you wouldn't want to tax it more.

The point is to have more dynamic control of your rendering to get the best visual fidelity you can given your allocated frametime budget. Dynamic Resolution has been absolutely crucial this gen on all consoles to get the best image quality possible at any given point in time. VRS simply allows for even greater granularity, which is a good thing.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
Areas that are noticed and not affected by VRS means that that area is already taxing...I feel you wouldn't want to tax it more.

No, it does not mean that at all. Anyway, you don't have to believe me - this is a quote from Microsoft when they announced their VRS API at GDC 2019.

VRS also lets developers do the opposite: using an increased shading rate only in areas where it matters most, meaning even better visual quality in games.
 
Last edited:

exodus

Member
Oct 25, 2017
9,936
My dream for PC gaming moving forward is that any modern GPU will run modern games at a given performance target. Say 60 fps, or 120fps (user customizable). These dynamic rendering techniques will make it so that upgrading your GPU will now simply impact your image quality rather than your performance. DR and VRS will allow for 100% GPU utilization at all times giving you the best image quality possible within your GPU's frametime budget given your performance target.

Foveated rendering support would be icing on the cake, and give PC hardware an even greater advantage, but I feel like support for that will never be terribly widespread.
 

plagiarize

Eating crackers
Moderator
Oct 25, 2017
27,491
Cape Cod, MA
Yes, the CPU is a hurdle, but we know from PC ports that the GPU is still primarily the limiting factor at 4K resolution. The X1X GPU is pretty much in line with what we would expect for 4K30. Consoles are currently at the breakpoint point for both CPU and GPU utilization.
Twice the GPU power won't get you to 60 in most open world games. Not close. It's worth remembering.

If they can't get the game logic to a stable 60, why limit what you're doing with the GPU? If my choice is unlocked framerate at 45 fps, or locking the game down to 30 and making the game look a lot better, I know which choice I'm going to make. I'd argue most open world games are GPU limited and 30 fps *because* the CPU is weak.
 

pswii60

Member
Oct 27, 2017
26,646
The Milky Way
I think people are forgetting just how weak the CPU is in the base consoles when they talk about how Series X 'only' has twice the GPU power of the X.

CPU is definitely a large hurdle for 30 fps in those consoles. They were always underpowered CPU wise (compared to their GPUs or PCs of the time). Series X won't be. For open world style games, 60 fps should be much more readily achievable now.

That said, we don't know how much of a spanner in the works for such thoughts lockheart might be.
If the suggestion is that games would be somehow "crippled" by supporting Lockhart then that would make 60fps on XSX even more likely given the games would technically be doing less. But I've no doubt games will be developed for XSX/PS5 as the leads and then downscaled to Lockhart.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
Twice the GPU power won't get you to 60 in most open world games. Not close. It's worth remembering.

If they can't get the game to a stable 60, why limit what you're doing with the GPU? If my choice is unlocked framerate at 45 fps, or locking the game down to 30 and making the game look a lot better, I know which choice I'm going to make.

Something else that people need to realize is that we won't be forced to 30 fps or 60fps any more thanks to Variable Refresh Rate being built in the HDMI 2.1 specification. This means TVs going forward will all feature VRR, so developers do not necessarily have to target 30fps or 60fps but they can just leave it at unlocked 45, 50 or whatever.
 

zombiejames

Member
Oct 25, 2017
11,912
I agree, I love VRS and the impact it can have.

I worry it's going to be treated a lot like temporal anti-aliasing, motion blur, and other features that improve the image/performance in motion but looks "bad" in screenshots. You just know some people are going to compare screenshots, see the VRS ones are "blurry", "not detailed", or "smeared" and automatically vilify it because they love their razor-sharp pixels. I don't get it because people play games in motion and not one screenshot at a time, but you know it's going to happen.
 

exodus

Member
Oct 25, 2017
9,936
If the suggestion is that games would be somehow "crippled" by supporting Lockhart then that would make 60fps on XSX even more likely given the games would technically be doing less. But I've no doubt games will be developed for XSX/PS5 as the leads and then downscaled to Lockhart.

With widespread VSR support, I expect Lockhart games to simply run at a lower visual fidelity, with otherwise identical performance. Hopefully.
 

Flappy Pannus

Member
Feb 14, 2019
2,335
I think it will definitely be important, but the gains the 3DMark benchmark are showing somewhat oversell it from a performance uplift perspective imo - it's designed to benchmark VRR after all, you can't factor the gains shown here into what you'd see in a regular game.

VRR has been sold as a way to gain performance with no perceptible loss in image quality, and from having run the benchmark myself (where I got a 38% uptick) at 4K on my 1660, I can definitely notice the loss in detail with VRR on vs off. It's a best-case scenario in terms of performance uplift, it's not like rending it at 1080p or anything but it's definitely blurrier and it's a scene with a ton of small detail that VRR can touch.

It's one more tool into making 4k/60 viable, but it's not miraculous.
 

pswii60

Member
Oct 27, 2017
26,646
The Milky Way
I agree, I love VRS and the impact it can have.

I worry it's going to be treated a lot like temporal anti-aliasing, motion blur, and other features that improve the image/performance in motion but looks "bad" in screenshots. You just know some people are going to compare screenshots, see the VRS ones are "blurry", "not detailed", or "smeared" and automatically vilify it because they love their razor-sharp pixels. I don't get it because people play games in motion and not one screenshot at a time, but you know it's going to happen.
Motion blur is always the first thing I disable in every game where it is an option! Hate it.
 

Vash63

Member
Oct 28, 2017
1,681
No, it does not mean that at all. Anyway, you don't have to believe me - this is a quote from Microsoft when they announced the VRS API at GDC 2019.

VRS also lets developers do the opposite: using an increased shading rate only in areas where it matters most, meaning even better visual quality in games.

This is a bit pedantic but I wouldn't call it just "the VRS API" as that makes it sound like they were somehow announcing the first or only VRS API, when it was already shipped and in use in Vulkan in 2018 (and publicly shown in 2018 in id Tech)
 

Khrol

Member
Oct 28, 2017
4,179
Good call making this thread. This is just another piece in performance gain puzzle but I'm very excited about it.
 

DarthBuzzard

Banned
Jul 17, 2018
5,122
Fixed VRS is just a small stepping stone to dynamic foveated rendering. That is the true revolutionary feature for next gen, though it will be a VR thing and won't exactly work well outside of VR. However it's what will flip the rendering balance between VR and non-VR, making VR games easier to run than non-VR if they are optimized for this correctly and are utilizing the full potential of DFR.

Yes, this means VR graphics are generally going to be at the forefront, a complete reversal of today.

Maximum gains of 20x less pixels needing to be rendered, and 20x less rays needing to be sampled for ray/pathtracing. This is what will enable VR GPU performance to jump a whole decade into the future just by having this enabled and utlized.
 
Last edited:

Karak

Banned
Oct 27, 2017
2,088
Something else that people need to realize is that we won't be forced to 30 fps or 60fps any more thanks to Variable Refresh Rate being built in the HDMI 2.1 specification. This means TVs going forward will all feature VRR, so developers do not necessarily have to target 30fps or 60fps but they can just leave it at unlocked 45, 50 or whatever.
HDMI specs are not enforced on TV's. So "will all" is not accurate
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909

They are not enforced, sure, but the vast majority of manufacturers will feature VRR in their televisions from 2020 going forward.
 

Alexious

Executive Editor for Games at Wccftech
Verified
Oct 26, 2017
909
This is a bit pedantic but I wouldn't call it just "the VRS API" as that makes it sound like they were somehow announcing the first or only VRS API, when it was already shipped and in use in Vulkan in 2018 (and publicly shown in 2018 in id Tech)

Right, their API is more accurate.
 

Deleted member 7948

User requested account closure
Banned
Oct 25, 2017
1,285
In tandem with the SSD, which can apparently be used as Virtual RAM to feed data/textures/assets directly to the GPU and CPU (via memory paging? this might be wrong)
Just because you can, doesn't mean that you should. The difference in speeds between an SSD and RAM is so big that it doesn't make any sense to do so.
 

Karak

Banned
Oct 27, 2017
2,088
They are not enforced, sure, but the vast majority of manufacturers will feature VRR in their televisions from 2020 going forward.
Just not the same as "will all" nor what I am hearing on the testing side. Many models below premiums for instance. Thus that one huge benefit is actually not at all promised and I think thats important to point out. But here is to hoping.

As for the feature itself, I am all for that. Also interested in seeing how it works in games where we see a huge difference in gameplay speed potential and if its used super aggressively if folks notice and how much during those slow moments in a game.
Fun tech stuff for sure.