As I understand, Dice aren't using any proprietary Nvidia stuff, just working with Nvidia on implementing DXR (which isn't platform specific). I'm not sure when we can expect any other vendors to start offering cards compatible with DXR, but when they do, these effects should all be available.I'm super curious about how much of this update is on the Nvidia side and how much is on DICE's side. I'm a wee bit worried about trends where ray trace patches come out and the game has to be updated before it's at a playable level.
Super cool update nonetheless :)
By the second generation RTX cards hit there will be new RT enabled games with RT usage being a lot higher than what BFV is doing and the performance will likely stay more or less the same, at least on the Ultra quality levels.By the time second generation RTX cards hit the performance should be good if they can keep up the optimization work.
I still want to raise a problem that isn't directly about the tech, but about game design. I'm not sure this thread is the place, but I'll try planting the seed anyway.
Beside a general fascination about the tech (that I also share) you also have to remember that games aren't simply tech. Something that looks pretty doesn't mean it's also functional, and in the same way "realistic" isn't always preferable.
When it comes to Battlefield I always had a problem with the transition from BC2 and BF3/4 because the latter games had a lot of "environmental noise" that wasn't there in BC2. In BF4 (and BF5 even more) there's always some particle effect on screen. Dust, fire sparkles or whatever else. The ultimate effect is that BC2 showed for the most part the static geometry of a map. The terrain, the buildings, the trenches. What was shown on screen was functional to the mechanics of movement and shooting. More importantly, when you saw something "moving" on screen than it was likely another player to shoot at.
The effect is that from BF3 onward it's a lot harder to separate the functional visual cues from all the "shimmering" and visual noise that crowds the screen everywhere. It simply takes a lot more time to parse and interpret the image.
Now, all this could be just the simulation of the confusion of war. But in BF5 these shiny, vibrant reflections are a setting you can turn on and off. My opinion is that they represent a disadvantage for those who decide to turn them on. They once again add even more noise to the image, more bright effects in a game that is already too bright to parse.
Try to play a shooter in a hall of mirrors. It's not ideal.
So, beside all the fascination about the tech and the beauty of static screenshots, is this stuff really useful to the experience of playing this game?
That's one aspect. The other aspect is about computation budget. It's obvious that these reflections require a whole lot of computation power. If you were designing a game, would you really think it's optimal to spend so much of your computation budget on just reflections and maybe shadows, or maybe you'd prefer to improve those aspects that seem way behind like animations, large scale environments, number of entities and so on? The actual stuff that you directly engage with.
For example, are we sure we NEED realistic reflections when simulated/optimized ones can already offer a fantastic image quality? Do we really need to simulate the whole environment and not just screen space effects? Imho, with all the dev time spent to write that DXR code and computation budget you could have also improved those aspects in a traditional way, and probably with a better practical outcome.
You have to realize this is just the very beginning of a new technology that represents a paradgim shift in how games are rendered. Of course it's a bit ridiculous if you watch at it from the outside to make such a fuss about fancy reflections, but the work that will be put in today will pay off in the future.
Even when you look at things in the future perspective you have to motivate that this is an ideal way to spend the computation budget on.
There are many different directions for the technology to improve, it just doesn't seem the optimal way to do it. It really does it all sound like advertisement for a product that otherwise would be completely ignored. A way to create desire and sell something that wouldn't get any attention.
And, eventually, it will be forced because everything else will be left behind. So we are now pushed to adopt this tech even if it's not worth it. And it will never gain its worth, it will just be the only thing left on the table.
Yeah it's weird that they haven't , I mean even if they can't train the game in real-time, they can still train it will pre-rendered footage.
Agreed. Honestly, any game implementing RTX should also use DLSS to properly exploit the hardware of Turing GPUs.
Yeah, even games that is said to support it are nowhere to be found. Hellblade got new hdr update but no dlss, darksiders 3 were also launched without it. If it was easy to implement like what nvidia has claimed why don't have and any retail games that ship with it. Only a benchmark demo that is delayed....Agreed. Honestly, any game implementing RTX should also use DLSS to properly exploit the hardware of Turing GPUs.
Specifically, GeForce RTX 2080 Ti users will now be able to play at over 60 FPS at 2560x1440 with DXR Raytraced Reflections set to Ultra.
Nice. That's decent performance.
No lies detectedDICE should really just appoint you the community manager, Theorry. You've done far more marketing and advertising for the game than EA has since the reveal.
By the second generation RTX cards hit there will be new RT enabled games with RT usage being a lot higher than what BFV is doing and the performance will likely stay more or less the same, at least on the Ultra quality levels.
Yeah, even games that is said to support it are nowhere to be found. Hellblade got new hdr update but no dlss, darksiders 3 were also launched without it. If it was easy to implement like what nvidia has claimed why don't have and any retail games that ship with it. Only a benchmark demo that is delayed....
It's fine if the top tier settings tank perf hard, but I'd say the biggest issue with this first gen implementation is the lack of (downward) performance scalability. Dropping it to low only yielded modest improvements, in an ideal word we'd have settings that ran great on 2070 and settings that tanked 2080ti on the same game to cover low end and future hardware.
Check your PMs :DI'd love to know what that destruction/dynamic geometry related bug was.
Anyway, all of that makes sense, and I think the community as a whole will get much better at effectively using this technology over the coming months and years. And not just raytracing, but also the new pipeline architecture and variable rate shading, which I'm also really excited about.
Well, it is an optional effect. So, it is not really a problem in that sense - you do not expect everyone to use it at first any way.That's one aspect. The other aspect is about computation budget. It's obvious that these reflections require a whole lot of computation power. If you were designing a game, would you really think it's optimal to spend so much of your computation budget on just reflections and maybe shadows, or maybe you'd prefer to improve those aspects that seem way behind like animations, large scale environments, number of entities and so on? The actual stuff that you directly engage with.
For example, are we sure we NEED realistic reflections when simulated/optimized ones can already offer a fantastic image quality? Do we really need to simulate the whole environment and not just screen space effects? Imho, with all the dev time spent to write that DXR code and computation budget you could have also improved those aspects in a traditional way, and probably with a better practical outcome.
DLSS requires tensor cores processing which may be impossible to implement alongside raytracing if such tensor cores are used for RT denoising. In case of BFV specifically it should be possible though as they stated that they are using their own compute based denoiser.Agreed. Honestly, any game implementing RTX should also use DLSS to properly exploit the hardware of Turing GPUs.
This is more an issue with BFV's RT implementation than Turing's h/w. People seem to completely miss the point that RT performance is mostly about s/w and not h/w. It is completely possible to add RTX usage in a way which won't result in any performance losses compared to rendering without RTX - but you might not be able to spot any gains in image quality either.It's fine if the top tier settings tank perf hard, but I'd say the biggest issue with this first gen implementation is the lack of (downward) performance scalability. Dropping it to low only yielded modest improvements, in an ideal word we'd have settings that ran great on 2070 and settings that tanked 2080ti on the same game to cover low end and future hardware.