• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Not really a surprise given they have a marketing deal with nvidia.
 

Bluelote

Member
Oct 27, 2017
2,024
PC is getting D3D12 version of CP2077 at launch and as the only version probably.

what confuses me is that cyberpunk lists Windows 7 as compatible, I know there is partial version of DX12 on Windows 7 for WoW and such, but that doesn't look ideal or complete, they list under minimum Windows 7, and GTX 780 (DX11.0 feature support level on DX12) I'm puzzled why they don't simply have DX11 support at this point... DX11 is a great option to have in terms of compatibility and preservation
 

Deleted member 54216

User requested account closure
Banned
Feb 26, 2019
927
Because RT won't be an option at consoles at launch at all?
Yeah, because they targeted current gen. Dont think consoles from 2013 can manage to pull of RT so what's your point lol.

Pretty sure the next gen patch/Version will change that and we will get RT & 4k HDR and I don't have to upgrade my graphic card or anything for that 😬
 
Dec 4, 2017
11,481
Brazil
I thought Ray Tracing was some kind of technology developed by Nvidia, like the DLSS
No. Ray Tracing is a generic term for a type of 3D rendering, it has existed both practically and theoretically for decades. Just not in games and not in real-time
You would be forgiven, seeing how Nvidia pushed it very heavily for their RTX 20XX cards. But "ray tracing" is a generic term for methods of global illumination that actually shoot "rays" into the scene to compute how reflections and other shading information will look like. It allows you to very accurately take into account indirect lighting, which can bounce around the scene before it hits any object in question. The price you pay for that accuracy is performance, even with dedicated hardware you will see a huge hit when comparing the same scene without ray tracing.
Thanks for the answers!
 
Last edited:

Jedi2016

Member
Oct 27, 2017
15,669
Yeah, because they targeted current gen. Dont think consoles from 2013 can manage to pull of RT so what's your point lol.

Pretty sure the next gen patch/Version will change that and we will get RT & 4k HDR and I don't have to upgrade my graphic card or anything for that 😬
Still gotta upgrade your console for that. Same difference.
 

Mivey

Member
Oct 25, 2017
17,826
I thought Ray Tracing was some kind of technology developed by Nvidia, like the DLSS
You would be forgiven, seeing how Nvidia pushed it very heavily for their RTX 20XX cards. But "ray tracing" is a generic term for methods of global illumination that actually shoot "rays" into the scene to compute how reflections and other shading information will look like. It allows you to very accurately take into account indirect lighting, which can bounce around the scene before it hits any object in question. The price you pay for that accuracy is performance, even with dedicated hardware you will see a huge hit when comparing the same scene without ray tracing.
 

dgrdsv

Member
Oct 25, 2017
11,880
what confuses me is that cyberpunk lists Windows 7 as compatible, I know there is partial version of DX12 on Windows 7 for WoW and such, but that doesn't look ideal or complete, they list under minimum Windows 7, and GTX 780 (DX11.0 feature support level on DX12) I'm puzzled why they don't simply have DX11 support at this point...
Because they are likely using D3D12 features which aren't available in D3D11 and porting these features back to D3D11 makes zero sense for a 2020 game.
We'll see how 780 will fare in CP2077 but I expect it to be borderline supported really as Kepler isn't too good with D3D12.
And cmon, you can't expect devs to optimize their engines for GPUs from 2013 in 2020.

DX11 is a great option to have in terms of compatibility and preservation
Not sure what you mean by that but DX11 is certainly limiting what is possible on modern GPU h/w and from compatibility point of view D3D12 isn't that much worse than D3D11 nowadays. Sure, there are still some DX11 GPUs out there but they are unlikely to be able to run CP2077 in playable framerates anyway. We're talking about GTX500 and HD6000 series after all.
 
Oct 27, 2017
9,427
Yeah, because they targeted current gen. Dont think consoles from 2013 can manage to pull of RT so what's your point lol.

Pretty sure the next gen patch/Version will change that and we will get RT & 4k HDR and I don't have to upgrade my graphic card or anything for that 😬
You think the next gen consoles will do full 4K with raytracing? More close to 1440p/30hz with reflections for RT upscaled to 4k. Not really in the same ballpark.
 

TSM

Member
Oct 27, 2017
5,823
This would seem to imply that RT isn't something that will "just work" with AMD cards for existing RT games. If RT implementation ends up being hardware specific that seems like it'd be a huge hurdle for AMD.
 

VoidCommunications

Alt Account
Banned
Aug 2, 2020
199
Am I missing something, but isn't conclusion that AMD cards can't run games RT effects on day 1 just assumption / speculation by WCCTech?

Not something developer said.
Yup. UGH.

UGH. So infuriating how little awareness tech media have. AMD implements the same driver functions and API that NVIDIA does. This is likely purely motivated by the co-marketing deal that NVIDIA has with Cyberpunk. But the writer is too dense to be aware of that stuff.

It is genuinely possible AMD cards won't be able to keep up with ray-tracing in games like this, but from a technical standpoint there's nothing stopping the option from being turned on. That's just more marketing magic.

This would seem to imply that RT isn't just something that will "just work" with AMD cards for existing RT games. If RT implementation ends up being hardware specific that seems like it'd be a huge hurdle for AMD.

The features for RTX are determined by Microsoft's Ray Tracing API in DX12. They make sure to work with both NVIDIA and AMD to create the API, so it's not something where AMD is missing functionality. In order to put out a card with ray tracing, it has to have exactly the same functionality. To go into more detail, both AMD and NVIDIA have to implement this spec:

DirectX Raytracing (DXR) Functional Spec

Engineering specs for DirectX features.

If you implement that spec as a hardware and driver engineer, you can say DXR and DirectX Ray-tracing. Vulkan has a similar spec. Then as a game developer you follow that spec from the other end, and you know all the functions work exactly how they're supposed to because the companies implemented the specification. There's no additional features NV would be using. Originally, NVIDIA implemented ray tracing through their extensions to the driver, so you had to have an NVIDIA card. But now developers want to transition to using the standard APIs for these tasks, rather than special purpose extension functions. I do not think additional ray tracing extension functions have been added with this hardware.

The NV card may be more performant, but that's no reason to disable the option in menus. It's purely a marketing decision. The new 3070 series does not include new features or additional ray tracing hardware. It's just faster and more performant, and this decision appears to come from the co-marketing deal NVIDIA has with Cyberpunk.

I imagine the engineers working on ray tracing at AMD are furious about this decision from CD Projeckt. Maybe it's a requirement of the cash NVIDIA hands them, I don't know. It could be that they don't want Cyberpunk to perform outside very specific parameters. I know I would be pissed if people thought all my hard work "wasn't enough" because of some fancy politicking.

This is entirely because NVIDIA has spent a fuck-ton of money developing Cyberpunk 2077. It has always been a title to showcase the RTX effects and especially to justify them. If suddenly you can play it on an AMD card, then NVIDIA will have wasted all their money. Probably wouldn't have been a problem if the game came out on time since AMD wouldn't have had hardware yet.
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,880
This would seem to imply that RT isn't just something that will "just work" with AMD cards for existing RT games. If RT implementation ends up being hardware specific that seems like it'd be a huge hurdle for AMD.
It will "just work" but that doesn't mean that it will work optimally or properly.

Still, I want to see the exact quote on this from the devs, not the text of WCCFT.
 

Deleted member 54216

User requested account closure
Banned
Feb 26, 2019
927
You think the next gen consoles will do full 4K with raytracing? More close to 1440p/30hz with reflections for RT upscaled to 4k. Not really in the same ballpark.
Right, I guess you don't know what these 500 bucks machines are capable of.
Not in the same ballpark, you right, I don't have to spent twice that just to get the same result on PC lol.

I just buy a next gen console, buy a game and there you go. RT + 4K HDR, for 500€.
My body is ready for this gen 👀
 

LCGeek

Member
Oct 28, 2017
5,857
This would seem to imply that RT isn't something that will "just work" with AMD cards for existing RT games. If RT implementation ends up being hardware specific that seems like it'd be a huge hurdle for AMD.

just like drivers its amd job to get this done.

Its a huge hurdle but if they want to compete with nvidia gotta start stepping up with dev tools.

RT was never gonna be easy or just work cause amd is working in dxr in some fashion. More details would be nice but as deadlines get closet on RDNA 2 I'm not hopeful on software side as much as I am hardware. Shame cause they haven't had this chance in a decade like this.
 
Oct 27, 2017
9,427
Right, I guess you don't know what these 500 bucks machines are capable of.
Not in the same ballpark, you right, I don't have to spent twice that just to get the same result on PC lol.

I just buy a next gen console, buy a game and there you go. RT + 4K HDR, for 500€.
My body is ready for this gen 👀

I am perfectly aware what they are capable of. That's why the idea of RT + 4K HDR is misguided as I stated before. Full raytracing and full 4k will not be a thing for this game on the next gen consoles.
 

Trace

Member
Oct 25, 2017
4,690
Canada
I am perfectly aware what they are capable of. That's why the idea of RT + 4K HDR is misguided as I stated before. Full raytracing and full 4k will not be a thing for this game on the next gen consoles.

Unfortunately the words "Ray Tracing" mean almost nothing now since it's a marketing term. "Actual" path traced games like Marbles at Night demo that Nvidia put out, which ran at 1440p/30fps with DLSS on, with a RTX 3090, are generally what I would consider ray tracing to be. But now RT can mean anything from that to a single RT effect being used only for shadows and now gamers think they're the same thing.

Consoles obviously don't have the muscle to do anything close to path tracing this gen, maybe by the time next gen rolls around.
 

ASTROID2

One Winged Slayer
Member
Oct 25, 2017
1,019
I didn't even know there were games that had ray tracing without nvidia cards.
 

packy17

Banned
Oct 27, 2017
2,901
I didn't even know there were games that had ray tracing without nvidia cards.

Nvidia's marketing has done a good job to keep it that way. To be fair, if ray tracing is a priority for anyone looking to buy a GPU right now, Nvidia is still the by far the better choice simply because of DLSS.
 

VoidCommunications

Alt Account
Banned
Aug 2, 2020
199
I thought Ray Tracing was some kind of technology developed by Nvidia, like the DLSS


Thanks for the answers!
Not trying to look down at you, but this is really impressive for NVIDIA in my opinion. If they've actually convinced some people that they invented ray tracing, that's fucking wild. Not your fault at all, since that's clearly what their marketing is going for. But damn, that's really upsetting for me personally.

Ray tracing is ancient. It was used to generate some of the first computer generated images back in the day. It's a really simple mathematical technique, much like the words "differentiation", "integration" or "calculus". NVIDIA's hardware includes a special chip that performs a single math operation used in ray tracing, very fast. Normally, ray tracing is slow because that single operation takes a while. The operation is ray -> triangle intersection. Does this light ray hit this triangle? needs to be asked millions of times per second. NVIDIA implemented a super cool chip for doing intersection very fast. AMD has also implemented a chip to do this, but it works differently internally. AMD was slower to make the chip and by every account, there's will likely perform slower. That is, they won't be able to perform as many ray-triangle intersections as quickly. But all the API functions will look the same.

Microsoft and the Khronos consortium (the committee who makes OpenGL and Vulkan) work with AMD and NVIDIA to determine what hardware they're adding and a way to present a common interface for that hardware. They're both committees that both companies work with so that developers have a standardized set of functions to use. Then you don't have to relearn everything to change which card you develop for. NVIDIA presented a proposal to both Khronos and Microsoft for their RTX hardware, with some functions to use it back in 2017 or so. From those proposed functions, Microsoft/Khronos talked to AMD and asked what their hardware is going to look like, so they can choose functions that work for both. It's all set up this way due to a lot of history. With AMD talked to, Microsoft developed and announced DXR. Now both companies will implement the same features, but differentiation will occur in performance.

It doesn't make sense to disable the features on AMD since the code should look practically identical. The only differences in code will be for performance rather than functionality. Now on a console game, this would imply a removal of the feature. Since you don't want any low-performance features in a console game. But this is PC, and I don't think they are resisting adding the option because it would tank frame-rates on AMD or something. That doesn't make much sense to me, and as a developer myself I really don't see how there would be any distinction between the code for the different hardware at least in terms of functionality. That's only because Microsoft has explicitly standardized the functions you need to call and how they work.

It all leads back to the pressure that NVIDIA is putting on this game. Like in my work, we are hearing about the stress from this game constantly. They put so much money into this game that it needs to return big time for NVIDIA. But hey, this will be trivial to test when the game comes out. Should be able to run it on AMD cards anyway with some creative binary patching/hooking and see how much bullshit they're spouting.
 
Last edited:

Spark

Member
Dec 6, 2017
2,539
I don't think i miss anything when I just pay 500 for a console that will most likely also get RT for cyberpunk with the next gen patch/Version.
By the time the consoles get AMD RT for this game (next year), AMD PC cards will also likely get it. And that RT solution will no doubt be worse than what Nvidia is getting this November. I'd rather just play on an Nvidia PC and avoid all that fluffing around. Why wait for a worse experience?
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,684
Unfortunately the words "Ray Tracing" mean almost nothing now since it's a marketing term. "Actual" path traced games like Marbles at Night demo that Nvidia put out, which ran at 1440p/30fps with DLSS on, with a RTX 3090, are generally what I would consider ray tracing to be. But now RT can mean anything from that to a single RT effect being used only for shadows and now gamers think they're the same thing.

Consoles obviously don't have the muscle to do anything close to path tracing this gen, maybe by the time next gen rolls around.
I suspect all it is going to mean is loads of mirror surfaces, even in regular SSR for a bit until the novelty wears off .
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,207
Dark Space
This means basically nothing, as it's safe to assume CDPR hasn't even touched an AMD RDNA2 desktop GPU up to this moment in time.
 

Deleted member 1476

User requested account closure
Banned
Oct 25, 2017
10,449
I know why I prefer consoles. I don't have to deal with this kind of nonsense.

Right, I guess you don't know what these 500 bucks machines are capable of.
Not in the same ballpark, you right, I don't have to spent twice that just to get the same result on PC lol.

I just buy a next gen console, buy a game and there you go. RT + 4K HDR, for 500€.
My body is ready for this gen 👀

Lmao at the amount of trolling people do here.
 

Jedi2016

Member
Oct 27, 2017
15,669
I'm actually surprised that RT is not vendor-locked anyway.
This is a future I fear, actually. Especially for these games that are "built" around one or the other with these marketing deals and stuff. NVidia-centric game? Sorry, Big Navi won't have RT. AMD-centric game? Sorry, RTX won't work here.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,207
Dark Space
This is a future I fear, actually. Especially for these games that are "built" around one or the other with these marketing deals and stuff. NVidia-centric game? Sorry, Big Navi won't have RT. AMD-centric game? Sorry, RTX won't work here.
Good thing Microsoft DXR is an open, vendor agnostic API.

You people act like developers are forced to use some bespoke method to provide ray tracing, when they are not.

It'd be like having DX12 or 16xAF exclusivity.
 

Xando

Member
Oct 28, 2017
27,314
This means basically nothing, as it's safe to assume CDPR hasn't even touched an AMD RDNA2 desktop GPU up to this moment in time.
Shouldn't AMD have sent test units (kinda like dev kits) to big devs by now? (I know nothing about how those PC manufacturers handle the pre release phase so sorry if it's dumb)
 
Dec 26, 2017
1,726
Firelink Shrine
Nvidia put in the effort, so of course.

You can't just tick the "ray tracing" box and have it work like magic, and AMD just don't have the developer relations capacity and capability for whatever reason.

Even if AMD come out with some masterwork GPU, they're still going to be tremendously behind Nvidia on actual implementation.
until AMD can deliver an AI super resolution solution on par with DLSS, RTX is irrelevant on their hardware. From my experience, RTX is too much a performance hit unless paired with DLSS.
 

VoidCommunications

Alt Account
Banned
Aug 2, 2020
199
Good thing Microsoft DXR is an open, vendor agnostic API.

You people act like developers are forced to use some bespoke method to provide ray tracing, when they are not.

It'd be like having DX12 or 16xAF exclusivity.
What's that thing about everything old being new again? wheel of time or whatever? DirectX exists for a reason. It'd be interesting to see things move away from that for a bit just from big figure marketing money, but yeah folks really have odd takes here.

Shouldn't AMD have sent test units (kinda like dev kits) to big devs by now? (I know nothing about how those PC manufacturers handle the pre release phase so sorry if it's dumb)
Yeah they probably did but CDPR is all hands on deck right now. I imagine they're sitting in a box on some persons' desk while they're busy with other stuff. You don't need to test that much though, that is kinda one of the whole goals of DirectX in the first place. Like what K.Jack is saying above. It's an open, vendor agnostic API that both manufacturers implement for their own benefit.

until AMD can deliver an AI super resolution solution on par with DLSS, RTX is irrelevant on their hardware. From my experience, RTX is too much a performance hit unless paired with DLSS.
Traditional upscaling techniques (aka not "A.I.") can be very effective as are other neural network based approaches. And additional approaches are an active field of research. Lot of good grad students doing cool work in it. I imagine they'll do just fine. DLSS is cool and all, but so is this:

research.fb.com

Neural Supersampling for Real-time Rendering - Meta Research | Meta Research

Following the recent advances in image and video superresolution in computer vision, we propose a machine learning approach that is specifically tailored for high-quality upsampling of rendered content in real-time applications.
 
Last edited:

DirtySprite3

Banned
Sep 13, 2019
810
Nvidia put in the effort, so of course.

You can't just tick the "ray tracing" box and have it work like magic, and AMD just don't have the developer relations capacity and capability for whatever reason.

Even if AMD come out with some masterwork GPU, they're still going to be tremendously behind Nvidia on actual implementation.
Point doesn't stand considering a chunk of the user base will be playing on AMD GPU's due to the new consoles. This is more of RDNA2 not being out when the game releases and their partnership with Nvidia.
 

Ra

Rap Genius
Moderator
Oct 27, 2017
12,207
Dark Space
Shouldn't AMD have sent test units (kinda like dev kits) to big devs by now? (I know nothing about how those PC manufacturers handle the pre release phase so sorry if it's dumb)
Ask AMD. I have no idea.

I would just never expect a game to support features on a GPU line that hasn't even released.

The RTX 2080 came out in September of 2019 and the first game to support RT, Battlefield V, came out two months later in November, with huge support from Nvidia.

It's unrealistic to expect a GPU that is only being announced on Octoober 28th to have feature support in a game that releases a literal two weeks later.

Also keep in mind that AMD's RT solution is quite different from Nvidia's, so it isn't as simple as porting the RTX work. Developers are going to have to have time with AMD's suite.
 

Nzyme32

Member
Oct 28, 2017
5,245
What is their source? Wccftech is not reliable, they post every baseless rumour out there for click baiting.

Indeed.
Though CDPR and Nvidia were talking RT in Cyberpunk with Nvidia RTX over a year ago, while AMD had no solution in place. So it makes sense that however AMD has not necessarily been any kind of priority.

No doubt AMD cards will will either get RT a while after launch, or simultaneous with whenever an update hits the consoles for the same