The 20XX series was literally early adopter ray tracing technology, unfortunately for those who bought it (but should have been obvious). Ray tracing, like with any facet of any game's rendering, will have a performance hit entirely relative to the scope of its application and optimisation, the quality at which it is rendered, and the capacity of the player's GPU. You can take two ray tracing games with ray tracing features enabled and the performance hit will not be uniform due to the extent at which the ray tracing is implemented. Control, for example, seems to be the most comprehensive and demanding of all currently available games. And it also depends on how intensive the effect is; Battlefield V, for example, allows you to scale the quality of the ray tracing no different to any other rendering effect.
While it sucks to consider the 20XX, especially something as high end as a 2080 Ti, not performing adequately in ray tracing, there are so many variables to consider with Cyberpunk 2077's specific implementation of ray tracing and scalability that it's really impossible to make any judgement call yet. And, as noted, it's also important to remember that irrespective of the 20XX raw performance and cost, it is and always will be the worst performing ray tracing series of cards. By virtue of its market entry point and function it is not going to be a leading standard, it is going to be the one series of cards from NVIDIA that struggles to keep up as ray tracing becomes more widely used and uniform, as it came about in a time where ray tracing games did not exist at all, the technology not implemented anywhere, and thus no standard or uniformity on what kind of ray tracing coverage games implement.
Ray tracing also isn't free, as we all know, and like any rendering effect ties into everything else as well. If Cyberpunk is an inherently demanding game computationally on the GPU, irrespective of ray tracing, then enabling ray tracing will come at an even more noticeable performance cost as the performance headroom is less. The impact will be more noticeable as the GPU already struggles to keep up the pace with the game on its highest settings, further deteriorated by trying to render what is looking to be fairly comprehensive ray tracing coverage.
And I mean, that "highest settings" factor is one to consider. The 20XX series is not going to last through an entire generation running games maxed out with ray tracing at 4K, DLSS or otherwise. AFAIK the 2080 Ti can't even hit 60fps Control maxed out with full ray tracing at 1440p. That shit's demanding.
End of the day though with Cyberpunk 2077 I wouldn't jump to conclusions just yet. Too many factors to consider, too many scalable variables, all pertaining to a game that is still technically in development and using what is considered still emergent technology getting regular optimisation.