So forgive me as a layperson who just plays games, but I'm gonna ask the dumbest questions in the history of this forum. It won't be the last time.
It seems to me that we are in an age of video game graphics where game developers are making all our wildest dreams come true. 120 fps games on consoles? Sure. Ray-traced reflections? Why not? 4k native gaming? Yes, please. And if this weren't enough, streaming some of the best visuals we've ever seen is only going to improve over what we're able to get now, so more people than ever will be able to afford and experience this visual splendor.
Now, I've been gaming for a while but I played my first gaming PC around 2014-2016, around that period. During that time I discovered something previously unknown, which is that if you invest in a strong enough piece of hardware, it will play most games reasonably well for a while. It just depends on what you're willing to accept as "reasonably well". For most people on this forum, 30 frames per second just doesn't cut the mustard anymore. I'll be honest in saying that I didn't really notice most of the time, but now that you tell me, I do.
However, games keep improving, graphics cards keep getting better/more capable, gaming consoles are able to do more than you ever dreamed.
Here is my question. (finally, right?)
It seems like developers are just challenging themselves right now to jump through previously difficult rings on the way to putting a better image on the screen at a higher framerate, or making things shine and reflect. The complexity of the characters and scene are pretty well established and pretty streamlined and cost-prohibitive to improve further.
I don't know. It seems like every time I watch a Digital Foundry video, they're always talking about "expensive" graphics, like how taxing certain effects, resolutions, or settings are on the GPU and CPU. Who decides how much graphics horsepower these things will cost? Sometimes I see it and I'm like "Yeah, that image really is clearer." and other times I'm left squinting.
Why can't game companies just get to a point where basically any level of visual fidelity can be reached by an established software standard, or a technique? If visuals are in the direction where you literally need to pause your game and set it to "photo mode" to see all the developer put in the game, that you wouldn't have noticed otherwise, shouldn't all this extra information's cost (to hardware) come down inevitably to a minimum?
I can't imagine our televisions providing much more of a gap in visuals beyond just marketing, and again, prohibitively expensive.
So what do you think?
It seems to me that we are in an age of video game graphics where game developers are making all our wildest dreams come true. 120 fps games on consoles? Sure. Ray-traced reflections? Why not? 4k native gaming? Yes, please. And if this weren't enough, streaming some of the best visuals we've ever seen is only going to improve over what we're able to get now, so more people than ever will be able to afford and experience this visual splendor.
Now, I've been gaming for a while but I played my first gaming PC around 2014-2016, around that period. During that time I discovered something previously unknown, which is that if you invest in a strong enough piece of hardware, it will play most games reasonably well for a while. It just depends on what you're willing to accept as "reasonably well". For most people on this forum, 30 frames per second just doesn't cut the mustard anymore. I'll be honest in saying that I didn't really notice most of the time, but now that you tell me, I do.
However, games keep improving, graphics cards keep getting better/more capable, gaming consoles are able to do more than you ever dreamed.
Here is my question. (finally, right?)
It seems like developers are just challenging themselves right now to jump through previously difficult rings on the way to putting a better image on the screen at a higher framerate, or making things shine and reflect. The complexity of the characters and scene are pretty well established and pretty streamlined and cost-prohibitive to improve further.
I don't know. It seems like every time I watch a Digital Foundry video, they're always talking about "expensive" graphics, like how taxing certain effects, resolutions, or settings are on the GPU and CPU. Who decides how much graphics horsepower these things will cost? Sometimes I see it and I'm like "Yeah, that image really is clearer." and other times I'm left squinting.
Why can't game companies just get to a point where basically any level of visual fidelity can be reached by an established software standard, or a technique? If visuals are in the direction where you literally need to pause your game and set it to "photo mode" to see all the developer put in the game, that you wouldn't have noticed otherwise, shouldn't all this extra information's cost (to hardware) come down inevitably to a minimum?
I can't imagine our televisions providing much more of a gap in visuals beyond just marketing, and again, prohibitively expensive.
So what do you think?
Last edited: