• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Speevy

Member
Oct 26, 2017
19,320
So forgive me as a layperson who just plays games, but I'm gonna ask the dumbest questions in the history of this forum. It won't be the last time.

It seems to me that we are in an age of video game graphics where game developers are making all our wildest dreams come true. 120 fps games on consoles? Sure. Ray-traced reflections? Why not? 4k native gaming? Yes, please. And if this weren't enough, streaming some of the best visuals we've ever seen is only going to improve over what we're able to get now, so more people than ever will be able to afford and experience this visual splendor.

Now, I've been gaming for a while but I played my first gaming PC around 2014-2016, around that period. During that time I discovered something previously unknown, which is that if you invest in a strong enough piece of hardware, it will play most games reasonably well for a while. It just depends on what you're willing to accept as "reasonably well". For most people on this forum, 30 frames per second just doesn't cut the mustard anymore. I'll be honest in saying that I didn't really notice most of the time, but now that you tell me, I do.

However, games keep improving, graphics cards keep getting better/more capable, gaming consoles are able to do more than you ever dreamed.

Here is my question. (finally, right?)

It seems like developers are just challenging themselves right now to jump through previously difficult rings on the way to putting a better image on the screen at a higher framerate, or making things shine and reflect. The complexity of the characters and scene are pretty well established and pretty streamlined and cost-prohibitive to improve further.

I don't know. It seems like every time I watch a Digital Foundry video, they're always talking about "expensive" graphics, like how taxing certain effects, resolutions, or settings are on the GPU and CPU. Who decides how much graphics horsepower these things will cost? Sometimes I see it and I'm like "Yeah, that image really is clearer." and other times I'm left squinting.

Why can't game companies just get to a point where basically any level of visual fidelity can be reached by an established software standard, or a technique? If visuals are in the direction where you literally need to pause your game and set it to "photo mode" to see all the developer put in the game, that you wouldn't have noticed otherwise, shouldn't all this extra information's cost (to hardware) come down inevitably to a minimum?

I can't imagine our televisions providing much more of a gap in visuals beyond just marketing, and again, prohibitively expensive.

So what do you think?
 
Last edited:

ToddBonzalez

The Pyramids? That's nothing compared to RDR2
Banned
Oct 27, 2017
15,530
Uh, there are advances in the consumer hardware space that allow consoles and PCs to become more powerful over time so devs make use of it to create more complex assets, effects, lighting, etc. that are more computationally expensive. I guess I don't really understand the question?
 
Oct 25, 2017
2,631
honestly it comes down to "gamers" and high expectations. games at 1440/60 look fine, but "the consumer" always wants more.
 
Jan 15, 2019
4,393
I guess you're suggesting that certain improvements are hitting a point where you need side-by-side comparisons to easily notice the difference, and at that point why not stop expending effort into improving visual fidelity and focus elsewhere? Do I have that right?

Granted, I don't have an informed answer for you. Just trying to clarify what you're asking.
 
OP
OP
Speevy

Speevy

Member
Oct 26, 2017
19,320
Uh, there are advances in the consumer hardware space that allow consoles and PCs to become more powerful over time so devs make use of it to create more complex assets, effects, lighting, etc. that are more computationally expensive. I guess I don't really understand the question?

Right, but these more complex assets are increasingly less visually impactful to all but the most demanding consumers. My question is mainly about why other than reaching certain resolutions they have to "tax" hardware.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,678
You are over budget on font size
 
Last edited:
OP
OP
Speevy

Speevy

Member
Oct 26, 2017
19,320
I guess you're suggesting that certain improvements are hitting a point where you need side-by-side comparisons to easily notice the difference, and at that point why not stop expending effort into improving visual fidelity and focus elsewhere? Do I have that right?

Granted, I don't have an informed answer for you. Just trying to clarify what you're asking.

More that some of the priorities on improvement are less what a person like me might notice, except when told about it explicitly.
 

ToddBonzalez

The Pyramids? That's nothing compared to RDR2
Banned
Oct 27, 2017
15,530
Right, but these more complex assets are increasingly less visually impactful to all but the most demanding consumers. My question is mainly about why other than reaching certain resolutions they have to "tax" hardware.
It's true that there have been diminishing returns in this area. the visual jump from PS1 to PS2 is more noticeable than the jump from PS4 to PS5 for example. That being said, watch a recent Pixar flick and it is clear that prerendered CGI is still far beyond what can be achieved in real time. There's alot of progress waiting to be made with real time visuals, so I don't think we should just say "graphics are good enough, let's just stop where we are right now."
 
Last edited:
Dec 15, 2017
1,590
Well, you do have a point. I always wondered if developers could settle for something that looks adequate enough from a graphics standpoint and try to improve the rest (mission design, AI, physics, writing)

Say, high end OG Xbox level graphics at high resolutions and 16X AF. Top that with a great artstyle and you are golden from a graphics point of view in 3D games.
 

HaremKing

Banned
Dec 20, 2018
2,416
I mean, graphics don't need to be expensive. Just check a look at the indie gaming scene.

However if people are buying a relatively expensive PC/console and they're not seeing a clear visual improvement in the games, then I'm not sure they'll want to play that game.
 
Oct 27, 2017
4,916
The methods used today are different than what they used 5, 10, 15, etc years ago. Newer methods make for a more convincing image or allow you to do things in realtime that previously had to be pre-rendered (eg: ray tracing).

I think when people call an effect expensive, they mean the visual improvement it gives is minimal compared to the performance cost. That can mean that it's an effect that has a less intensive substitute or that there's not much difference between the medium and ultra settings when it comes to the final output.

Right, but these more complex assets are increasingly less visually impactful to all but the most demanding consumers. My question is mainly about why other than reaching certain resolutions they have to "tax" hardware.

Games in the past had to limit things like time of day and how dynamic the environments are in order for you to not see the puppet strings in the background. If you took a game like TLOU2 and changed the time of day, it would look weird as hell because most of the lighting and shadowing is pre-baked. As hardware power and software tools improve, developers can build new types of worlds that weren't really possible in the past.
 

ss_lemonade

Member
Oct 27, 2017
6,646
Say, high end OG Xbox level graphics at high resolutions and 16X AF. Top that with a great artstyle and you are golden from a graphics point of view.
That's pretty much playing OG Xbox games via enhanced BC on a One X or Series X and even then, some of those games could still look very dated (maybe even more so at higher resolutions since the flaws become much more apparent)
 
Jan 15, 2019
4,393
More that some of the priorities on improvement are less what a person like me might notice, except when told about it explicitly.
I think what you're talking about is a big part of why the Switch has been so successful, for instance. Graphics actually have hit that "good enough" point for a ton of consumers, to the extent that they'd rather have a portable version of a game instead of a 4k version of it. As someone else mentioned, indie games are also evidence of this. Plenty of people are opting for stylized interpretations and retro aesthetics rather than the most cutting edge visuals possible. It just so happens that there's enough people who do want to see visuals pushed that it's still worthwhile to put effort into.
 

lexony

Member
Oct 25, 2017
2,518
More effects are always possible. I mean look at pixar movies. The question is only, how long does it take for your pc to render a single frame. And in gaming, this has to be in real time.
 

TheMadTitan

Member
Oct 27, 2017
27,190
The problem is that people think that because they bought X card that can push 1000fps with max settings the year that they bought it, it should do the same year in and year out for five or so years with no compromises.

If someone sets a minimum standard for performance, like, say 60fps, they can hit that for years if they tweak settings. Games don't look ugly as sin on low anymore, but people are still in the 2011 mindset of turning something down to low is going to ruin the visual experience when that's absolutely not the case.
 

the-pi-guy

Member
Oct 29, 2017
6,269
Who decides how much graphics horsepower these things will cost? Sometimes I see it and I'm like "Yeah, that image really is clearer." and other times I'm left squinting.
The algorithm/technique that is used to calculate/apply those things.
Doesn't necessarily have anything to do with how good the scene is. Some techniques are relatively cheap and give good results. Some aren't. Full Ray Tracing gives great results but it's very costly.

Every scene of a game is a huge amount of math, (tens of) billions of operations being applied every second.

Why can't game companies just get to a point where basically any level of visual fidelity can be reached by an established software standard, or a technique?

If visuals are in the direction where you literally need to pause your game and set it to "photo mode" to see all the developer put in the game, that you wouldn't have noticed otherwise, shouldn't all this extra information's cost (to hardware) come down inevitably to a minimum?

That doesn't mean that the regular scene is rendering all the same stuff.
That might just mean the artists did a little more work beyond what the system can handle in game.

Depending on the photo mode, sometimes you can apply more stuff that is too demanding for in game because there is no animation, among other things.
 

spad3

Member
Oct 30, 2017
7,122
California
Innovation breeds efficiency, efficiency allows for more room in innovation.

I don't know. It seems like every time I watch a Digital Foundry video, they're always talking about "expensive" graphics, like how taxing certain effects, resolutions, or settings are on the GPU and CPU. Who decides how much graphics horsepower these things will cost? Sometimes I see it and I'm like "Yeah, that image really is clearer." and other times I'm left squinting.

Right, but these more complex assets are increasingly less visually impactful to all but the most demanding consumers. My question is mainly about why other than reaching certain resolutions they have to "tax" hardware.

As long as there's room for improvement, improvements will be made.

As for "who decides how much?" - the developers do. The developers have to look at baseline hardware of the install base (in layman's terms, what is the lowest average specs of the hardware that consumers own) and then they look at the topmost hardware available in the industry. That sets the scale for what they develop for. Then a game engine is developed for devs to create their workspace in. Then assets are imported in (assets = character models, environments, lighting nodes, skyboxes, etc) and the assets have to follow the hardware scaling rules set by the engine. This is why when you see games on PC, they have adjustable graphics settings because you're able to control what the engine is going to render for you: the lowest denominator setting, or fully maxed out. Certain assets and effects require more CPU and GPU power to render, like lighting and textures. What "power" means in that context is how many calculations per second does the CPU or GPU have to do to display a frame of animation. Lowering settings means you lower the level of detail that's being rendered, which means you're using less "power."

So when channels like Digital Foundry say "taxing," what they mean is how hard are they pushing the CPU/GPU to render a frame of the game. The more intensely detailed the assets are, the more CPU/GPU power they will require to be rendered. And this all comes back to the way the game engine is built.
 

Ruffy666

Member
Oct 27, 2017
259
Certainly doesn't "need" to but I'd think even from a development perspective, folks enjoy the challenge of pushing the envelope in terms of what they're able to accomplish. Think of all the insane tiny details ND added to Uncharted 4, things that have zero impact on gameplay and you might not have even noticed except for one of the umpteen YouTube videos touting "THE INSANE TINY DETAILS IN UNCHARTED 4 THAT YOU NEVER NOTICED!"

I think the desire to be on the cutting edge is just human nature to some degree
 

Mullet2000

Member
Oct 25, 2017
5,894
Toronto
New/powerful hardware means new rendering techniques, effects, etc become possible, because the old hardware couldn't handle them.

Developers are excited by the new techniques so they use them. Users are excited to see the new techniques used too.

You're always operating within the hardware's limit, so stacking new techniques pushes the hardware harder and harder till you hit your limit.

It's really as simple as that.
 

Kingasta

Avenger
Jan 4, 2018
814
Why settle? Remember how every gen we said "graphics can't get better" and "this is so close to real life", now if someone asked you to settle at any point in the past with the knowledge of how much progress will be made you wouldn't accept it.
 

Tygre

Member
Oct 25, 2017
11,095
Chesire, UK
At the end of the day, it's all math.

Some math is easy and quick, 1 + 1 = 2 stuff. Some math is so commonly used that it has been specifically catered for. Some math is just incredibly difficult to do efficiently. The harder the math, the longer it takes.

To render at 60 fps you need to have all your math done in 0.015 seconds. Can't do all your math that fast? Too bad! Frame dropped!

As technology advances, previously hard math gets easier, either through the brute force of being able to do more math per second, or by introducing efficiencies into the pipeline. Instead of actually working out 1 + 1 = 2, you have a lookup table that whenever you go to do 1 + 1 you check and see the answer is 2 without having to calculate it.

I am oversimplifying to a degree that disgusts me, but I don't want to get into framebuffers and render pipelines and precaching and temporal aa and all the other things that mess with basic frame pacing.

So what do you think?

I think you need to get better at asking questions if you want to receive helpful answers.

Be concise. Be specific. Think about what you actually want to know before you just vomit up a word salad in the hopes that smart people will be willing to dig through it for you.
 

Jroc

Banned
Jun 9, 2018
6,145
I'm not entirely sure if I understand the OP, but I'll toss some trivia out there.

Developers do optimize the visuals over the course of a generation and eventually spend their resources in more meaningful ways. A good example of this is Call of Duty 2 (2005) vs Call of Duty 4 (2007) on the Xbox 360:

727850-927725_20060509_007.jpg


cod4_screenshot_01.jpg


Most people looking at those two shots will say that the bottom looks graphically superior, but the character models in Call of Duty 4 actually use about half the polygons of the Call of Duty 2 characters. Instead of using "expensive" high-poly NPCs, they freed those resources up so that they could add more detail to the environment and use more advanced shaders. Most people didn't notice the radically reduced polycount, but they DID notice all of the new graphical effects.

Right now the new consoles have just launched so we are seeing a lot careless usage when it comes to "expensive" new effects. For instance a lot of games have RT effects that cut the framerate in half while looking marginally better. By the end of this generation we will be seeing these effects used in smarter ways instead of being wasted on scenes that require a frame-by-frame analysis to see the result.
 
OP
OP
Speevy

Speevy

Member
Oct 26, 2017
19,320
I think you need to get better at asking questions if you want to receive helpful answers.

Be concise. Be specific. Think about what you actually want to know before you just vomit up a word salad in the hopes that smart people will be willing to dig through it for you.

I ask what's on my mind. It's sometimes not altogether clear what you want from a thread until you ask it, and frankly, sometimes the most tried and true topics are the least helpful. (Such as "Major acquisition coming." "Is Sony/Microsoft/Nintendo doomed" or "Just played your favorite game. When does it get good?"

I wonder things that aren't likely to cause confrontation and which genuinely concern me. Whether smart people answer me is up to them.
 

jkk411

Member
Jul 22, 2018
1,021
As hardware steadily improves, the possibility space developers are working in is constantly expanding, so I think it's just natural that devs will try to push to the new limits whenever possible. Developers have to make hard cuts all the time, and then new hardware comes along and suddenly they don't have to make those cuts anymore and the wheels start turning about what they could do that wasn't possible previously.
 

DoradoWinston

Member
Apr 9, 2019
6,099
Right, but these more complex assets are increasingly less visually impactful to all but the most demanding consumers. My question is mainly about why other than reaching certain resolutions they have to "tax" hardware.
because those things look good in trailers and gameplay which will sell to a large audience and then will make the hardcore talk about it all the time which better or for worse leads to more discussions online which spreads it even more. Imagine the absolute shit storm Pixar would get if Soul came out and all of a sudden looked just ok, one of the main discussions i saw after the movie released was how incredible the environments and human characters looked.

On the other hand, as an artist if you want to throw the entire kitchen sink at your game to make it the most visually advanced you can take it then well thats a completely separate and valid reason.
I mean at the same time the tools advance throughout an entire generation and things get more efficient so you can do way more later on because people decide to push the hardware as far as possible. not really sure what the question is tbh like...you cant get to a point where things are common place or efficient without people trying em in the fist place or advancing the tech