• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Now that you've decided which is the most wasteful, what would you rather have then?

  • High resolution native 4k, lower detail raytracing, lower framerate

    Votes: 223 13.9%
  • Raytraced everything, lower resolution and detail, lower framerate

    Votes: 173 10.7%
  • 120fps, lower resolution and detail, low quality raytracing or even nothing at all

    Votes: 276 17.1%
  • I'd rather have something in between -like sub-4k at 60fps with some form raytracing-

    Votes: 938 58.3%

  • Total voters
    1,610

Elliot Pudge

Member
Oct 25, 2017
1,499
for console gamers?

either 120 fps (because console gamers won't own a screen that supports that refresh rate) or 4k (because 4k is a waste in general)
 

Zarshack

Member
May 15, 2018
541
Australia
I think that 1440-1800p 60 fps is what we should be hoping to have available as an option for every game.

Edit: I think the utilisation of a dynamic resolution for every game in order to maintain 60 fps would be excellent.
 

Transistor

Vodka martini, dirty, with Tito's please
Administrator
Oct 25, 2017
37,126
Washington, D.C.
Native 4K is the biggest waste to me. Reconstruction techniques have gotten so much better that they need to be embraced over native.

I'd prioritize them as:

Framerate > ray tracing > resolution
 

grady

Member
Oct 29, 2017
609
Bournemouth, UK
Depends on the type of game, multiplayer should always prioritise framerate whereas single player games can have nicer visuals and higher resolution. A middle ground with most games running at 60fps would be ideal. I've still got a 1080p monitor so resolution means very little to me
 

Mars

Member
Oct 25, 2017
1,988
High refresh/FPS (requires a TV or monitor that supports 120+ hz) or Native 4K.
 

Garrison

Member
Oct 27, 2017
2,892
User Banned (1 Day): Thread Whining; History of Hostility
This thread will be the biggest waste of resources today.
 

bushmonkey

Member
Oct 29, 2017
5,599
I just wish the poll was the thread title then I'd say native 4k is the biggest waste of resources in my opinion.
 

Outrun

Member
Oct 30, 2017
5,782
None of them are waste.

Resources can and will always redeployed for maximum effectiveness.

If a dev working on a XSX wants to use reconstruction 4k, in order to put more visual effects in their game, they will indeed do so.
 

Dogo Mojo

Member
Oct 27, 2017
2,157
The games I spend the most time with do t really benefit (as much) from super high frame rates so that's less important to me, not to mention I've played games on consoles the majority of my life so as long as the frame rate is stable that's more important to me than it being high.

The importance of Performance or Visuals will differ from person to person.
 

RedHeat

Member
Oct 25, 2017
12,685
Higher than 60fps most assuredly. I know people like to meme about not being able to tell the difference and such, but the average literally can not. And said average person isn't going to waste their time hunting down a monitor or TV that can take advantage of said framerates.
 

Batatina

Member
Oct 25, 2017
5,263
Edinburgh, UK
I would rather not have any fixed expectations and let the developers turn things on and off in order to create the best possible result for the specific game they are making. If they want to push the environments and ray-tracing and sacrifice resolution that's fine, if they want resolution but simpler effects that's fine too, etc. If they want everything to be the top of the line and when you see it, it pays off, then I'll be fine with 30 FPS as well. It's just all a compromise and every game has different strengths.

Having performance and graphics mode is also a good compromise.
 

Bede-x

Member
Oct 25, 2017
9,384
High framerates is what's important, but my (and many others) main gaming display can't handle 120fps and so many games are still running choppy 30fps, so let's compromise here and try making 60fps the standard. That would be such an improvement in look and feel if we could get most games back to that standard again.

Don't care at all about resolution and raytracing, before developers start nailing the framerate.
 

Shark

Member
Oct 28, 2017
8,126
Raleigh, NC
I'm pretty curious if I'd actually be able to discern the differences between 4K and the checkerboard solutions of this gen in my normal playing environment.
 

Instro

Member
Oct 25, 2017
15,002
I don't see any of these as being wasteful. Like people want to crap on native 4k all the time, yet it's an important piece of resolving the detail in higher quality assets and effects.
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
I didn't think there was any excuse that games didn't hit a locked 60fps last gen and that certainly applies to next gen. You lock your framerate to 60 and balance your visuals around that.

But I really couldn't care less about 120. Do I have the display/hardware to run at 120? Yes. Have I tried it? Yes. Did it do anything for me? Not in the least.

But 4K native output can also be a waste. Still, pixel counting has gotten easier since getting an 85" tv. So... eh? Getting locked to 60 is far more important than hitting 4K, but it's a nice thing to have.

RT can produce some nice results but is a resource hog. Again, it's nice to have but as long as the artistic direction is good I care much more about that than technical graphics.

So 60fps is a must. 120 isn't necessary and 4K/RT are a bonus.
 

iksenpets

Member
Oct 26, 2017
6,484
Dallas, TX
Native 4K seems most wasteful in a world where everyone has 120Hz TVs — it provides the least benefit — but designing around 120FPS seems wasteful in a world where probably 1% of players will have the hardware to display it.
 

Ronnie Poncho

Avenger
Oct 27, 2017
2,133
If you ask me, a 1080p game with raytracing is better than a 4K one without raytracing. RT really does add so much depth and visual fidelity.

Also, 120fps is nice but I'm happy with 60 tbh
 

StereoVSN

Member
Nov 1, 2017
13,620
Eastern US
120 fps will be a waste for the most part on these consoles. Without DLSS hitting high enough resolution with high frame rates considering the specs of the consoles will not be possible without significant trade offs. Now throw RTX into the mix.

Unless developers allow high levels of graphical customization ala PCs which I don't see many doing in order not to confuse users.
 

Zaki2407

Member
May 6, 2018
1,567
To be honest, I can live with 1080p+60fps+HDR+Raytracing.
I hope the majority of games in the future have these options.
 

werezompire

Zeboyd Games
Verified
Oct 26, 2017
11,319
Options are best. On my laptop, I have a high-refresh 1080p screen so 4k doesn't do a lot for me there. Conversely, my TV is 4k but not high-refresh, so over 60 fps is worthless for me there.
 

Cipher Peon

One Winged Slayer
Member
Oct 25, 2017
7,799
This is a weirdly framed question.

Anyway, I'd prioritize 4K and raytracing way over 60/120FPS. For games that allow that option, I will be taking it.
 

Magnus

Member
Oct 25, 2017
8,358
All the results are null and void because op made the mistake many others seem to as well, where the question in the thread title and the question in the poll arent the same. Like, what? Lol
 

KayonXaikyre

Member
Oct 27, 2017
1,984
Sure options are nice but 4k to me is the least important thing lol. It's the biggest waste imo. 1440p is where it's at. None of those poll options worked for me too i'd take like 1440p with high frame rate and as much effects as possible.
 

chandoog

Member
Oct 27, 2017
20,071
Attempting higher native resolutions, with the advancements and improvements in reconstruction, will be a waste of scarce GPU resources.
 

calibos

Member
Dec 13, 2017
1,992
Nothing in that list will be wasted. If Devs focused only on Raytracing for example, performance would be the thing that people would complain about. Resolution only and it would be about the effects quality.

Things should be balanced and I trust Devs to make the decisions that best suit them.
 

AppleBlade

Member
Nov 15, 2017
1,711
Connecticut
As a few others have said - 1440p, 60FPS, HDR with Raytracing is the sweet spot imo. Everything after that is a victim of diminishing returns to my eyes. Give me that and use the rest of the horsepower to make bigger, more detailed worlds with better physics and effects.
 

nanskee

Prophet of Truth
Member
Oct 31, 2017
5,069
Native 4K seems most wasteful in a world where everyone has 120Hz TVs — it provides the least benefit — but designing around 120FPS seems wasteful in a world where probably 1% of players will have the hardware to display it.
It appears that quite a few of the mid-high range 4k sets from maybe the last two years can at least output 1080/120hz. I'm salty as I got my 4k set 4 years ago and it doesn't support 1080/120hz
 

Rizific

Member
Oct 27, 2017
5,948
in a world where rtx3000 cards exist, it seems like 4k is more than manageable now. its raytracing that looks to be a waste of resources to me now. i can borderline tell the difference between rtx on/off screenshots.
 

iksenpets

Member
Oct 26, 2017
6,484
Dallas, TX
It appears that quite a few of the mid-high range 4k sets from maybe the last two years can at least output 1080/120hz. I'm salty as I got my 4k set 4 years ago and it doesn't support 1080/120hz

True, but most people haven't upgraded their TV in that time frame, and most who have bought something in the sub-$1000 range that doesn't support those features. We're in a world where 4K penetration is low enough that MS still sees the value in releasing a 1080p box, let alone where 120Hz penetration is. It's a great feature for them to support, but one that only their very most engaged customers are going to be using for years to come.
 

nanskee

Prophet of Truth
Member
Oct 31, 2017
5,069
True, but most people haven't upgraded their TV in that time frame, and most who have bought something in the sub-$1000 range that doesn't support those features. We're in a world where 4K penetration is low enough that MS still sees the value in releasing a 1080p box, let alone where 120Hz penetration is. It's a great feature for them to support, but one that only their very most engaged customers are going to be using for years to come.
Yeah, you're definitely right. Sometimes I forget that the vast majority of people are still using 1080p sets. Also, yeah 120hz is a feat likely seen on sets that cost $1000+
 

jotun?

Member
Oct 28, 2017
4,490
Has anyone pointed out yet that the poll question is the reverse of the thread title?
 

laxu

Member
Nov 26, 2017
2,782
As a PC player who owns a 4K 120 Hz OLED, a 2080 Ti GPU and 3700X CPU I can say I have experienced all of these.
  • Anything above 60 fps just improves the responsiveness of the game and reduces visible motion blur. It's the biggest improvement on this list to me.
  • Raytracing can make scenes look far more realistic but at the same time the performance of next gen consoles is limited in this area so I doubt we will see the real good stuff like raytraced lighting in most games.
  • 4K native resolves a lot of fine detail. That's its primary benefit.
To me 4K native is the loser out of these because in movement even on an OLED and 120 fps it can be sometimes hard to see the difference compared to 1440p. Using say 1600-1800p with Radeon Content Aware Sharpening would make that difference even less. Native 4K is pretty when there is barely any movement so dynamic resolution rendering would be preferable unless you can get something similar to DLSS 2.x on consoles.

People need to start thinking in terms of dynamic stuff. Dynamic framerates with variable refresh rate support and dynamic resolution targeting a framerate range. That's how you get situational compromises that work. Let's take something like TLOU2 as an example of how it could work.
  • When you are in interiors the game could target 1800p-4K 45-60 fps with raytracing. These sections are slower paced so the framerate is less of an issue and overall there is less things to render in a single scene but can have a high amount of detail like items in houses and so on. So put the emphasis on improving visuals.
  • When you step into the open world sections, it could instead target 1440p-1800p 60-90 fps. This will give you more responsive control over your character and horse and makes jumping and moving around look and feel good. Raytracing gets pared back to a minimum of some low fidelity puddle and window reflections.
  • Action packed scenes with infected hordes? Go for 1080p-1440p 80-120 fps. No raytracing. Responsive control is key, you won't have time to admire the sights so not being able to see the reflection on the eyes of an enemy is not going to matter.
 

bobliefeld

Member
Jan 30, 2019
203
As a 2080 owner I have to say Raytracing.

Technically, it's an important and exciting step but no RTX game has really wowed me. It's more a chin stroking, "well that's interesting" kind of feature than a "wow" feature at the moment.

I'm sure it'll be great when developers can set raytracing support as a minimum requirement, at the moment they're limited to doing things that can be achieved by non RTX cards.

Next would be native 4K. Certainly gaming on a TV, on a couch at a reasonable distance native 4K is a bit of a waste. Checkerboarding on the PS4 Pro seems almost good enough. DLSS - I don't know why they even bother with an on/off in games that support it. Just enable it.
 

vitormg

Member
Oct 26, 2017
1,928
Brazil
None of them, they are all amazing. Maybe 4k native If you consider DLSS is an option on PC. Unfortunately consoles don't have such a solution.