• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

p3n

Member
Oct 28, 2017
650
You're full of delusions.

8700K and RTX 2080 are not mid range parts in any way, shape or form.

According to Intel the 7-series is now (upper) mid-range with the 9-series as high-end consumer products. Was the 8700K mid-range in 2017 when it released? No it wasn't. Is it mid-range today? Absolutely.

Are you familiar with nvidia's chip designations? The "GX-100" or "GX-110" revision are the full size chip. A "GX-104" or "GX-114" revision are the down-sized variant usually with additional headroom to deactivate damaged parts of the chip to increase yield. The "GX-106/116" is sometimes not even fully based on the original chip design and is a heavily down-sized version used for budget or low-end cards.

The last 80 model that was considered "high-end" was the 580 (GF110) as it was the full size chip without any deactivated parts. The mid-range 560 was the GF114 chip while the budget and low-end cards had the GF-116.
The "680" following it was originally supposed to be the 660 (GK104). It performed so well compared to the 500-series (and ATI's HD-series), they just slapped the 680 name AND PRICE on it and called it a day. They never sold the GK100. The first revision full-size chip GK110 was the first Titan.
Fast forward to the Turing chips and you can see that nvidia - again - isn't selling the full sized chip. The high-end is the TU-102 (2080Ti & Titan RTX). The mid-range is the TU-104 (2080) and the supposedly low-end is the TU-106 (2070 and 2060).



TL;DR: Just because nvidia's pricing is abusing a market without competition doesn't magically make a down-sized TU-104 chip "high-end".
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
The multiplayer used an early form of checkerboarding. The multiplayer ran at less than 1,920 pixels horizontally. It didn't look great.
"In both SP and MP, Killzone Shadow Fall outputs a full, unscaled 1080p image at up to 60 fps. Native is often used to indicate images that are not scaled; it is native by that definition."


In Multiplayer mode, however, we use a technique called "temporal reprojection," which combines pixels and motion vectors from multiple lower-resolution frames to reconstruct a full 1080p image. If native means that every part of the pipeline is 1080p then this technique is not native.

Games often employ different resolutions in different parts of their rendering pipeline. Most games render particles and ambient occlusion at a lower resolution, while some games even do all lighting at a lower resolution. This is generally still called native 1080p. The technique used in KILLZONE SHADOW FALL goes further and reconstructs half of the pixels from past frames.

We recognize the community's degree of investment on this matter, and that the conventional terminology used before may be too vague to effectively convey what's going on under the hood. As such we will do our best to be more precise with our language in the future.


Up-scaling is a spatial interpolation filter. When up-scaling an image from one resolution to another, new pixels are added by stretching the image in X/Y dimension. The values of the new pixels are picked to lie in between the current values of the pixels. This gives a bigger, but slightly blurrier picture.



Temporal reprojection is a technique that tracks the position of pixels over time and predicts where they will be in future. These "history pixels" are combined with freshly rendered pixels to form a higher-resolution new frame. This is what KILLZONE SHADOW FALL uses in multiplayer.



So, in a bit more detail, this is what we need for this technique:


  • We keep track of three images of "history pixels" sized 960x1080
    • The current frame
    • The past frame
    • And the past-past frame
  • For each pixel we store its color and its motion vector – i.e. the direction of the pixel on-screen
  • We also store a full 1080p, "previous frame" which we use to improve anti-aliasing
Then we have to reconstruct every odd pixel in the frame:


  • We track every pixel back to the previous frame and two frames ago, by using its motion vectors
  • By looking at how this pixel moved in the past, we determine its "predictability"
  • Most pixels are very predictable, so we use reconstruction from a past frame to serve as the odd pixel
  • If the pixel is not very predictable, we pick the best value from neighbors in the current frame
On occasion the prediction fails and locally pixels become blurry, or thin vertical lines appear. However, most of the time the prediction works well and the image is identical to a normal 1080p image. We then increase sub-pixel anti-aliasing using our 1080p "previous frame" and motion vectors, further improving the image quality.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
^ For those who thought it was TL;DR, don't bother trying to claim it's not 1080p native in Killzone Shadow Fall then. Because it was, and still is.
 

eonden

Member
Oct 25, 2017
17,078
The court case was a failure with the judge dismissing it. The game runs at 1920*1080p.

And ND chose 30fps for The Last of Us part 2 to push the game's complexity. Every game can run at 60fps on any modern computer if you design the game for that framerate.
As you said, if you design it to run at that framerate. My entire point is that given the choice between increased complexity or framerate, most AAA devs choose compelexity. And that is the case already for 60fps, imagine for doubling the requirements for framerate by putting 120.
 

Deleted member 18324

User requested account closure
Banned
Oct 27, 2017
678
"Let's see the enthusiast PC market get out of THIS situation"
*enthusiast PC market emerges out of yet another console launch period with continued growth*
"Well it's completely irrelevant anyway as most of their users are grandmothers who have no idea Steam is launching on startup"
 

JahIthBer

Member
Jan 27, 2018
10,376
GeForce RTX 2080 is high end. Just because it's TU-104 doesn't make it mid range. It is within 35% of the fastest graphics card.
It should be $300-350 by now, so upper mid range, RTX 2080 is a GTX 1080 Ti with HW RT, 1080 Ti is an early 2017 GPU.
Remember the GTX 980 Ti got out motored by the GTX 1070 a year later for a much cheaper price, but somewhere Nvidia decided to price gouge & sadly a lot of PC gamers were happy to pay up, then AMD has risen their prices too, they got caught with their pants down with their $500 5700 XT & pretended it was a joke.
I dunno if this will change by late 2020, but it will be Consoles biggest advantage price wise if nothing gives.
 

Techno

The Fallen
Oct 27, 2017
6,409

Md Ray

Member
Oct 29, 2017
750
Chennai, India
It should be $300-350 by now, so upper mid range, RTX 2080 is a GTX 1080 Ti with HW RT, 1080 Ti is an early 2017 GPU.
Remember the GTX 980 Ti got out motored by the GTX 1070 a year later for a much cheaper price, but somewhere Nvidia decided to price gouge & sadly a lot of PC gamers were happy to pay up, then AMD has risen their prices too, they got caught with their pants down with their $500 5700 XT & pretended it was a joke.
I dunno if this will change by late 2020, but it will be Consoles biggest advantage price wise if nothing gives.
RTX 2080 is not a 1080 Ti. It has less CUDA, less TMUs, fewer ROPs, lower memory bandwidth, 3GB less video memory.
 

Premium

Banned
Oct 27, 2017
836
NC
Like the explanation literally says it's sub native and then tries to say "but it's just as good!"

We don't need to defend the honor the a launch game from 6 years ago.

Exactly. I enjoyed the hell out of KZ2 multiplayer but I'm not about to get sucked into this silly debate about a 7 year old game again.
 
Mar 22, 2019
811
in the process of building a new pure-gaming pc and i literally have everything except the cpu and motherboard (went with MSI 2080 ti for 4k goodness).
only thing holding me back from pulling the trigger on a 3700x is the fact that the new 3850x is round the corner and given the rumors that both ps5 and xbx2 will use amd i'd rather be 'future' proofed for any console crossover games i decide to play on pc.

this is all only good for pc - it will always be the growth gaming industry.
 

Sanctuary

Member
Oct 27, 2017
14,203
That said, nothing stops someone with a budget CPU from getting whatever GPU they please. How does one classify a system with a 8400 and a 2080ti?

An unbalanced one that will tear through most games right now, but will likely start falling behind rather quickly in future open world styled games...which seem to be spreading like viruses.

But it's not. When 8GB video memory is saturated, performance on the RTX 2080 plummets down the drain while GTX 1080 Ti remains at a much higher framerate.

/hugs my 1080 Ti. But honestly, I can't wait for Q2 2020 where if I'm going to have to spend $599+ on a new GPU, it will actually be an upgrade.
 
Last edited:
Oct 28, 2017
1,916
Remember when the current gen came out with 8 cores but 2 core pentiums without hyperthreadnig could match them?
I expect something similar but less extreme at the begining of the next gen. Zen 2 is awesome but it will be downclocked for better yields and has to fit into a reasonable tdp while being on the same chip with a gpu.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
Remember when the current gen came out with 8 cores but 2 core pentiums without hyperthreadnig could match them?
I expect something similar but less extreme at the begining of the next gen. Zen 2 is awesome but it will be downclocked for better yields and has to fit into a reasonable tdp while being on the same chip with a gpu.
Difference is those dual core pentiums ran at over double the clockrate and had much higher IPC.

Not the case for PS5 and XBox Scarlett. PS5 and Xbox Scarlett are using almost the latest CPUs (Zen 3 out next year but it's only a refinement)
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,680
User Banned (1 day): Platform wars, prior warning for platform warring
Nah, we'll just get a bunch of years with Pc gamers complaining of poor optimisation until a darling CPU emerges with good price:performance , then the PC gamers can resume their gloating
 

dgrdsv

Member
Oct 25, 2017
11,846
Console CPUs will clock lower than their PC counterparts, which are already on the market and you may get a 8C/16T one for $300 - which is less than what a next gen console will cost you.

So while next gen machines will definitely put some long running 4C/4T and possibly even 8T platforms out of their gaming misery, I doubt that such PCs will hold back the consoles in any way imaginable. PC gamers will just upgrade, as they always do.

But it's not. When 8GB video memory is saturated, performance on the RTX 2080 plummets down the drain while GTX 1080 Ti remains at a much higher framerate.
With this logic, a GTX Titan X with its 12GBs of VRAM is faster than GTX 1080 Ti.
 

Sanctuary

Member
Oct 27, 2017
14,203
But... the Titan X Pascal is factually faster than a 1080 Ti by a minuscule percentage. That percentage is gone if you overclock a 1080 Ti though. But if a game ever required 12GB video memory, Titan X is factually faster.

I think that's kind of their point though. Has a game ever required that much memory, and how many actually require more than 8gb right now while we're at it?
 

Nooblet

Member
Oct 25, 2017
13,624
According to Intel the 7-series is now (upper) mid-range with the 9-series as high-end consumer products. Was the 8700K mid-range in 2017 when it released? No it wasn't. Is it mid-range today? Absolutely.

Are you familiar with nvidia's chip designations? The "GX-100" or "GX-110" revision are the full size chip. A "GX-104" or "GX-114" revision are the down-sized variant usually with additional headroom to deactivate damaged parts of the chip to increase yield. The "GX-106/116" is sometimes not even fully based on the original chip design and is a heavily down-sized version used for budget or low-end cards.

The last 80 model that was considered "high-end" was the 580 (GF110) as it was the full size chip without any deactivated parts. The mid-range 560 was the GF114 chip while the budget and low-end cards had the GF-116.
The "680" following it was originally supposed to be the 660 (GK104). It performed so well compared to the 500-series (and ATI's HD-series), they just slapped the 680 name AND PRICE on it and called it a day. They never sold the GK100. The first revision full-size chip GK110 was the first Titan.
Fast forward to the Turing chips and you can see that nvidia - again - isn't selling the full sized chip. The high-end is the TU-102 (2080Ti & Titan RTX). The mid-range is the TU-104 (2080) and the supposedly low-end is the TU-106 (2070 and 2060).



TL;DR: Just because nvidia's pricing is abusing a market without competition doesn't magically make a down-sized TU-104 chip "high-end".
It's hysterical to say 2080 is a "mid range" card.
The 2080Ti and Titan may be defined as enthusiast level hardware but 2080 (super since base is unavailable) is absolutely high end. It's the 3rd most powerful card out there in the market ! It's nonsense to say it's mid range and the 2070 is low end.
 
Last edited:

JahIthBer

Member
Jan 27, 2018
10,376
But it's not. When 8GB video memory is saturated, performance on the RTX 2080 plummets down the drain while GTX 1080 Ti remains at a much higher framerate.
Really? tbh i don't know many games that use over 8gb's, but they will come November 2020, so the 1080 Ti might age a bit better, all the more reason to question the 2080's price.
 

NaDannMaGoGo

Member
Oct 25, 2017
5,963
^ For those who thought it was TL;DR, don't bother trying to claim it's not 1080p native in Killzone Shadow Fall then. Because it was, and still is.

lmao

That's not native 1080p. It not being native 1080p is precisely why you listed several paragraphs explaining the differences.

And no, extrapolating the remaining pixels is clearly not native, else they wouldn't need to be extrapolated.

Also, you've spammed written over a quarter of your lifetime ERA comments on this topic within a day? Maybe cool down a tad.
 

leng jai

Member
Nov 2, 2017
15,117
Really? tbh i don't know many games that use over 8gb's, but they will come November 2020, so the 1080 Ti might age a bit better, all the more reason to question the 2080's price.

Most games barely even use 4. Maybe Gears 5 at 4K with insane textures?

Just gods to show how much Nvidia shit the bed with pricing this generation when a 2080 Super costs $1300aud here and had the same amount of VRAM as a 1070.
 

Buggy Loop

Member
Oct 27, 2017
1,232
I think you're overestimating the CPU in the PS5/XB4.

Just because the hardware is in the same family doesn't mean it's clocked the same. Consoles need to cool themselves after all.

Ya, OP probably forgot they they'll get a laptop version most likely.

You can't magically have the top line CPU + The latest custom desktop sized GPU with reasonable watts requirements in a console box with no cooling problems, and still be 599$. And to expect it to outperform 2000$+ PCs....


Nah
 

gofreak

Member
Oct 26, 2017
7,734
I've never really had the impression that devs, who are not 'PC-first' at least, care what their PC minimum requirements wind up being. If 'console-first' devs start requiring a certain amount of CPU power to hit 30hz on console, then I don't think they're going to pare that back with an eye on PC minimum requirements. They'll just set their PC min requirements up at that level.

The bigger potential for 'hold-back' is in cross-gen games - games targeted to work on everything from a base XB1 up. We'll see how the transition goes in that regard.
 
OP
OP
DonMigs85

DonMigs85

Banned
Oct 28, 2017
2,770
The vanilla 2080 can outperform the 1080 Ti except in a small handful of titles that need more than 8GB VRAM. Definitely not a midrange card at all. That would be the 2060 Super or vanilla 2070 at most.
 

thirtypercent

Member
Oct 18, 2018
680
Ah, the eternal 'this time we'll get 'em' dream. Just like lastgen (afair the big scare was GDDR5) it won't happen, requirements will rise, PC gamers will adapt. Plus nextgen console games won't be allowed to utilize all cores/threads and the CPUs will be clocked much lower. Current AAA games have already rising requirements and want a bunch of cores, they don't run great on quads or older hardware in general and offer power-hungry setting. So in a year's time it won't be a big issue and even if only a very short time. We're used to upgrading, this ain't no different. Also turning down setting will still be a thing.

Also, you've spammed written over a quarter of your lifetime ERA comments on this topic within a day? Maybe cool down a tad.

That's usually a good way to spot a troll everyone should stop responding to.
 
Last edited:

Md Ray

Member
Oct 29, 2017
750
Chennai, India
lmao

That's not native 1080p. It not being native 1080p is precisely why you listed several paragraphs explaining the differences.

And no, extrapolating the remaining pixels is clearly not native, else they wouldn't need to be extrapolated.

Also, you've spammed written over a quarter of your lifetime ERA comments on this topic within a day? Maybe cool down a tad.
You not understanding what the paragraphs said doesn't change the facts. It uses a 1080p native image and then extrapolates as explained.

I think you'll find that all of my comments have been full of constructive info, but if you want to be immature then that's fine.
 
Status
Not open for further replies.