• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

El Bombastico

Avenger
Oct 25, 2017
36,048
Officially denied by AMD
Yes it is coming this year, although I would not be surprised if it was very late in the year with not the greatest availability.
Yep, AMD denied the rumors. It'll still be a 2020 release. Just in time to know how the consoles fare vs PC hardware.

Well good, hope it comes out around the same time as the TI/3090/whatever-the-fuck-theyre-gonna-call-it
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
I went from a 4670k to a 8600k (both with a 1070) and I could really see a jump in framerate. Mostly in 1% lows. But yeah, it depends on what you are playing. Competitive games like OW? CPU-bound, eye-candy games like SotTR? GPU-bound (mostly). In uour case, I think a cheap and easy way to get some extra performance without any hassle is to buy another 8 gigs of ram, more and more games need a lot of ram.
As for Zen3, I don't think there will be any change in the core count. Zen3 is made using N7P, an optimized N7 which is currently used in Zen2. So probably higher clocks all around. But the greatest change is in the architecture. I suspect they further improved the infinity fabric and latency is lower now.
Zen4 is probably going to be in either N7+, which while being still 7nm uses a completely different process, or N5, so maybe we'll see more cores then.

I decided not to bother with upgrades now since none of them will transfer over - I'll just have to wait until the end of the year. It sucks, but I'm even holding off on playing some games like DOOM Eternal and Control. I want to see those for their first time in their full glory (and in 120fps in DOOM's case, I've never had a display over 60Hz and will be getting an LG CX).

My current PC is exactly the same as yours and I'm also waiting for Ryzen 4000 and Nvidia's 3000. I'm so ready.

epic_handshake_watercolor_painting.jpg


Is the Ryzen 4000 series still coming this year? I heard rumors that it was being pushed back to 2021...

There was one site that claimed that. From what I recall, there was no evidence, no one else has backed up their claim, at least one other site actively disagreed, and Zen3 is still officially set for this year as of a few weeks ago.
 

Vuze

Member
Oct 25, 2017
4,186
PCIe 5.0 and DDR5 are 2021 and not Ryzen 4000/this year, right? I always get confused for some reason. I think I will just hold out till then as I feel like that should be good enough for the whole of next gen.
 

kami_sama

Member
Oct 26, 2017
7,004
I decided not to bother with upgrades now since none of them will transfer over - I'll just have to wait until the end of the year. It sucks, but I'm even holding off on playing some games like DOOM Eternal and Control. I want to see those for their first time in their full glory (and in 120fps in DOOM's case, I've never had a display over 60Hz and will be getting an LG CX).
Doom at 144Hz is godly.
If you're planning to buy a Nvidia GPU (you're on this thread, so most likely lol), get either a G-Sync one or a G-sync compatible (select freesync ones) and you will get no tearing.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
Officially denied by AMD

Oh, marvellous.

I'm very happy with my 3950X chip, so I don't know how they can improve upon it's design.

I'm guessing it'll be a simple case of higher IPC and higher clockspeeds. Hopefully it'll match the 10900K's single-core performance while improving multi-core performance even further. That sounds like overkill now, but it won't seem like it once next-gen games built to run at 30fps on a 16-thread 3.5GHz Zen2 appear and you want to run them at 120fps or more.

Doom at 144Hz is godly.

If you're planning to buy a Nvidia GPU (you're on this thread, so most likely lol), get either a G-Sync one or a G-sync compatible (select freesync ones) and you will get no tearing.

Way ahead of you. LG's CX (and B9, C9 and BX) OLED TVs are G-Sync Compatible. They also have Freesync and HDMI-VRR - a great all-in-one display.
 

Zephy

Member
Oct 27, 2017
6,168
What's the performance uplift like? I've been wondering what a cutting-edge rig with an older, weaker graphics card would actually be like.

If I remember I will tell you after I've actually built the setup :)

I usually buy an entire new PC, but this time I'm keeping my old case and some other stuff like SSD and PSU, so I'll have to have some time to build it, since during the process I will dismantle the old components and will therefore not be able to use the desktop PC.

I'm expecting huge boosts in CPU heavy games like X4, which I had to stop playing due to my PC lagging too much during late game big battles. But for most games I should not see a huge difference, since what I usually play mostly runs very fine on my 1080p display.
 
Last edited:

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
What size LG CX are you getting? 48?

55". My current display is 55" and I'd rather not go smaller.

I'm also not sure if the 48" is even coming out in Australia.

If I remember I will tell you after I've actually built the setup :)

I usually buy an entire new PC, but this time I'm keeping my old case and some other stuff like SSD and PSU, so I'll have to have some time to build it, since during the process I will dismantle the old components and will therefore not be able to use the desktop PC.

I'm expecting huge boosts in CPU heavy games like X4, which I had to stop playing die to my PC lagging too much during late game big battles. But for most games I should not see a huge difference, since what I usually play mostly runs very fine on my 1080p display.

Yeah, I this is about what I expected. I think I made the right decision waiting. Thanks.

Now let's just hope Zen3 is good.
 

piratecap

Member
Oct 27, 2017
221
Going from a 4670k and 980ti to a amd 4000-series cpu and nvidia 3080 here.. Hoping it will be able to run Cyberpunk in 1440p ultrawide and decent settings.
 

dgrdsv

Member
Oct 25, 2017
11,885
Word from the rumour mill is 2022.
Zen4 have a chance of launching at the end of 2021.
Intel's 1700 socket should support PCIe5 as well and launch in 2021 with Alder Lake but we'll have to wait till 2022 and Meteor Lake to get CPUs with said support into this socket (similar to how it is with PCIe4 in socket 1200 and CML->RKL transition).
 

PHOENIXZERO

Member
Oct 29, 2017
12,089
Oh, marvellous.



I'm guessing it'll be a simple case of higher IPC and higher clockspeeds. Hopefully it'll match the 10900K's single-core performance while improving multi-core performance even further. That sounds like overkill now, but it won't seem like it once next-gen games built to run at 30fps on a 16-thread 3.5GHz Zen2 appear and you want to run them at 120fps or more.



Way ahead of you. LG's CX (and B9, C9 and BX) OLED TVs are G-Sync Compatible. They also have Freesync and HDMI-VRR - a great all-in-one display.
The major change we know about Zen 3 is that CPUs up to 8-cores will be a single monolithic die with a unified L3 cache which should help significantly with reducing latency. Clock speeds should increase too, hopefully infinity fabric speed will have as well for the CPUs over 8C.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
The major change we know about Zen 3 is that CPUs up to 8-cores will be a single monolithic die with a unified L3 cache which should help significantly with reducing latency. Clock speeds should increase too, hopefully infinity fabric speed will have as well for the CPUs over 8C.

Ahhh, I see. So the 16-core would probably be a pair of the 8-core CCXs, and the 12-core would be what - 2 6-core CCXs? And maybe the lower chip will be one of them?
 

laxu

Member
Nov 26, 2017
2,782
What's the performance uplift like? I've been wondering what a cutting-edge rig with an older, weaker graphics card would actually be like.

The other way around generally works better. With a 970 you are mainly GPU limited so having a higher end CPU will most likely do nothing more than maybe stabilize framerates a bit.
 

dgrdsv

Member
Oct 25, 2017
11,885
The major change we know about Zen 3 is that CPUs up to 8-cores will be a single monolithic die with a unified L3 cache which should help significantly with reducing latency. Clock speeds should increase too, hopefully infinity fabric speed will have as well for the CPUs over 8C.
It will help with latency uniformity but a bigger cache usually means higher latency. The fact that the cache itself will likely stay the same size, just as a one chunk now instead of two (one per CCX) means that it's hard to say if there will be as high gains as there were with Zen2. I'm expecting lower average gains but better handling of edge cases - like those Far Cry games which still have a huge lead on Intel CPUs.
 

Protoman200X

The Fallen
Oct 25, 2017
8,560
N. Vancouver, BC, Canada
I'm guessing it'll be a simple case of higher IPC and higher clockspeeds. Hopefully it'll match the 10900K's single-core performance while improving multi-core performance even further. That sounds like overkill now, but it won't seem like it once next-gen games built to run at 30fps on a 16-thread 3.5GHz Zen2 appear and you want to run them at 120fps or more.

Perhaps, but I'm quite content with the 3950X, and I feel it's going to be a bit before games take advantage of what we have on the market. I could be very wrong on this assumption, but oh well. I'm the sort who's still content with 1080p/240Hz for my gaming monitors.

by improving upon its design.

You know what I mean, smart arse.
 

Readler

Member
Oct 6, 2018
1,972
The major change we know about Zen 3 is that CPUs up to 8-cores will be a single monolithic die with a unified L3 cache which should help significantly with reducing latency. Clock speeds should increase too, hopefully infinity fabric speed will have as well for the CPUs over 8C.
Wait isn't this the case already?
 

Zephy

Member
Oct 27, 2017
6,168
Not trying to one up (one down?) you, but I'm coming from 4590 and a 280X going for a Ryzen 4XXX and 3070 this year. Super excited as well!

Should we make a club or something ? X'D

Going from 4790K with 970 to a 10700 with a 3xxx GPU (will wait on prices and versions before I decide what I get).
 

Mengy

Member
Oct 25, 2017
5,404
Same, I just receiced my new components (i7 10700, 32GB RAM and 1tb nVME SSD), but i'm waiting for the next Geforce to change my 970. I was motivated to upgrade because I want to run DCS World on my Pimax 5K VR headset.

Same here, I'm going to order the parts for my new PC in the next few weeks, but I'll be using my 970 in the new PC until the 3000 cards are out.

The new CPU, SSD's (I still use spinning drives!), and RAM will greatly benefit my video editing for my Youtube channel while I wait for Nvidia to release the new cards!
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
Perhaps, but I'm quite content with the 3950X, and I feel it's going to be a bit before games take advantage of what we have on the market. I could be very wrong on this assumption, but oh well. I'm the sort who's still content with 1080p/240Hz for my gaming monitors.

You might be right, but I do think the underpowered Jaguar chips in the PS4 and X1 have resulted in a generation where CPUs have a very easy time pushing framerates like 144-240fps. The CPU jump in the new consoles is going to be really phenomenal - but yes, I agree that it'll be a while before we see third party games really taking advantage of them. Even then, the 3950X should do well - I'm just concerned that even it won't be able to manage 240fps in a lot of later games, even if you use a 3080 and run at 1080p.
 

ss_lemonade

Member
Oct 27, 2017
6,658
What's the performance uplift like? I've been wondering what a cutting-edge rig with an older, weaker graphics card would actually be like.
Interesting. My current rig (from 2013 outside of the graphics card) is a 970 with an i5-3570K, and 8GB of DDR3 and a SanDisk SATA SSD. Last Black Friday when there was an eBay Plus sale, I considered getting most of what I needed for a new rig (3900X, 32GB of 3600mhz DDR4, an NVMe drive, etc) and keeping the 970 until I could get a 3080 on sale. I decided the result wouldn't be worth it and I'd be better off waiting a year and going Zen3 instead.
My upgrade path was the other way. Basically started with:

3570k + 8GB + 780: Worked really well for 1080p at 2013 until I started running into games I could no longer max out. It was either games demanding heavy CPU utilization (results in low minimum framerates, microstuttering), more ram (stuttering) or a better GPU (lower overall framerates. Limited 3GB vram also affected some games like Gears from loading textures properly)

Gaming laptop with similar GPU performance but a better CPU (6700HQ + 16GB + 970m): Probably closer to your question. Games ran fairly similar but with slightly lower max, avg framerates (the 970m was just slightly below my 780). The better CPU and RAM did help a lot with stuttering and gave me a much more stable experience with newer games. I ran into a multitude of unfortunate issues though with the laptop and decided to sell it after a year and just get a new GPU instead.

3570k + 8GB + 1080 Ti: Improved max and average framerates a lot and could again max out games even at higher resolutions. This didn't help at all though with minimum framerates and stuttering. Games like Final Fantasy 15 were still a mess. The laptop gave a much more stable gaming experience despite the significantly weaker GPU.

3570k + 16GB + 1080 Ti: Didn't exactly help as much as I was hoping, though I was only really comparing with Final Fantasy 15 since that is what I was playing at the time that was extremely demanding.

3700x + 32GB + 1080 Ti: Demolished any game I threw at it.

Yeah but I'm ok I think It can handle between 30/60fps. Thanks to Gsync.
It will struggle to get more than that but I don't need more than 60fps on AAA solo games.
Gsync barely helped when I was running into games that needed faster CPUs.

It had a 2080ti.

With RTX on, the bottleneck is 100% going to be the GPU, so as long as you have a decentish CPU/RAM they should be fine.
Wasn't it streamed through GeforceNOW servers, or was there another demo tested on local hardware?

Would be nice to have 2080 Ti-tier hardware with GeforceNOW (or 3000 series). I just know that the fastest configuration can still run into framerate issues with games like Metro Exodus at the highest RTX settings @ 1080p.
 
Last edited:

RedSwirl

Member
Oct 25, 2017
10,058
Interesting. My current rig (from 2013 outside of the graphics card) is a 970 with an i5-3570K, and 8GB of DDR3 and a SanDisk SATA SSD. Last Black Friday when there was an eBay Plus sale, I considered getting most of what I needed for a new rig (3900X, 32GB of 3600mhz DDR4, an NVMe drive, etc) and keeping the 970 until I could get a 3080 on sale. I decided the result wouldn't be worth it and I'd be better off waiting a year and going Zen3 instead.

Speaking of Zen3, I was wondering - do you think the larger core counts are going to trickle down to the cheaper CPUs this time? Like, say, a 12-core 4800X when the 3800X was only 8-core?
I went from a 4670k to a 8600k (both with a 1070) and I could really see a jump in framerate. Mostly in 1% lows. But yeah, it depends on what you are playing. Competitive games like OW? CPU-bound, eye-candy games like SotTR? GPU-bound (mostly). In uour case, I think a cheap and easy way to get some extra performance without any hassle is to buy another 8 gigs of ram, more and more games need a lot of ram.
As for Zen3, I don't think there will be any change in the core count. Zen3 is made using N7P, an optimized N7 which is currently used in Zen2. So probably higher clocks all around. But the greatest change is in the architecture. I suspect they further improved the infinity fabric and latency is lower now.
Zen4 is probably going to be in either N7+, which while being still 7nm uses a completely different process, or N5, so maybe we'll see more cores then.
See I'm still on a 4670k with 8GB of RAM, with a 1070. Every once in a while I think about getting another 8GB but at this point I don't know if it's worth it if I might rebuild this year. My motherboard isn't compatible with DDR4 so I don't know if it's worth it to grab another 8GB of DDR3 and only use it for a few months.

I have this hooked up to a 1080p TV, so my ceiling is 1080p60fps, and I don't think I'll be moving up to 4K this year. I downscale when I can get away with it. There are already a few games like Modern Warfare, Kingdom Come, and AC Origins where I'm pretty badly CPU limited. I actually don't know if it's the CPU or RAM with Modern Warfare, I just know the game will repeatedly lock up for a few seconds. I heard Red Dead II had similar problems with older i5s. Other AAA games like Resident Evil 3 and Doom Eternal run just fine, but I'm scared of having to run Cyberpunk on this.

Depending on when the 3070 comes around I may even first rebuild but keep the 1070 for a little bit. The game I play most is actually Arma, and even though I've accepted it will always run like shit to some extent, I've heard it really benefits from faster RAM and CPU clock speeds. I just want RTX, DLSS, and integer scaling as soon as I can get it at a reasonable price. I haven't started looking up CPUs yet but I am considering going AMD for the first time.

Advice?
 
Last edited:

Sabin

Member
Oct 25, 2017
4,622
See I'm still on a 4670k with 8GB of RAM, with a 1070. Every once in a while I think about getting another 8GB but at this point I don't know if it's worth it if I might rebuild this year. My motherboard isn't compatible with DDR4 so I don't know if it's worth it to grab another 8GB of DDR3 and only use it for a few months.

I have this hooked up to a 1080p TV, so my ceiling is 1080p60fps, and I don't think I'll be moving up to 4K this year. I downscale when I can get away with it. There are already a few games like Modern Warfare, Kingdom Come, and AC Origins where I'm pretty badly CPU limited. I actually don't know if it's the CPU or RAM with Modern Warfare, I just know the game will repeatedly lock up for a few seconds. I heard Red Dead II had similar problems with older i5s. Other AAA games like Resident Evil 3 and Doom Eternal run just fine, but I'm scared of having to run Cyberpunk on this.

Depending on when the 3070 comes around I may even first rebuild but keep the 1070 for a little bit. The game I play most is actually Arma, and even though I've accepted it will always run like shit to some extent, I've heard it really benefits from faster RAM and CPU clock speeds. I just want RTX, DLSS, and integer scaling as soon as I can get it at a reasonable price. I haven't started looking up CPUs yet but I am considering going AMD for the first time.

Advice?

A 3070 won't make much sense tbh. It will be massivly bottlenecked by the CPU and the slow DDR3 Ram.
 

RedSwirl

Member
Oct 25, 2017
10,058
A 3070 won't make much sense tbh. It will be massivly bottlenecked by the CPU and the slow DDR3 Ram.
Yeah I wasn't thinking about putting a 3070 onto my current rig. I would definitely rebuild everything else first, but it's a matter of whether I actually rebuild everything else sooner and just wait a while before I get any new GPU. And I'm wondering if it's worth it for me to buy more DDR3 RAM between now and when I decide to start the rebuild.
 

BBboy20

One Winged Slayer
Member
Oct 25, 2017
22,010
So, I suppose my dilemma is that there is a new PC platform on the horizon but that won't get around until after Cyberpunk 2077. On the other hand, I've had the same CPU and RAM since 2011 and it seems like these upcoming 4000s and 3000s would be the right time to upgrade right before (hopefully) 2077. Though, I suppose that will also depend on how expensive (or performance-worthy) those future Ryzens and RTXs will be.

I also read up about the early pitfalls of new architectures so maybe it would be better to invest in the very best this current generation of technology has to offer.
 

nitewulf

Member
Nov 29, 2017
7,204
Not too familiar with Intel architecture now, but current Ryzen CPUs are more than capable of what's coming down the pipe. I have a 3700X and in Control, Gears 5, Forza H, DMC5 etc I have not seen the CPU taxed beyond 10%, and usually at 3%. The GPUs are the main bottlenecks if you are using a Ryzen and DDR4 ram (even Ram amount/size is not THAT important).
 

Pall Mall

Member
Oct 25, 2017
2,424
I'm wondering if the 3080ti will be able to handle Cyberpunk with raytracying at 4K or if I should scale that back to 1440p.
 

Nooblet

Member
Oct 25, 2017
13,632
I'm wondering if the 3080ti will be able to handle Cyberpunk with raytracying at 4K or if I should scale that back to 1440p.
Probaably not if you want 60FPS.
Apparently the main demo that streamers played was running at 1080P using DLSS from 720P. It had everything turned on except RT reflections. Though reports were that it was running in excess of 60FPS. And it's possible it was lower resolution to make streaming easier.

My guess is 3080Ti/3090 will be able to do 1440P with DLSS everything on and hit 60. Without RT it should be able to hit native 4K like the B roll footage that Digitalfoundry used in their analysis. But I'm purely speculating at this point.
 

RCSI

Avenger
Oct 27, 2017
1,839
Damn, Sandy Bridge is easily among the legendary tier of Intel CPUs.

I don't regret buying the 2500k, it nearly lasted through an entire generation of targeted console games, if not for later VR titles maxing it out. I don't expect the Zen 3 series to last as long and anticipate an upgrade in 6 years, though I hope to push 8 for a build after PS6/XBSX2
 

Pall Mall

Member
Oct 25, 2017
2,424
Probaably not if you want 60FPS.
Apparently the main demo that streamers played was running at 1080P using DLSS from 720P. It had everything turned on except RT reflections. Though reports were that it was running in excess of 60FPS. And it's possible it was lower resolution to make streaming easier.

My guess is 3080Ti/3090 will be able to do 1440P with DLSS everything on and hit 60. Without RT it should be able to hit native 4K like the B roll footage that Digitalfoundry used in their analysis. But I'm purely speculating at this point.
Yeah I guess we'll see when we actually get the hard specs and benchmarks, but that seems like the reasonable assumption. I'm sure it'll look glorious though in either permutation with RT, man that lighting is clean. But I think I want that 60fps.
 

Trieu

Member
Feb 22, 2019
1,774
Usually new hardware is never quite good enough to play the most demanding and newest games completely maxed out on higher resolutions (especially 4K) at a playable (might come down to each individual taste what qualifies as playable) framerate.

A friend of mine bought an RTX 2080 Ti Lightning Z and overclocked it to 2100MHZ and it still wasn't quite good enough for Control and Metro Exodus completely maxed out (with RT on) at 1440p.
And if I am not mistaken is RayTracing rendered/calculated per pixel and higher resolutions are extremely demanding.

Completely speculating here, but I would guess to play Cyberpunk 2077 in 4K with RTX and all other settings to the max and 60+ fps then you need 3-4x the performance of the 2080 Ti.
Which does sound mighty unreasonably with me saying it like that, but only a very small part of the actual RTX 20 die is actually dedicated to RayTracing compared to the rest. So we can hope that RayTracing performance gets a gigantic boost with the newer generations that are yet to come.

(As an owner of an RTX 2080 and 1440p monitor I am pretty disappointed in how current RTX games run on that hardware)
 

RedSwirl

Member
Oct 25, 2017
10,058
Usually new hardware is never quite good enough to play the most demanding and newest games completely maxed out on higher resolutions (especially 4K) at a playable (might come down to each individual taste what qualifies as playable) framerate.

A friend of mine bought an RTX 2080 Ti Lightning Z and overclocked it to 2100MHZ and it still wasn't quite good enough for Control and Metro Exodus completely maxed out (with RT on) at 1440p.
And if I am not mistaken is RayTracing rendered/calculated per pixel and higher resolutions are extremely demanding.

Completely speculating here, but I would guess to play Cyberpunk 2077 in 4K with RTX and all other settings to the max and 60+ fps then you need 3-4x the performance of the 2080 Ti.
Which does sound mighty unreasonably with me saying it like that, but only a very small part of the actual RTX 20 die is actually dedicated to RayTracing compared to the rest. So we can hope that RayTracing performance gets a gigantic boost with the newer generations that are yet to come.

(As an owner of an RTX 2080 and 1440p monitor I am pretty disappointed in how current RTX games run on that hardware)
From what I understand, both RT and 4K are things that are still so expensive that basically no hardware can max them out for modern games yet. We're still probably gonna be relying on stuff like DLSS, VRS, and other forms of upscaling for a while.
 

BreakAtmo

Member
Nov 12, 2017
12,838
Australia
From what I understand, both RT and 4K are things that are still so expensive that basically no hardware can max them out for modern games yet. We're still probably gonna be relying on stuff like DLSS, VRS, and other forms of upscaling for a while.

Yep, and current RT is just a hybrid solution. And then you get into questions of what counts as "maxing out" - 60fps? 120? 144? Whatever the case, native 4K with ray-tracing is brutal on any rig. Like, a 2080Ti can probably do it with Metro Exodus if you're ok with 30fps or medium settings since it apparently uses a rather light implementation, but for games with multiple RT effects it's just out of the question.

Thankfully, the DLSS 2.0 Quality Mode seems to effectively give you different-but-equal image quality plus a 60-70% performance boost, and the Performance Mode doesn't look much worse and ups that to 130-140%. I don't think you'd ever really want to stop using DLSS - if you can hit native 4K, then you could do 8K DLSS Performance Mode and downsample instead.
 
Status
Not open for further replies.