• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
For consumers, how big of a market is the high end top tier Nvidia cards? So many people seemed to buy the Rx 580s and 1060s, even more the lower end cards. Are they even trying to face Nvidia xx80s? Was that the Vega 7 during 1080 gen?
the best place to check for gaming cards is the steam hardware survey. 20% of all gpus polled are 1060s. the first xx80 card is the 1080 at 2.6%

 

Hesemonni

Banned
Oct 27, 2017
1,974
For consumers, how big of a market is the high end top tier Nvidia cards? So many people seemed to buy the Rx 580s and 1060s, even more the lower end cards. Are they even trying to face Nvidia xx80s? Was that the Vega 7 during 1080 gen?
Perceptions drive sales so having the top dog out there helps via trickle down effect on your lower end offerings. Also, while the absolute sales might be smaller, but margins are much larger on enthusiast market.
 

eonden

Member
Oct 25, 2017
17,091


Best part of the entire CES. AMD is as good at doing presentations not about CPUs as it is competing against Nvidia in high-end GPUs.
 

elenarie

Game Developer
Verified
Jun 10, 2018
9,823
He actually says "Intelligent devices at the edge." Im actually confused as to what or where it is. lol

Edit: Damn intel got them tunes though.

Your fridge, thermostat, fitness tracker, augmented reality headset, toaster, car, whatever. Devices that live at the very edge of what we perceive to be the computing world, powered and backed up cloud services.
 

Phil me in

Member
Nov 22, 2018
1,292
sounds like the Ryzen 4000 series might be on par with the i9 9990k or whatever it's called in gaming.
Can't wait to upgrade my Ryzen 2600 it's good but not great.
 

Herne

Member
Dec 10, 2017
5,319
Prepare for the super competitive AMD 5800XT to rival the RTX 2070 Super at the low cost of $499.99 =p

The 5700 XT was to rival the 2070 and is a better performer at €100 less. AMD may no longer be interested in "being the value alternative" but they're still undercutting nVidia. Though granted the 5500/XT pricing isn't great, some people on YouTube were saying they were forced into it, but who knows.
 

Phil me in

Member
Nov 22, 2018
1,292
What games are you finding yourself limited by a 2600?

I game at 1440p and Sometimes I've dropped res to 1080p to see if there's an improvement in FPS and they usually isn't. Feels like it's cpu bound.
warhammer 2 and mechwarrior
I'm tempted to get one, but also tempted to hold off until next year for the new platform.

True but this year won't require a new mobo. I'm happy with the rest of my system.
 

Deleted member 11276

Account closed at user request
Banned
Oct 27, 2017
3,223
I game at 1440p and Sometimes I've dropped res to 1080p to see if there's an improvement in FPS and they usually isn't. Feels like it's cpu bound.
warhammer 2 and mechwarrior


True but this year won't require a new mobo. I'm happy with the rest of my system.
I really doubt that. Check with MSI Afterburner your CPU usage. I doubt it's even near 100%. There could be many other bottlenecks. Also try dropping resolution below 1080p
 

DrDeckard

Banned
Oct 25, 2017
8,109
UK
I game at 1440p and Sometimes I've dropped res to 1080p to see if there's an improvement in FPS and they usually isn't. Feels like it's cpu bound.
warhammer 2 and mechwarrior


True but this year won't require a new mobo. I'm happy with the rest of my system.

I suppose the big positive is that X670 Mobos should be the pinnacle of the platform.....hmmm. Decisions.
 

Deleted member 56752

Attempted to circumvent ban with alt account
Banned
May 15, 2019
8,699
Damn. Just got a 9700k and the new intel chips are supposed to be nuts in terms of performance. Weak
 

Phil me in

Member
Nov 22, 2018
1,292
I really doubt that. Check with MSI Afterburner your CPU usage. I doubt it's even near 100%. There could be many other bottlenecks. Also try dropping resolution below 1080p

I've tried to use msi afterburner for one of its plugins and every time I use it, it auto overclocks my gpu no matter what I do.

I'm using a rtx 2600
 

Lkr

Member
Oct 28, 2017
9,528
I've tried to use msi afterburner for one of its plugins and every time I use it, it auto overclocks my gpu no matter what I do.

I'm using a rtx 2600
In fairness, warhammer 2 will be limited by the 2600 at 1080p for sure.
I wasn't even thinking about 1080p when replying, so I can see why our experiences differ. I moved to 1440p so it hasn't bothered me...yet. However, previously I was being bottlenecked by an old 3570k, so the 2600 still feels like a revelation a year later.

Also by any chance do you have an msi rtx 2060? I believe those have an auto OC feature for afterburner, unless that's all rtx cards regardless of partner
 
Nov 2, 2017
2,275
The 5700 XT was to rival the 2070 and is a better performer at €100 less. AMD may no longer be interested in "being the value alternative" but they're still undercutting nVidia. Though granted the 5500/XT pricing isn't great, some people on YouTube were saying they were forced into it, but who knows.
Atleast in Europe the 5700XT & 2070 are the same price. Both can be had for around 400€ for the cheapest models and I think that was more or less the case when the 5700XT launched. The 2070 was meant to be EOL though with the 2070S replacing it at the same MSRP. However for some reason the 2070 was selling well below MSRP in Europe even before Navi, which is why they're the same price. This doesn't appear the case for the 2070S which does sell for 500+€.

AMD kind of has to undercut 2xxx Turing cards as Navi is older technology. It has no raytracing hardware or VRS so there's a real risk that it might age real bad in 1-3 years. I'd only advise buying Navi if you plan to upgrade in a fairly reasonable time.

The 5500 series doesn't undercut Nvidia. In fact it provides worse price/performance than some 16xx cards.
 

Herne

Member
Dec 10, 2017
5,319
Atleast in Europe the 5700XT & 2070 are the same price. Both can be had for around 400€ for the cheapest models and I think that was more or less the case when the 5700XT launched. The 2070 was meant to be EOL though with the 2070S replacing it at the same MSRP. However for some reason the 2070 was selling well below MSRP in Europe even before Navi, which is why they're the same price. This doesn't appear the case for the 2070S which does sell for 500+€.

AMD kind of has to undercut 2xxx Turing cards as Navi is older technology. It has no raytracing hardware or VRS so there's a real risk that it might age real bad in 1-3 years. I'd only advise buying Navi if you plan to upgrade in a fairly reasonable time.

The 5500 series doesn't undercut Nvidia. In fact it provides worse price/performance than some 16xx cards.

I'm not sure the lack of ray tracing is going to be much of an Achilles heel in even three years time. Sure it'll be a lack as AMD themselves will long have had ray tracing cards out by then and more developers implement it in their games, but the take-up is going to be slow and there will be extremely few if any games that absolutely require it to run, because doing so would cut a lot of potential customers out of buying the game. Given ray tracing's performance hit on GeForce 2xxx cards to the point that even 2080 owners disable it, the comparable 5700/XT and the 2060/70/Super cards would be left seriously out of the cold.

And yeah, everyone agrees the 5500/XT prices are terrible. I'm sure I read somewhere that AMD had no choice, but there's really no excuse for it. The 5600 XT price of $279 that was confirmed today isn't spectacular either.
 
Nov 2, 2017
2,275
I'm not sure the lack of ray tracing is going to be much of an Achilles heel in even three years time. Sure it'll be a lack as AMD themselves will long have had ray tracing cards out by then and more developers implement it in their games, but the take-up is going to be slow and there will be extremely few if any games that absolutely require it to run, because doing so would cut a lot of potential customers out of buying the game. Given ray tracing's performance hit on GeForce 2xxx cards to the point that even 2080 owners disable it, the comparable 5700/XT and the 2060/70/Super cards would be left seriously out of the cold.

And yeah, everyone agrees the 5500/XT prices are terrible. I'm sure I read somewhere that AMD had no choice, but there's really no excuse for it. The 5600 XT price of $279 that was confirmed today isn't spectacular either.
I feel devs will start pushing raytracing as soon as next gen consoles launch at the end of this year. They're going to want to showcase the tech. Console makers are already hyping it up. Consoles having raytracing hardware is probably the best situation Nvidia could've hoped for. Now initially you'll probably be able to dissable it as we're going through a cross gen period. At the same time though It's going to mean that while 2xxx cards might match consoles settings through next gen the 5700 series won't be able to. In 2 years time I expect there to be no more cross gen so I expect raytracing hardware to be pretty much mandatory. Consoles & Turing supporting VRS is only going to make RDNA1 have an even hard time in the future.

You might be right in the end but I feel the signs are definitely pointing to RDNA1 not being very futureproof. I can't imagine devs aren't going to take advantage of what the new consoles have to offer = raytracing & VRS. Both features Turing has but RDNA1 doesn't have.
 

Herne

Member
Dec 10, 2017
5,319
I feel devs will start pushing raytracing as soon as next gen consoles launch at the end of this year. They're going to want to showcase the tech. Console makers are already hyping it up. Consoles having raytracing hardware is probably the best situation Nvidia could've hoped for. Now initially you'll probably be able to dissable it as we're going through a cross gen period. At the same time though It's going to mean that while 2xxx cards might match consoles settings through next gen the 5700 series won't be able to. In 2 years time I expect there to be no more cross gen so I expect raytracing hardware to be pretty much mandatory. Consoles & Turing supporting VRS is only going to make RDNA1 have an even hard time in the future.

You might be right in the end but I feel the signs are definitely pointing to RDNA1 not being very futureproof. I can't imagine devs aren't going to take advantage of what the new consoles have to offer = raytracing & VRS. Both features Turing has but RDNA1 doesn't have.

To be honest I'm planning on getting a 5700 XT soon and the lack of ray tracing is not an issue at all. It's going to take a few years before it as a feature becomes a must have, and probably two years before it comes on cards that are capable of really pushing it at a decent price, consoles having it or no. Again, even 2080 owners disable it due to the performance hit and that is not a cheap card. Unless AMD's and nVidia's future implementations of the technology are more efficient/take less of a hit to performance it's going to take a while to become a universal feature enjoyed by everyone.
 
Nov 2, 2017
2,275
To be honest I'm planning on getting a 5700 XT soon and the lack of ray tracing is not an issue at all. It's going to take a few years before it as a feature becomes a must have, and probably two years before it comes on cards that are capable of really pushing it at a decent price, consoles having it or no. Again, even 2080 owners disable it due to the performance hit and that is not a cheap card. Unless AMD's and nVidia's future implementations of the technology are more efficient/take less of a hit to performance it's going to take a while to become a universal feature enjoyed by everyone.
2080 owners disable it because they want more fps or want to run the game at 4k. That's indeed a choice you can make. Some people have a 2080 and run stuff at medium settings to push high fps. That's the same thing. Even a 2060 is capable of running raytracing. You just won't be able to it at 4k or at 60fps+ unless you run at 1080p.

I really wouldn't buy a 5700xt at this point in time. My general advice: don't buy any card right now and wait until Ampere & RDNA2 as Turing is overpriced. If you have no choice and need to buy a GPU anyway buy a 2xxx card. Chances are that even a 2060 is going to significantly outperform a 5700xt in next game games if you want to use settings on par with consoles. Also you keep focussing on the raytracing but VRS is also a thing so even without raytracing Turing might just be better at next gen games than RDNA1 which makes current performance comparisons unreliable. Given that a 2070 is the same price as the 5700XT and only like 10% worse in current games it seems like an unnecessary risk.
 

Herne

Member
Dec 10, 2017
5,319
2080 owners disable it because they want more fps or want to run the game at 4k. That's indeed a choice you can make. Some people have a 2080 and run stuff at medium settings to push high fps. That's the same thing. Even a 2060 is capable of running raytracing. You just won't be able to it at 4k or at 60fps+ unless you run at 1080p.

I really wouldn't buy a 5700xt at this point in time. My general advice: don't buy any card right now and wait until Ampere & RDNA2 as Turing is overpriced. If you have no choice and need to buy a GPU anyway buy a 2xxx card. Chances are that even a 2060 is going to significantly outperform a 5700xt in next game games if you want to use settings on par with consoles. Also you keep focussing on the raytracing but VRS is also a thing so even without raytracing Turing might just be better at next gen games than RDNA1 which makes current performance comparisons unreliable. Given that a 2070 is the same price as the 5700XT and only like 10% worse in current games it seems like an unnecessary risk.

I'm currently gaming on an R9 390X which I bought at launch. I used to upgrade my graphics card every two years - I'm now coming up on five years and it is absolutely time to upgrade. The 5700 XT/2070/Super are really the only cards available at a decent-ish price that would provide a decent upgrade. The reasons I'm going with the 5700 XT are that it's almost the performance of the 2070 Super at significantly less and I have a FreeSync monitor that nVidia hasn't seen fit to make compatible, so it makes sense to go with a Radeon card.

I expect to get at least three years out of it, at which point I'll have woken up and taken notice of ray tracing. Until then, I won't miss what I've never had.
 

Morgenstern

Member
Oct 28, 2017
256
Lisa Su of AMD was interviewed by Anandtech and shared a few pieces of information about higher end GPUs and Zen 3. Nothing shocking, but nice to have some degree of confirmation.
 
Oct 25, 2017
2,937
ViewSonic Unveils New Gaming Monitors for the ELITE Line – The XG-Series Includes 55-inch 4K, G-SYNC Compatibility with Blur Busters Strobe Certification and ELITE Software
New "Blur Busters Approved" Monitor Certification Program Announced at CES 2020 for Display Motion Blur Reduction Modes


The Criteria
  • Improved color quality of motion blur reduction
  • Eliminate strobe crosstalk double-images
  • Adjustable persistence with variable MPRT
  • Additional refresh rates with motion blur reduction
  • Firmware upgradeable
  • Vastly improved motion blur reduction at lower Hz on a high Hz monitor

The first fully approved monitor is the Viewsonic XG270.
27 inch IPS
240hz
1080p
FreeSync/G-sync Compatible
Firmware upgradeable
0.6ms MPRT - "Pure XP+" branded blur reduction mode

For comparison, the Asus 360hz monitor that probably does not have blur reduction technology (unconfirmed, so far) reaches 2.8ms of MPRT using "Brute force" industry standard LCD panel methods.
A primer on MPRT (Moving Picture Response Time) vs the typically used GtG response time measurement.


Time will tell if this certification means anything and will be 100% reputable. BB's been paving the trail for the LCD the motion reduction/1000hz journey since the Lightboost days, so this new development seems very interesting to me (and is honestly LONG overdue, I've been a convert for ages). Websites like TFTCentral and Rtings have adapted BB's testing methods over the past few years.


Whether its a manufacturing push or end user drive based on BB's and pc gaming high framerate discourse over the past 10 or so years, something seems to be going the right way in the display market for once. There are a ton of 165hz-240hz monitors now. No wonder the 300hz-360hz territory is opening up.
 
Last edited: