• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

PHOENIXZERO

Member
Oct 29, 2017
12,063
If you have another card to fall back on or are okay with getting a cheap one for the next few months I would absolutely sell a 2070, 2080 and especially a 2080Ti right now while prices are still high and putting the money towards an Ampere or RDNA2 card in the Fall.

Thinking of splurging on an LG 48CX in the near future, what are the odds that the 3070 will be able to do most stuff at 4k60, potentially with RT on? I don't fancy pushing my luck with wife approval on almost £2,500 worth of purchases!!

The 3070 is supposedly going to be within 5-10% of the 2080Ti while having better RT performance.
 

Winstano

Editor-in-chief at nextgenbase.com
Verified
Oct 28, 2017
1,828
The 3070 will probably be similar to the 2080ti in power. From what I understand that should mean most current-gen games should be able to run at native 4K60 without RT and maybe with some smart setting adjustments, though with G-Sync having 50-60fps would probably still be fine. With RT, you're going to need to make sacrifices, but if the game has DLSS 2.0 you could just activate Quality mode and likely still be fine - though if it has that, you should activate Quality Mode either way. It matches or surpasses native 4K with a 70% performance bump.
If you have another card to fall back on or are okay with getting a cheap one for the next few months I would absolutely sell a 2070, 2080 and especially a 2080Ti right now while prices are still high and putting the money towards an Ampere or RDNA2 card in the Fall.



The 3070 is supposedly going to be within 5-10% of the 2080Ti while having better RT performance.

Hmmm... Looks like I'm doing the dishes for a bit longer then...
 

dgrdsv

Member
Oct 25, 2017
11,843
So DLSS 8K? Bring it. This sort of thing is why DLSS would absolutely be worth adding even to games that are already easy to run at high framerates.
Well, if a game can render in 4K with DLSS reconstruction to 8K then yeah. So far though most games with DLSS can't really hit 60 in 4K, especially with RT enabled. Which is why DLSS 2X has become somewhat of a vaporware.
 

BreakAtmo

Member
Nov 12, 2017
12,824
Australia
Well, if a game can render in 4K with DLSS reconstruction to 8K then yeah. So far though most games with DLSS can't really hit 60 in 4K, especially with RT enabled. Which is why DLSS 2X has become somewhat of a vaporware.

Yeah, I was thinking it would be pretty cool to have games like CSGO or Overwatch get DLSS. Really though, everything should have it, since it benefits the high and low ends. Apparently, even a 2060 can manage 4K60 Death Stranding if it's overclocked and you use Performance Mode. I imagine a 3080Ti should be able to do the DLSS 8K trick, in DS, though I would probably prefer 120fps.
 

SharpX68K

Member
Nov 10, 2017
10,514
Chicagoland
www.graphcore.ai

Introducing 2nd Generation IPU Systems for AI at Scale

Graphcore's second-generation IPU platform has greater processing power, more memory and built-in scalability for handling extremely large Machine Intelligence workloads.




Graphcore Takes on Nvidia with Second-Gen AI Accelerator

www.eetimes.com/graphcore-takes-on-nvidia-with-second-gen-ai-accelerator/

British startup Graphcore has unveiled its second-generation IPU (intelligence processing unit), the Colossus Mark 2, an enormous 59.4 billion-transistor device designed to accelerate AI workloads in the data center. The company also launched a 1U server blade for data centres which incorporates four of the Colossus Mark 2 chips and allows scalability to supercomputer levels. This new offering is designed to place Graphcore in firm competition with market leader Nvidia for large scale data center AI acceleration.
Graphcore's Mark 1 device was released in 2018. Mark 2, which has migrated from TSMC 16nm to TSMC 7nm, achieves 250 TFlops with 1472 independent processor cores. The new chip has three times the amount of RAM – 900MB on-chip, up from 300MB in the previous versions. Graphcore's figures have chip performance up roughly 3-4X compared the Mark 1 overall; versus eight of the C2 PCIe cards (each with 2x IPU Mark 1s), eight IPU Machines (each with 4x IPU Mark 2s) can perform BERT training 9.3x faster, BERT-3Layer inference is 8.5x faster and EfficientNet-B3 training is 7.4x faster.
The IPU Machine (part number M2000) is a 1U server blade with four Colossus Mark 2 chips on it, offering a Petaflop of AI compute at FP16 precision.
"This is really the product that Graphcore has been working on since we started the company and that we have wanted to produce," said Graphcore CEO Nigel Toon. "The innovations are more than just going from TSMC 16nm to 7nm, the other innovations such as on chip RoCE and new AI number format plus more all add up. It keeps Graphcore ahead of Nvidia's latest Ampere [offering] so it's important timing for Graphcore," said Michael Azoff, Chief Analyst, Kiasco Research.
Toon showed a side-by-side comparison showing what Graphcore offers at a similar price point versus Nvidia's DGX-A100 system; launched a couple of months ago, DGX-A100 is powered by eight state-of-the-art 7nm Ampere A100 GPUs. A similar budget will buy you eight IPU Machines (24 IPU chips total), occupying 8U compared to the DGX-A100's 6U. But Graphcore's figures have their system offering 12x the FP32 (AI training) compute, and 3x the FP16 compute. It would also offer 10x the memory, allowing much bigger models to be supported. Overall, Graphcore believes such a system would offer a 16x performance advantage when training EfficientNet.

"[This would translate to] either much lower cost, less power or faster training, whichever parameter is most important for customers," Toon said.

svc3pS4.jpg

Graphcore's comparison of what they offer for around the same price as an Nvidia DGX-A100 (Source: Graphcore). Note that an Nvidia DGX-A100 is 6U compared to 8x Graphcore IPU Machines at 8U.

"The second generation Graphcore IPU is impressive from a performance standpoint with three times more memory, but I think easy scalability is perhaps its greatest feature," said Karl Freund, senior analyst for AI at Moor Insights & Strategy. "The new fabric extends processing to literally thousands of IPUs, while the new IPU Machine enables a plug-and-play scalable infrastructure. With this new product, Graphcore may now be first in line to challenge Nvidia for data center AI, at least for large-scale training."

More at the link https://www.eetimes.com/graphcore-takes-on-nvidia-with-second-gen-ai-accelerator/
 
Last edited:

dgrdsv

Member
Oct 25, 2017
11,843
not at all, the RTX Titan is only like 10% better than the 2080 Ti for games.

RTX Titan is for compute loads because it has a lot more tensor cores
It's tensor core advantage is the same as general SM number - so about the same 10%. It's biggest advantage is the VRAM which it has 24 GBs of.
 

Tovarisc

Member
Oct 25, 2017
24,396
FIN
There is a rumor that new Nvidia high-end GPUs will require new power connectors, so if you want to keep your old PSU, you'd have to utilize some sort of an adapter (which is naturally suboptimal for power lines).

From that rumor
This design may also just be specific to NVIDIA's reference 'Founders Edition' models while AIBs will still ship their custom designs with traditional mini-fit power connectors.
It seems like users with good PSUs can still plug in two 6-pin power connectors given they meet the high efficiency required to sustain an enthusiast-grade graphics card. But then again, anyone running a $500 US+ graphics card would already be rocking a pretty decent quality PSU.

So it looks like the addition of this new 12-pin power connector won't require any big changes and you will indeed be able to run the NVIDIA GeForce RTX 30 series Ampere gaming graphics cards without having to worry about buying a new PSU that falls in the standard of your brand new graphics card.

So their TLDR is "Have decent PSU? You fine". Which is fair assumption for now as

A) It's rumor

B) There is no way that NV would design sudden proprietary change to connector(s) without taking into count capabilities of current PSU's
 

Crazymoogle

Game Developer
Verified
Oct 25, 2017
2,878
Asia
Igor's Lab has a pretty interesting piece on this. Does Nvidia want to change the cable standard and possibly the AWG to ensure better max wattage? Yes. But an adapter is guaranteed in the box of products that use it. But really all we need to know is:
  • Probably 3080/ti and maybe Founders Edition only
  • Will come with an adapter in the box
  • Requires 2x8pin, not 2x6pin = they need more than 10 non-ground pins anyway
 

dgrdsv

Member
Oct 25, 2017
11,843
So these cards are rumored to launch as early as next month?
Depends on what you mean by "these cards".
Some GA102 based cards will likely launch over the next couple of months - I'm thinking about 3070, 3080 and a new Titan possibly.
3090 and/or 3080Ti will probably come around Navi 21 launch time so closer to Oct-Nov. 3070Ti possibly as well depending on what AMD will do on a cut down Navi 21.
3060 and 3060Ti probably won't launch this year though.
 

lyr1c

Member
Aug 8, 2019
17
Delayed launches of different tiers are really annoying, I'm still trying to figure out what I should do. Obviously I want to maximize performance and price / performance ratio while minimizing the time without a "good" GPU. Right now I have an MSI 2070 Super Gaming Trio X. Checking prices on Ebay (Germany), I assume I could get something between 450-500€ if I sold the card. When the card released I bought it for 500€ instead of the normal retail price of 630€ because of a pricing error, so I would get most of my money back. I have an old AMD 7950 to hold me over till Ampere launch.

I know it's almost impossible to accurately predict the prices of the new GPUs, especially because of the competition with AMD / RDNA 2 which could potentially force NVIDIA to lower the prices. Still, are there any guesses about the launch price of the 3080 / 3080 TI? I think the best option would be to sell the 2070 Super now and get the 3080 when it releases. On the other hand, assuming the new AMD GPUs will be good, I feel like the 3080 TI could have a better price / performance ratio than the 2080 TI to counter AMDs offering. I don't really want to buy a 3080 when there is a potentially much stronger 3080 TI launching only a few months later while also not being that much more expensive.
 

low-G

Member
Oct 25, 2017
8,144
There is a rumor that new Nvidia high-end GPUs will require new power connectors, so if you want to keep your old PSU, you'd have to utilize some sort of an adapter (which is naturally suboptimal for power lines).

Do new PSU specs EVER come from hardware manufacturers? Seems weird to invent a connector with new specs that may not be viable for PSU manufacturers.

I guess they'd have to be working with PSU makers at this point. And if they're releasing Turing with this power interface the PSUs must already be in development as well...

The power spec is different so while it sounds like 2x6 pin would function, I wonder about people overclocking the higher end cards.

Also seriously, if they're expanding the wattage need this much AND moving down a node AND not introducing dramatic new features like RTX, this thing had better be >150% performance per price and model. I'd expect 200%.
 

Tovarisc

Member
Oct 25, 2017
24,396
FIN
Do new PSU specs EVER come from hardware manufacturers? Seems weird to invent a connector with new specs that may not be viable for PSU manufacturers.

I guess they'd have to be working with PSU makers at this point. And if they're releasing Turing with this power interface the PSUs must already be in development as well...

Those are collaborative developments within industry, between many companies and parties, to create unified standard.

If NV is running solo on this then this connector will never go anywhere and be proprietary fuckery.
 

low-G

Member
Oct 25, 2017
8,144
Those are collaborative developments within industry, between many companies and parties, to create unified standard.

If NV is running solo on this then this connector will never go anywhere and be proprietary fuckery.

Yeah they'd probably have to be working with AMD at this point too...
 

seroun

Member
Oct 25, 2018
4,464
Huh. What was the selling point on the founders cards then?

All the design of Founder Edition cards (PCB, chip layout, and cooler design) is done by NVIDIA, The rest of the cars are done (PCB, chip layout, and cooler design) by partners.

Why do they exist and are more expensive than reference cards, while being worse? I have no idea.
 

Otheradam

Member
Nov 1, 2017
1,224
I seem to recall founders cards being actually available 1-2 months earlier than partner cards. I thought that was when Nvidia made most of their money selling their cards.
 

Pyros Eien

Member
Oct 31, 2017
1,974
I seem to recall founders cards being actually available 1-2 months earlier than partner cards. I thought that was when Nvidia made most of their money selling their cards.
For the 2080ti at least it was simultaneous releases, but the early stocks were really low and it was hard to find for about 2months. Not too sure about the other models since I was mostly tracking that one personally. Part of it was because the founder's edition/release was delayed a couple of weeks, but MSI and EVGA had versions out right away.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Thia is what I remember too. It was frustrating...

I think 10 series that was the case. 20 series yeah I think some aibs did launch right away but stocks in general were pretty low and hard to find cards for a while. Sort of a paper launch but that's pretty common with GPUs. Aftermarket 5700xts took a couple months to come out as well.
 

tokkun

Member
Oct 27, 2017
5,399
Those are collaborative developments within industry, between many companies and parties, to create unified standard.

If NV is running solo on this then this connector will never go anywhere and be proprietary fuckery.

Nvidia is 70-80% of the discrete GPU market. As long as it is not impossible to implement, PSU makers will support it.
 

dgrdsv

Member
Oct 25, 2017
11,843
Also seriously, if they're expanding the wattage need this much AND moving down a node AND not introducing dramatic new features like RTX, this thing had better be >150% performance per price and model. I'd expect 200%.
I doubt that this is about wattage if it'll turn out to be true. NV hasn't even been hitting the upper limit of PCIE3 (300W) during the last several generations. There are also some rumblings of PCI-SIG standardizing the dual 8-pin config (up to 375W) in PCIE4 spec. I don't see why they would need more than that on a consumer graphics card.
 

low-G

Member
Oct 25, 2017
8,144
I doubt that this is about wattage if it'll turn out to be true. NV hasn't even been hitting the upper limit of PCIE3 (300W) during the last several generations. There are also some rumblings of PCI-SIG standardizing the dual 8-pin config (up to 375W) in PCIE4 spec. I don't see why they would need more than that on a consumer graphics card.

I agree, I expect they may have longer term plans on increasing the wattage, but I don't expect them to jump to such high wattage in a single gen.
 

zerocalories

Member
Oct 28, 2017
3,231
California
I doubt that this is about wattage if it'll turn out to be true. NV hasn't even been hitting the upper limit of PCIE3 (300W) during the last several generations. There are also some rumblings of PCI-SIG standardizing the dual 8-pin config (up to 375W) in PCIE4 spec. I don't see why they would need more than that on a consumer graphics card.

Mo powah baybee
 

smocaine

Member
Oct 30, 2019
2,010
Anyone else watch the new 'Moore's Law is Dead' video?

Sounds like Sept. is gonna be a crazy good time. Sell sell sell!
 
Oct 25, 2017
2,932
Hard to feel any particular way about it right now. Seems like he hasn't walked back most of the things he talked about previously.

Watching "new" 2020 hardware launches has been weak: comet lake being better but hotter, wadding through amd b450 mobo drama, B550 taking forever to come out and then being out of stock constantly, XT CPUs missing the landing, Threadripper PRO dropping out of the sky for Big Business Buck-o's Only ™, AMD Desktop APUs teasing "best desktop gaming chips!1 ever, maybe?!?!?", Zen 3 being reconfirmed like 6 freaking times in the space of 3 months, etc.

NVIDIA's leaking ship is just another tire on the endless fire. In the grand scheme its rare, but in 2020 that's just how things are.
 
Status
Not open for further replies.