• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,639
That's release day for the 3090. 3080 releases before that

www.nvidia.com

Introducing NVIDIA GeForce RTX 30 Series Graphics Cards

Experience games at stunning levels of detail, at high resolutions and framerates. And further accelerate and improve your experience with NVIDIA DLSS, NVIDIA Reflex, and other groundbreaking technologies that are coming to Fortnite, Call of Duty: Black Ops Cold War, Cyberpunk 2077, Valorant...
Sorry, I mean in terms of being able to order the product. For the 3080, you can order it (from overclockers anyway) from the 17th.
 

MrKlaw

Member
Oct 25, 2017
33,038
Why are there so many almost identical looking cards from the same manufacturer at slightly different prices? I want to get a shortlist of good, quiet, reliable models before they go on sale so I have a hitlist for when inevitably some are not available . I think anything from £649-749 is an ok price range for the 3080 if the extra is likely to bring quieter fans or better performance

Overclockers not showing the nvidia cards - are they only direct from nvidia?
 

kami_sama

Member
Oct 26, 2017
6,998
So if I understand you correctly, you can't preorder it, you just have to be fortunate enough to buy it when it goes live for sale? I imagine that's what it'll be like for online retailers, but what about physical brick and mortar stores? Is that a midnight release type of situation or what? (I'll wait outside my local Microcenter if I have to...)
We have no idea, every retailer is going to probably do it's thing.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,639
Would these measurements be ok in an NZXT H510?

Length: 317.8mm (12.5in) Height: 120.7mm (4.8in) • Width: 2.5 slot (58mm) (2.3in)

It's the Zotac 3080, for reference. Thanks!
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Still no date for the reviews embargo?
If it's launch day then it sucks for people wanting to make an informed decision because they'll all be gone by the time you read the reviews.
 

BoxScar

Member
Jul 21, 2020
799
Bought a 1080 mini 2 years ago for almost the same price as a RTX 3070. Breaks my heart.

I might try to hold off for the RTX 3060 as I only have a 2K monitor - when do these normally come out? 6 months after the XX70/80?
 

Deleted member 25042

User requested account closure
Banned
Oct 29, 2017
2,077
Bought a 1080 mini 2 years ago for almost the same price as a RTX 3070. Breaks my heart.

I might try to hold off for the RTX 3060 as I only have a 2K monitor - when do these normally come out? 6 months after the XX70/80?

You bought it two years ago for $500
There are people who just days before the presentation bought $1000+ used 2080 Tis....
 

Burai

Member
Oct 27, 2017
2,084
Bought a 1080 mini 2 years ago for almost the same price as a RTX 3070. Breaks my heart.

I might try to hold off for the RTX 3060 as I only have a 2K monitor - when do these normally come out? 6 months after the XX70/80?

That very much depends on how threatened they feel by AMD. Same with the Ti cards. Could be six months, could be three. Or less.
 

Anddo

Member
Oct 28, 2017
2,854
Would these measurements be ok in an NZXT H510?

Length: 317.8mm (12.5in) Height: 120.7mm (4.8in) • Width: 2.5 slot (58mm) (2.3in)

It's the Zotac 3080, for reference. Thanks!

You're good. I fit a 2080-ftw3 in that case with an AIO with room to spare. The 2080-ftw3 is 11.90 inches in length.
 

brain_stew

Member
Oct 30, 2017
4,727
Can you imagine paying £900 for a used 2080ti just one week ago?

Everyone who managed to sell theirs made off like bandits. I was fully expecting a revolution in price:performance and I'm pleased we got it. $500/£450 was my price prediction for the 3070 all along I just didn't think we'd actually get better than 2080Ti performance, more like 90-95% at best.
 

Idoru

Member
Oct 28, 2017
175
Do we know yet how the release day works in regards of timezones? Seems kinda weird that every language site just says "17th".
 
Oct 28, 2017
511
Yes some of the custom AIB's are using 8 pin, and specifically from your picture, the EVGA FTW3 is using 3x 8 pin.

Only 3090 supports Nvlink this gen. Jensen doesn't want you to combine that crazy value of two 3080's. (Tbf most games don't work with SLI nowadays)


Just doing math, shows that the 3090 (outside of 8K and VRAM usage and bandwidth constraints) is 20% faster. So it can only be at best 20%, likely a little less. But hey you get to pay $800 more than the 3080! Its not good value for gamers.

I will have nightmares now thank you


I don't see how when they are offering a 2080 Ti for $500, you see this as a bump in prices. You gotta stop focusing on the names and focus on the performance per dollar.

Your gpu dumps hot air into your case. This isn't really different.

No, RTX Titan is slightly faster than a 2080 Ti, and the 3090 is far faster than a 2080 Ti.

It is confirmed on Nvidia's website, that Founder's Editions are priced at MSRP.

Why not get the 3080? Still a really large upgrade from 2080 Ti, and it isn't kicking you in the nuts with the price.

Only if you don't mind fan blades.

Uh what? How do you see that as exponential? They didn't even start from higher numbers this time, these are the most honest Nvidia's graphs have EVER been.


No, all benchmarks and slides were using a PCIE 3 computer. PCIE4 may provide some benefit, but we assume thats in the range of 1-5%. Not something to lose sleep over, or upgrade your whole computer before you are ready.

You will set your game resolution as 1440p 120hz or 4k 60hz, and then touch DLSS settings separately. And yes, you can always use DSR to supersample, DLSS same thing, but if you want 120hz, its going to be supersampled down from 4k to 1440p. your monitor doesn't give a shit about what the internal resolution is, only the frame presented to it.

It is a card for people who are prosumer and are using RTX Titan's to upgrade to. It's not a good value for a gaming card. Its at BEST 20% more performance, starting at $800 more. Gonna be even more than 1500 for AIB 3090s.

Reference board is only 1x HDMI 2.1, some custom boards are adding an extra.

Jensen literally called you out in the video and said it was ok for you to now upgrade.

Nice!

Of course this is true, unless we are all just a simulation, or a hallucination of an alien.

Thank you! I feel so good that I sold it when I did ^_^

Ye of little faith.

I was able to turn off the RGB shit on 2080Ti FTW3, I have no reason to expect you can't this time.

Only the reference board. The FTW3 is a 2.75 slot beast. And the reason for this, is because one, they want you to pay more for the better FTW3, and two, some people can't fit more than 2 slot cards in their case and they still want to have a 3090 for them, without having them to go watercooling route.

100%. Custom AIB's already have shown us that some of them have added an extra HDMI 2.1

Going off steam, only 0.88% of people had a 2080 Ti.
So yeah $500 buys you more gaming power than 99% of computers currently have. What a time to be alive :)

This guy is fucking insane and full of shit. Watch Digital Foundry if you want some real information.

Digitial Foundry has a limited benchmark right now for 3080. Watch the video, its good!

Don't waste your money on the 3090, it is at BEST 20% faster than a 3080 at 4k. Not worth paying an extra $800+, you could be getting a PS5 or a new monitor, or half of an LG OLED with that kinda money. Or a whole lotta games!

This is not possible without completely fucking the bandwidth of the card up. It can only be either 10,11,12,20,22,24.... 3080 Ti will be a 12GB card, mark my words.

Unless you have an overclocked 10900k, 650w is enough.

The price is more than double for 20% (at best) of the performance at 4k.

3x 8 pin means you can provide more power to the card, this is useful for overclocking.

Since 8K is 4x worse than 4k, some games will completely shit the bed at that resolution. And then DLSS makes up for it by being fucking incredible and rendering from 1440p to 8k.

Yeah I do feel that 2080 Ti performance on the 3070 for $500 is a bargain. The 3080 has better price / performance than even the 300 so yeah, I would consider it an even better bargain too. I guess if some people really think RDNA2 is about to drop cards with even better prices, you could think otherwise, but I think they should prepare to be dissapointed.



Watch the Digital Foundry Video


www.nvidia.com

Introducing NVIDIA RTX IO: GPU-Accelerated Storage Technology For The Next Generation of Games

Load instantaneously, experience vast worlds with endless views and rich detail, and further improve gameplay by leveraging the power of GeForce RTX 30 Series graphics cards and NVIDIA RTX IO.
Yup, even the one thing consoles had going for them, Nvidia has them beat.
I'm looking forward to someday having PCIE 4.0 SSD in my system.

8K is 4x the pixels of 4k, and 16x of 1080p. NO game is "easy to run" at 8K. This requires tremendous, tremendous amounts of power and bandwidth.






This is a public misconception about how much a game actually uses for VRAM, and the number you see in MSI Afterburner/Rivatuner, which includes caching.

We can take Flight Simulator 2020 for example, if you use the developer FPS overlay, it will tell you exactly how much the game is using.
On 4K, everything set to Ultra and all sliders to to the right except supersampling, FS2020 only uses 8GB of VRAM!

Everything above that is just the game caching assets that it may or may not need, and they will fill fill fill all that room up with potentially useless data, which is good because unused VRAM is wasted VRAM, but you will not have a performance disadvantage having a 10GB card in 4k gaming even including Next-Gen titles.

TL;DR: 10GB IS PLENTY FOR 4K Next-gen
Shhhhhhhhhh I want less competition for a 3080!
 

MrKlaw

Member
Oct 25, 2017
33,038
is there a good way to measure my current systems power draw with my GTX1080, so I can properly estimate what a 3080 might do? I'd like to avoid replacing the PSU if possible, its not modular and honestly the thought of all those cables just makes me think I'd rather do an entire rebuild with upgraded parts if I have to replace the PSU
 

Winstano

Editor-in-chief at nextgenbase.com
Verified
Oct 28, 2017
1,828
Oh man, clearance is something I'd not even thought about! I've got a Meshify C with a front mounted 280 radiator... Just looked at the FE and it will *JUST* fit. Are there any dimensions out there for partner cards yet?
 

Kromeo

Member
Oct 27, 2017
17,831
I'm going to stick with 2080ti for at least another year, haven't had it long enough to justify the ridiculous amount it cost (if it ever will)

Hope you don't have stock issues as bad as when I was buying that, took months after the orginal release date for it to finally turn up
 

1-D_FE

Member
Oct 27, 2017
8,252
I didn't really follow Turing, but if that had pre-orders, it was the only GPU I ever remember having pre-orders. It's been quite common for orders to always go live on the day NDAs lifted on reviews and sales dates were lifted.
 

mordecaii83

Avenger
Oct 28, 2017
6,858
I didn't really follow Turing, but if that had pre-orders, it was the only GPU I ever remember having pre-orders. It's been quite common for orders to always go live on the day NDAs lifted on reviews and sales dates were lifted.
Yeah, that's what I remember as well. I think Turing had some major stock issues which is why they did pre-orders.
 

Seiniyta

Member
Oct 25, 2017
521
Oof, I hope the 750 watt power supply won't be neccecary for the 3080 (is recommended in the spec sheet). With my new build got myself a 650watt >;<
 

Serene

Community Resettler
Member
Oct 25, 2017
52,522
I didn't really follow Turing, but if that had pre-orders, it was the only GPU I ever remember having pre-orders. It's been quite common for orders to always go live on the day NDAs lifted on reviews and sales dates were lifted.

I definitely pre-ordered the 1080ti I have now.
 

laoni

Member
Oct 25, 2017
4,712
Oh man, clearance is something I'd not even thought about! I've got a Meshify C with a front mounted 280 radiator... Just looked at the FE and it will *JUST* fit. Are there any dimensions out there for partner cards yet?

EVGA has given some of their specs, they're all bigger than the FE though
 

Crazymoogle

Game Developer
Verified
Oct 25, 2017
2,879
Asia
Apparently one of the Gigabyte Master Cards might exceed 3 slots but unfortunately haven't seen anyone willing to give up WxLxH yet, I guess EVGA is the only one so far
 

Jazzem

Member
Feb 2, 2018
2,680
Calculated my current build with RTX 3080 and i7 9700k overclocked to 4800 Mhz... (via https://outervision.com/power-supply-calculator)

118766462_625777928076905_4824445970856155078_n.png


...phew, may just be making it out okay with m'Gold rated Corsair 650M 😅

Won't lie, I'm super anxious about a lot of this, from the PSU concerns to possible stock issues to my motherboard I bought last year not having PCI-E 4.0 to possibly increased heat to argh argh argh argh
 

laoni

Member
Oct 25, 2017
4,712
Apparently one of the Gigabyte Master Cards might exceed 3 slots but unfortunately haven't seen anyone willing to give up WxLxH yet, I guess EVGA is the only one so far

videocardz.com

MSI announces GeForce RTX 3090, RTX 3080 and RTX 3070 graphics cards - VideoCardz.com

MSI unveils first custom NVIDIA® GeForce RTXTM 30 Series [Taipei, Taiwan] As a leading brand in True Gaming hardware, MSI is proud to share its take on NVIDIA®’s exciting new GeForce RTX™ 30 series GPUs, with graphics cards that unite the latest in graphics technology, high-performance circuit...

A few of these MSI cards have dimensions but... Not many
 

snausages

Member
Feb 12, 2018
10,337
Increased heat is my main concern now. Pretty sure I can fit one of these things in and my 750W should be fine, I ain't overclocking CPU or anything like that either.

I just don't know how I'm gonna feel about these things when summer comes back around. Maybe I'll limit fps or something lol
 

Serious Sam

Banned
Oct 27, 2017
4,354
Any reason why one would go for an FE card over third party?
FE has looks going for it, if you are into that. Third parties tend to have slightly better performance (sometimes up to 10% compared to FE), more efficient and quieter coolers (although this time FE cooler looks rather sophisticated). Third party cards also have better selection of display ports (for example many 3rd party cards have 2x HDMI2.1, FE only 1 such port). Also, 3rd parties include interesting features like dual bios mode (performance cooling, quiet 0rpm cooling), etc.
 

kami_sama

Member
Oct 26, 2017
6,998
Was funny when Jensen basically called out 1080Ti people saying "it's safe to upgrade now guys".
I felt it in my soul lol
My 1070 has served me well, but it's been four years.
What is the advantage of getting a FE vs ASUS or MSI? Is there a performance or cooling hit?

Looking at that 3090 to pair with my 9900k...
Before yeah, they were the weakest cards in cooling, due to that, also performance. But they were also the smallest in a lot of cases.
Now, we don't know, it's a completely different cooling solution, but at least they are the cheapest, but very likely they will be hard to come by.
 

Hero_Select

One Winged Slayer
Banned
Oct 27, 2017
2,008
FE has looks going for it, if you are into that. Third parties tend to have slightly better performance (sometimes up to 10% compared to FE), more efficient and quieter coolers (although this time FE cooler looks rather sophisticated). Third party cards also have better selection of display ports (for example many 3rd party cards have 2x HDMI2.1, FE only 1 such port). Also, 3rd parties include interesting features like dual bios mode (performance cooling, quiet 0rpm cooling), etc.
It definitely is a looker. I wise just how effective that cooling is. I want to wait for benchmarks vs AIBs but by the time those come around they'll all be sold out. I'll try for an Evga one on release date but if can't get a hold of one then I'll just wait a bit.
 
Jun 19, 2020
1,133
Did'nt know that they are capable of doing that but the RTX cards can now do Cinematic Quality Raytracing in 3d programms (Blender):