• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Elliot Pudge

Member
Oct 25, 2017
1,498
I just found out I voided my EVGA 2080 warranty because I removed a sticker from it. FUCK. I saw a sticker and was like "I don't want that on something that's going to get hot inside my pc." and only just now read it. It broke into pieces (by design, as I now realize) so I can't just put it back.

This was my first GPU upgrade, I had no idea about this...

that's kinda hilarious not gonna lie
 

SapientWolf

Member
Nov 6, 2017
6,565
I just found out I voided my EVGA 2080 warranty because I removed a sticker from it. FUCK. I saw a sticker and was like "I don't want that on something that's going to get hot inside my pc." and only just now read it. It broke into pieces (by design, as I now realize) so I can't just put it back.

This was my first GPU upgrade, I had no idea about this...
The FTC thinks that warranty void stickers are illegal:

https://www.engadget.com/2018/04/11/ftc-warranty-warning/
 

BriGuy

Banned
Oct 27, 2017
4,275
I just found out I voided my EVGA 2080 warranty because I removed a sticker from it. FUCK. I saw a sticker and was like "I don't want that on something that's going to get hot inside my pc." and only just now read it. It broke into pieces (by design, as I now realize) so I can't just put it back.

This was my first GPU upgrade, I had no idea about this...
A 2080 is still a pretty boss card. Just hold onto it and that extra $400 you saved and the forthcoming 3080ti won't sting quite so badly.
 

LQX

Banned
Oct 25, 2017
1,871
I just found out I voided my EVGA 2080 warranty because I removed a sticker from it. FUCK. I saw a sticker and was like "I don't want that on something that's going to get hot inside my pc." and only just now read it. It broke into pieces (by design, as I now realize) so I can't just put it back.

This was my first GPU upgrade, I had no idea about this...
It didn't say warranty void if removed?
 
Oct 25, 2017
41,368
Miami, FL
Update: Nothing can make Nier Automata's open world area smooth when panning. Unless it's CPU-bound beyond the capabilities of my OC'd 6700K or the 2080Ti is simply not enough, I guess.

gg development on that one.

I admit being disappointed about this, but it's hard to brute force performance past poor porting.
 

MuckyBarnes

Member
Oct 27, 2017
98
Vancouver, B.C.
I just found out I voided my EVGA 2080 warranty because I removed a sticker from it. FUCK. I saw a sticker and was like "I don't want that on something that's going to get hot inside my pc." and only just now read it. It broke into pieces (by design, as I now realize) so I can't just put it back.

This almost happened to me last night. I was pulling the plastic wrap off and it started to catch the warranty decal.
 

Crooked Rain

Member
Oct 25, 2017
184
It did, but I didn't read it. I thought it was just a random UPC sticker.

I thought warranty void stickers were outlawed?

This almost happened to me last night. I was pulling the plastic wrap off and it started to catch the warranty decal.
Mine came off with the plastic film. I was able to reattach it, but I've heard EVGA is pretty lenient about this kinda stuff. I would probably contact them and ask about it. They might tell you to keep the bits of the sticker you still have to send in if you need an RMA.
 

Oticon

Member
Oct 30, 2017
1,446
EVGA doesn't give a shit about the sticker unless you return it with actual physical damage. Which is why I always purchase EVGA because I swap out the cooler to a waterblock.
 

I Don't Like

Member
Dec 11, 2017
14,896
I just found out I voided my EVGA 2080 warranty because I removed a sticker from it. FUCK. I saw a sticker and was like "I don't want that on something that's going to get hot inside my pc." and only just now read it. It broke into pieces (by design, as I now realize) so I can't just put it back.

This was my first GPU upgrade, I had no idea about this...

Pretty sure EVGA doesn't actually do that.
 

Kyle Cross

Member
Oct 25, 2017
8,408
Pretty sure EVGA doesn't actually do that.
I did a quick Google search and saw some people saying it only applies to Asian territories. Regardless, I still have all the sticker, every single piece, and I put it back together on the cards box along with the serial number sticker I had to take off the card cause they actually put it sideways across the PCI connector...

I'm basically wanting to take part in the step up program cause I instantly regretted buying a 2080 instead of spending another $300-$400 for a Ti, so I was freaking that the sticker might've screwed my chances of that.

That said tho, it seems currently the step up program only offers the standard XC, and not the XC Ultra. Do you guys think they'll offer the XC Ultra in the step up program within the next 79 days? I guess I could just do the regular XC but that'd result in higher temps, wouldn't it?
 

I Don't Like

Member
Dec 11, 2017
14,896
Can't do it tonight but tomorrow it's all coming together

8ab0132e-ed18-4f8b-au9dq9.jpeg
 

dreamfall

Member
Oct 25, 2017
5,948
Can anyone try GTA IV with the new cards baha. I'm actually curious if performance can be improved in any regard...
 

I Don't Like

Member
Dec 11, 2017
14,896
Can anyone try GTA IV with the new cards baha. I'm actually curious if performance can be improved in any regard...

I reinstalled V and will be installing all the high res texture packs for all assets.

Why do you care about IV? I don't recall any issues with it.

I did a quick Google search and saw some people saying it only applies to Asian territories. Regardless, I still have all the sticker, every single piece, and I put it back together on the cards box along with the serial number sticker I had to take off the card cause they actually put it sideways across the PCI connector...

I'm basically wanting to take part in the step up program cause I instantly regretted buying a 2080 instead of spending another $300-$400 for a Ti, so I was freaking that the sticker might've screwed my chances of that.

That said tho, it seems currently the step up program only offers the standard XC, and not the XC Ultra. Do you guys think they'll offer the XC Ultra in the step up program within the next 79 days? I guess I could just do the regular XC but that'd result in higher temps, wouldn't it?

What? No. They're the same card but the Ultra is a slightly higher factory overclock because it bins higher. In other words EVGA runs some tests on the cards and whichever has a highest stable OC gets the Ultra label - a difference that will be negligible in actual real world performance. By negligible I mean literally like 1-3fps, if that. I have no idea whether the Ultra will be offered in step up but it doesn't matter - get the XC and be done with it.
 
Last edited:

Vash63

Member
Oct 28, 2017
1,681
What? No. They're the same card but the Ultra is a slightly higher factory overclock because it bins higher. In other words EVGA runs some tests on the cards and whichever has a highest stable OC gets the Ultra label - a difference that will be negligible in actual real world performance. By negligible I mean literally like 1-3fps, if that. I have no idea whether the Ultra will be offered in step up but it doesn't matter - get the XC and be done with it.

This is not correct at all. The XC Ultra has a completely different 3 slot cooler.
 

Lakeside

Member
Oct 25, 2017
9,214
I reinstalled V and will be installing all the high res texture packs for all assets.

Why do you care about IV? I don't recall any issues with it.



What? No. They're the same card but the Ultra is a slightly higher factory overclock because it bins higher. In other words EVGA runs some tests on the cards and whichever has a highest stable OC gets the Ultra label - a difference that will be negligible in actual real world performance. By negligible I mean literally like 1-3fps, if that. I have no idea whether the Ultra will be offered in step up but it doesn't matter - get the XC and be done with it.

Pretty sure Ultra gets a considerably larger cooler. A negative in my book.
 

I Don't Like

Member
Dec 11, 2017
14,896
This is not correct at all. The XC Ultra has a completely different 3 slot cooler.

You're right man, good call - the Ultra is a 2.75. I thought XC and XC Ultra were the same and Black Edition and blower were the only 2 slot.

Still, both cards are 2 fans and as I mentioned the performance will be essentially the same.
 

Kyle Cross

Member
Oct 25, 2017
8,408
You're right man, good call - the Ultra is a 2.75. I thought XC and XC Ultra were the same and Black Edition and blower were the only 2 slot.

Still, both cards are 2 fans and as I mentioned the performance will be essentially the same.
Again, you've misunderstood. At no point was I talking about game performance, I was talking about temperature. The XC Ultra has a significantly larger heatsink than the XC, so it runs cooler. I've heard 10-15c cooler by some.
 

I Don't Like

Member
Dec 11, 2017
14,896
Again, you've misunderstood. At no point was I talking about game performance, I was talking about temperature. The XC Ultra has a significantly larger heatsink than the XC, so it runs cooler. I've heard 10-15c cooler by some.

No I understand you weren't talking about game performance. I was the one tying cooler difference to how it translates to performance.

I personally would never believe that disparity in temps though. When you're talking about differences in the teens you're comparing air to either hybrid or more likely a custom loop.
 

Darktalon

Member
Oct 27, 2017
3,265
Kansas
Update: Nothing can make Nier Automata's open world area smooth when panning. Unless it's CPU-bound beyond the capabilities of my OC'd 6700K or the 2080Ti is simply not enough, I guess.

gg development on that one.

I admit being disappointed about this, but it's hard to brute force performance past poor porting.
Are you using the Nier FAR mod?
 

Smokey

Member
Oct 25, 2017
4,175
Played some AC Odyssey full blown. Max at 4k with HDR. The combination of it all does produce a great looking picture, but my CPU is getting ate up. Luckily I have 6c/12t, but I'm curious how 4c/8t CPUs will handle this.

Also if you plan to play in HDR, be sure to disable RTSS. It looked really off until they told me in the Perf thread, but now it looks great.
 

BAW

Member
Oct 27, 2017
1,938
EVGA just increased the price of their 1080ti by 20 euros in their EU store. That's hilarious. This is the gen that is increasing the prices of old graphics cards instead of decreasing them. Glad I managed to place my order before the price hike.
 

Darktalon

Member
Oct 27, 2017
3,265
Kansas
Got my msi trio 2080 ti in tonight. I haven't pushed my card to the very limit yet, but I've got +115 core and +900 memory, and seems plenty stable. This produces about 2010 mhz core.

This is a 17.5 TF card essentially.

Hits about 70c using my fan curve, 77c if set to auto.
Replacing sli 980 TIs, it's great to be able to use taa again in everything. It's pretty damn fast in a random assortment of games I've tried. I play 1440p and 144hz, very interesting to see many of my games completely cpu bound now (even with 6700k at 4.4ghz).
 

Observable

Member
Oct 27, 2017
946
Maybe a dumb question, but does being used as an eGPU have an effect on the OC capabilities of a card?

I got a EVGA 2080ti ultra, but I can't overclock memory more than 500 or benchmarks and games start to hang, and the core no more than 98. So did I just lose the silicon lottery or will it likely overclock better when used with a full system?
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
You lost the sillicon lottery
Maybe a dumb question, but does being used as an eGPU have an effect on the OC capabilities of a card?

I got a EVGA 2080ti ultra, but I can't overclock memory more than 500 or benchmarks and games start to hang, and the core no more than 98. So did I just lose the silicon lottery or will it likely overclock better when used with a full system?
He could be having power problems, it may not just be the lottery. What is the PSU there?
 

Observable

Member
Oct 27, 2017
946
He could be having power problems, it may not just be the lottery. What is the PSU there?
I'm using the Razor Core X which has a 650w PSU, so I don't think it's a power issue as the PSU only powers the GPU and 100w for the laptop.

Temps on the GPU are around 65-70 when load is maximal, and speed doesn't go above 1950mhz on the core. This is with fans maxed out. The automated 'scan'tuning software in Precision X1 gave a +98mhz as recommendation, power is at 130% and voltage doesn't seem to really matter to what value I set it, but I set it to 100.

Don't know if I'd want to go that far, but would it be a valid reason to return the card? It's the first GPU I bought in 10 years so I don't know what I should expect.
 
Last edited:

Lashley

<<Tag Here>>
Member
Oct 25, 2017
59,918
That was weird. Installed my RTX 2080, replacing my 970.

Windows boots up, acts like it can't detect the GPU, so I uninstall the drivers using DDU. After checking geforce experience etc.

Reboot, still no luck, I retry Geforce Experience and get this:

6Y1fqiR.png


Anyway, I download the manual drivers, but before the drivers even finish downloading the monitor flickers and the problem seemingly has sorted itself? Haha
 

Deleted member 1067

User Requested Account Closure
Banned
Oct 25, 2017
4,860
That was weird. Installed my RTX 2080, replacing my 970.

Windows boots up, acts like it can't detect the GPU, so I uninstall the drivers using DDU. After checking geforce experience etc.

Reboot, still no luck, I retry Geforce Experience and get this:

6Y1fqiR.png


Anyway, I download the manual drivers, but before the drivers even finish downloading the monitor flickers and the problem seemingly has sorted itself? Haha
Your display driver likely crashed. It's pretty normal with a new gpu until you get the driver situation sorted out.
 

Vash63

Member
Oct 28, 2017
1,681
I'm curious, Nvidia's talked about how they can use RTAA for 'perfect' AA that will auto detect things like fine-wires and such that MSAA/FXAA/TAA and such are bad at. Apparently it can smartly limit only a small percentage of the screen to use the RT with edge detection and such, and use simple FXAA or TAA for the rest of the image.

Has there been any talk of a solution like this for reflections? It would be neat to see a solution that used SSR as a baseline but 'fell back' to RT whenever the reflection data was invalid due to occlusions and such. I could see this still being problematic / less accurate for specular highlights and such within the reflections but I wonder if it would be a lot faster since far fewer pixels would need to be ray traced.
 

Steel

The Fallen
Oct 25, 2017
18,220
So, I get back from my trip and the only 2080Tis I can find are $2k+. And the 1080Tis I was looking at got more expensive. Thank you tariffs for murdering supply and cost. So, I said screw it, ordered a 2080. Unsurprisingly it's not gonna arrive until Wednesday. Hoping drivers make it better over time.
 

icecold1983

Banned
Nov 3, 2017
4,243
I'm using the Razor Core X which has a 650w PSU, so I don't think it's a power issue as the PSU only powers the GPU and 100w for the laptop.

Temps on the GPU are around 65-70 when load is maximal, and speed doesn't go above 1950mhz on the core. This is with fans maxed out. The automated 'scan'tuning software in Precision X1 gave a +98mhz as recommendation, power is at 130% and voltage doesn't seem to really matter to what value I set it, but I set it to 100.

Don't know if I'd want to go that far, but would it be a valid reason to return the card? It's the first GPU I bought in 10 years so I don't know what I should expect.
1950 is fine honestly. People who win the lottery only get like 2050 which is only going to improve actual game performance by a couple %
 

Lakeside

Member
Oct 25, 2017
9,214
I'm using the Razor Core X which has a 650w PSU, so I don't think it's a power issue as the PSU only powers the GPU and 100w for the laptop.

Temps on the GPU are around 65-70 when load is maximal, and speed doesn't go above 1950mhz on the core. This is with fans maxed out. The automated 'scan'tuning software in Precision X1 gave a +98mhz as recommendation, power is at 130% and voltage doesn't seem to really matter to what value I set it, but I set it to 100.

Don't know if I'd want to go that far, but would it be a valid reason to return the card? It's the first GPU I bought in 10 years so I don't know what I should expect.

It sounds fine to me, certainly wouldn't want to return something because it didn't exceed the manufacturer's specs by enough..

I'm intrigued by the eGPU.. do you have a way to compare to a similar enough system with internal GPU that you could estimate the Thunderbolt performance penalty?
 

Observable

Member
Oct 27, 2017
946
1950 is fine honestly. People who win the lottery only get like 2050 which is only going to improve actual game performance by a couple %
Ok I guess that's alright then, it's just the memory clock then. I see a lot of people with +800/+1000 where I can't set it above 500mhz. Don't know how much of a performance difference that gives though. I guess I hoped that all cards in the Ultra version would be relatively similar.

It sounds fine to me, certainly wouldn't want to return something because it didn't exceed the manufacturer's specs by enough..

I'm intrigued by the eGPU.. do you have a way to compare to a similar enough system with internal GPU that you could estimate the Thunderbolt performance penalty?
Not to a similar system, but I tested it in 3DMark TimeSpy the regular version and got around 12.800 when combined with with my MacbookPro I9. I saw some people online report 13800-14200, so I guess that leaves around a 10% performance cost. The i9 has some problems with thermal throttling under Windows with Bootcamp on Mac so I had to limit its power usage and Turbo Boost but after that it's pretty much perfect for gaming. It's really surprisingly plug-and-play. Tested Forza Horizon and it is playable at 3440 x 1440 with a mix of High, Ultra and Extreme at 60 FPS.

Then when I need to go I just shut it down, take out the single cable and its a normal laptop again.
 
Last edited:

icecold1983

Banned
Nov 3, 2017
4,243
Ok I guess that's alright then, it's just the memory clock then. I see a lot of people with +800/+1000 where I can't set it above 500mhz. Don't know how much of a performance difference that gives though. I guess I hoped that all cards in the Ultra version would be relatively similar.

combined its probably 5-6% faster than yours in best case scenarios. i wouldnt worry about it. if youre getting 60 fps they may get 63.
 

Sqrt

Member
Oct 26, 2017
5,880
Ok I guess that's alright then, it's just the memory clock then. I see a lot of people with +800/+1000 where I can't set it above 500mhz. Don't know how much of a performance difference that gives though. I guess I hoped that all cards in the Ultra version would be relatively similar.


Not to a similar system, but I tested it in 3DMark TimeSpy the regular version and got around 12.800 when combined with with my MacbookPro I9. I saw some people online report 13800-14200, so I guess that leaves around a 10% performance cost. The i9 has some problems with thermal throttling under Windows with Bootcamp on Mac so I had to limit its power usage and Turbo Boost but after that it's pretty much perfect for gaming. It's really surprisingly plug-and-play.
eGPU bandwidth (TB 3 i guess) is considerably lower than 16x PCiE 3.0, that will also affect peak GPU performance of you system.
 

Observable

Member
Oct 27, 2017
946
eGPU bandwidth (TB 3 i guess) is considerably lower than 16x PCiE 3.0, that will also affect peak GPU performance of you system.
Yes but that's ok, it's a stop gap until I built a new system sometime next year. And if it's really just the 10% difference it's perfectly fine, it still outperformed every internal 1080ti when I checked the Timespy rankings.