• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

pswii60

Member
Oct 27, 2017
26,649
The Milky Way
How are you guys getting 120hz on xbox x with C9?

when i turn on vrr its choppy, when i switch from 60hz to 120 and disable vrr it seems like 60fps in rdr2

can anyone share some feedback?
Sounds like it is switching to a different picture mode on your TV - check your TV settings when it changes and see if motion interpolation is switched back on.
 

kc44135

Member
Oct 25, 2017
4,720
Ohio
I have the same reservations that kc44135 mentioned, and while the video you linked was a good watch (I saw this last week), I thought then as I think now, "That's great - 6 months of heavy use isn't a problem!" followed by, "But for $2,000 I hope to have this TV for a very long time. How will burn-in look on moderate usage over... 3 years? 5 years? 10 Years?" and that's where I get a little hesitation on pulling the trigger on OLED.
Yeah, this is my feeling on it too. When I buy a TV,.I want it to last me at least 5-10 years. A lot of these tests are 6 months to a year. If I'm gaming 95% of the time on the TV, and playing some games like GTA V, Skyrim, etc. for hundreds upon hundreds of hours potentially, is an OLED gonna last me that long. That's where I get iffy on it, as much as I love the picture these displays output.
 

TacoSavage

Banned
Nov 6, 2017
89
I did some extended testing with the 65 C9 and xbox x and 120hz on and vrr & game mode OFF, it makes a difference on SOME games. Rdr2 is the game i noticed a really big difference to a point where it feels 60fps
 

Scott Lufkin

Member
Dec 7, 2017
1,447
TV could last way longer.

The point of my comment was that yeah, it could last way longer. It could also start showing signs of burn-in at the 9-month mark during his test, which for most folks at let's say 8-10 hours of TV (let alone gaming) could still start burning in at 3 years. 4 years. That's not nearly long enough. No way I can talk my wife into dropping $2,000 on another TV just 3 or 4 years from now. And yeah, I'm speculating - that's all we are all doing, even the guy performing the test. The truth is OLED's haven't been around long enough to know what an OLED TV will do after 5 years of moderate use. It's just too much to invest in for me right now, much as I'd love to have an OLED TV.
 

Ferrs

Avenger
Oct 26, 2017
18,829
The point of my comment was that yeah, it could last way longer. It could also start showing signs of burn-in at the 9-month mark during his test, which for most folks at let's say 8-10 hours of TV (let alone gaming) could still start burning in at 3 years. 4 years. That's not nearly long enough. No way I can talk my wife into dropping $2,000 on another TV just 3 or 4 years from now. And yeah, I'm speculating - that's all we are all doing, even the guy performing the test. The truth is OLED's haven't been around long enough to know what an OLED TV will do after 5 years of moderate use. It's just too much to invest in for me right now, much as I'd love to have an OLED TV.

That's fair. Of course there's no point on buying an OLED if you're not goint to be able to enjoy it with the fear of possible burn in.
 

Pargon

Member
Oct 27, 2017
11,971
Just a heads up, I was fiddling around with the upscaling feature (under sharpening) with the newer NV drivers. This creates new resolutions to use that are upscaled to 4K. Once this is done, you will no longer have 120Hz @ 1440p. Even after disabling the upscaling option, those resolutions remained and I couldn't get 120Hz back @ 1440p. (Had to reinstall the drivers and choose the clean option.)
Enabling that option also sets it to scale on the GPU in the "adjust desktop size and position" section of the Control Panel:
nv-display-scaling-7dk9o.png


Disabling it in the "manage 3D settings" option doesn't seem to change that back to display scaling.
 

datamage

Member
Oct 25, 2017
913
Enabling that option also sets it to scale on the GPU in the "adjust desktop size and position" section of the Control Panel:
nv-display-scaling-7dk9o.png


Disabling it in the "manage 3D settings" option doesn't seem to change that back to display scaling.
Is there a general consensus on letting the display handle the scaling vs the GPU? Always been curious.
 

Pargon

Member
Oct 27, 2017
11,971
Is there a general consensus on letting the display handle the scaling vs the GPU? Always been curious.
GPU scaling is fast but low quality (bilinear) though they just added a new scaling method to Turing GPUs (16 and 20 series). It tends to make things a lot more seamless when switching resolutions since the signal going to the display never changes.
But if the GPU upscales, that means the output is always going to be 4K - which locks you to 60Hz for now, rather than allowing 1440p120. With HDMI 2.1 GPUs it won't matter since they can output 4K120.
So you want the display to handle scaling for now.
 

zerocalories

Member
Oct 28, 2017
3,231
California
Hmm the Samsung qleds seems to be just as good as OLED or am I missing something?
I had a fald monitor before (Pg27uq) but return it because of the haloung affect, however I developed the feeling that hdr without some type of dimming is not worth it. The oled can obviouslt turn all the way off, but how good are qleds for pc gaming? How good are the blacks? How is the response time? Will the Samsung q series have hdmi 2.1? I'm basically asking oled vs qled because it seems that oled is the best tv if it weren't for burn-in.
 

zerocalories

Member
Oct 28, 2017
3,231
California
Fuck it. Taking home a 55 c9 open boxed from Best Buy. Seems like it was barely used the remote is still in factory package. 150 off. If I really don't like it I'll return for a newer one

Anything I need to be aware of to make sure it's a tv worth keeping? Like any dead pixel tests

also anything I need to be aware of in the nvidia control panel?

(edit: I got them to take another 100 off, total was 1333usd for a pretty pristine 55 c9)
 
Last edited:

datamage

Member
Oct 25, 2017
913
Fuck it. Taking home a 55 c9 open boxed from Best Buy. Seems like it was barely used the remote is still in factory package. 150 off. If I really don't like it I'll return for a newer one

Anything I need to be aware of to make sure it's a tv worth keeping? Like any dead pixel tests

also anything I need to be aware of in the nvidia control panel?
For checking dead pixels, I usually use one of the following sites:
http://www.deadpixeltest.com/
or

As for other tests, if you wanted to go down that rabbit hole, you can look up 5-20% gray test videos on YouTube to check for banding.
In the NV CP I conceded and just stick with 4:2:2/12bit/limited as I got tired of switching depending on what I was doing. (I couldn't see a difference between 10bpc and 12bpc.) If someone can chime in further on this, that would be great.
 

Deleted member 17207

user requested account closure
Banned
Oct 27, 2017
7,208
Right now I'm considering the Samsung QN60 and the Sony X900F. (Both the same price). Checked Rtings and it seems the Sony has local dimming and the Samsung has VRR, but both have other ups and downs...So I guess it's a debate of picture vs refresh rate? (Sony gets better reviews on HDR as well)


this is too hard man! I've also seen people say VRR might not even be a thing next gen because devs are barely using it on PC right now....

(Previously I was eyeing the QN70 but after tax etc it's just out of my price range)
 
Last edited:

TaySan

SayTan
Member
Dec 10, 2018
31,374
Tulsa, Oklahoma
Hmm the Samsung qleds seems to be just as good as OLED or am I missing something?
I had a fald monitor before (Pg27uq) but return it because of the haloung affect, however I developed the feeling that hdr without some type of dimming is not worth it. The oled can obviouslt turn all the way off, but how good are qleds for pc gaming? How good are the blacks? How is the response time? Will the Samsung q series have hdmi 2.1? I'm basically asking oled vs qled because it seems that oled is the best tv if it weren't for burn-in.
No LED can produce perfect blacks like an OLED. However it's still a premium technology with a premium price. The picture quality of an OLED was well worth the admission for me. The only TV I would look into right now is the LG OLEDs as they are only TVs with HDMI 2.1 support. Otherwise I would wait next year.
 

ZSJ

Alt-Account
Banned
Jul 21, 2019
607
Right now I'm considering the Samsung QN60 and the Sony X900F. (Both the same price). Checked Rtings and it seems the Sony has local dimming and the Samsung has VRR, but both have other ups and downs...So I guess it's a debate of picture vs refresh rate? (Sony gets better reviews on HDR as well)


this is too hard man! I've also seen people say VRR might not even be a thing next gen because devs are barely using it on PC right now....

(Previously I was eyeing the QN70 but after tax etc it's just out of my price range)
Man don't get a mediocre picture quality just for VRR. That's all I'm saying.
 

TitanicFall

Member
Nov 12, 2017
8,256
GPU scaling is fast but low quality (bilinear) though they just added a new scaling method to Turing GPUs (16 and 20 series). It tends to make things a lot more seamless when switching resolutions since the signal going to the display never changes.
But if the GPU upscales, that means the output is always going to be 4K - which locks you to 60Hz for now, rather than allowing 1440p120. With HDMI 2.1 GPUs it won't matter since they can output 4K120.
So you want the display to handle scaling for now.

What makes you say that? GPU scaling, scales to what you set your display resolution to, so you can still get 1440p.
 

Jogi

Prophet of Regret
Member
Jul 4, 2018
5,444
was between a samsung qled and the c9 and went with the C9....gat damn! Was rocking a UNF7100 and enjoy it enough, but wow does 4k + HDR + going from 55" -> 65" change things. Moved the old tv into the bedroom and it looks so muddy in comparison haha.
 

Deleted member 17207

user requested account closure
Banned
Oct 27, 2017
7,208
You are missing out getting an HDR TV that can't display HDR right. Not even only HDR, the Sony is winning every picture quality category there.

I believe the QN80/Q8FN is the good one with VRR.
Aaaaand out of my price range!

so yeah, looks like the Sony will be the way to go. And maybe in a few years when the tech is cheaper I'll upgrade further.
 

zerocalories

Member
Oct 28, 2017
3,231
California
when i try to set the c9 as compter monitor in nvidia control panel, it goes blank. any ideas?

edit, okay so 4k @100hz doesn't work, but 2440x1440@120hz does. whats the highest I can run 4k at?
 
Last edited:
OP
OP
ussjtrunks

ussjtrunks

Member
Oct 25, 2017
1,687
Are there any hdr anime / broadcast tv shows in hdr or is netflix the only one doing hdr programming atm?
 

Pargon

Member
Oct 27, 2017
11,971
I've also seen people say VRR might not even be a thing next gen because devs are barely using it on PC right now...
VRR is not a feature that developers have to enable. With some rare exceptions, it just works with basically all games on PC automatically.
Consoles may handle things differently but I would expect it to be enabled at the system level and work in every game automatically, just like PC.

What makes you say that? GPU scaling, scales to what you set your display resolution to, so you can still get 1440p.
That's not been my experience at all. GPU scaling locks the output to whatever NVIDIA decides the "default" resolution is for the display, and any other resolution you select is scaled to it.

when i try to set the c9 as compter monitor in nvidia control panel, it goes blank. any ideas?

edit, okay so 4k @100hz doesn't work, but 2440x1440@120hz does. whats the highest I can run 4k at?
Your options are 1440p120 or 4K60 for now - though some are reporting that it can be extended to 66Hz at 4K, which would be beneficial for VRR (keeps 60 FPS games within the VRR window rather than using V-Sync).
 
Last edited:

BloodshotX

Member
Jan 25, 2018
1,593
Right now I'm considering the Samsung QN60 and the Sony X900F. (Both the same price). Checked Rtings and it seems the Sony has local dimming and the Samsung has VRR, but both have other ups and downs...So I guess it's a debate of picture vs refresh rate? (Sony gets better reviews on HDR as well)


this is too hard man! I've also seen people say VRR might not even be a thing next gen because devs are barely using it on PC right now....

(Previously I was eyeing the QN70 but after tax etc it's just out of my price range)
i personally wouldnt go with the Q60r since its edge lit and has a plastic border. I personally got a Q70r in july, and i enjoy it immensely. Sure there is some clouding (not bad in my book), but luigis mansion 3 on switch and resident evil 2 remake sure look beautiful on it.

i went with the 55 inch since its capable of 120hz and has a 120hz panel. got it for 1100 euros back then.
 
Sep 21, 2019
2,594
Without question. Just upgraded from a C7, and the superior color rendering, GSYNC capability, HDMI 2.1 ports for next-gen GPUs, better HDR, and 120 hz refresh at 1440p make it the best gaming TV without question.
 

Fawz

Member
Oct 28, 2017
3,654
Montreal
I really like the TV for Games and Movies, but man it is horrible for PC usage. I've tried every kind of set-up and setting and nothing I do makes standard Windows navigation any good. Every time I open a Settings or Explorer page (ie: A lot of white) the whole screen dims very noticeably, and even worst is that most whites turn to gray. I appreciate the attempt to reduce burn in, but I wish I could manually turn off all these automatic adjustments (ie: With the Costco guarantee I don't mind if it does burn in within the next year).

Looking Online the one thing left for me to try is the Service Menu to disable the Auto-Dimming feature LG seems to be such a fan of on their OLEDs. I'm worried that might void my warranty though, and I don't have the set-up for it now. Has anyone given it a try yet?
 

Deleted member 17207

user requested account closure
Banned
Oct 27, 2017
7,208
i personally wouldnt go with the Q60r since its edge lit and has a plastic border. I personally got a Q70r in july, and i enjoy it immensely. Sure there is some clouding (not bad in my book), but luigis mansion 3 on switch and resident evil 2 remake sure look beautiful on it.

i went with the 55 inch since its capable of 120hz and has a 120hz panel. got it for 1100 euros back then.
Right on. Yeah the Q70 looks great, but the 65 inch (which is what I want) is just a tad out of my price range :( bummer!
 

ShapeGSX

Member
Nov 13, 2017
5,207
i have VRR turned on with my Samsung Q90R, and I just don't see that big a deal with my Xbox One X. I never really noticed console games tearing before I got this TV. It's probably a bigger deal with PC games.
 

NutterB

Member
Oct 27, 2017
388
So according to Rtings.com, a B9 is as good as a C9 yet slightly cheaper. Should I save those few hundred dollars, or should I get the C9?
 

Ferrs

Avenger
Oct 26, 2017
18,829
So according to Rtings.com, a B9 is as good as a C9 yet slightly cheaper. Should I save those few hundred dollars, or should I get the C9?

The C9 has better chip that it helps with scaling low res content and the effects like noise reduction and the likes.

Some people say it has better color gradient to reduce banding but I haven't see any review saying that.
 

predprey

Banned
Sep 20, 2019
165
Well this is quite an experience

Anyone else use an OLED for a pc monitor daily driver?

Better set a screen saver and not use it for long hours. There is a real risk of the center divider (when tiling windows left and right) burning into the display. Based on real experience, it's a permanent burn in and not some temporary image retention. Though I later got the panel replaced because the issue worsened into the entire left side being brighter than the right, which apparently was an issue with the electronics board as told by the technician. There's also a Youtube video explaining the risk of this:

Not sure if this is still an issue with OLEDs after 2017, but just know that the risk is real and not a panel luck of the draw thing.
 

Deleted member 8752

User requested account closure
Banned
Oct 26, 2017
10,122
I really like the TV for Games and Movies, but man it is horrible for PC usage. I've tried every kind of set-up and setting and nothing I do makes standard Windows navigation any good. Every time I open a Settings or Explorer page (ie: A lot of white) the whole screen dims very noticeably, and even worst is that most whites turn to gray. I appreciate the attempt to reduce burn in, but I wish I could manually turn off all these automatic adjustments (ie: With the Costco guarantee I don't mind if it does burn in within the next year).

Looking Online the one thing left for me to try is the Service Menu to disable the Auto-Dimming feature LG seems to be such a fan of on their OLEDs. I'm worried that might void my warranty though, and I don't have the set-up for it now. Has anyone given it a try yet?
It's not just to reduce burn in, its to prevent the panel from overheating.
Turn down your contrast and brightness settings.
 

Plum

Member
May 31, 2018
17,266
Just bought one! I seriously cannot wait for it. I'll be going from a middle-of-the-range Sony 1080p to this and I really want to see how big of a difference it makes.

How's the Xbox One X's 4K Blu-Ray player capability? Thinking of buying one or two 4K blu-rays (plus Planet Earth) to see how it looks.
 

MazeHaze

Member
Nov 1, 2017
8,570
Better set a screen saver and not use it for long hours. There is a real risk of the center divider (when tiling windows left and right) burning into the display. Based on real experience, it's a permanent burn in and not some temporary image retention. Though I later got the panel replaced because the issue worsened into the entire left side being brighter than the right, which apparently was an issue with the electronics board as told by the technician. There's also a Youtube video explaining the risk of this:

Not sure if this is still an issue with OLEDs after 2017, but just know that the risk is real and not a panel luck of the draw thing.
It does seem to be a panel luck of the draw thing to me. I've had a B7 for over two years, like 6,000 hours on it. It's my ONLY PC monitor and I play all of the consoles on it too. I use it as a PC 4-5 hours a day, I do a ton of music production in Ableton on it, word processing, web browsing etc. Clean as a whistle. Meanwhile I've seen people with two month old TV's get the Netflix app logo burnt in.
 

Deleted member 17207

user requested account closure
Banned
Oct 27, 2017
7,208
well the q60 is not terrible, keep in mind that aside from the picture quality wich is the same. That its less effective in HDR due to the edge led, thats all.
Right, but the Sony X900F (same price as the Q60) gets better reviews on all the picture fronts. Only thing the Q60 seems to have over it is HDR.

But the Sony is on sale at Costco until December 4th or something, so I'm going to wait and see if the Q70 goes down further on actual Black Friday (despite being on sale now already).
 

Fawz

Member
Oct 28, 2017
3,654
Montreal
It's not just to reduce burn in, its to prevent the panel from overheating.
Turn down your contrast and brightness settings.

They're not maxed out right now sine I followed the tweaking guides and adjusted further for my lighting conditions, but if I do reduce it does that basically reduce heat and so reduce the need for that automatic limiter? Also considering how instant the limiter is (ie: If I minimize/maximize windows it instantly dims the image) I'm doubtful it's for temperature since it would make more sense to be gradual instead
 

Deleted member 8752

User requested account closure
Banned
Oct 26, 2017
10,122
They're not maxed out right now sine I followed the tweaking guides and adjusted further for my lighting conditions, but if I do reduce it does that basically reduce heat and so reduce the need for that automatic limiter? Also considering how instant the limiter is (ie: If I minimize/maximize windows it instantly dims the image) I'm doubtful it's for temperature since it would make more sense to be gradual instead
It should reduce the need for the limiter, yes. It's to prevent exceeding certain threshold of nits if I'm not mistaken.

I agree that it's the worst thing about the sets, even more so than a risk of burn in. Not being able to disable ABL is terrible.
 

BloodshotX

Member
Jan 25, 2018
1,593
Right, but the Sony X900F (same price as the Q60) gets better reviews on all the picture fronts. Only thing the Q60 seems to have over it is HDR.

But the Sony is on sale at Costco until December 4th or something, so I'm going to wait and see if the Q70 goes down further on actual Black Friday (despite being on sale now already).
Okay. If you have any more questions, feel free to ask them : )