• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Oct 25, 2017
27,762
I can't believe my pretty basic 1080p Samsung from like 10 years ago cost me 1000 Canadian on sale and now you can get a 55" C9 for 1600 Canadian :p

If the price hits 1k I may get one
 
Dec 13, 2017
577
There were multiple improvements made in the panel tech between c6-c9 that should help mitigate it. (Bigger red subpixel, etc.) Good luck!

Does the bigger red pixel thing help mitigate burn-in? My C7 has burn-in from yellow and red bars, but everything else is fine. It's gotten super noticeable so I'm trying to get a new TV, was planning on waiting until Balck Friday sales to get the new CX or whatever, but if the C9 is good enough I'd get that.
 

Pargon

Member
Oct 27, 2017
11,997
OLED is still a downgrade from plasma in terms of motion resolution/handling. To me, that's a major concession that will probably keep my plasmas as my primary displays until they die or possibly until I can't source parts for them. Their natural motion/PQ is just so much easier on the eyes than any modern display.
The LG CX OLEDs should hopefully exceed Plasma TVs for motion resolution with the new 120Hz BFI feature when it's set to high. They won't have the response time issues of Plasma TVs either.
It's one of the reasons I'm waiting for CX prices to drop rather than pick up a cheaper B9/C9 now, as tempting as it may be.

No such thing as a next gen proof tv when microled are right around the corner
People mean supporting all the features of next-gen consoles, not "next gen" TV tech.
While I hope that µLED is here soon, I still remember the early 2000s on sites like AVS Forum where OLED was "just around the corner" for a decade.
 

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,806
Will they though initially? I think I'll wait until they are out before upgrading or pick up a tv next year instead.

Oh your question seemed to imply it wasn't worth trying to get a set with HDMI 2.1. There's no harm in waiting until the consoles are out and there are more TV options to choose from. That's a pretty wise move to make, but the next gen consoles will absolutely utilize HDMI 2.1 features at launch.
 
Oct 28, 2017
1,539
Oh your question seemed to imply it wasn't worth trying to get a set with HDMI 2.1. There's no harm in waiting until the consoles are out and there are more TV options to choose from. That's a pretty wise move to make, but the next gen consoles will absolutely utilize HDMI 2.1 features at launch.

Aside from 120hz what are the other benefits of 2.1?
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
The LG CX OLEDs should hopefully exceed Plasma TVs for motion resolution with the new 120Hz BFI feature when it's set to high. They won't have the response time issues of Plasma TVs either.
It's one of the reasons I'm waiting for CX prices to drop rather than pick up a cheaper B9/C9 now, as tempting as it may be.


People mean supporting all the features of next-gen consoles, not "next gen" TV tech.
While I hope that µLED is here soon, I still remember the early 2000s on sites like AVS Forum where OLED was "just around the corner" for a decade.

There was just a Value Electronics comparison done between the CX and A8H. Calibrator DeWayne Davis (D-Nice on AVS) chimed in with his findings. It's only one source and, since mistakes can be made by anyone, multiple sources are always needed for verification, but BFI max on LG is 60hz. The other settings are at 120. BFI at max produced the most "plasma-like" image and he estimated it reached 1080 lines of motion resolution, but it also introduced severe darkening (100 nits goes down to 36) and affected calibration. (contrast, gamma, and maybe something else I'm not remembering) The only BFI setting that did not impact calibration in a meaningful way was auto, and that was at 120hz with an estimated 850 lines of motion resolution.
 

Pargon

Member
Oct 27, 2017
11,997
BFI max on LG is 60hz. The other settings are at 120. BFI at max produced the most "plasma-like" image and he estimated it reached 1080 lines of motion resolution, but it also introduced severe darkening (100 nits goes down to 36) and affected calibration.
That is exactly what should be expected from BFI - though the panel should be capable of more than 36 nits with it enabled.
It's disappointing that the other settings are not still 60Hz, but with increased persistence for people that would prefer to trade off motion resolution for higher brightness. BFI at 120Hz with a 60 FPS source greatly hurts motion quality and I'd say it almost renders the feature useless.
 

Citizencope

Member
Oct 28, 2017
6,201
For someone who put in 300+ hours into the last Orcs Must Die game on my projector I'm scared shitless of burn in on OLEDs. I may have to wait a year before I jump in.
 

Gorion's Ward

Member
Apr 6, 2019
495
Israel <3
I can't wait for the Rtings review of the X900H. It's reasonably priced, and supports 4K120, VRR, eARC, and Dolby Vision - pretty much my entire checklist for a next gen friendly TV.
Yeah, reviews will probably be up sometime in June. Will make sure to wait for Vincent and RTINGS. Specs don't seem promising and it looks like the difference from the 950 is more significant when compared to the X900F, but zone count isn't be all end all, and this TV is my only hope for this year. If Sony would've just promised to upgrade the 950H through firmware for 2.1 features, I'd be all over the cheap 950G by now...
 

kc44135

Member
Oct 25, 2017
4,721
Ohio
As someone who just bought a C9 and did TONS of research before doing so.. burn-in concerns are completely overblown, particularly with video games. It's a non-issue for the VAST majority of owners. Even then, LG apparently is willing to do a one-time replacement for burned in panels, even though it's not covered by warranty. You can ask their customer support to confirm that if you'd like.

I think of it like this: I grew up with CRT TVs, which also are susceptible to burn-in, yet how many kids of the 80s grew up with the UI elements of games like Super Mario Bros burned into their screens (the answer: not many.)
I actually did ask LG, they just told me burn in isn't anything to worry about, they wouldn't tell me they will do the panel replacement. :(
 
Last edited:

kc44135

Member
Oct 25, 2017
4,721
Ohio
For someone who put in 300+ hours into the last Orcs Must Die game on my projector I'm scared shitless of burn in on OLEDs. I may have to wait a year before I jump in.
300 hours probably isn't anything to worry about. RTings test suggest it would take like 9000 hours to achieve burn in (COD and FIFA on high brightness 24/7 for a year was the test IIrc). Burn in can totally happen, but it will take much longer than 300 hours. Here is the vid in question: https://youtu.be/nOcLasaRCzY
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
300 hours probably isn't anything to worry about. RTings test suggest it would take like 9000 hours to achieve burn in (COD and FIFA on high brightness 24/7 for a year was the test IIrc). Burn in can totally happen, but it will take much longer than 300 hours. Here is the vid in question: https://youtu.be/nOcLasaRCzY

I can assure you that the burned-in panels I've replaced on the C9/B9 did not have 9000 hours on them.
 

Gorion's Ward

Member
Apr 6, 2019
495
Israel <3
Problem with burn-in tests, from what I can gather, is the panels vary quite a lot and so it's kind of a lottery. I've definitely seen many posts in hardcore TV enthusiasts forums here where people who babysit their OLED (even b8 and up) got burn in.

Where I live, there's a 3-year warranty which includes burn-in protection, with an option to extend to 6 years for an extra 300$ or so. It usually gives you the option to either trade to the equivalent current year's model or upgrade (from B to C, or from 55 to 65 to 75) at a low cost. But even with that, LED will probably be my choice. I don't care for babysitting my TV or changing my habits and with my usage, burn in will happen fast. I don't want to have to think about it or ever contact support regarding my TV.
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
How many hours did they have?

Far less. Store demo units rarely even hit 4000 in a year.

Burn in is a gamble with all self emissive panels and some are more prone to retention than others. A few hundred hours of a static HUD element without changing up content could absolutely cause it
 

kc44135

Member
Oct 25, 2017
4,721
Ohio
Far less. Store demo units rarely even hit 4000 in a year.

Burn in is a gamble with all self emissive panels and some are more prone to retention than others. A few hundred hours of a static HUD element without changing up content could absolutely cause it
Well, a store demo unit I would think would be a bad example tbh. They are left in Torch mode basically the whole time the store is open, often running the same content over and over in some cases too. The TV's your talking about would be C9's and B9's that people owned, correct? That would mean they had burn in within less than a year, then? With all the measures LG has in place, that seems pretty crazy to me. I guess maybe it would have to do with how the TV was used as well, tho. But it is certainly disconcerting regardless.
 

macindc

Member
Oct 27, 2017
200
Far less. Store demo units rarely even hit 4000 in a year.

Burn in is a gamble with all self emissive panels and some are more prone to retention than others. A few hundred hours of a static HUD element without changing up content could absolutely cause it

"could" is doing a lot of work in that sentence. Like the rtings test showed video game content wasn't likely to cause burn-in with their test panels into well into the thousands of hours (FIFA and Call of Duty fared way better than, say, the one showing CNN all the time.)

It is possible, it COULD happen. Is it likely to? I don't think so. I think in real-world usage someone using their TV for 1 video game for several hundred hours with no other content breaking that up like Netflix or whatever is not one of the more likely scenarios.

Does the bigger red pixel thing help mitigate burn-in? My C7 has burn-in from yellow and red bars, but everything else is fine. It's gotten super noticeable so I'm trying to get a new TV, was planning on waiting until Balck Friday sales to get the new CX or whatever, but if the C9 is good enough I'd get that.

There aren't any substantial panel changes between the C9 and the CX, particularly any that would make burn-in more or less likely, so you should be fine getting a C9 while they're still around. I actually opted for a C9 over the CX recently since the C9 has the full HDMI 2.1 bandwidth available to it (the CX doesn't, they reallocated some of it toward AI picture processing instead), and still has a DTS sound decoder (which was removed from the CX.)
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
Well, a store demo unit I would think would be a bad example tbh. They are left in Torch mode basically the whole time the store is open, often running the same content over and over in some cases too. The TV's your talking about would be C9's and B9's that people owned, correct? That would mean they had burn in within less than a year, then? With all the measures LG has in place, that seems pretty crazy to me. I guess maybe it would have to do with how the TV was used as well, tho. But it is certainly disconcerting regardless.

The store demo was an extreme example to prove a point about panel on-time accumulation. Nobody puts that many hours on their television in a year. And yes, televisions people owned from new.

Again, Rtings is not an authority on anything. They're just the new David Katzmaier. Like my experience, theirs is anecdotal.
 
Dec 13, 2017
577
There aren't any substantial panel changes between the C9 and the CX, particularly any that would make burn-in more or less likely, so you should be fine getting a C9 while they're still around. I actually opted for a C9 over the CX recently since the C9 has the full HDMI 2.1 bandwidth available to it (the CX doesn't, they reallocated some of it toward AI picture processing instead), and still has a DTS sound decoder (which was removed from the CX.)

fuck, this makes me want to get a new TV. The only thing holding me back is the burn-in PTSD. I know it's down to each unit being random when it comes to it, but I dont think I played enough to warrant the burn-in I got. Got really bad after about 2 years of use playing a variety of games. I raised brightness on one game because the HDR was terrible and that's when I noticed it start to happen. Could've been an issue with my settings but still scared
 

Ada

Member
Nov 28, 2017
3,731
Well my TV just decided to die today, high pitched whine plus black stripes across the whole screen. Lasted 5 years. What TV should I replace it with? I only game on it and don't watch anything else.
 

Japanmanx3

One Winged Slayer
Avenger
Oct 25, 2017
5,908
Atlanta, GA
Knda burned out on all my research. Will probably just get he Sony 900H when the 55" creeps beneath 1k and call it a day. Unless better specced but reasonably priced stuff comes out by this time next year.
 

nillapuddin

Member
Oct 25, 2017
3,240
CX RTINGS review is out and yeah, I'm not sure theres a reason to buy a CX over a C9 at all tbh.

I understand there is a better processor in the CX over C9, but Im not entirely sure what thats even for?
Doing all the post processing stuff? (that we all turn off for game mode?)


here that is btw
www.rtings.com

LG CX OLED Review (OLED48CXPUB, OLED55CXPUA, OLED65CXPUA, OLED77CXPUA)

The LG CX OLED is an excellent high-end TV. It's part of LG's popular OLED lineup, sitting behind the LG GX OLED, and it delivers the same exceptional picture qu...
cx-design-medium.jpg
 
Last edited:
OP
OP
manustany

manustany

Unshakable Resolve
Member
Oct 27, 2017
3,529
The Space
Right now I'm thinking getting the Q95T 55' because of the One Connect Box (I can easily switch the HDMI 2.1 from a console to another), the brightness, the HDR10+, the clean UI, the minimal remote and because I already have a small Samsung ecosystem.

Plus, Sumsung usually drop prices often in Italy, especially in September where they offer a good cashback. It's sold for €2.200 here, but for example last week Samsung already dropped the price to €1.799 only for the weekend. I assume I will easily find this TV for less than €1.500 before November.

I'm still considering the CX 48 but honestly I don't think an OLED panel is WAY better than a QLED, plus LG UI is a bit messy compared to Tizen (and I don't wanna talk about the remote).
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
CX RTINGS review is out and yeah, I'm not sure theres a reason to buy a CX over a C9 at all tbh.

I understand there is a better processor in the CX over C9, but Im not entirely sure what thats even for?
Doing all the post processing stuff? (that we all turn off for game mode?)


here that is btw
www.rtings.com

LG CX OLED Review (OLED48CXPUB, OLED55CXPUA, OLED65CXPUA, OLED77CXPUA)

The LG CX OLED is an excellent high-end TV. It's part of LG's popular OLED lineup, sitting behind the LG GX OLED, and it delivers the same exceptional picture qu...
cx-design-medium.jpg

There was a quickie Value Electronics deep dive a few days ago. Better processing mostly went toward upscaling and cleaning up poor quality content. Brought it closer to Sony level, but not quite.
 

Pargon

Member
Oct 27, 2017
11,997
CX RTINGS review is out and yeah, I'm not sure theres a reason to buy a CX over a C9 at all tbh.
BFI for 60Hz sources is the main reason, as it should reduce image persistence to ~4ms.
That should result in motion handling which is similar to, or better than, most Plasma TVs. Previous models only halved the persistence, to ~8ms.
It should also have been able to work with 120Hz sources, but according to RTINGS' review it's not a good implementation and should not be used.

Keep in mind that there will be visible flicker when you enable this option, and it will cut the brightness by more than 50%.
They don't seem to have posted brightness measurements, but I expect it to be 100 nits at most.
It's also going to be incompatible with VRR.

The main reason someone would want it is for retro gaming if they don't have a CRT, or modern games that are limited to a maximum of 60 FPS.

I understand there is a better processor in the CX over C9, but Im not entirely sure what thats even for?
There's potential for better image quality with improved processing, and I believe the display is supposed to adjust the picture to better suit bright rooms. I expect these are things which HDTVtest's review will cover.

Does anyone know if the 48" CX lacks any features? I know the smaller models can be missing stuff sometimes.
I believe it's supposed to be fully-featured, but we don't know for sure right now.
One of the main reasons that smaller LCDs are not, is that it's more difficult/expensive to produce local dimming backlights at smaller sizes - especially if you want a lot of zones.
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
60hz BFI is a measured 64% brightness reduction according to DeWayne Davis. (highly regarded calibrator, goes by D-Nice on AVS). In addition to the brightness reduction, it also adversely affects calibration (gamma and contrast) and the only BFI setting that doesn't in a meaningful way is Auto. (120hz — motion was less clean in practice than Sony A8H's BFI implementation)

edit: corrected my math and clarified
 
Last edited:

Pargon

Member
Oct 27, 2017
11,997
60hz BFI is a measured 64% brightness reduction according to DeWayne Davis.
That was a measured drop from 100 nits, which is likely not the same as you would see when brightness is maxed-out.

In addition to the brightness reduction, it also adversely affects calibration (gamma and contrast) and the only BFI setting that doesn't in a meaningful way is Auto.
That doesn't matter - you would calibrate with BFI enabled for those sources anyway.
Auto doesn't affect the picture much because it doesn't do much.

People still don't seem to get it. For BFI to have a significant impact on motion handling, it also has to have a significant impact on brightness.
You can't switch the panel off for 3/4 of every frame and think it won't do that.

It's the same thing for scanline filters. You can't turn half of the screen black and not expect brightness to drop.
 

Fitts

You know what that means
Member
Oct 25, 2017
21,163
That was a measured drop from 100 nits, which is likely not the same as you would see when brightness is maxed-out.


That doesn't matter - you would calibrate with BFI enabled for those sources anyway.
Auto doesn't affect the picture much because it doesn't do much.

People still don't seem to get it. For BFI to have a significant impact on motion handling, it also has to have a significant impact on brightness.
You can't switch the panel off for 3/4 of every frame and think it won't do that.

It's the same thing for scanline filters. You can't turn half of the screen black and not expect brightness to drop.

Does light output not scale linearly in LGs? And the calibration impact is absolutely worth mentioning because some choose to enable/disable on the same profile dependent on content.

And I don't think anyone is arguing that BFI doesn't result in a brightness reduction.

Edit: an aside, but when it was explained that the high BFI setting on the A8H was 48Hz I died inside. An absolutely worthless inclusion.
 
Last edited:

Pargon

Member
Oct 27, 2017
11,997
Does light output not scale linearly in LGs?
They're self-emissive displays with an aggressive ABL. It doesn't scale linearly at all.
If it was completely linear, the brightness drop would be fixed at 75% when BFI was set to high.
But since the panel is black for 3/4 of the frame duration, you can drive it brighter when it's switched on to make up for some of that loss.

And the calibration impact is absolutely worth mentioning because some choose to enable/disable on the same profile dependent on content.
Unless there are measurements posted, saying that it "affects calibration" is meaningless.
And since they did not increase the cell light setting to restore brightness, it's entirely possible that those differences were simply due to the meter not giving the same readings in low light. Or the meter may not handle the refresh rate well. You'd have to switch many meters to CRT mode to get proper readings.
Since OLED response times are what they are, I would not have expected it to do much at all to image quality except make it darker - though that does change your perception of an image even if nothing else changed.

And I don't think anyone is arguing that BFI doesn't result in a brightness reduction.
You would be surprised at how many people seem to think that BFI is pointless until it can be done without affecting image quality at all.
They want CRT/Plasma-like motion without the drawbacks of a CRT/Plasma, without understanding that it's because those displays flicker that the motion is so much better, and also the reason they're dimmer.
 

Lump

One Winged Slayer
Member
Oct 25, 2017
15,980
CX RTINGS review is out and yeah, I'm not sure theres a reason to buy a CX over a C9 at all tbh.

I understand there is a better processor in the CX over C9, but Im not entirely sure what thats even for?
Doing all the post processing stuff? (that we all turn off for game mode?)


here that is btw
www.rtings.com

LG CX OLED Review (OLED48CXPUB, OLED55CXPUA, OLED65CXPUA, OLED77CXPUA)

The LG CX OLED is an excellent high-end TV. It's part of LG's popular OLED lineup, sitting behind the LG GX OLED, and it delivers the same exceptional picture qu...
cx-design-medium.jpg

If I weren't waiting for the 48, I'd 1000% get a C9 right now. Alas, I'm ride or die for that 48 inch as a monitor replacement so it's still going to be the CX for me.
 

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,806
So for the average gamer who isn't needing the best of the best, it's not necessary right? Like I can still play next gen without HDMI 2.1

Well of course HDMI 2.1 isn't required. You could play next gen on a 480p TV if you wanted to. Just at this point, you may as well get a set with HDMI 2.1. At a bare minimum you should want variable refresh rate.
 

Pargon

Member
Oct 27, 2017
11,997
So for the average gamer who isn't needing the best of the best, it's not necessary right? Like I can still play next gen without HDMI 2.1
It's not necessary, but without VRR it's possible that games will run at lower frame rates (locked to 30/60 rather than unlocked) and won't be as smooth.
Without HDMI 2.1 you will be limited to 1440p with VRR enabled. Microsoft will support that, Sony may not.
 

Deleted member 35509

Account closed at user request
Banned
Dec 6, 2017
6,335
Do all the C9s have HDMI 2.1? QVC has all their TVs listed with 2.0 but I can't tell if that's just a generic listing for each one.
 

ss_lemonade

Member
Oct 27, 2017
6,651
That is exactly what should be expected from BFI - though the panel should be capable of more than 36 nits with it enabled.
It's disappointing that the other settings are not still 60Hz, but with increased persistence for people that would prefer to trade off motion resolution for higher brightness. BFI at 120Hz with a 60 FPS source greatly hurts motion quality and I'd say it almost renders the feature useless.
Is this the reason why my TV's (KS9000) BFI option "Clear LED" never worked like I expected and just looks like this?

ks9000-bfi-small.jpg


For some odd reason, it seems like the lower end KS8000 does BFI better at 60 hz, according to reviews I read back then.
 

Pargon

Member
Oct 27, 2017
11,997
Is this the reason why my TV's (KS9000) BFI option "Clear LED" never worked like I expected and just looks like this?

ks9000-bfi-small.jpg


For some odd reason, it seems like the lower end KS8000 does BFI better at 60 hz, according to reviews I read back then.
That's right. 60 FPS sources need 60Hz BFI.
Using 120Hz BFI with a 60 FPS source reduces flicker, but results in awful double-images like that.
I forget the specific model now, but these images are from some Sony LCD that offered both 60Hz and 120Hz BFI options.

60 FPS at 120Hz BFI:
120hz-low34kq2.jpg


60 FPS at 60Hz BFI:
60hz-lowh3kk3.jpg


It's not perfect, since it's an LCD - so you get some trailing on the left edge due to the slower response times, but it's dramatically clearer than a display with faster response times but no BFI.

The reason that "120Hz BFI" was a big deal for the CX OLED is that it meant two things:
1. The panel could use BFI with 120 FPS sources to reduce persistence from 8ms to 4ms.
Nice to have, but not essential - most people using the panel at 120Hz for gaming would probably have VRR enabled, which prevents the use of BFI.

2. But to do BFI at 120Hz requires the panel to update at 240Hz - at least with the way that LG have implemented it so far.
A panel that is updating at 240Hz can have variable persistence with a 60Hz source, for 1/4, 1/2, or 3/4 BFI with older consoles/games that can only do 60 FPS - where the motion improvements are most needed.
That would let you control the amount of flicker/brightness loss against how much improvement there is to motion.
Unfortunately though, it sounds like LG implemented the auto/low/medium modes to operate at 120Hz instead, and only high runs at 60Hz with 3/4 frames of BFI. It will be very effective at that setting, but dimmer than many people will like.
 

ShutterMunster

Art Manager
Verified
Oct 27, 2017
2,450
I really wish the 4K HDR monitor space was more competitive than it is. If you want a quality 4K HDR monitor you have to drop so much money, it's absurd.

There are a ton of affordable TV options with great picture quality but it's 50" and up or bust and my gaming space at home isn't so spacious. I think I may prefer gaming at a desk in a comfy chair to couch gaming too.
 

Darknight

"I'd buy that for a dollar!"
Member
Oct 25, 2017
22,806
So for all these TVs the least they will cost you from $800+ at the absolute best then, ya?

Right now, yes, but there's so few TVs on the market right now that fit the criteria. As more come out and as time goes on, that price will drop and the selection will have more options. I have no doubts TCL will create a low cost model that will fit the bill for people on a stricter budget.
 

nillapuddin

Member
Oct 25, 2017
3,240
There was a quickie Value Electronics deep dive a few days ago. Better processing mostly went toward upscaling and cleaning up poor quality content. Brought it closer to Sony level, but not quite.
I will definitely look at that then, I'm interested in this upscaling tech but I've yet to see actual examples of it/it explained in a way I could determine value

BFI for 60Hz sources is the main reason, as it should reduce image persistence to ~4ms.
That should result in motion handling which is similar to, or better than, most Plasma TVs. Previous models only halved the persistence, to ~8ms.
It should also have been able to work with 120Hz sources, but according to RTINGS' review it's not a good implementation and should not be used.

Keep in mind that there will be visible flicker when you enable this option, and it will cut the brightness by more than 50%.
They don't seem to have posted brightness measurements, but I expect it to be 100 nits at most.
It's also going to be incompatible with VRR.

The main reason someone would want it is for retro gaming if they don't have a CRT, or modern games that are limited to a maximum of 60 FPS.

See you've explained it in a way that I can parse it, but it seems to constantly undercut itself

Bfi @ 60 good
Bfi @ 120 bad
but also
Bfi = less bright, doesn't work with vrr

It's like there isn't a truly slam dunk answer here
 

Rndom Grenadez

Prophet of Truth
Member
Dec 7, 2017
5,633
Does anyone know how the performance is on a GX? I'm looking to grab one later on in the year and I figure I should be pretty future proof with that purchase.