So that would be:
- The C9 with BFI at 60Hz has 8.33ms persistence.
([Rendered_Frame](8.33ms) & [Black_Frame](8.33ms))*60 per second?
- 120 FPS on the C9/with has 8.33ms persistence.
([Rendered_Frame](8.33ms))*120 per second?
They have the same amount persistence, but wouldn't you still want more frames in that case there is the same amount of persistence?
I can only see that CX with BFI being an advantage vs Gsync if you can hit a locked 120hz on CX which would show a better result. Is that correct?
Persistence affects how much motion blur we see. It also has an impact on how smooth the image appears.
Lower image persistence does make motion smoother, but it's not directly equivalent to a higher frame rate - so 120 FPS should still appear smoother than 60 FPS + BFI. That being said: I have not done a side-by-side test of that exact scenario.
Whether you achieve the 8.33ms image persistence via 60 FPS + BFI, or 120 FPS without BFI, you should see a similar amount of motion blur.
The CX with its 4.17ms persistence at 60 FPS + BFI (High) would have
less motion blur than the C9 at 120 FPS without BFI (8.33ms), but is probably not as smooth.
The advantage of BFI is that it decouples image persistence from frame rate.
A 60 FPS source could have a persistence of 12.5ms, 8.33ms, 4.17ms, 2.08ms, 1.04ms etc. rather than being locked to 16.67ms.
The downside is that as you reduce persistence with BFI, the image gets dimmer, and it flickers more.
It also requires that you never drop frames, as stuttering becomes more noticeable when it's in use.
So 120 FPS without BFI -particularly with VRR- is still going to be preferable to 60 FPS with it, even if they have the same persistence.
But that only scales so far. 120 FPS is achievable today - though typically in older games or at lower resolutions.
To reduce persistence further with higher frame rates starts to be unrealistic - especially if you were to try and match a CRT which has less than 1ms of persistence. 1000 FPS at 1000Hz is not going to happen any time soon.
There's no ideal right now, and no getting back to what we had with CRTs any time soon.
Though people complained that CRTs were dim after comparing them to LCDs, for what they were actually achieving, they were very bright.
An LCD or OLED using BFI to achieve the same image persistence is significantly dimmer than the CRT.
You can output less light without reducing brightness.
You choose brightness or you choose motion clarity: those are your choices.
The one thing is that with emissive displays, they could be driven harder at lower persistence.
It depends on many different factors, but you could potentially drive a display at twice the brightness for half the time to achieve the same perceived brightness overall. Of course that may not be possible, as there will be a limit to how bright they can be driven before damage occurs.