Low persistence "flashing" is terrific in VR (which is a really good way to play old games IMO).
The whole low persistence thing is interesting. Because it tends to suck on TVs, even with less aggressive BFI (way too dim). But it seems like it could be done with a combination of higher refresh rate (more frequent updates) and/or upping the voltage in low persistence mode. Some high refresh LCD monitors already do this. They can safely drive more voltage in very short bursts (as opposed to the steady/lower voltage required for sample and hold). I was browsing the AVS forums looking for LG CES rumors and some some posts talking about LG patents that related to motion resolution. And it seemed to be focusing on voltage. I think we're probably closer to solving the problem than we think.
What needs to happen is separating drawing the image from black frame insertion.
With current BFI implementations, they're literally drawing black frames in-between every other frame. So if you have a 120Hz panel, that means the highest frame rate you can use BFI with is 60 FPS. And it limits you to only having 50% image persistence.
With a 240Hz panel you could enable 50% BFI at 120 FPS, or have options for 25/50/75% at 60 FPS - which I believe is what LG were intending to offer with the 2019 OLEDs, but cancelled at the last minute.
If they controlled this with the panel driving instead; e.g. switching the panel "off" once the frame has been drawn, rather than by drawing black in-between every frame, it would not affect the frame rate at all. A 120Hz panel could do BFI at 120 FPS.
And by using the panel driving to control this, you would be able to set the persistence to any value you like. You could set it to 10% or 90% depending on how much brightness loss and flicker you can tolerate.
As you say, this also lets you drive the panel brighter. While there are going to be limits to what you can use before it starts to damage the panel, you could theoretically implement BFI with no brightness loss at all. If you scanned the image line-by-line instead of illuminating the entire screen at once, rather than displaying 100 nits for 17ms (60Hz) maybe you could drive the panel at 850 nits for 2ms.
I believe this is similar to how the displays used for VR operate.
The other problem with BFI is refresh rate.
With CRTs, the low persistence was inherent to the way they operated. Feed a CRT a 60Hz input, and it would flicker at 60Hz. 50Hz input, 50Hz refresh etc.
With LCDs that use backlight strobing, this is often not the case. Many televisions strobe at 120Hz for a 60Hz input, which results in double-images like the example I posted above. NVIDIA do not allow ULMB to be enabled below 85Hz (though it can be hacked in).
I fear that OLED may be the same, and either double the refresh, or have limits on the minimum refresh rate that it allows low-persistence operation to be used at.
If you're playing a game at 30 FPS, the display should really be updating at 30Hz. That will flicker terribly, but you can get used to it. I've watched a few movies on CRT at 24Hz and you wouldn't believe how clear and smooth the motion is on it - though the flicker is
severe and I had to watch at a very low brightness for that to be tolerable.