Watching the space scenes in the Martian on an OLED TV should be an ideal viewing experience with the true black backgrounds, but in every shot that panned, I couldn’t stop fixating on the incorrect twinkling of all the stars.
This is almost always caused by something linearly interpolating pixels that are in a gamma encoded color space. The 0-255 pixel values that you find everywhere aren’t linear — 64 is not twice the photons of 32. It is close enough that you usually won’t notice, but high contrast star fields and line drawings are the worst case.
There are so many places this can go wrong.
I assume the movie master is rendered correctly, but any resampling to a different resolution will probably do it wrong. I was watching in 4K, but 3840x2160 is not the native theatrical resolution. There is an @FFmpeg option to do gamma correct resizing, but it isn’t the default because it is quite a bit slower.
Lossy compression can also mess up the subtle fractional pixel values, but star fields should not be challenging to compress. A fixed bit rate should be nearly lossless in “easy” scenes like that, but an adaptive one can drop it to a trickle and make the errors perceptible.
A 4K video decoded to a 4K buffer sent to a 4K TV should not involve any resampling, but it is easy for a half pixel offset to slip in somewhere, and GPUs don’t offer a gamma correct YUV textures sampling mode. If you need resampling (as in a VR screen) the best practice is to copy the YUV video to an sRGB texture, then draw that.
Any color space transform for the various “vivid” / “cinematic” / whatever modes that gets done in the media device or TV also have to be done in linear space to avoid issues. I tend to rail against these abuses of the pixels, but if you must do it, at least do it right. The same with any motion interpolation or other features on the TV.
At the very end, the display panel must actually accurately reproduce the gamma curve everything else in the pipeline expected. There are legitimate subtleties here, like 2.2 vs sRGB, but sometimes things are deliberately distorted to “punch up” unknowing content.
So many problems vanish if you just work in 12+ bit linear space, but so far, it has generally been better on net to spend memory/GPU/display bandwidth on higher resolution or frame rates. It is creeping in, but it will be many years yet before end to end linear is common.