I personally don't think we should set a relatively low bar now in 2024 and stick to that - 108nits is pretty low, consumer HDR can easily reach 1000nits nowadays. The HDR standard goes up to 10.000 nits and then each display does what it can.
What happens when content is mastered at, say, 4000 nits (there are a few blu rays like that) and your TV can't handle that? Well, yes it's a bit of a disaster as you have a "short blanket" and you can only try and squeeze the signal in what you have with various "tone mapping" algorithms which are not standard so each manufacturer does that differently. Dolby Vision is an attempt to make that standard as the algorithm is decided by Dolby and not by the TV manufacturer.
Sticking to 108 in cinema now doesn't feel future-proof. Barco's 300nits could run 108nits as well - the PQ format is linear, it's not like the old gamma curve which was always relative to 100% brightness. HDR content calling, say, 80 nits, will be displayed at 80 nits on screen and not at a certain percentage of maximum brightness.
So I feel we need a cinema standard with a higher figure to future-proof it and then agree on a way to make content looking good at 300, 100, 50 nits, possibly automatically. Yes, creators will not be happy to hear that.
Still, in the consumer world, there is content mastered for 4000 nits - even though there are NO displays capable of displaying that - whereas other content is mastered at 500nits to make sure that even older OLED TVs can display it "as intended" without any weird tone mapping applied to the signal.
What happens when content is mastered at, say, 4000 nits (there are a few blu rays like that) and your TV can't handle that? Well, yes it's a bit of a disaster as you have a "short blanket" and you can only try and squeeze the signal in what you have with various "tone mapping" algorithms which are not standard so each manufacturer does that differently. Dolby Vision is an attempt to make that standard as the algorithm is decided by Dolby and not by the TV manufacturer.
Sticking to 108 in cinema now doesn't feel future-proof. Barco's 300nits could run 108nits as well - the PQ format is linear, it's not like the old gamma curve which was always relative to 100% brightness. HDR content calling, say, 80 nits, will be displayed at 80 nits on screen and not at a certain percentage of maximum brightness.
So I feel we need a cinema standard with a higher figure to future-proof it and then agree on a way to make content looking good at 300, 100, 50 nits, possibly automatically. Yes, creators will not be happy to hear that.
Still, in the consumer world, there is content mastered for 4000 nits - even though there are NO displays capable of displaying that - whereas other content is mastered at 500nits to make sure that even older OLED TVs can display it "as intended" without any weird tone mapping applied to the signal.
Comment