Announcement

Collapse
No announcement yet.

World-first acoustic transparent LED cinema screen debuts

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    I personally don't think we should set a relatively low bar now in 2024 and stick to that - 108nits is pretty low, consumer HDR can easily reach 1000nits nowadays. The HDR standard goes up to 10.000 nits and then each display does what it can.

    What happens when content is mastered at, say, 4000 nits (there are a few blu rays like that) and your TV can't handle that? Well, yes it's a bit of a disaster as you have a "short blanket" and you can only try and squeeze the signal in what you have with various "tone mapping" algorithms which are not standard so each manufacturer does that differently. Dolby Vision is an attempt to make that standard as the algorithm is decided by Dolby and not by the TV manufacturer.

    Sticking to 108 in cinema now doesn't feel future-proof. Barco's 300nits could run 108nits as well - the PQ format is linear, it's not like the old gamma curve which was always relative to 100% brightness. HDR content calling, say, 80 nits, will be displayed at 80 nits on screen and not at a certain percentage of maximum brightness.

    So I feel we need a cinema standard with a higher figure to future-proof it and then agree on a way to make content looking good at 300, 100, 50 nits, possibly automatically. Yes, creators will not be happy to hear that.

    Still, in the consumer world, there is content mastered for 4000 nits - even though there are NO displays capable of displaying that - whereas other content is mastered at 500nits to make sure that even older OLED TVs can display it "as intended" without any weird tone mapping applied to the signal.

    Comment


    • #17
      108 nits sounds like the old standard in 70 mm presentation, at the time of intruction. It helped staying away from the 40 to 60 nits (14/16 fl) standards that are border to night vision for most people. A standard that should have been long abound.
      But anything else is even nicer in theory. We should also keep in mind, human vision might have an extraordinary dynamic range, from night vision in grey to brightest daylight. The eyes accommodates by blinding down. Still the band within each brightness setting that is actually resolved is around 5 f stops. The dynamic range of a computer monitor is higher.
      There is a benefit in higher luminance, meaning as long as we stay away from border to night vision images look better. I do not see a true benefit in 1000 nits or more. Yes, you can blind like viewing into the sun, but real advantage is minimal. Unless you are displaying your image in a bright room, then it might make sense.
      Direct radiators can aim for that goal.
      Projection will be difficult, as mentioned largest machine for a small image required. With projection, the more light you send into the room, the more gets reflected back, at least if a stadium seating is installed, the greasy faces quickly ruin your limiting contrast figure.

      300 nits should be more than ample in a dark room without light sources. For standard theatrical I would so much to have 100 nits, like with Todd AO. Combined with medium sized theatres. You don't need a super large screen, you need to fill the field of vision in a well sized flat screen canvas. It's a superior image in result, and would only require moderate projector size increases. Increases, which are manageable today.

      Comment


      • #18
        We need to find that sweet spot where the patron becomes euphoric during the presentation from visual, audible and environmental perspectives. You have to get it so everyone is willing to pay yet again to experience it all it over and over again. Yeah, push them into an addiction. No chance? Or maybe? Kind of like the fireworks display in the movie Coneheads. WTF is he talking about?

        It amounts to the weak link in a chain phenomenon. Painfully charge them through the nose for everything and nothing else will overcome that. Put them in uncomfortable seats in a musty theater where the A/C has been off to save $3 and nothing will overcome that.

        Produce the film with too much dynamic range to where they cannot understand the dialog and have to recoil in pain during the action (or trailers) and nothing else will overcome that.

        I wonder how much screen size affects things. Ever notice how you can watch a movie on your phone and it seems to come to life and fill your whole conscience? Have you been to IMAX where the images are so large that you see pixels and the frame rate so low that you cannot possibly read or comfortably follow the credit roll?

        Then how reliably can you repeat something. I can play Dark Side of the Moon on my surround system and one day I am whisked away. Another day I need that and go back to it only to not have the same experience. What is that? Do I need to go back 4 hours and lead myself up to that moment ingesting all of the same chemicals?

        All of the new tech is cool but, is it necessary? Is the investment that you have made thinking you've got something out of this world suddenly worthless and now you need this new thing? Certainly the inventors see the profit potential. Did they even think about the audience? Let alone you.

        My ranting again and, yeah, the audio transparency was the argument against the LED flat screen. And, it is absolutely just a big TV. I don't think it is the Holy Grail so much that if you pay for it they will come.

        By the way, The Great Gig in the Sky is one awesome piece of music. Every rendition of it you can find on YouTube is worth the watch. So much emotion and so much is communicated without a single word. The band Atom does a good job with it.

        Maybe I need to take the temperature of the room and recalculate the speed of sound, re-calibrate right before every play to be sure to get to that moment where there is nothing else in the world but the performance?

        Comment


        • #19
          Originally posted by Bruce Cloutier View Post
          Do I need to go back 4 hours and lead myself up to that moment ingesting all of the same chemicals?
          LOL, want people to be wowed with the spectacle again. THAT is how you do it. Micro-dose the butter or the fountain drinks? ;-)

          I pretty much agree, other than specialized venues that trade in things such as premieres and the like... the cost to benefit math seems to be completely out of whack in the current declining cinema market. I don't feel any amount of new technology, short of something like a holodeck, is going to woo back the big audiences from the golden eras of film exhibition. They are just marketing feathers in cinema caps to try to place themselves above their local competition... but in so many markets we've seen a dwindling down to one or no theatres.... the technology feather approach falls apart in those markets.

          If you could snap your fingers and convert every existing cinema to 4K ATMOS and the latest in acoustically transparent LED wall tech, at no cost to the operators, MAYBE you'd have a fighting chance for technology to play a role in an imagined recovery... but the expensive, delayed, and piecemeal approach is unlikely to have any real impact on changing the market trends, and in fact might accelerate the demise of many who take the gamble but then the financials don't work out. The early adopters who can afford it might see a short term improvement, while they are the only players hosting such technology, but it seems unlikely to move the overall needle in the big picture scheme of things.

          In short, as much as some might want technology to be the solution, I don't think it plays the role they think it does anymore relative to the see it in a theatre or not calculus.
          Last edited by Ryan Gallagher; 07-06-2024, 02:19 PM.

          Comment


          • #20
            Speaking of technology... anyone tried the big-screen experiences in the latest VR headsets? Apparently that is one thing they actually good at. I've never tried it with a contemporary headset such as Vision Pro.
            We think we got problems with consumer competition on TVs and home cinemas now, if the VR space becomes ubiquitous enough that may be the real competition to the cinema experience at the end of the day.

            The one thing, for now, that Cinema has in it's pocket is that it is easier to do with friends. But untethered VR systems are getting there, you just need all your friends to have one and figure out sync so the open style headphones don't have distracting sync issues from your buddy's headset.

            Comment


            • #21
              Originally posted by Stefan Scholz View Post
              I do not see a true benefit in 1000 nits or more. Yes, you can blind like viewing into the sun, but real advantage is minimal. Unless you are displaying your image in a bright room, then it might make sense.
              In my opinion image dynamic is like sound dynamic.

              Because Digital Sound allows you to reach 105dB on each channel doesn't mean that you want to use that all at once simultaneously!

              A great example of HDR that springs to mind is the first DUNE: there's a scene where weapons are being fired in the night. The overall scene is dark but the weapons leave very bright (small) trails, the dynamic is amazing, does not blind you and allows you to see all the details in the dark as well.
              Having a 1000nit full screen (which is not yet possible by most displays BTW but just for conversation sake) is pointless, as you say your eyes will adapt and you won't see much different than watching a 300nits full screen white.

              Same with sound, we all know those beautiful and powerful soundtracks which are using the dynamic with care and which can be very loud but enjoyable - and those soundtracks which just sound always loud and annoying.

              So it's a matter of art here. Give the proper artist a wider palette (including brightness) and they'll come up with a masterpiece.
              Give it to the wrong person and you'll be blinded (and deafened!).

              I am confident a similar conversation was being held back in the '90s "we don't need 105dB, do you want to deafen us?"

              Comment


              • #22
                This site is full of techs, but we need to apply a business person's perspective on this topic.

                From a cinema owner's point of view, the details on the upgraded PLF format is not important. Perception of quality and marketability is what they think about. Thats a challenge left to the vendors who have released all these new better than DCI type projectors and sound systems in recent times.

                Also be mindful that DCI, or the studios, are totally against letting any one "proprietary" solution take over. Look at the lengths they went to moving away from ATMOS to IAB that any vendor can now sell.
                I assure you, they will make moves to ensure no specific EDR/HDR type technology becomes the gatekeeper to cinema presentation as they did for Immersive audio.

                In terms of 108nits. The complexity here is large. Just because a system can do 108nits does not mean its average luminance is any where near that. For example, from memory, the Barco light steering still targets a 100nit type base image luminance, the steering is used to fill the less than 5% of the screen that has speculative highlights. How the other systems handle this. I am not sure. but a similar approach may be used, they just don't have the speculative highlights that do give images a far more realistic appearance.

                Cinema is not like domestic HDR. Yes they can master a film with a screen with 2000nit type display. Designed to "MAKE A HUGE COMPROMISE" in that it is tone mapped into the capabilities of the display being used. Cinema was never expected to take such a compromised path. So a common set of values a display can meet are targeted and the projectors are expected to be able to reproduce those values to guarantee the most accurate producer's intent possible.

                At the end of the day, this is mostly about cinema owners competing in a compressed market. The technology is second place to what they can afford and what gains traction with consumers. Who at the end of the day, decide for us by what they put their dollars behind.

                For example, VHS over Betamax. Or more recently, ATMOS speaker placement over AURO. AURO has a much batter speaker placement for reproduction of the sound field, but Atmos was more designed as a consumer product they could more easily implement in a home environment. (Up speakers bouncing sound off the roof). That's why it won in consumer land as it was cheaper to implement (Crappy) bounce sound of roof type environments. (All comes back to money/costs/perception)

                Realistically, I don’t mind, as an improvement is better than no improvement.

                The 300nit target SMPTE has proposed, if Barco can meet it using its light steering projection system. I don't see a 108nit target solution being approved by SMPTE. The industry will simply wait until the others catch up. Plus, emissive displays are already very much in the 300nit capability.

                Comment


                • #23
                  Cinema is not like domestic HDR. Yes they can master a film with a screen with 2000nit type display. Designed to "MAKE A HUGE COMPROMISE" in that it is tone mapped into the capabilities of the display being used. Cinema was never expected to take such a compromised path. So a common set of values a display can meet are targeted and the projectors are expected to be able to reproduce those values to guarantee the most accurate producer's intent possible.
                  I'm with you that Cinema should not be designed to make compromises at the design stage.

                  But how many cinemas are running their 2D at 14fL? And how many are they running their 3D at their intended brightness?
                  So maybe - open for debate - it would be time to realise that the world doesn't give a BLEEP of the standards and try to make the most of what that particular projector can offer? If the projector senses 5fL, let it tweak the picture to make it look "better" with those 5fL.

                  I know, it's never going to happen. But also 14fL is never going to happen. I am also aware that if projectors suddenly were capable of doing that, someone would think "then I don't need 14fL, the projector will do with 5"

                  Interesting about the 100nits full screen and 300nits 5%. Which is basically what's needed. OLED TVs have a similar limitation (which is getting better and better) as the peak white is only 10% of the screen. Full screen an OLED TV can do 150/250nits depending on the generation.

                  Comment


                  • #24
                    Originally posted by Ryan Gallagher View Post
                    Speaking of technology... anyone tried the big-screen experiences in the latest VR headsets? Apparently that is one thing they actually good at. I've never tried it with a contemporary headset such as Vision Pro.
                    We think we got problems with consumer competition on TVs and home cinemas now, if the VR space becomes ubiquitous enough that may be the real competition to the cinema experience at the end of the day.

                    The one thing, for now, that Cinema has in it's pocket is that it is easier to do with friends. But untethered VR systems are getting there, you just need all your friends to have one and figure out sync so the open style headphones don't have distracting sync issues from your buddy's headset.
                    Tried it with Creature from the Black Lagoon 3D couple of months ago.

                    As expected, 3D effect far exceeds anything I saw elsewhere, with great depth and prominence – no TV, let alone cinema screen could achieve that.

                    Besides that, image quality in terms of details was a bit lacking, for IMAX size screen at least, as would any other Blu-ray quality source. I guess you need something like UHD Blu-Ray to obtain comparable image quality to proper 2K DCP. Or a newer movie.

                    Personally, I wouldn’t watch any movie longer than 90 minutes this way. It gets uncomfortable. And weird. It certainly doesn’t feel like watching a movie in the movie theatre. I think VR should play its strengths rather than artificially limit itself by imitating other experiences.
                    Last edited by Agnius Acus; 07-07-2024, 07:26 AM.

                    Comment


                    • #25
                      I've been on SMPTE standards committees for most of my 44ish years in the industry. I've taken a break in the last few years due to time constraints and I can tell you that there are misconceptions here. SMPTE standardizes existing practices much more than setting standards to achieve. It doesn't pick winners/losers in technology but sees what is being adopted and sets the standards to achieve it. Taking it back further...when SMPE was formed, and you had 35mm film as a widely adopted medium...the idea is if you standardized on the perforations, image location...etc., then everyone that makes their projectors and film to work with that will be able to play the format with confidence that it will work. Now, there can be times where, in the case of digital cinema, you have other industry agencies (e.g. DCI) will push for standards to be adopted but if you check, the standards started there. There is a fair amount of overlap between people that work with the various organizations within the industry. If you look at earlier DCI documents, you'll see more definitions as to how things were to be, with respect to all aspects of digital cinema...how to encode, security, presentation. Then...one by one, the DCI document started to refer the reader to the relevent SMPTE standards/Recommended Practices/Engineering Guidelines as those standards came into being after they were already put into practice.

                      It is not the normal order for SMPTE to lead in creating technology but to document the technology. That said, SMPTE does engage in improved methods of achieving the established standards (e.g. how to measure the light...which is one of the last ones I, personally, worked on). There are times where SMPTE will take input from various factions within the industry and propose a change in a standard that will negatively impact one or more parts of the industry and there is time to give voice to the concerns (it is a slow moving process). An example would be the "Scope" aperture for projection. It evolved and one of the last changes was to reduce its width from .839" to .825" (again, projection aperture). This allowed for a uniform width on all 35mm apertures and gave a little more breathing room for the various digitals (particularly DTS) as well as for the optical sound doping to not show up on the edge of the image. That small change cost Panavision (and other camera manufacturers) quite a bit because they had to change all of their ground glass in their viewfinders for the reduced width. This sort of change is more of the exception than the rule and it was practice driven (people would need to undercut their plates to avoid the garbage at the edge of the image).

                      I am confident a similar conversation was being held back in the '90s "we don't need 105dB, do you want to deafen us?"
                      I was there...you'd be incorrect. In fact, Dolby Digital was only 103dB too. I don't believe SMPTE actually set an upper SPL limit. What had been established was the 85dBc level and regardless of if the format was optical, magnetic or digital, if dialog was presented, it shouldn't be a different level regardless of the medium. The 85dBc level, for optical was referenced to 50% modulation. With optical, the limits are pretty well established as you can't go over 100%...there is no space there. Magnetic is a bit more of a wildcard. You still have a "reference level" that corresponds to 85dBc as well as when the Dolby NR circuits cease to affect the sound but the medium and what the recording level is affect the potential max SPL. In 1983, the formulation for the mag tracks changed as well as the track width got a little wider (the head cores themselves, not the film or the space available on the film). With the improved magnetics, this allowed the reference level to correspond to 185nW/m and you could get about 20dB of headroom from there. So, the potential 105dB(ish) was already established. But it was dependent on the oxides used and your cinema processor's ability to carry the audio without clipping. Dolby had established (not SMPTE) a 300mV buss level in their processors so the processor had to handle 3V in its signal path for a worst case...except on the outputs where amplifiers expecting higher nominal voltages might tax some processors.

                      In any event, the digital sound formats really ran with the established magnetic sound format that they were replacing. We already had the 10dB (inband) boost for subwoofer. Furthermore, I don't believe SMPTE set a top end standard, just the reference level and a methodology for tuning the room (ST202).

                      As for IAB versus Atmos. That was really a Dolby decision to make their system work in a generic fashion. SMPTE can't force a manufacturer to do something it doesn't want. When DCinema was started, the "digital wars" of the '90s was an example what everyone wanted to avoid as exhibitors, for a time, had to own multiple systems or not play some titles in the optimal sound format (e.g. Universal was tied to DTS, Disney to Dolby, Columbia/Tristar to SDDS). And, here we are again, with competing formats with DTS-X, Dolby Atmos, and Auro. The IAB compromise was a good one. I'm disappointed to see that companies like Barco want server companies to license acceptance of 3rd party servers into their S4 projectors. That flies in the face of allowing the exhibitor to choose what brands/products that they want to work with and have interoperability. Somewhere, someone is paying something to someone just to make that happen and that, ultimately, costs the exhibitor so a manufacturer can do the right thing. Personally, I think equipment should not be DCI certified that has such a scheme. I wouldn't have been allowed in 2010...which is why all of the DLP projectors shared the same server slot...even when it was used for HDSDI or even just a blank.

                      In any event, 108nits is an existing practice that has gained somewhat wide use. I see no reason it cannot be standardized and other manufacturers can make equipment that can work with movies mastered for it. I think 300nits is a decent goal and probably where emissive should start but it should not be where projectors are trying for, at this stage. It sacrifices way too much money and screen size in the name of trying to make an unproven customer demand.

                      Comment


                      • #26
                        Interesting discussion! I've always thought the 105 dBc max level is inaccurate. The 85 dBc level with about -20 dBFS (the SMPTE standard has a slightly different level due to the various methods of defining dBFS and also the change in level depending on bandwidth) is measured with pink noise that has an RMS level of about -20 dB FS and a crest factor of 12 dB. You can increase the level by 8 dB before the peaks start getting clipped. So, the maximum undistorted level in an auditorium would be 93 dBc. You can get louder with a signal with a lower crest factor, but 12 dB is fairly representative of real content.

                        Getting agreement in establishing or revising SMPTE standards can be a challenge. I was chair of the subtitle group for a while. We were trying to get accurate display of Japanese vertical subtitles. A font file contains vertical metrics that would allow for accurate display. However, font compressors delete the vertical metrics, so subtitle rendering systems use horizontal metrics, such as the horizontal baseline to align vertical text. Another version of the subtitle standard uses the bounding box of the glyph for vertical alignment. This results in Japanese 111 using the ichi character (a horizontal line) being displayed as 3 (three horizontal lines) instead of being spaced properly for 111. I TRIED to get the group to use the vertical metrics in the font file, but, instead, the group decided to leave things as they were and to write an authoring recommended practice to get around these issues.

                        On IAB, the committee spent time trying to take the best features of each of the competing systems. However, the committee was directed to base the standard on Dolby Atmos. The standard is based on a submission by Dolby with a few minor additions (There are some additions from Auro, but these may not be implemented in the field). I suggested a lot of editorial changes to improve clarity so it could be properly implemented by someone without inside knowledge. I also corrected a few errors and inconsistencies. The document from Dolby was "forward looking" and includes a lot of features not implemented by anyone, including Dolby. ISDCF ran tests to see what features were implemented and established Profile 1, which became RDD 57, SMPTE ST 2098-2 Immersive Audio Bitstream and Packaging Constraints: IAB. Note that an RDD is not a standard, but, instead a "Registered Document Disclosure." Anyone can submit an RDD. It is a "this is how we do things document... If you want your equipment to work with ours, use this document."
                        Last edited by Harold Hallikainen; 07-07-2024, 01:21 PM.

                        Comment


                        • #27
                          In any event, 108nits is an existing practice that has gained somewhat wide use. I see no reason it cannot be standardized and other manufacturers can make equipment that can work with movies mastered for it. I think 300nits is a decent goal and probably where emissive should start but it should not be where projectors are trying for, at this stage. It sacrifices way too much money and screen size in the name of trying to make an unproven customer demand.
                          I think the HDR standard is already there and it's scalable as PQ uses absolute values and not relative.

                          So even if company XYZ comes up with a 2000nits projector, a 108nit DCP would still be displayed at 108nits.

                          I just don't think we should end up with a special standard for the cinema which caps everything at 108 just because right now the only available HDR content uses 108.

                          Comment


                          • #28
                            As screen luminance increases, are we going to have a problem with flicker fusion frequency? Can it be dealt with by increasing the number of flashes per frame, or do we need to go to a higher frame rate (let's make Cinema like TV!)?

                            Comment


                            • #29
                              I think all HDR studies demand a higher frame rate to overcome that issue. Most modern high contrast TVs have motion interpolation activated by default.

                              Comment


                              • #30
                                Marco, it isn't about "capping" the standard at 108Nits...it is about establishing a standard at 108Nits. We have a standard for 24fps but that certainly is not a cap as we have been experimenting with higher frame rates since the 1950s. How many titles have been mastered in Dolby Vision? It's more than a passing fad and if it was standardized such that others can make competing equipment, it would drive the costs down as well as increase the perceived quality of the industry. As I mentioned before, 300Nits is a fine goal but it is one that I think is us unachievable, in 2024 with a large screen in a for-profit theatre unless you can find what would likely be a fraction of 1% of the theatres.

                                Honestly 108Nits...which would more than double the light of existing systems (presuming, they are at standard now), will significantly increase the cost of presentation...even without the HDR equipment to get beyond 2000-3000:1 contrast. If you are running 15K lumens, you'd have to jump to over 30K lumens...check out the price difference on those two projectors. It is significantly more than double. Add in the higher contrast and you are likely 4X or more the cost. This isn't a low-cost proposition, even at 108Nits but I could see people wanting a deluxe "X" theatre in their complex where they might be willing to put in a 108Nit system.

                                Comment

                                Working...
                                X