|
This topic comprises 2 pages: 1 2
|
Author
|
Topic: Newsweek article on Digital Projection
|
Paul Linfesty
Phenomenal Film Handler
Posts: 1383
From: Bakersfield, CA, USA
Registered: Nov 1999
|
posted 03-27-2001 08:33 AM
At last, A mainstream news magazine that doesnt think digital projection(in its current form)is ready for prime time! This link from the current issue of Newsweek: http://www.msnbc.com/news/549585.asp
| IP: Logged
|
|
Scott Norwood
Film God
Posts: 8146
From: Boston, MA. USA (1774.21 miles northeast of Dallas)
Registered: Jun 99
|
posted 03-27-2001 10:20 AM
But they still got the facts wrong. First, they didn't distinguish between the capture format (film or electronic) and the presentation format (film or electronic), which don't need to be the same thing. Second, they fell in to the trap of "digital is cheaper" which, at film resoution, it isn't. For about $30k, one can shoot a feature on 16mm, which has roughly the same resolution of HDTV when shot with slower film and good lenses. That compares _very_ favorably with renting HDTV cameras and editing equipment, plus you end up with a 16mm print that can be projected or a negative which can be blown up to 35mm for $10-20k, rather than an HDTV tape that would cost about $40k+ to transfer to film at 4MC or similar.
| IP: Logged
|
|
|
|
|
Aaron Haney
Master Film Handler
Posts: 265
From: Cupertino, CA, USA
Registered: Jan 2001
|
posted 03-28-2001 12:17 AM
I think the notion of the brain having to do extra work to construct an image based on the pixels from a digital projector will only be true if the resolution is too low. At a high enough resolution, that problem will go away. Here is the relevant quote: quote: Then there’s the matter of our eyes, which are just not used to gazing at pixellated data on a large screen. “Your brain has to work a little harder to construct the image,” asserts cinematographer Allen Daviau, whose credits include “Empire of the Sun” and “E.T.”
That problem can be solved, since if the projector has high enough resolution, the images won't look pixelated. Of course the question is, will we get to actually see that kind of resolution in theaters, or will we have to put up with "good enough"? At the moment it sounds like the answer is, sadly, the latter. That was a strange article. You're right, Scott, there was absolutely no distinction made between shooting and projecting, even though the two are completely independent of each other. The article also talked about digital images looking "flat" and "two-dimensional" -- which is just a complete non-sequitor. Both film images and digital images are going to look that way ... because they're both 2D media! Duh. I don't know what the author was thinking when he wrote that. On the other hand, I was extremely pleased to see this statement in print: quote: The technology has also not yet overcome large gaps in resolution—the amount of picture information carried by digital pixels is still a fraction of what is contained on a frame of 35mm film.
It's about time somebody got around to putting that into a news article. I'm sick of all the hype about how if something's "digital", it's just got to be better. If it doesn't have enough pixels, it's not better! I'm very glad this article was willing to point that out. (And for anyone who might be reading who thinks the extra resolution of film is somehow negated by the limitations of projectors and lenses, I would direct your attention to another thread in which John Pytlak explained how, in Kodak's screening rooms, they are able to measure a resolution of 80 line-pairs per milimeter even when the film is being run through a projector as normal. For a standard 35mm frame area, that works out to nearly 4K resolution. Put that in your digital projector and smoke it.) And let's not forget the fact that Lucas shot Episode II in 1920x1080 HDTV, while the DLP-based projectors Technicolor is talking about installing in 1000 theaters for free (as part of their roll-out plan) have a resolution of just 1280x1024. So those 1000 theaters will not even be able show the full resolution of Lucas's movie, but those theaters showing 35mm prints will be able to, due to the fact that film can hold every last pixel of an HDTV image (and more). After the digital data has been output to film with a laser recorder, it will actually look better that way, rather than trying to show it with an electronic projector that doesn't have enough resolution. Unless theaters start installing digital projectors that are capable of showing full HDTV resolution, the best way to watch Episode II will be on film, not electronically. ------------------ Aaron Haney Professional Complainer Apple Computer, Inc.
| IP: Logged
|
|
|
|
Evans A Criswell
Phenomenal Film Handler
Posts: 1579
From: Huntsville, AL, USA
Registered: Mar 2000
|
posted 03-30-2001 10:29 AM
Paul, it is not the frame rate that will make the brain work harder. It is the spatial quantization, not the temporal quantization, that will make the brain work harder in this case. When detail in an image approaches the resolution of film, degradation of the detail takes place in a way that the brain handles easily. Since the grain is randomly distributed on film, and the grain is different in every frame, the brain is good at integrating (averaging) out the grain over several frames, which can allow details captured poorly by the film to still be easily discernable. If something in the image is not moving, the difference in the way the object gets "rendered" on film can actually allow more detail to be seen than one single frame may capture. Interestingly, studies have been done that have determined that the human eye and brain can more easily extract detail from images that have a slight bit of noise present! Film grain may actually aid the human visual system in seeing detail more easily in an image! With a digital system, the pixels are in the same locations for every image, so that these types of still objects in the image with details approach the sampling limitations of the device get rendered the same way, with the same artifacts (lack of detail or aliasing artifacts) in every frame, with no variance in rendering so there is nothing for the brain to average to get more detail out of the picture or reduce the artifacts. This is compounded by the digital device having a much lower resolution than film. The brain is quite good at extracting detail from low-resolution images, but it takes work and having to do that sort of work for the length of a movie may cause headaches in many people. Distractions during a feature cause the brain to have to do more work to watch a movie, and being distracted by some attribute of the presentation, such as noticing rows and/or columns of pixels, or alaising artifacts or beat patterns produced by the use of uniform spatial quantization, will make the brain have to work harder. Evans A Criswell Huntsville-Decatur Movie Theatre Info Site
| IP: Logged
|
|
|
|
|
John Pytlak
Film God
Posts: 9987
From: Rochester, NY 14650-1922
Registered: Jan 2000
|
posted 04-02-2001 07:49 AM
"Film Done Right" will stay clean and scratch free for hundreds or thousands of runs. The Disney theme parks run 35mm and 70mm KODAK ESTAR prints tens of thousands of passes. I have a sample of a 70mm print that was shown at EPCOT 60,310 times! Yes, if you "do film wrong" you can scratch a print on the first pass. But you can also trash a DVD by handling it improperly, scratch or stretch a D5 tape on poorly maintained equipment, or suffer a head crash on a hard drive. Whether it's film or digital, it still takes skilled and caring people and well-maintained equipment to avoid problems. ------------------ John P. Pytlak, Senior Technical Specialist Worldwide Technical Services, Entertainment Imaging Eastman Kodak Company Research Labs, Building 69, Room 7419 Rochester, New York, 14650-1922 USA Tel: 716-477-5325 Cell: 716-781-4036 Fax: 716-722-7243 E-Mail: john.pytlak@kodak.com Web site: http://www.kodak.com/go/motion
| IP: Logged
|
|
Evans A Criswell
Phenomenal Film Handler
Posts: 1579
From: Huntsville, AL, USA
Registered: Mar 2000
|
posted 04-04-2001 09:43 AM
John, you've made a very good point. Most people think that digital will last forever because accessing the data does not degrade it (like playing a CD or DVD over and over as opposed to playing a vinyl record or running film through a projector repeatedly). What people don't think about are the physical devices that are storing and retrieving that string of numbers that makes up the song or movie. Physical devices, regardless of whether they're storing analog signals or digital information, can fail, and it is very important to have backups.If the data is preserved and copied to new media as old media start to go bad, then digital seems to be the way to go for archival. What many people don't realize is how fast standards of encoding and decoding digital information change. What is really scary about the copy protection and encryption craze is that many years from now. people may be able to read the string of numbers from some digital media, but not be able to make sense of what the numbers represent because the data was encrypted and the key was lost, the algorithm has been lost, or some piece of proprietary hardware necessary for the decryption is no longer made and none are available or in working condition. People generally can not physically observe digital media and tell what's on it. With a vinyl record, it doesn't take too much mechanical ability to figure out that the vibrations of the groove are a fairly direct encoding of the sound recorded. With a piece of movie film, it doesn't take too much insight to just look through the film to see the recorded image and figure out what's going on, or shine light through it to make a bigger version. Suppose someone finds a CD or DVD many years from now but no players are available. They might have a chance at figuring out the CD if they rig up a divice to extract the bitstring, but the encrypted information on the DVD would probably be hopeless to figure out. If all of our analog media go away and everything becomes digital, I believe a lot of our history and knowledge (including movies and songs) will be lost if we're not careful. Evans
| IP: Logged
|
|
Peter Berrett
Jedi Master Film Handler
Posts: 602
From: Victoria, Australia
Registered: Nov 2000
|
posted 04-16-2001 08:11 AM
Your analysis was impressive EvansMay I ask that you extrapolate your analysis a little further please? How does the brain perceive TV by comparison with film and digital projection at theatres? Your analysis concludes that digital projection is harder to watch than film at 24 fps but how does TV compare? Here in Australia we have PAL colour which I understand is 625 x 625 pixels. Your NTSC systems is similar but has a lower number of pixels. Is TV harder to watch than film or vice versa? Are you more likely to get headaches watching TV? Thanks Peter
| IP: Logged
|
|
|
All times are Central (GMT -6:00)
|
This topic comprises 2 pages: 1 2
|
Powered by Infopop Corporation
UBB.classicTM
6.3.1.2
The Film-Tech Forums are designed for various members related to the cinema industry to express their opinions, viewpoints and testimonials on various products, services and events based upon speculation, personal knowledge and factual information through use, therefore all views represented here allow no liability upon the publishers of this web site and the owners of said views assume no liability for any ill will resulting from these postings. The posts made here are for educational as well as entertainment purposes and as such anyone viewing this portion of the website must accept these views as statements of the author of that opinion
and agrees to release the authors from any and all liability.
|