|
This topic comprises 2 pages: 1 2
|
Author
|
Topic: Film resolution vs. HDTV
|
|
|
|
|
|
John Pytlak
Film God
Posts: 9987
From: Rochester, NY 14650-1922
Registered: Jan 2000
|
posted 12-06-2000 06:59 AM
Yes, 4K is shorthand for 4096 lines of horizontal resolution (2048 line pairs).Note that the Kodak Lightning digital film scanner and recorder digitizes a 4-perf "Academy" frame in two seconds, at a resolution of 3656 x 2664 pixels/color. For comparison, the current DLP-Cinema digital projectors use 1280 x 1024 pixels, less than many laptop computer displays. ------------------ John P. Pytlak, Senior Technical Specialist Worldwide Technical Services, Entertainment Imaging Eastman Kodak Company Research Labs, Building 69, Room 7419 Rochester, New York, 14650-1922 USA Tel: 716-477-5325 Fax: 716-722-7243 E-Mail: john.pytlak@kodak.com
| IP: Logged
|
|
Jerry Chase
Phenomenal Film Handler
Posts: 1068
From: Margate, FL, USA
Registered: Nov 2000
|
posted 12-06-2000 08:04 AM
John, referesh my memory? Was that comparison of the actual image when viewed as a still frame? I ask, because I've seen a similar figure used for still camera comparisons.The moving images in motion picture film can suffer from the image moving back and forth along the lens axis during projection from the heat of the lamp, which degrades the quality from those figures given for still images. I think there is a lot going on at a subjective level in viewing motion picture film. Among other things, the flicker rate of various presentation systems has long been suggested to have an "entraining" or trance inducing effect. Those subjective issues and past experiences affect peoples perception, sometimes in an irrational way. To give an idea of the actual overall image quality of projected film, I offer this easily verifiable comment. There is no way that anyone can convince me that matte lines invisible on the projected film images can suddenly appear and be obvious on a television dub, unless the image quality of the projected image is somehow more forgiving (and hence in some ways a less accurate representation) than the tv image. We all know how poor the quality of tv is. It doesn't mean I enjoy the projected film image any less, it just means I take the religious wars and comparisons of image quality on a frame by frame basis a lot less seriously. To my own eyes, which have to wear glasses, the current crop of digital appears comparable to the average 35mm presentation of a clean print. Yes there are differences, but the digital is clearly superior to any 16mm I have ever seen, even though by the stated figures this should be a more accurate comparison. Give me an 11 x 14 print that I can view close-up though, and I'll spot the differences every time, even at 4 megapixels.
| IP: Logged
|
|
John Pytlak
Film God
Posts: 9987
From: Rochester, NY 14650-1922
Registered: Jan 2000
|
posted 12-06-2000 08:51 AM
Jerry:The figures are for the image on the film. Obviously, if you have poor printing or projection, some sharpness will be lost. With care, most theatres can achieve 68 line pairs per millimetre of resolution, as measured by the SMPTE 35-PA (RP 40) test film. For the normal 0.825 x 0.690 inch (20.96 x 17.53 mm) projectable area of the scope format, this translates to 1425 x 1192 line pairs, or 2850 x 2384 lines --- still not too shabby. Even for 1.85:1 flat (0.825 x 0.446 inches, 20.96 x 11.33 mm), 68 lp/mm translates to 1425 x 770 line pairs or 2850 x 1540 lines. My personal experience in viewing digitally projected images is that the lack of sharpness and the pixel structure become objectionable at viewing distances closer than about 2 screen heights. IMHO, I can sit much closer to a well-projected 35mm film image. And most modern theatre designs feature "close up" seating with large screens. One reason matte lines are sometimes more visible in video is the large amount of "edge enhancement" sometimes used to give the impression of sharpness in video. This sometimes causes objectionable "ringing" of edges, as seen in the recent DVD of "The Sound of Music". No way does the DVD compare to a film presentation for true sharpness, but the video has "sharpened" edges, which make it seem sharp from a distance, but unsharp and "edgy" when viewed closer. ------------------ John P. Pytlak, Senior Technical Specialist Worldwide Technical Services, Entertainment Imaging Eastman Kodak Company Research Labs, Building 69, Room 7419 Rochester, New York, 14650-1922 USA Tel: 716-477-5325 Fax: 716-722-7243 E-Mail: john.pytlak@kodak.com
| IP: Logged
|
|
|
John Pytlak
Film God
Posts: 9987
From: Rochester, NY 14650-1922
Registered: Jan 2000
|
posted 12-06-2000 10:51 AM
Rory:If you read the information on the Kodak website, Kodak is part of the "digital revolution" too. But IMHO, today's Digital Cinema is "not ready for prime time" because it doesn't match the quality of "Film Done Right" and is not yet cost-effective. I agree that Digital Cinema can be better than film, but it's not there yet. My real worry is that the same "cut cost to the bone" mentality that hurts film presentation will lead to mediocre digital presentation in many theatres. ------------------ John P. Pytlak, Senior Technical Specialist Worldwide Technical Services, Entertainment Imaging Eastman Kodak Company Research Labs, Building 69, Room 7419 Rochester, New York, 14650-1922 USA Tel: 716-477-5325 Fax: 716-722-7243 E-Mail: john.pytlak@kodak.com
| IP: Logged
|
|
John Schulien
Expert Film Handler
Posts: 206
From: Chicago, IL, USA
Registered: Nov 1999
|
posted 12-06-2000 05:10 PM
Here's an interesting article about a new digital projection technology called Grating Light Valve that appears to address many of the shortcomings of DLP. http://www.e-town.com/news/article.jhtml?articleID=3772 There's better information at the company web site: www.siliconlight.com If you click on "GLV Technology" and go to the bottom of the page, there are some good white-papers which describe the technology. The device uses a 1D linear array of micromechanical ribbon mirrors. Ordinarily, the mirrors reflect, but each mirror can be electrostatically deflected. When alternate ribbons are deflected, that portion of the device becomes a diffraction grating. The device produces a single line, which is mechanically scanned horizontally, creating a 2D picture. Some of their claims: o Up to 30,000 lumens. (It's made out of mirrors, so it can handle a lot of energy) o 4000:1 contrast range (Compared to 1000:1 for film) o Elimination of pixelation o The system can compensate for a burned-out pixel element. o Because the image is scanned horizontally, only a single row of pixels is needed, so the HDTV prototype has only 1080 pixels, instead of the 2 million plus needed in a 2D array. This means that this technology could easily be scaled up to the 4K or so that would be needed to surpass the grain resolution of film. o And just in case Hollywood needed a damn good reason to adopt this technology, this article points out: http://www.e-town.com/columns/features.jhtml?articleID=1199 that because the technology allows such fast switching times, each frame can be overscanned up to six times, and apparently this makes it impossible to videotape, (you get vertical bars running through your picture), so the image is camcorder-proof! Interesting reading. - John
| IP: Logged
|
|
|
Evans A Criswell
Phenomenal Film Handler
Posts: 1579
From: Huntsville, AL, USA
Registered: Mar 2000
|
posted 12-07-2000 08:43 AM
About John's comment on videotaping movies:I don't see why anyone would go to the trouble to videotape a movie from a projection screen. I have heard of people trying to take camcorders into theatres to do it, but to me, that's as ridiculous as trying to record music by putting a microphone in front of a speaker in order to record music. There is a tremendous amount of degradation involved and I don't see that anyone could come up with a sellable product that way. 1. The camera would have to be absolutely still. 2. Anyone getting up between the camcorder and the screen would create a silhouette in the image. 3. Perspective distortion would be present in the image. 4. The image would be of extremely low quality anyway. 5. In most regular theatres using film, wouldn't the 48 Hz shutter rate interfere with the 30 frames per second (60 fields per second) operation of the camcorder? Back to the topic of HDTV vs. film: It is difficult to compare resolution figures when comparing digital images and film images (or even from analog videotape). One medium may have a somewhat lower resolution but lower noise than another medium with higher resolution with greater noise. It would be interesting to take HDTV and film and compare the noise present in an image and then compare a film image taken with a pixel pattern at that same resolution and see how much "noise" there is in the film version.
As I understand, with film, there are grains of a certain density, and the grains can be considered to be randomly distributed on the medium (correct me if I'm wrong, John). Comparing film with digital imaging, to me, seems very similar to comparing non-uniform sampling to uniform sampling. Suppose I sample a scene with 1000000 pixels, and in the first case, I sample in such a manner that my samples are arranged in a uniform grid of 1000 by 1000. In the second case, I sample with 1000000 pixels, but those 1000000 sample locations are randomly distributed throughout the scene. Theoretically, both cases have the same "resolution" and contain the same amount of information, but in the second case I cannot really say that the resolution of my sampling is 1000 by 1000 like I can for the first case, although I can say that the resolution is 1000000 pixels. It would take more resolution than 1000 by 1000 to produce a good reproduction of the second case, and in fact, no uniform x by x sampling will ever exactly reproduce the second case. (No digital scanning device will ever be able to produce a non-loss scanning of film). Look at it the other way and let's say I start with a uniform 1000 by 1000 sampling and I want to capture it on a non-uniform medium. It would take more than 1000000 randomly-distributed "grains" or samples to capture that information. There are also the color-depth issues in the film vs. video debate. In short, I think comparing film and digital resolutions are like comparing apples and oranges. Evans
| IP: Logged
|
|
John Pytlak
Film God
Posts: 9987
From: Rochester, NY 14650-1922
Registered: Jan 2000
|
posted 12-07-2000 09:54 AM
Paul Mayer said:"The SMPTE Hollywood Section had a meeting at the lab about a month ago and we got to see the current DLP imager in action on a 50' screen (clips from Braveheart and American Beauty). At least they got rid of the scalloped fixed pattern noise I saw during Phantom Menace a year ago." I was at that October 10 meeting too, sitting in about the fifth row, screen center. From that viewing distance, the pixel structure was objectionably noticeable to me. Sorry we didn't get to say "hello" in person. ------------------ John P. Pytlak, Senior Technical Specialist Worldwide Technical Services, Entertainment Imaging Eastman Kodak Company Research Labs, Building 69, Room 7419 Rochester, New York, 14650-1922 USA Tel: 716-477-5325 Fax: 716-722-7243 E-Mail: john.pytlak@kodak.com
| IP: Logged
|
|
John Schulien
Expert Film Handler
Posts: 206
From: Chicago, IL, USA
Registered: Nov 1999
|
posted 12-07-2000 11:59 AM
Evans A Criswell makes some excellent points, and asks:> I don't see why anyone would go to the trouble to videotape a movie from a > projection screen. I have heard of people trying to take camcorders into theatres to do > it, but to me, that's as ridiculous as trying to record music by putting a > microphone in front of a speaker in order to record music. There is a tremendous amount of > degradation involved and I don't see that anyone could come up with a sellable product > that way. I completely agree. It seems insane, yet people do it. Yes, the video quality is terrible. It's not done for the quality. Basically it's organized crime. These are professional operations that pay to have slick, four-color packaging made to closely resemble legitimate studio product. Then they mass-duplicate the inferior video, and sell it through black-market channels. I've seen copies turn up at flea markets, in the hands of street vendors, etc. A coworker of mine regularly travels to Tiawan, and apparently everything in American theatres is also available in cheap, knock-off VCDs. Illegal, reprehensable, indefensable, and exactly the sort of thing that copyright law was designed to protect against. Back to the HDTV vs film ... I mostly agree with your discussion about uniform vs non-uniform sampling, but I have to question the statement that "(No digital scanning device will ever be able to produce a non-loss scanning of film)." If you scan a frame of film at sufficient resolution, your scan will begin to expose the grain structure itself. According to this Kodak page: http://www.kodak.com/country/US/en/motion/support/h1/exposure.shtml#structure the silver particles in developed film range in size from about 0.0002 to 0.002 mm. Given a frame height of 11.33 mm for a 1.85:1 aspect ratio, what sort of resolution do we need so that a pixel is approximately the size as a large grain particle: 11.33mm x 1 grain/0.002mm = 5665 grains Meaning that at a vertical resolution of 5665 lines, each pixel would be about the size of a large film grain particle. This is roughly in line with claims that 4096 pixel resolution is required to fully capture the information in a 35mm film frame. Anything more and you're not increasing the image resolution -- instead you're sharpening and defining the appearance of the individual grain particles. In effect, you're oversampling the image -- using a higher resolution uniform sampling to record a lower resolution non-uniform sampling. At some point, you're going to lose the ability to detect the difference between film, and a digital scan of that film at high enough resolution to capture the grain structure. According to this page: http://www.cs.nyu.edu/visual/home/faq.html > The resolution at the center of the retina is about 1 arcminute under ideal conditions, > and this falls off rapidly outside the central 2 degree region. At 10 degrees > eccentricity, the resolution is about 10 arcminute. A high resolution display with > 1280 pixels across a 60 degree field of view achieves about 3 arcminutes per pixel. which implies that when you start exceeding 3-4K resolution, there is a serious question as to whether the human eye can even perceive the difference. I'm really at a loss as to the color depth issue. I've never seen any comparisons. It's a serious question, & I have no answer. But here's the basic point I'm trying to make. Whether or not this particular GLV technology can be effectively brought to market as a digital projector: o Because the imaging element is one-dimensional, It is probably scalable to resolutions that not only duplicate the image, but the film grain structure as well. o It is claimed to be capable of operating at brightness levels in excess of what film can handle without damage. Hence, it does something that film can't do -- it can be brighter. o It is also claimed to have a contrast ratio four times that of film, so in theory, it can do something else that film can't do -- it can have better contrast. o Something I missed the first time around. These devices work so fast, that each frame can be projected six times. But what if instead of projecting each frame six times, you projected six different frames, or performed six-step interpolation between frames. You now have a potential frame rate of 144 frames/second. Film can't do that either. o This is not going to be the last word in projection technology. If anything, digital projection technology is a highly underdeveloped field. I think that digital projection has the same potential for improvement as hard-drives and CPUs a decade ago. A "big" hard drive was 100 megs, and a "fast" CPU was 20 MHz. Now, hard drives hold 400 times as much data, and CPUs are 50 times as fast. My question is: Given that there is the potential for 1-2 orders of magnitude improvement across the board in every aspect of video projection technology ... i.e. 10 years from now, 1280x1024 will probably appear as quaint and old-fashioned as 16 color CGA appears today: Over the next 10 years, will "film done right" be able to maintain an edge over "digital projection done right"? - John
| IP: Logged
|
|
|
All times are Central (GMT -6:00)
|
This topic comprises 2 pages: 1 2
|
Powered by Infopop Corporation
UBB.classicTM
6.3.1.2
The Film-Tech Forums are designed for various members related to the cinema industry to express their opinions, viewpoints and testimonials on various products, services and events based upon speculation, personal knowledge and factual information through use, therefore all views represented here allow no liability upon the publishers of this web site and the owners of said views assume no liability for any ill will resulting from these postings. The posts made here are for educational as well as entertainment purposes and as such anyone viewing this portion of the website must accept these views as statements of the author of that opinion
and agrees to release the authors from any and all liability.
|