|
This topic comprises 3 pages: 1 2 3
|
Author
|
Topic: Resolution of scanned prints? > OT: Frame rate discussion
|
|
|
Bobby Henderson
"Ask me about Trajan."
Posts: 10973
From: Lawton, OK, USA
Registered: Apr 2001
|
posted 12-17-2006 07:32 PM
Hmm. Sounds like stuff for a FAQ.
Imagery on a 4-perf 35mm original negative on a major film production will have a lot more native resolution than what a typical 2K digital projector can show.
We're talking about a film strip that goes through a very expensive 35mm camera system outfitted with very expensive lens systems. When the cinematography is done very well, a 4-perf 35mm original negative may have as much as 24 million pixels of real image detail.
Not all of that detail is going to get to the movie theater screen due to generational loss that happens in the film print making process. But when the job is done right, you're going to get a good bit more that the 2 million pixels boasted by 1080p HDTV.
The general consensus is 4K digital projection would come a lot closer to equaling the detail of a 35mm print shown under optimum conditions. A 'scope image in 4K would be 4096 X 1716 -a little over 7 million pixels.
A growing number of film productions are having the original camera negatives, or an inter-negative or inter-positive, scanned into computer systems to do all the post processing work. The computer files are referred to as a digital intermediate, or "DI" for short.
Digital intermediates have a number of advantages. It's easier to adjust color, contrast and alter specific areas of the image. The "analog" methods of color timing a film print or using processes like bleach bypass affected the entire image globally. 3D CGI special effects sequences can be composited into a digital intermediate far easier than having to output elements to send to an optical printer.
All that convenience and speed comes at a price: resolution bottleneck.
When a movie is put through a digital intermediate step, it's native resolution is dumped down to the format of that DI file. Right now 2K seems to be the de facto standard. That's due in part to most CGI effects being produced at 2K quality. Lots of movie studios think that standard is good enough.
When a 'scope movie is put through a 2K DI process, the file ends up with 2048 X 858 pixels. That's it. It won't matter if you see the movie in 35mm or 2K digital projection. There's only so much native detail encoded into the DI.
Some movies are processed with 4K quality digital intermediates. But that's pretty rare.
Right now we're seeing a pretty aggressive trend of advancement in computer CPUs, graphics GPUs, memory technology and file storage. Perhaps at some point all that extra brute force computing power will make 4K economical enough for Hollywood to finally embrace it across the board and finally abandon the slightly better than 1080p HD 2K standard.
| IP: Logged
|
|
|
|
|
|
Scott Norwood
Film God
Posts: 8146
From: Boston, MA. USA (1774.21 miles northeast of Dallas)
Registered: Jun 99
|
posted 12-18-2006 07:37 AM
What is usually missing from this type of discussion is that a) there isn't one, single thing called "film" (8mm != 70mm, Kodak != Fuji, modern cameras/lenses != old cameras/lenses etc.) and b) there is more to image quality than resolution (consider greyscale/color depth, lossy compression, jump and weave, etc.).
DI seems to be popular for big-budget productions, but is very rarely used for smaller, low-budget films. Also, pretty much anything made more than a few years ago would predate the use of DI.
Hey, Joe, why do you hate film so much?
| IP: Logged
|
|
|
|
|
|
|
Cameron Glendinning
Jedi Master Film Handler
Posts: 845
From: West Ryde, Sydney, NSW Australia
Registered: Dec 2005
|
posted 12-19-2006 02:59 AM
The only time I heard a resolution of 35mm film was over 10 years ago, it was 3300 lines of resolution. Kodak have spent over $1 000 000 000 dollars improving it since then. In reality its the quality of the lens, a prime is sharper than a zoom. Low asa film stocks have better definition.
Unfortunatly Kodaks improvements have been squanded on set, higher asa stocks alow less light to be used for speed on set, therefore quicker production.
The digital interneg is scanned of the original neg, in the traditional aproach, Image quality was lost in the lab as copies were made, internegs ect. final print neg ect and generation loss occured at each stage
16 mm looks much better today transfered digitally to 35mm, rather than going through the optical blow up stage ect in the olden days.
In Australia, ten canoes was our first 4K interneg, it look pretty good to me on 35mm.
At the end of the day, Image from the computer is then transfered electronically back to film.
| IP: Logged
|
|
|
|
All times are Central (GMT -6:00)
|
This topic comprises 3 pages: 1 2 3
|
Powered by Infopop Corporation
UBB.classicTM
6.3.1.2
The Film-Tech Forums are designed for various members related to the cinema industry to express their opinions, viewpoints and testimonials on various products, services and events based upon speculation, personal knowledge and factual information through use, therefore all views represented here allow no liability upon the publishers of this web site and the owners of said views assume no liability for any ill will resulting from these postings. The posts made here are for educational as well as entertainment purposes and as such anyone viewing this portion of the website must accept these views as statements of the author of that opinion
and agrees to release the authors from any and all liability.
|