|
|
Author
|
Topic: HDTV/ DVD/ SDTV Question
|
|
Joe Redifer
You need a beating today
Posts: 12859
From: Denver, Colorado
Registered: May 99
|
posted 03-15-2005 07:59 PM
Well since I refuse to pay for television programming that has commercials in it, I obviously don't have something as wallet-sucking as cable so I can only speculate on the answers:
#1 - Cable TV quality always sucks ass. If you get a good picture, then something is definitely wrong. Yet another reason not to actually pay for TV. Even digital cable is all compressed to hell. Oh well, it's your money not mine. I am laughing at you, though... paying for TV. HA! Also, SD video tends to look like ass when viewed on an HD set, as HDTV's just can't handle it as it was meant to be handled (they are nowhere near as versatile as computer monitors).
#2 - DVDs are higher resolution than your standard broadcast signal. DVDs are typically 720x480 which is waaay above what SD broadcasts are usually at in terms of overall discernable pixels. Also DVDs don't have to go through a crappy coaxial wire for hundreds of feet. Instead it is hooked up right there with component cables. Next, DVDs play back in 480p, which helps it to look a bit nicer as well. You do have your DVD player hooked up for progressive scan, right?
#3 - HDTV is either 1280x720 or 1920x1080 (interlaced... for now) for the resolution. This signal gets squashed down to a crappy 19mbps (or less) MPEG2 signal and sent you your home. MPEG2 is also used in DVD at a little less than half that bit rate (usually). MPEG2 is an awful codec, one of the worst ever created with TONS of visable arifacts (all of which are completely unacceptable). That's why whenever anything moves fast on HDTV programs, the entire screen turns into a pixelated mess. Whoever invented MPEG2 should be shot along with their parents and children. Much more efficient and higher quality codecs will be used for the new DVD formats coming up. And there is talk of changing over HDTV broadcasts to the better codecs as well. Of course that will render all current HDTV sets obsolete but you don't mind buying a new "set top box" right?
| IP: Logged
|
|
|
Evans A Criswell
Phenomenal Film Handler
Posts: 1579
From: Huntsville, AL, USA
Registered: Mar 2000
|
posted 03-22-2005 04:05 PM
Technical answer: because DVD has better resulution than cable TV, but not nearly as much as HDTV.
Take a look at the NTSC broadcast parameters:
Luma Bandwidth: 4.2 MHz Color Carrier: 3.57954545 MHz (315 over 88) Frame rate: 29.97 frames/sec Field rate: 59.94 fields/sec Scan lines per frame: 525 Visible Scan lines: 483 (now considered to be 480 in the digital age) Percent of scan line used for picture: 0.8322 percent (52 8/9 over 63 5/9) Bandwidth per scan line: 222.1 Hz Lines of resolution (luma, broadcast) per picture width: 444.3 Lines of resolution (luma, broadcast) per 4:3 picture height: 333.2 Lines of resolution (luma, broadcast) per screen width per MHz bandwidth: 105.8 "TV Lines" per MHz (above figure multiplied by reciprocal of aspect ratio 4:3): 79.34 Lines of resolution (chroma, broadcast) per screen width: 63.5 Pixel resolution needed to capture luma (broadcast): 635 width, 480 height Color bandwidth (1.3 MHz equiband in studio, 600 KHz equiband broadcast for I and Q (1.3MHz wideband-I abandoned long ago))
So, the luminance is 4.2 MHz, although when color was introduced, some of the high frequencies were sacrificed to include the color information. Now much of the high frequency luminance remains after the color information is separated depends on how sophisticated the comb filter is.
Many cheap devices simply extract the luminance below 3 MHz and don't bother trying to uninterleave the luma and color frequencies in the upper region. Multiply bandwidth in MHz by 105.8 to get the number of lines of resolution horizontally, so 3 MHz limits you to 317.4 . Luminance resolution above that is corrupted at least partially by the insertion of the color, centered at 3.58 MHz. A lot of it is recoverable if the interleaving is done properly in the composite signal, though. The limiting resolution for luminance per screen width is 444, of course with some corruption between 317.4 and 444 because of the color information.
A DVD, if you hook it up with component cables, gets around the frequency interleaving of the luma and chroma. Using the commonly accepted Kell value of 0.7, the 720 pixels of DVD is roughly equivalent to 4.76 MHz luminance (504 lines of resolution across the picture). That's about 13.4 percent better than a full-bandwidth 4.2 MHz broadcast signal as far as luminance goes, but, there's no corruption of the upper luminance frequencies, other than compression artifacts.
Where DVD shines is the color bandwidth. Broadcast TV uses equiband 0.6 bandwidth components (although the I component was originally supposed to be 1.3 or 1.5 MHz, but was mostly abandoned), which limits the color resolution to 63.5 lines of resolution. The eye is not as sensitive to this limitation as it is to the luminance, which is 4.2 MHz. The color resolution horizontally is one seventh of the luminance. DVD has a color resolution of 360 by 240, which is equivalent to 2.38 MHz compared to the 0.6 of broadcast TV. That's 252.0 color lines of resolution horizontally compared to 63.5 color lines of resolution of broadcast TV.
The only bad thing about DVD is it only has half the vertical color resolution of broadcast TV. I can really notice that now when I watch some DVDs. But, even taking this into account, DVD still beats broadcast by a long way, plus the fact that the color and luminance don't interfere with each other with DVD. If you use a composite cable, you are going back to some of the broadcast limitations with mixing of the luminance and color.
HDTV has 1280 by 720 or 1920 by 1080 digital resolution, which is way higher than the 720 by 480 of DVD. That's why it looks better, even though I notice compression artifacts when there is a lot of movement.
| IP: Logged
|
|
|
|
|
|
|
Evans A Criswell
Phenomenal Film Handler
Posts: 1579
From: Huntsville, AL, USA
Registered: Mar 2000
|
posted 03-24-2005 10:26 AM
Larry, another reason you may see differences in 480i and 480p is some sets may use a totally different processing method for 480p. I have a 17-inch Sony HD set in my bedroom and when I used colorbars, I could tell a big difference in color bandwidth between 480i and 480p, just by switching the DVD player. This could be either due to the DVD player having separate signal construction paths for the two, one of which has more color bandwidth, or the set itself samples the color at a higher rate if the signal is progressive.
On that little Sony HDTV, transitions between color bars are "perfect" with 480p input. With 480i, using the same DVD player and the same component cables (just switching the player to interlaced), the transitions are no longer "perfect", making it clear that using the 480i causes a loss in color bandwidth (although not in luminance bandwidth).
| IP: Logged
|
|
|
|
All times are Central (GMT -6:00)
|
|
Powered by Infopop Corporation
UBB.classicTM
6.3.1.2
The Film-Tech Forums are designed for various members related to the cinema industry to express their opinions, viewpoints and testimonials on various products, services and events based upon speculation, personal knowledge and factual information through use, therefore all views represented here allow no liability upon the publishers of this web site and the owners of said views assume no liability for any ill will resulting from these postings. The posts made here are for educational as well as entertainment purposes and as such anyone viewing this portion of the website must accept these views as statements of the author of that opinion
and agrees to release the authors from any and all liability.
|