So, I finally got my hand on a HDTV decoder from my local television provider (Get) today. The box handles digital signals from the cable or television provider and supports HDTV, enabling higher TV quality and services.
The decoder supports the following signals: 576p, 576i , 720p and 1080i
The manual tells me to set the output signals from the decoder to 720p if I have a television that can handle HD-signals, but not able to handle 1080i. If I have a television that is HD-ready and that supports 1080i, I should set it to this format. Is it so simple?..]
Let’s first look at some of the basics. HDTV is is the broadcast of television signals with higher resolutions than the traditional television signals that we traditionally have received (HDTV also gives us much better sound). There are three important aspects that defines the format that HDTV is encoded by:
- Number of lines (720, 1080)
- Display method (progressive, interlaced)
- Display rate (60, 50, 30, 24)
- Look for the best flat screen TV to add up in your home theater system.
Reading about HDTV at Wikipedia above (follow the link), gives me headache. How can a person with regular insight into video format, native resolution, pixels, frame rates, the scanning system (progressive, interlaced) understand anything about this? It is a nightmare.
So the next thing I look into is what technical specification the HD decoder from my television provider has. The box is well tested in several of the leading media related to media and technology in Norway. But no one of them considers the display rate. It is not possible to find anything about the display rate at the web-pages to my provider or within the tests that has been done. It seems that also by people that should know this stuff pretty well, they just focus on the number of lines (720, 1080) and the display method (progressive vs interlaced).
The European Broadcasting Union recommends that its members use 720p50 with the possibility of 1080i50 on a programme-by-programme basis and 1080p50 as a future option. (In the USA, 720p is used by ABC, Fox Broadcasting Company and ESPN because the smoother image is desirable for fast-action sports telecasts, whereas 1080i is used by CBS, NBC, HBO, Showtime and Discovery HD due to the crisper picture particularly in non-moving shots. The BBC is one of the EBU members transmitting in HDTV. It has not yet made a final decision on picture scanning format. Sveriges television in Sweden and Cyfra+ in Poland broadcast in 720p50. All other commercial European HDTV services so far use 1080i50.)
So, the more I digg into this topic the more jungle it gets. I must assume that following the EBU the signals from my decoder is 720p50 and 1080i50. From the EBU statements it is obvious that the signals should be changeable on a programme-by-programme basis based on what format the program was shot in and what kind of program it is (sport, nature and so on). This means that all signals that is not HDTV must be “transformed” or “upscaled” to this format (720p or 1080i).
Assuming the 720p50 and 1080i50 lets look at the picture:
720p50 = 921,600 pixels multiplied by 50 (frames pr seconds) = 46.080.000 pixels pr second
1080i50 = 2,073,600 pixels multiplied by 50 divided by 2 = 51.840.000 pixels pr second
This gives me 12,5% more pixels with the 1080i versus 720p. Not very much, and is this possible to see within my distance to the television? Probably not. The interlaced vs progressive method is well documented and discussed, and the progressive method is better regarding moving pictures.
So far I have got an overview of the different sources of HDTV, what scanning formats exists, the difference between the interlaced and progressive, upscaling basics, native resolution and so on.
Now let me take a closer look at my Plasma television (I have a Panasonic Viera TH-42PV70). With my Plasma and my decoder what signals should I choose? I almost always read the same; Choose 720p not 1080i, and the argument is the progressive vs interlaced element. As I calculated above, I get 12,5% more pixels pr second with 1080i, but with normal distance to the television it’s hard to see the difference, but the artifacts with the interlaced display method I will do (in some occasions). From the Panasonic presentation of my plasma comes: The processor converts signals of present TV broadcasts and DVD software to 1080p high-definition video signals. It up-converts the video signals without causing any loss in the signal or image details. It also applies progressive conversion to upgrade 1080i high-definition video signals to 1080p signals. This allows VIERA to render pictures that are more beautiful than the original.
So it seems that I should use 1080i from my decoder. My plasma will then convert the interlaced signals to progressive. I then looked at the native resolution.. it is 1024 X 768…. that means the television has to scale the 1080i source, that is now converted to 1080p to fit the native resolution. Doing so, the processor has to skip out a lot of information (pixels). The native resolution regarding the display related to HDTV signals is making this topic even more complex.
If this HDTV mess was not complex before taken the native resolution into account, it sure is now
I am happy that I am not a seller of HDTV displays.. this is sure not a simple task!
- Look for the best flat screen TV to add up in your home theater system.
6 comments for “The HDTV, 720p, 1080i and 1080p jungle”