The HDTV, 720p, 1080i and 1080p jungle

So, I finally got my hand on a HDTV decoder from my local television provider (Get) today. The box handles digital signals from the cable or television provider and supports HDTV, enabling higher TV quality and services.
The decoder supports the following signals: 576p, 576i , 720p and 1080i

The manual tells me to set the output signals from the decoder to 720p if I have a television that can handle HD-signals, but not able to handle 1080i. If I have a television that is HD-ready and that supports 1080i, I should set it to this format. Is it so simple?..]

Let’s first look at some of the basics. HDTV is is the broadcast of television signals with higher resolutions than the traditional television signals that we traditionally have received (HDTV also gives us much better sound). There are three important aspects that defines the format that HDTV is encoded by:

  • Number of lines (720, 1080)
  • Display method (progressive, interlaced)
  • Display rate (60, 50, 30, 24)

Reading about HDTV at Wikipedia above (follow the link), gives me headache. How can a person with regular insight into video format, native resolution, pixels, frame rates, the scanning system (progressive, interlaced) understand anything about this? It is a nightmare.

So the next thing I look into is what technical specification the HD decoder from my television provider has. The box is well tested in several of the leading media related to media and technology in Norway. But no one of them considers the display rate. It is not possible to find anything about the display rate at the web-pages to my provider or within the tests that has been done. It seems that also by people that should know this stuff pretty well, they just focus on the number of lines (720, 1080) and the display method (progressive vs interlaced).

The European Broadcasting Union recommends that its members use 720p50 with the possibility of 1080i50 on a programme-by-programme basis and 1080p50 as a future option. (In the USA, 720p is used by ABC, Fox Broadcasting Company and ESPN because the smoother image is desirable for fast-action sports telecasts, whereas 1080i is used by CBS, NBC, HBO, Showtime and Discovery HD due to the crisper picture particularly in non-moving shots. The BBC is one of the EBU members transmitting in HDTV. It has not yet made a final decision on picture scanning format. Sveriges television in Sweden and Cyfra+ in Poland broadcast in 720p50. All other commercial European HDTV services so far use 1080i50.)

So, the more I digg into this topic the more jungle it gets. I must assume that following the EBU the signals from my decoder is 720p50 and 1080i50. From the EBU statements it is obvious that the signals should be changeable on a programme-by-programme basis based on what format the program was shot in and what kind of program it is (sport, nature and so on). This means that all signals that is not HDTV must be “transformed” or “upscaled” to this format (720p or 1080i).

Assuming the 720p50 and 1080i50 lets look at the picture:

720p50 = 921,600 pixels multiplied by 50 (frames pr seconds) = 46.080.000 pixels pr second

1080i50 = 2,073,600 pixels multiplied by 50 divided by 2 = 51.840.000 pixels pr second

This gives me 12,5% more pixels with the 1080i versus 720p. Not very much, and is this possible to see within my distance to the television? Probably not. The interlaced vs progressive method is well documented and discussed, and the progressive method is better regarding moving pictures.

So far I have got an overview of the different sources of HDTV, what scanning formats exists, the difference between the interlaced and progressive, upscaling basics, native resolution and so on.

Now let me take a closer look at my Plasma television (I have a Panasonic Viera TH-42PV70). With my Plasma and my decoder what signals should I choose? I almost always read the same; Choose 720p not 1080i, and the argument is the progressive vs interlaced element. As I calculated above, I get 12,5% more pixels pr second with 1080i, but with normal distance to the television it’s hard to see the difference, but the artifacts with the interlaced display method I will do (in some occasions). From the Panasonic presentation of my plasma comes: The processor converts signals of present TV broadcasts and DVD software to 1080p high-definition video signals. It up-converts the video signals without causing any loss in the signal or image details. It also applies progressive conversion to upgrade 1080i high-definition video signals to 1080p signals. This allows VIERA to render pictures that are more beautiful than the original.

So it seems that I should use 1080i from my decoder. My plasma will then convert the interlaced signals to progressive. I then looked at the native resolution.. it is 1024 X 768…. that means the television has to scale the 1080i source, that is now converted to 1080p to fit the native resolution. Doing so, the processor has to skip out a lot of information (pixels). The native resolution regarding the display related to HDTV signals is making this topic even more complex.

If this HDTV mess was not complex before taken the native resolution into account, it sure is now

I am happy that I am not a seller of HDTV displays.. this is sure not a simple task!

 


  6 comments for “The HDTV, 720p, 1080i and 1080p jungle

  1. April 10, 2008 at 9:19 am

    I think Eyvind that you would do yourself a huge favour by switching of the geek-mode in your brain before starting to deal with this kind of stuff. 😉 The rule of thumb is to choose up-scaling rather than down-scaling. The logic behind this is simply that downscaling by nature discards information. In reality a lot of people would even have difficulties telling a properly upscaled progressively-scanned SD picture from a HD source.

    More importantly: from your normal viewing distance you wouldn’t be able to tell the difference anyway, so why should you care? Choose 720p sit back and enjoy the show.

  2. April 10, 2008 at 10:57 pm

    he he. I always read that… use 720p. But my brain is not made that way. I need to know why! In my case… I think 1080i from my decoder gives the best result. By the way, I have Discovery HD, and it is really a huge step forward from SD television. The picture is crystal clear, and you really get the feeling of standing in the nature, not watching it on TV. But I must say, that U2 3D gave a more whow-effect than the Discovery HD did 😀

  3. June 27, 2009 at 10:24 pm

    Thanks for this post, it helps to understand the jungle of HD resolutions, and yes it really is a Jungle for most people. You use to go and buy a TV and that was it! now its all about features, defintions etc.

  4. Conrad Preen
    August 19, 2009 at 3:52 pm

    Your argument:

    > 720p50 = 921,600 pixels multiplied by 50 (frames pr seconds) =
    > 46.080.000 pixels pr second

    > 1080i50 = 2,073,600 pixels multiplied by 50 divided by 2 = 51.840.000
    > pixels pr second

    > This gives me 12,5% more pixels with the 1080i versus 720p.

    – simply does not stand. You cannot put spatial and temporal resolution into the same bag and come out with a difference of 12.5% – that’s like adding apples and oranges.

    1080i has 1080/720 times the vertical resolution of 720p, however on content that has a lot of movement the fact that each field is sampled at different times leads to a loss of resolution on moving objects. This is why sports channels have in general chosen 720p whereas movie / general programming channels prefer 1080i.

    The compatibility of these signals with displays on the market is as you rightly say a real issue. The inevitably cheap scaling engine in the display can destroy much of the broadcasters good work.

    Last of all – compression. The biggest factor by far in reducing the quality of TV is the data compression used to fit ever more channels into the available bandwidth. If you showed most viewers uncompressed SD they would probably think it was HD.

  5. September 3, 2009 at 9:56 pm

    Conrad, thanks for your comment. I am not sure if I understand this. If I can not compare the pixels directly this is getting difficult. The interlace artifacts is easy to understand, but 1080i50 gives me more pixels pr second than 720p50. If this is like apples and oranges then I just think I will give up trying to understand 🙂

  6. November 12, 2010 at 9:55 am

    As for the 1080p, there is no doubt that the 1080p resolution is the best on the market. However, there is little difference in picture quality between 1080p and 720p at 32 "and under the screen size.

Leave a Reply

Your email address will not be published. Required fields are marked *