For people without advanced technical knowledge of HDTV it is hard to believe that the 720p format (720 progressive lines per frame at 50/60 frames per second) is actually not only not worser, but generally a little better than 1080i (which has 50/60 interlaced 1920x540 fields per second that theoretically comprise a huge 1920x1080 frame). I'm not going to spend the whole day convincing you (I just feel too lazy for that :), but I have a couple of smart links for you to read (both documents come from some smart people at European Broadcasting Union):
Conclusion: 720p delivers the same image quality as 1080i on uncompressed footage, and even better on compressed video (as 720p is more efficient for modern compression aplgorithms). That's why 720p is the Europe's recommended broadcasting standard.
Now I want to answer two questions that you probably want to ask after reading those articles:
Why was 1080i initially added to the HDTV inventory?
Because it happened a long time ago, when CRT displays were popular. Interlaced video was good for CRT, but it isn't good for modern display types that are all progressive (LCD, plasma, 100 Hz TVs, DLP, etc).
Why 1080i is more widely used now by actual HDTV channels than 720p?
It's all about marketing. Bigger numbers. Marketing people aren't techies, they have a hard time seeing actual difference between the two formats. And if they look the same, they simply promote that one that has bigger numbers. Because if they choose lower numbers, their competitors would choose bigger ones and tell the masses that they are cooler, and the masses will believe them (judging from the numbers, of course).
But to be completely fair, I must say that there is one case when 1080i is better. It is better for low-framerate footage, like cinema. But only if your HDTV performs proper inverse telecine. So, 1080i can be the format of choice, say, for a channel that mostly broadcasts 24p movies.
No comments:
Post a Comment