
Conclusion: 720p delivers the same image quality as 1080i on uncompressed footage, and even better on compressed video (as 720p is more efficient for modern compression aplgorithms). That's why 720p is the Europe's recommended broadcasting standard.
Now I want to answer two questions that you probably want to ask after reading those articles:
Why was 1080i initially added to the HDTV inventory?
Because it happened a long time ago, when CRT displays were popular. Interlaced video was good for CRT, but it isn't good for modern display types that are all progressive (LCD, plasma, 100 Hz TVs, DLP, etc).
Why 1080i is more widely used now by actual HDTV channels than 720p?
It's all about marketing. Bigger numbers. Marketing people aren't techies, they have a hard time seeing actual difference between the two formats. And if they look the same, they simply promote that one that has bigger numbers. Because if they choose lower numbers, their competitors would choose bigger ones and tell the masses that they are cooler, and the masses will believe them (judging from the numbers, of course).
But to be completely fair, I must say that there is one case when 1080i is better. It is better for low-framerate footage, like cinema. But only if your HDTV performs proper inverse telecine. So, 1080i can be the format of choice, say, for a channel that mostly broadcasts 24p movies.
No comments:
Post a Comment