View Single Post
Old 6th January 2010, 12:35   #37  |  Link
2Bdecided
Registered User
 
Join Date: Dec 2002
Location: UK
Posts: 1,673
Quote:
Originally Posted by knutinh View Post
For your statements to be generally right, I think one would expect that compressing any original 1080p50 sequence at:
1)1080@50p, h264, X mbps
2)1080@50i, h264, X mbps
3)720@50p, h264, X mbps
Like this...

http://www.ebu.ch/CMSimages/en/tec_e...tcm6-46693.pdf

In 2006, the EBU (inc the IRT, SVT, etc) tried very hard to convince everyone that 1080i wasn't worth it. A year earlier, the IRT were doing demonstrations of this, intentionally using the worst MPEG-4 codec they could find wrt interlacing capability!

Yet when it came to launching HD across Europe, broadcasters chose 1080i. The reason they give is that with newer encoders and full HD displays, 1080i is the current sweet spot.

Part of the problem is probably that they don't have 1080p easily available as a delivery format yet.


It's easier to do the test at SD resolutions, and just as valid. If interlacing is useless, then 720x576p50 at a given bitrate should always look better than 720x576i50 at the same bitrate.

With x264, I think that might be true. But broadcasters are saying that the hardware encoders they have available don't give this result.

I haven't seen any published tests that match this - quite the opposite...
http://ip.hhi.de/imagecom_G1/assets/..._hdtv_2008.pdf
...either the broadcasters are lying - or encoders have changed since that paper was written. Note: they create a 1080i50 signal using about the worst possible method in that paper.

Cheers,
David.
2Bdecided is offline   Reply With Quote