View Single Post
Old 5th January 2010, 16:16   #31  |  Link
2Bdecided
Registered User
 
Join Date: Dec 2002
Location: UK
Posts: 1,673
Quote:
Originally Posted by knutinh View Post
Anyways, using interlacing as an extra layer of lossy compression makes little sense. If interlacing is/was a good way of removing bits while keeping quality, then MPEG/ITU codecs would do interlacing internally on progressive signals.
Of course they don't, even though interlacing does (at least partly) achieve the gains it's supposed to. That's why it's used. It's not a conspiracy, and it's not a mistake - it actually works (i.e. gives better quality / lower bitrates). Even with H.264 (if the encoder handles interlacing well enough).

The other reason is that 1080 is a bigger number than 720 (and does look sharper on most TVs - even the previously common 768-line ones) - but the technology isn't out there to do 1080p50 yet, so you're stuck with interlacing.


It does make logical sense that packaging the (adaptive) interlacing and (adaptive) deinterlacing into the encoder should make it work better than externally - but it's more complexity: more tuning in the encoder; more work in the decoder. Has anyone ever done it?

Cheers,
David.
2Bdecided is offline   Reply With Quote