View Single Post
Old 6th January 2010, 13:00   #40  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by 2Bdecided View Post
I only found a single page describing a setup. Was there supposed to be any results?

A good one. Thank you.

Quote:
Yet when it came to launching HD across Europe, broadcasters chose 1080i. The reason they give is that with newer encoders and full HD displays, 1080i is the current sweet spot.
But they are not academics. If the market responds more positively to "1080" than "720", they will offer it, no matter if it is technically "better", wont they?
Quote:
Part of the problem is probably that they don't have 1080p easily available as a delivery format yet.
Why is it a problem to use 720p?
Quote:
It's easier to do the test at SD resolutions, and just as valid. If interlacing is useless, then 720x576p50 at a given bitrate should always look better than 720x576i50 at the same bitrate.
I think that you are right. By having a display that has far higher resolution than the content, we can effectively "factor it out".

It might be that tests at 576i/576p/384p should be carried out at larger distances/smaller displays to be representative of 1080i/1080p/720p.


I believe that the tv-industry is quite conservative. Where IT change equipment and mindset every 3 years, these guys tends to have 20 year cycles. They have invested heavily in editing equipment and interfaces that is limited to 1080i. The big manufacturers have an interest in differentiating themselves through superior deinterlacing. For cameras, there seems to be a potential advantage to do native interlaced capture. For 24p content, they have a working (sort of) channel using 60i/50i.

-k

Last edited by knutinh; 6th January 2010 at 13:15.
knutinh is offline   Reply With Quote