View Single Post
Old 6th January 2010, 12:52   #39  |  Link
knutinh
Registered User
 
Join Date: Sep 2006
Posts: 42
Quote:
Originally Posted by 2Bdecided View Post
Well, if you have a 1366 x 768 display, then obviously a 1920 width source will look sharper than a 1280 width source in the horizontal direction. There can be no argument there.
Agreed. Smart displays could even benefit from a little subpixel scaling.

Quote:
Whether the 1080 line interlaced version or 720 line progressive version looks sharper depends on the factors you mention. At best, 1080 can look sharper (by as much as the 1080:720 ratio suggests), with dumb deinterlacing they're quite similar, but the 1080i version will visibly bob. It's rare for interlaced signals to be filtered to half the vertical resolution, so suggesting you'll get 540 vs 720 isn't realistic.

Cheers,
David.
The camera info that I found suggested that native 1080i capture will either be:
1)Non-filtered (at least electronically), meaning that you let through all aliasing allowed by the transfer function of optics and Optical Lowpass-filter.
2)Sensor line#1 and line#2 is averaged to produce line#1 of field#1. Line#2 and line#3 is averaged to produce line#1 of field#2. I believe this to be a vertical 2-tap boxcar pre-filter. It has a null at fs/2, lets through significant aliasing between fs/4 and fs/2, and attenuates some passband detail below fs/4.

For cases where interlacing is applied digitally on a progressive source, there should be many more options. Either tailor-made static filtering, or scene-adaptive filter cutoff. Do you know anthing about what the actually do?

For embedding 1080@24p inside 1080@60i (or 1080@25p inside 1080@50i), I think that they should employ no vertical filtering.

BTW, nice to see that hydrogenaudio-members are into video as well.

-k
knutinh is offline   Reply With Quote