Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
1st January 2010, 17:50 | #21 | Link | ||||
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
Especially for 24fps movie content, the case could be made for a 60i container. But it introduce an infinite list of possible screw-ups for engineers and content producers... Quote:
I would have guessed that a high-end interlacer could be content-adaptive, filtering only moving parts of the scene? Anyways, using interlacing as an extra layer of lossy compression makes little sense. If interlacing is/was a good way of removing bits while keeping quality, then MPEG/ITU codecs would do interlacing internally on progressive signals. And then there would be complete end-to-end control of what had been done and how it should be converted. The same can be said about colorspace conversion and decimation, though. Quote:
I remember reading a Philips paper in which they compared 25p, 50i and 50p when encoding as bitrate-constrained MPEG2. The conclusion was that MPEG2 + 50i + HQ deinterlacing had best rate-distortion characteristics. Perhaps because MPEG2 lacked deblocking? Quote:
There are some physical stuff in sensors that I do not understand very well. Integration time and bandwidth for instance. Last edited by knutinh; 1st January 2010 at 17:57. |
||||
1st January 2010, 18:10 | #22 | Link | ||
Registered User
Join Date: Jan 2002
Location: France
Posts: 2,856
|
Quote:
Quote:
__________________
|
||
1st January 2010, 18:42 | #23 | Link | |||||
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
Quote:
Quote:
Quote:
There will typically be an optical lowpass filter in front of the sensor that smears out details to some degree (have to show some respect to Nyquist) and optics seldomly have perfekt spatial frequency response either. Quote:
Do you use PSNR, SSIM, or real people? Do you average across all codecs and deinterlacer implementations (optimizing mean viewer experience), or only for some idealized reference implementation? I think that 1080p60 with good lossy compression will be best on a quality vs bandwidth benchmark. But is that all? How do you factor in quality vs price? BBC concluded that 720p was enough for the UK public as long as screen sizes did not go much beyond 50". Last edited by knutinh; 1st January 2010 at 18:56. |
|||||
2nd January 2010, 08:35 | #25 | Link | |
Registered User
Join Date: Jun 2009
Location: London, United Kingdom
Posts: 707
|
Quote:
I always thought SD/HD were treated the same like Manao. Last edited by kieranrk; 2nd January 2010 at 08:39. |
|
2nd January 2010, 14:47 | #26 | Link |
Registered User
Join Date: Mar 2002
Posts: 1,075
|
HD displays are all progressive, so the old type of flicker for static thin lines (fine text for instance) doesn't exist anymore when the deinterlacer works well (of course deinterlacers don't always work well). Motion dependent aliasing is actually more likely though for HD (whenever something moves near a multiple of 1 pel per field vertically the vertical bandwidth is halved).
A sports broadcast with interlacing and without a flicker filter would be interesting ... the grass would look lovely I bet. |
4th January 2010, 14:45 | #27 | Link | |
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
http://downloads.bbc.co.uk/rd/pubs/w...les/WHP092.pdf -k Last edited by knutinh; 4th January 2010 at 14:48. |
|
5th January 2010, 15:15 | #28 | Link |
Registered User
Join Date: Dec 2002
Location: UK
Posts: 1,673
|
All UK HD broadcasts are 1080i - there's no 720p - not "even" from the BBC.
And the official line is that while some old tests showed 720p looked better at lower bitrates than 1080i, encoders have improved and 1080i now looks better. Cheers, David. |
5th January 2010, 15:38 | #29 | Link | |
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
For 24p-originating content I think that it is feasible to prove that 1080i is "best" for most display devices. I think that it is difficult to prove the same for general content (e.g. sport) given that there are many bad deinterlacers out there. My VideoSeven lcd contains one of them :-) The question includes "interlacers", lossy encoders, bitrate, and deinterlacers/scalers. Many variables... -k |
|
5th January 2010, 16:00 | #30 | Link |
brainless
Join Date: Mar 2003
Location: Germany
Posts: 3,653
|
but most broadcasters just take a 1080i feed and bob-deinterlace it to 720p.
Their deinterlacers aren't the best either (staristepping etc.). Then you display will upscale this again to 1080p. I think the content should be left in its original format.
__________________
Don't forget the 'c'! Don't PM me for technical support, please. |
5th January 2010, 16:16 | #31 | Link | |
Registered User
Join Date: Dec 2002
Location: UK
Posts: 1,673
|
Quote:
The other reason is that 1080 is a bigger number than 720 (and does look sharper on most TVs - even the previously common 768-line ones) - but the technology isn't out there to do 1080p50 yet, so you're stuck with interlacing. It does make logical sense that packaging the (adaptive) interlacing and (adaptive) deinterlacing into the encoder should make it work better than externally - but it's more complexity: more tuning in the encoder; more work in the decoder. Has anyone ever done it? Cheers, David. |
|
5th January 2010, 16:27 | #32 | Link | |
Registered User
Join Date: Jan 2002
Location: France
Posts: 2,856
|
Quote:
__________________
|
|
5th January 2010, 18:23 | #33 | Link | ||
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
For your statements to be generally right, I think one would expect that compressing any original 1080p50 sequence at: 1)1080@50p, h264, X mbps 2)1080@50i, h264, X mbps 3)720@50p, h264, X mbps Would (on average) be best for 2) for any bitrate X. I highly doubt that to be true, but I have read Philips white-papers suggesting that they could make make 2) true if they used: A)Philips' advanced deinterlacing B)MPEG2 without deblocking filtering C)At constrained bitrates I think that B) was suggested as an important explanation. The standardization organs are competitive about compression gain. If integrating interlacing/deinterlacing in the codec resulted in improved PQ for a given bitrate and a given implementation cost, surely someone would suggest it, have it implemented in the standard? Quote:
I think that all sense indicates that if the source is progressive (not always true), then doing interlacing within the codec will give major benefits for image quality and possibly total complexity as opposed to doing it externally. Advanced deinterlacers do all kinds of "artificial intelligence" that they should not have to do given precise signalling on how the content was actually produced. Motion vectors could be jointly optimized for tracking motion and describing candidates for filling in lines, saving a lot of cycles and having the luxury of optimizing for the ground-truth in the encoder. It might be that I/we are setting the wrong background for the discussion. 1080p50 is not generally the source, and if one made 1080p50 cameras, they would have worse noise-performance. If that is the case, then interlacing could be a reasonable technology in the camera to overcome sensor limitations. If that is the case, then it may be the case that deinterlacing in the camera to 1080p50 does not increase quality/bitrate sufficiently, but does increase complexity considerably. I dont know. -k Last edited by knutinh; 5th January 2010 at 18:53. |
||
5th January 2010, 22:18 | #34 | Link |
Registered User
Join Date: Mar 2002
Posts: 1,075
|
It can look sharper in theory, but only if you allow aliasing ... if you remove all the aliasing you are essentially going to be halving the vertical resolution (which is why generally some aliasing is left and they reduce vertical bandwidth by ~70%).
|
6th January 2010, 00:44 | #35 | Link | |
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
reably exchanging metadata) would be a system capable of 1920x540@50p <-> 1920x1080@25p and anything in-between on a spatial/temporal as-needed basis. When doing something like a seemingly perfect 1920x1080@50p linear pan, that would be based on (possibly sensible) assumptions about how scenes are captured, but still doing bad errors. Superresolution systems depend on aliased input. I think that they overlap a lot with interlacing (at least in theory). Do you think that it is possible to use correlation to estimate if the interlacer used "field integration mode" or "frame integration mode" (or possibly some 70% vertical filtering), and use that information to select between different modes of deinterlacer agressiveness? -k |
|
6th January 2010, 12:20 | #36 | Link | |
Registered User
Join Date: Dec 2002
Location: UK
Posts: 1,673
|
Quote:
Whether the 1080 line interlaced version or 720 line progressive version looks sharper depends on the factors you mention. At best, 1080 can look sharper (by as much as the 1080:720 ratio suggests), with dumb deinterlacing they're quite similar, but the 1080i version will visibly bob. It's rare for interlaced signals to be filtered to half the vertical resolution, so suggesting you'll get 540 vs 720 isn't realistic. Cheers, David. |
|
6th January 2010, 12:35 | #37 | Link | |
Registered User
Join Date: Dec 2002
Location: UK
Posts: 1,673
|
Quote:
http://www.ebu.ch/CMSimages/en/tec_e...tcm6-46693.pdf In 2006, the EBU (inc the IRT, SVT, etc) tried very hard to convince everyone that 1080i wasn't worth it. A year earlier, the IRT were doing demonstrations of this, intentionally using the worst MPEG-4 codec they could find wrt interlacing capability! Yet when it came to launching HD across Europe, broadcasters chose 1080i. The reason they give is that with newer encoders and full HD displays, 1080i is the current sweet spot. Part of the problem is probably that they don't have 1080p easily available as a delivery format yet. It's easier to do the test at SD resolutions, and just as valid. If interlacing is useless, then 720x576p50 at a given bitrate should always look better than 720x576i50 at the same bitrate. With x264, I think that might be true. But broadcasters are saying that the hardware encoders they have available don't give this result. I haven't seen any published tests that match this - quite the opposite... http://ip.hhi.de/imagecom_G1/assets/..._hdtv_2008.pdf ...either the broadcasters are lying - or encoders have changed since that paper was written. Note: they create a 1080i50 signal using about the worst possible method in that paper. Cheers, David. |
|
6th January 2010, 12:50 | #38 | Link | |
Registered User
Join Date: Jan 2002
Location: France
Posts: 2,856
|
Quote:
__________________
|
|
6th January 2010, 12:52 | #39 | Link | ||
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
Quote:
1)Non-filtered (at least electronically), meaning that you let through all aliasing allowed by the transfer function of optics and Optical Lowpass-filter. 2)Sensor line#1 and line#2 is averaged to produce line#1 of field#1. Line#2 and line#3 is averaged to produce line#1 of field#2. I believe this to be a vertical 2-tap boxcar pre-filter. It has a null at fs/2, lets through significant aliasing between fs/4 and fs/2, and attenuates some passband detail below fs/4. For cases where interlacing is applied digitally on a progressive source, there should be many more options. Either tailor-made static filtering, or scene-adaptive filter cutoff. Do you know anthing about what the actually do? For embedding 1080@24p inside 1080@60i (or 1080@25p inside 1080@50i), I think that they should employ no vertical filtering. BTW, nice to see that hydrogenaudio-members are into video as well. -k |
||
6th January 2010, 13:00 | #40 | Link | ||||
Registered User
Join Date: Sep 2006
Posts: 42
|
Quote:
A good one. Thank you. Quote:
Quote:
Quote:
It might be that tests at 576i/576p/384p should be carried out at larger distances/smaller displays to be representative of 1080i/1080p/720p. I believe that the tv-industry is quite conservative. Where IT change equipment and mindset every 3 years, these guys tends to have 20 year cycles. They have invested heavily in editing equipment and interfaces that is limited to 1080i. The big manufacturers have an interest in differentiating themselves through superior deinterlacing. For cameras, there seems to be a potential advantage to do native interlaced capture. For 24p content, they have a working (sort of) channel using 60i/50i. -k Last edited by knutinh; 6th January 2010 at 13:15. |
||||
Tags |
content, deinterlace, interlaced, progressive, quality |
|
|