Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
|
|
Thread Tools | Search this Thread | Display Modes |
14th March 2011, 12:03 | #1 | Link |
Registered User
Join Date: Mar 2011
Posts: 2
|
Testing DVD progressive scan (MPEG decoder + deinterlacer)
Hello everybody,
I am trying to setup a simple test bed for comparing performances of several DVDs, all equipped with HDMI output and Full HD compliant 1080P (as declared by the supplier) I bought the DVD Demystified test disc and I am playing with it. A video is repeated using different rates, from 1.0 Mbps up to 9.8 Mbps. Surprisingly, when I play the video on a DVD connected via HDMI to a big LCD screen, 1080P resolution, I cannot see any difference in quality for rates higer than 5.0 Mbps. Do you think the chip (SUNPLUS SPHE8203R) mounted on the DVD is not able to handle high bit rates and it simply doesn't afford to decode completely the image ? (I am not an expert of MPEG, sorry, but I am trying to improve ...) I can say that the link to the TV via YUV or via HDMI gives the same result. How is possible ? About the deinterlacing function, I run few tests with black and white patterns containing thin horizontal lines. I see those lines flickering ... How can I check if the deinterlacer is properly working ? My idea was to capture the digital video using a device like BlackMagic Intensity Pro, but then how can I check that: 1) the MPEG elementary stream is not properly decoded 2) the deinterlacer is able or not to detect a stream in film-mode Thank you in advance for your help. Andrea |
14th March 2011, 13:46 | #2 | Link | |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Quote:
A good DVD encoder (and professional HW encoders are extremely good) can yield an almost perfect image by 6-6.5Mbps and extremely few artifacts at 9.8. This latter bitrate has been chosen as a panel of specialists found it out that increasing the bitrate over 10 will bring no further advantages.
__________________
Born in the USB (not USA) |
|
15th March 2011, 16:53 | #4 | Link | |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Quote:
You probably think of VCD, where the bitrate was chosen to fit 80 min of film, and since it was still low, they dropped the third error correction layer, too.
__________________
Born in the USB (not USA) |
|
15th March 2011, 16:58 | #5 | Link |
Banned
Join Date: Mar 2004
Location: PA, US
Posts: 683
|
No, it was chosen because the DVD mechanism reads at 10.5 Mbit/s. You also have container overhead and a safety net in case of scratches, to keep the buffer full. DVD specification limits bitrate to 9.8 Mbit/s for a physical reason. Surely there are instances where the picture CAN and WILL benefit from much more than 9.8 Mbit/s, but the DVD mechanism cannot read faster than that, so they imposed a limit.
Oh, correction, it's 10.8 Mbit/s. Last edited by ramicio; 15th March 2011 at 17:01. |
15th March 2011, 17:10 | #6 | Link |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Of course the reading speed played an important part in the equation, but if the drive would have allowed only 5 Mbps (for the sake of argumentation), the maximum would not have been set at 6Mbps, but the drive would have been improved. And now it reads even 24x (at least at margins). Decisive was the quality of the image.
__________________
Born in the USB (not USA) |
15th March 2011, 17:17 | #7 | Link |
Banned
Join Date: Mar 2004
Location: PA, US
Posts: 683
|
No, it is limited to 9.8 Mbit/s because of the drive's physical capability to read faster than that. What part of 1x do you not understand? Just because there are drives that read at 24x doesn't mean anything. Bit rate caps on video are ALWAYS based off the media's 1x speed. Video is averaged at ~4-5 Mbit/s anyway... If the drive would have been limited to 5 Mbit/s and they felt the quality was bad they would have not redesigned the physical media around that, they would have adopted a better compression algorithm. Why do you think Blu-ray started out with MPEG-2 on ALL titles and are now mostly h.264 with a few VC-1 titles here and there?
|
15th March 2011, 17:24 | #8 | Link |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Because that HD material was already on MPEG-2 format (MPEG-2 was used mainly for documentaries, commercials/demos and stuff). It was sent through air as DVB-S (old norm, not DVB-S2), it was displayed using proprietary devices at various showrooms and fairs, and so on. On the other hand, the films were scanned, precessed and easily reencoded, as they are stored uncompressed.
__________________
Born in the USB (not USA) |
15th March 2011, 17:30 | #9 | Link |
Banned
Join Date: Mar 2004
Location: PA, US
Posts: 683
|
MPEG-2 for HD broadcast was used for just that, not archival things. It's encoded on the fly, too, with a sacrifice of quality. They even used h.264 in other countries for HD broadcast before Blu-ray even came out...so your point makes no sense.
|
15th March 2011, 17:55 | #10 | Link | |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Quote:
@atessadri: what device does the upscaling and how?
__________________
Born in the USB (not USA) |
|
15th March 2011, 18:12 | #13 | Link |
Banned
Join Date: Mar 2004
Location: PA, US
Posts: 683
|
Yes, because they came out when Blu-ray came out... It was all the Blu-ray standard allowed then. They also don't need to try to dazzle people so much by Blu-ray anymore, so they aren't going to release new demo discs with a new video format for free. I really don't understand your point here...or most any of your points for that matter. If some new standard for cinema came out, like an upped FPS, then you would be damn well sure that the Blu-ray demo discs would now be h.264.
|
15th March 2011, 18:24 | #14 | Link |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
It's highly off-topic, but I'll try to explain you once and for all:
Long before BD was launched, there were HD TVs and HD transmissions. I live in Europe, I don't know how the things were in the States, or in Japan. To sell those TVs that had an awful image with SD aerial/cable senders, most manufacturers had a proprietary device that fed the TV with HD images (Philips has one, Sharp had another one, Samsung too, Sony had the prototype - cartridged Bluray, and so on). They were MPEG-2. Some shops ran HD shows/loops from satellite, all of them being at that time MPEG-2, and sent using the old DVB-S protocol. Several years later, the same demos were put onto the Bluray disks, as now a unified standard exists. That's the story with MPEG-2 and HD. After movies begin to represent the bulk of the HD content, the other 2 codecs gained importance.
__________________
Born in the USB (not USA) |
15th March 2011, 18:35 | #15 | Link |
Banned
Join Date: Mar 2004
Location: PA, US
Posts: 683
|
The videos are not stored as mpeg-2, they are archived losslessly. The same thing that is broadcast is not the same as is what's on the disc. Anything that is broadcast is encoded on the fly, quite inefficiently too. They used mpeg-2 on the blu-ray demos because that's the format that was available to blu-ray at the time. They used mpeg-2 on this magical device and for broadcasts before that because that's what was majorly broadcast. The us still uses mpeg-2 for broadcast for darn's sake. There were a few channels back then that did use h.264, but it wasn't widely adopted yet.
My point is: Media is first. From there they decide a read speed from the size versus a common title length, 90 minutes, then they choose a video codec. Mpeg-2 was considered decent and was supported more than anything else in the world. Avc took over once more computational power and hardware support came along. The post was supposed to be in all caps because of how pissed I am. They don't base whole standards around a few demos in some retail stores. Last edited by Guest; 9th June 2012 at 01:14. Reason: 4 |
16th March 2011, 17:26 | #16 | Link |
Registered User
Join Date: Mar 2011
Posts: 2
|
Why my DVD connected to a LCD screen is so bad ?
Hello guys,
thanks a lot for having started a nice discussion on the origin of the DVD. Since you mentioned that MPEG-2 is fine and that in every shop the demo played on big LCD screen are all MPEG-2, I am really wondering why my DVD connected to a big Panasonic LCD screen is so bad. How can I test if the deinterlacer is properly working ? or if the MPEG-2 at 9.8 Mbps is completely decoded ? I mean, if I connect the DVD via composite video or via HDMI connector I don't see any difference. Could you please help me to setup a proper test bed to compare several DVDs ? Thanks again for your interest. Sincerely Andrea |
16th March 2011, 17:29 | #17 | Link |
Banned
Join Date: Mar 2004
Location: PA, US
Posts: 683
|
Maybe it's just the DVD source that is bad. There is no such thing as not fully decoding...there is...it's called no picture whatsoever, or totally corrupt (like when a set top box loses a bit of signal you see corruption.)
|
16th March 2011, 17:38 | #18 | Link |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Bad is a dog that bites, bad is a dog that doesn't eat, bad is a dog that s**ts in the neighbour's garden ....
I asked you once, what is bad, what is no improvement, and remember that Pannies always have to be set for progressive stuff.
__________________
Born in the USB (not USA) |
Tags |
chip, deinterlacer, hdmi, mpeg, test |
Thread Tools | Search this Thread |
Display Modes | |
|
|