Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
|
|
Thread Tools | Search this Thread | Display Modes |
23rd September 2017, 16:12 | #1 | Link |
Registered User
Join Date: Jul 2006
Posts: 13
|
Linux Command Line utility for comparing quality between media
Hi, I have a lot of video media I am archiving and often get duplicates. I'm looking to automate the process of determining which copy to keep by comparing the two copies, ensuring they are identical (ie. runtime, frames) and seeing which one has the better encoding (i.e. resolution, drf, interlacing vs progressive, cropping, audio sync, etc.).
I looked around for an appropriate tool, but they are mostly GUI based and don't make a decision. The outcomes I would like is Archived is better than New, New is better than Archived, Media is different (i.e. different media) and unable to determine (i.e. requires manual validation). I know there's a lot of complexity in this but I'm wondering if I can reduce most manual cases down. Is there an application / script that can perform such analysis on two files? |
23rd September 2017, 21:10 | #2 | Link | |
Registered User
Join Date: Sep 2007
Posts: 5,377
|
Quote:
Probably not, because there is no way to determine accurately or consistently what is "better" between 2 files . eg. Something might have a higher resolution, higher drf, higher bitrate, higher anything and yet be "worse" in quality. There is no good way to automate that type of detection with any accuracy If you had a third file, the "original", then yes, you could probably run ssim/psnr metrics with ffmpeg for part of the script Probably better off doing it yourself because there is way too much room for error |
|
23rd September 2017, 22:58 | #3 | Link | |
Software Developer
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,248
|
Quote:
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊ Last edited by LoRd_MuldeR; 23rd September 2017 at 23:01. |
|
24th September 2017, 20:02 | #4 | Link |
Registered User
Join Date: Jul 2006
Posts: 13
|
I think a CLI that can provide blind metrics would be enough for a starting point, I think with that I can filter out most of the core defects and then rely on eyeballing them for validation.
I saw a very old tool called AVInaptic but it doesn't seem to compile/run on a modern linux distro and looks like its UI only? |
24th September 2017, 20:30 | #5 | Link | |
Software Developer
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,248
|
Quote:
Plase don't make the mistake to judge quality by bitrate or by quantizer values! Higher bitrate (or lower quantizer) does not imply better quality: For example, the higher bitrate (or lower quantizer) encode may actually have been created from a worse source, so a lower bitrate (or higher quantizer) encode made from a higher quality source can actually look way better! Also, resulting quality depends a lot on the encoder used and the encoder's settings. A "bad" encoder can produce very poor quality, even if you crank up the bitrate! At the same time, a "good" encoder may produce decent quality at very low bitrates - from the exactly same source. Last but not least: Modern video formats, such as H.264, support a feature called "adaptive quantization", which means that every block in the frame can have its own separate quantizer. Usually, each block in a frame will store its quantizer as an offset from the frame's "base" quantizer. Consequently, tools that show you the frame quantizers - and that's what most tools do - will not tell you anything about the actual per-block quantizers. Not that the per-block quantizers would tell much about the video's actual quality, for the reasons explained before...
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊ Last edited by LoRd_MuldeR; 24th September 2017 at 20:44. |
|
|
|