Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
|
|
Thread Tools | Search this Thread | Display Modes |
24th September 2018, 00:03 | #1 | Link |
Registered User
Join Date: Feb 2018
Posts: 8
|
x264 unnecessarily(?) converts to 16-bit when reencoding 10-bit file
When opening a 10-bit h.264 mkv file directly with x264 using either lavf or ffms and the following command set:
Code:
x264.exe "F:\10bitfile.mkv" --demuxer lavf -o "output.mkv" - --profile high10 --input-depth 10 --output-depth 10 --cabac --ref 5 --deblock 0:0 --partitions all --me umh --subme 11 --psy-rd 1.00:0.10 --merange 32 --trellis 2 --8x8dct --cqm flat --deadzone-inter 21 --deadzone-intra 11 --no-fast-pskip --chroma-qp-offset -3 --bframes 16 --b-pyramid normal --b-adapt 2 --b-bias 0 --direct auto --weightp 2 --keyint 240 --min-keyint 24 --scenecut 40 --rc-lookahead 60 --no-mbtree --crf 18 --qcomp 0.70 --qpmin 0 --qpmax 81 --qpstep 4 --nal-hrd none --ipratio 1.40 --pbratio 1.30 --aq-mode 3 --aq-strength 0.70 Code:
resize [warning]: converting from yuv420p10le to yuv420p16le Now, I've read somewhere that x264 works internally in either 8 or 16 bits, so when encoding to 10 bits, the 16-bit mode is used and the input needs to be converted to that bit depth. But since this was written by some random user, rather than a developer, I'm not sure I can trust it. If that is actually the case, why isn't the alert displayed when input is piped from VS? And should I even care? Theoretically, if x264 simply maps 10-bit values into 16-bit space, then quantizing it back to 10-bit could be simply done by truncating and should not require any dither, and therefore output should be identical to input. The question is: is this what x264 is doing? Last edited by rekweom; 24th September 2018 at 00:06. |
|
|