Quote:
Originally Posted by Boulder
This is probably a silly question, but here goes anyway: if I use --hdr-opt, do I need to feed the encoder with 10-bit data or is 16-bit data as good if the source is a standard UHD with HDR? I always process things in 16-bit domain and let the encoder dither down to 10 bits.
|
The actual x265 encoder instance is going to start encoding with 10-bit 4:2:0 pixels one way or another. It gets converted somewhere upstream, perhaps even in the x265 exe. But that’s a filter that runs before the actual codec itself.
If you’re changing bit depth in x265, remember to always use —dither.