View Single Post
Old 28th October 2018, 01:35   #6470  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by Boulder View Post
This is probably a silly question, but here goes anyway: if I use --hdr-opt, do I need to feed the encoder with 10-bit data or is 16-bit data as good if the source is a standard UHD with HDR? I always process things in 16-bit domain and let the encoder dither down to 10 bits.
The actual x265 encoder instance is going to start encoding with 10-bit 4:2:0 pixels one way or another. It gets converted somewhere upstream, perhaps even in the x265 exe. But that’s a filter that runs before the actual codec itself.

If you’re changing bit depth in x265, remember to always use —dither.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote