Quote:
Originally Posted by LigH
Just chatted a bit in IRC ... gnafu believes that CONFIG_LOWBITDEPTH=1 is a) still necessary to be set at compile time, b) in general useful, because it enables an optimized code path for 8 bit precision only, but does not alter the behaviour of higher bit depths.
I hope he is right.
|
I tested with CONFIG_LOWBITDEPTH=1 and it's faster. But it was disable because something...
Code:
Turn off CONFIG_LOWBITDEPTH by default
CodecWG agreed to have this off for default "C" model.
Commit