View Single Post
Old 20th November 2019, 00:50   #4  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
In LAV filters, you can disable 10/16bit and let it export as NV12 (8bit). In MadVR OSD stats, you'll see that you can shave off about 10-15ms of rendering time.
and would could lower the render times even lower by selecting RGB output. because the GPU doesn't have to do that now. is this a good idea no because there is no reason to touch the image and if you have perfroamcne issues just use D3D11 decode and you should not loose 10 ms render times.
still wonder how my 1060 can do every thing in less then 10 ms but...
Quote:
Placebo people will use 10/16bit. Insane people will use RGB48.
sane user will not change these settings at all because they just cost CPU cycles for nothing. using RGB out of any kind is lowering picture quality by maving the chroma scaling to the CPU instead of a GPU which is much faster and efficient by doing that.

Quote:
On my really outdated system (old mobo and old cpu), I use NV12 on my 32" 1080p/SDR/TN monitor (not IPS or O/LED; bought it for gaming and HFR interpolating via SVP).
The difference between NV12 and 10bit, only some scenes may show actual difference to me.
no you don't using NV12 because your screen doesn't support it for an input and GPU can't output NV12 the closes would YCbCr 4:2:0 which is possible with HDMI 2.0 there is absolutely no reason to do that.
huhn is offline   Reply With Quote