Quote:
Originally Posted by Klaus1189
A high end CPU is not needed when you have a 1050ti which has hardware decoders.
|
This pretty much sums it up.
I'm using an old, used XEON with 12 cores. Does not support AVX extension.
But with a GTX 1060 and 12GB of RAM, I can rip my UHD movies and watch them without any frame drops.
I watch on my PC.
SDR monitor, MPC-HC (development continued on
Github, for new updates/patches, it's preferred you have a push commit), latest MadVR test builds (not static web page to get updates,
random posts on the AVSforums have download links to test builds; so use latest stable on a static page:
Videohelp).
In LAV filters, you can disable 10/16bit and let it export as NV12 (8bit). In MadVR OSD stats, you'll see that you can shave off about 10-15ms of rendering time.
Test this, so you can see if you notice a difference in accuracy or not.
Placebo people will use 10/16bit. Insane people will use RGB48.
On my really outdated system (old mobo and old cpu), I use NV12 on my 32" 1080p/SDR/TN monitor (not IPS or O/LED; bought it for gaming and HFR interpolating via SVP).
The difference between NV12 and 10bit, only some scenes may show actual difference to me.