Quote:
Originally Posted by Atak_Snajpera
I'm not interested in hardware encoding at all because hardware encoders suck in fine detail retention at low bitrates. Another problem. Hardware encoding will also be less useful in Distributed encoding mode where most laptops/pcs use non nvidia GPUs.
Summary - I do not have nvidia GPU and my next GPU will be for 100% AMD navi
- Hardware encoders produce noticeable more blurry image at lower bitrates
- Many machines in DE mode could not be used for encoding due to incompatible gpu.
|
To back Atak's point, I have tried GPU encoding for several years. Each year the quality does improve, but the overall image is still nowhere near what traditional CPU encoding can do. Often the image is softer with more compression artifacts and larger file size. The only win is the speed. I can encode 1080p at >200fps on my 2080ti which makes the need for distributed encoding null in my opinion, but the result is so underwhelming that I can't justify it. Also, as soon as you throw any additional filters in the mix such as denoise or tonemapping you lose the speed anyways. I'll stick with distributed encoding for now.