Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 15th October 2017, 04:26   #1  |  Link
SZGY
Registered User
 
Join Date: Mar 2003
Posts: 8
x265 8 vs 10 bit quality issues

Hello,

I've started to experiment with x265 recently and using info that used to apply to x264, I thought it would be better to go for 10 bit in order to avoid banding and get better compression efficiency.

However, the results were not as expected. While output file sizes don't differ much, the 10 bit encode has visible detail loss and weird artifacts. See attached image + screenshot comparison.

Am I doing something wrong in the encoding process? Would I need to feed the 10 bit encoder 10 bit video data in the first place?

http://screenshotcomparison.com/comparison/120725

x265 Built 2017-10-14:
64Bit-8bit-(3883b374b58d93e707971d93b37eb931)
64Bit-10bit-(411b3fca92fc7e922d1aea8522c17088)

ffmpeg-20170921-183fd30-win64-static

Code:
loadplugin("DGDecodeNV.dll")
dgsource("P1080573.dgi")
trim(583,819)
BicubicResize(1920,1080,-1,0)
Code:
ffmpeg -i test3.avs -f yuv4mpegpipe - | x265-10 --y4m - -o test10.265
Source file:
https://mega.nz/#!Ik50gKDZ!5MwTMw8vR...xa0Ao4svIOCj4E
Attached Images
 
SZGY is offline   Reply With Quote
Old 20th October 2017, 18:27   #2  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,738
Are you playing the video back on a 10-bit display through a >8-bit display pipeline? If output is 8-bit, than the dithering of the display device can cause issues.

Is the source 10-bit? In general HEVC has less of a quality gap between 8-bit and 10-bit encodes with 8-bit sources than H.264 had. I recommend using 10-bit encoding only when you have 10-bit sources. 8-bit is quite a bit faster to encode and quite a bit more compatible, particularly on mobile devices.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 20th October 2017, 19:36   #3  |  Link
SZGY
Registered User
 
Join Date: Mar 2003
Posts: 8
Thank you for your reply.

The source is 8bit H.264, with an 8bit pipeline to the encoder. While my graphics card and display can do 10bpc, I'm not sure if MPC-HC handles this. I was just concerned about these strange ringing artifacts that are introduced when switching to 10bit. Even raising the bit rate leaves them there. 8bit definitely performs better here.

Nevertheless, I am going to take your advice and encode 8bit, just be on the safe side.
SZGY is offline   Reply With Quote
Old 21st October 2017, 14:44   #4  |  Link
Sagittaire
Testeur de codecs
 
Sagittaire's Avatar
 
Join Date: May 2003
Location: France
Posts: 2,484
As always, screenshoot comparison doesn't mean anything: default Rate Control for all codec have really agressive strategy. That mean quantizer (local and overall) can be really different in same picture for the same codec (8 bit vs 10 bit) or different codec.

For example, for same picture, you can compare IFrame for "codec A" and bFrame for "codec B". Defaut RC use generally really agressive quantizer for Bframe (default ratio at 1.40 for x264 and x265) and really conservative quantizer for IFrame.

If you want make screenshot comparion between codec, you must always use constant quantizer encoding mode and make that at the same bitrate for all codec.
__________________
Le Sagittaire ... ;-)

1- Ateme AVC or x264
2- VP7 or RV10 only for anime
3- XviD, DivX or WMV9
Sagittaire is offline   Reply With Quote
Old 22nd October 2017, 16:43   #5  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,738
Quote:
Originally Posted by SZGY View Post
The source is 8bit H.264, with an 8bit pipeline to the encoder. While my graphics card and display can do 10bpc, I'm not sure if MPC-HC handles this. I was just concerned about these strange ringing artifacts that are introduced when switching to 10bit. Even raising the bit rate leaves them there. 8bit definitely performs better here.
MPC-HC absolutely can do 10-bit output, although it may only be in full screen mode. It took some settings tweaking to do last time I tried it, although that was probably a version or two ago. IIRC, with a NVidia GPU, you could only get 10-bit windowed video with Quadro drivers.

The new Redstone 3 version of Windows 10 has support for windowed HDR even, so things may change there once drivers and apps are updated.

Quote:
Nevertheless, I am going to take your advice and encode 8bit, just be on the safe side.
Let us know how it goes.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 22nd October 2017, 17:40   #6  |  Link
Boulder
Pig on the wing
 
Boulder's Avatar
 
Join Date: Mar 2002
Location: Finland
Posts: 5,717
Quote:
Originally Posted by benwaggoner View Post
I recommend using 10-bit encoding only when you have 10-bit sources.
I guess this also depends on your chain of processing. I increase the bitdepth to 16 bits for processing (resizing and denoising) and then encode as 10-bit HEVC. From what I've seen, HW decoding of 10-bit HEVC is getting more and more common in new devices.
__________________
And if the band you're in starts playing different tunes
I'll see you on the dark side of the Moon...
Boulder is offline   Reply With Quote
Old 22nd October 2017, 18:13   #7  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,738
Quote:
Originally Posted by Boulder View Post
I guess this also depends on your chain of processing. I increase the bitdepth to 16 bits for processing (resizing and denoising) and then encode as 10-bit HEVC. From what I've seen, HW decoding of 10-bit HEVC is getting more and more common in new devices.
Yes, 10-bit decode is getting a lot more common; I don't know of any new GPUs or SoCs that support 8-bit but not 10-bit. But there are many, many millions of devices out there which are 8-bit only. Smart TVs are almost all 10-bit, but lots of phones, tablets, and computers only have 8-bit. Lots of phones today are still shipping with older 8-bit only chipsets. Of course phones have a much faster replacement cycle. But it'll be 2025 years before we could safely assume a generic connected device won't be limited to only 8-bit decode.

Also, using 10-bit puts us at the mercy of how good conversion to the native display space is. Lots of SoCs that have 10-bit decode still have only a 8-bit display pipeline, and if there isn't decent dithering, a 10-bit decode can actually look worse than an 8-bit in some cases.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 30th October 2017, 18:35   #8  |  Link
SZGY
Registered User
 
Join Date: Mar 2003
Posts: 8
Thank you goes out to everyone who answered.
SZGY is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 09:49.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.