Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 30th December 2016, 01:30   #1  |  Link
KahnDigifer
Digital Devil
 
Join Date: Oct 2016
Posts: 4
How much lower of a bit rate on h.265 will give similar quality to h.264?

I've been taking lossless avi files and encoding them in h.264 at 25 mbps for 4k videos. If I make the switch to HEVC, can I really get the same quality as h.264/25mbps at half the bit rate?

Should I do h.265 at also 25 mbps to get the same quality as the h.264 output or would I get similar/equal visual quality with h.265 at a bit rate of 12.5 - 13 mbps? I've been reading these articles saying I should get the same quality as h.264 at about half the bit rate, but is that really true from your observations?
KahnDigifer is offline   Reply With Quote
Old 30th December 2016, 02:07   #2  |  Link
mariush
Registered User
 
Join Date: Dec 2008
Posts: 590
Depends on the type of content in the lossless avi files. anime (big areas of colors sharp edges etc) is different than pixel art games (side scrollers and mario like games or games like stardew valley , starbound, terraria etc) game capture and different than multiplayer games with lots of action and motion and different again than movies with grain and slightly blurring which helps with compression.
I would say that 20-30% less bitrate for same quality would be realistic, if you use software encoding (like finely configured x265, configured for the type of content you wish to encode) . If you use hardware encoding (new RX series and GTX10** cards have hardware HEVC encoders) won't compare in quality with software encoders.
mariush is offline   Reply With Quote
Old 30th December 2016, 02:40   #3  |  Link
KahnDigifer
Digital Devil
 
Join Date: Oct 2016
Posts: 4
Thanks for the response. I'm away from the PC I've been using so won't be able to test for a few days, but I'll try out your suggestion.

I was thinking of doing the HEVC encoding with my GTX 1070 GPU. Is the quality difference between a software and hardware encoder a set rule or can the GPU encoded file produce similar quality if the bit rate is set higher, like 30 mbps as opposed to 25 mbps for the cpu encoded file?

These are for videos I've shot and will make downloadable online. The reason I'm considering using the GPU for HEVC encoding rather than the CPU is because there's a large bulk of them and it's in 4k. Although my desktop has a premium thermal paste and a 240 mm Corsair H100i cooler, I'm worried CPU encoding that much will fry my processor.
KahnDigifer is offline   Reply With Quote
Old 30th December 2016, 03:15   #4  |  Link
JohnLai
Registered User
 
Join Date: Mar 2008
Posts: 448
Nvidia Nvenc for hevc encoding?
Nvenc lack of B-frame itself means higher bitrate is required. Around 50% more bitrate than software encoder x265.
Read more on my post here https://forum.doom9.org/showthread.p...93#post1780493
JohnLai is offline   Reply With Quote
Old 30th December 2016, 12:13   #5  |  Link
birdie
.
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 135
It solely depends on the source.

With a grainy/noisy source there might be zero or even negative gains, with an anime like source you might gain up to 40% of bitrate.

Quote:
Originally Posted by KahnDigifer View Post
I was thinking of doing the HEVC encoding with my GTX 1070 GPU. Is the quality difference between a software and hardware encoder a set rule or can the GPU encoded file produce similar quality if the bit rate is set higher, like 30 mbps as opposed to 25 mbps for the cpu encoded file?
There are no HEVC GPU encoders which have a comparable to x265 quality. But if we compare AVC and HEVC GPU encoders then most likely you will always have some gains.

Last edited by birdie; 30th December 2016 at 12:19.
birdie is offline   Reply With Quote
Old 3rd January 2017, 22:22   #6  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,511
The real use cases for these hardware encoders are:

A) You simply cannot spend or do not have the CPU resources necessary to do proper software encoding (e.g. game streaming at high resolutions and 60p)
B) You care much more for speed than quality (e.g. quick transcodes for compatibility)

When quality is the primary consideration, x265 is still the bees knees
Blue_MiSfit is offline   Reply With Quote
Old 4th January 2017, 03:41   #7  |  Link
KahnDigifer
Digital Devil
 
Join Date: Oct 2016
Posts: 4
Thanks for the responses guys. Something I just thought of though - do you think most/any web browsers will have support for h.265 playback in the near future? If I want to make these videos streamable through an online video player like JWPlayer, etc., then I'm stuck having to encode in h.264, right?
KahnDigifer is offline   Reply With Quote
Old 4th January 2017, 19:45   #8  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,511
That is a huge problem with HEVC right now.

The only browser that does support it AFAIK is Edge, and in that case only if you have HEVC hardware acceleration in your system.

From what I understand the lack of browser support is mostly due to licensing issues. FWIW, VP9 has decent software decode support on all current desktop browsers except Safari

Netflix did just announce that they will stream 4K HEVC to Edge and their Windows 10 app - but curiously chose to allow this only for users of Intel's new Kaby Lake CPUs - even though users like myself with an old Sandy Bridge CPU and a new GTX 1080 GPU could play this content perfectly fine... Hmm...

Basically HEVC distribution today is limited to smart TVs, set top boxes, and a few edge cases like Netflix on Kaby Lake on Windows 10.

There may also be some linear TV contribution or distribution being done via satellite using HEVC, but that's also kind of an edge case.

HEVC is fabulous, but until we see broad support on the desktop I think VP9 will be the "current gen" codec of choice for in-browser streaming, but with AVC still being hugely prevalent as well for "non youtube" sites

Last edited by Blue_MiSfit; 4th January 2017 at 19:53.
Blue_MiSfit is offline   Reply With Quote
Old 4th January 2017, 19:49   #9  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,410
And Edge requires Windows 10 which has a limited market share...

Chrome and Firefox only support H.264 on PC via system codecs so you can't expect widespread HEVC browser support anytime soon. (And of course Google wants to push AV1)

Last edited by sneaker_ger; 4th January 2017 at 20:04.
sneaker_ger is offline   Reply With Quote
Old 4th January 2017, 19:58   #10  |  Link
birdie
.
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 135
Quote:
Originally Posted by sneaker_ger View Post
And Edge requires Windows 10 which has a limited market share...

Chrome and Firefox only support H.264 on PC only via system codecs so you can't expect widespread HEVC browser support anytime soon. (And of course Google wants to push AV1)
You're not correct about google chrome: it supports AVC everywhere because it has its decoder built in.

Chromium based browsers indeed lack AVC support and rely on the system codec.

But Chrome and Chromium are two different yet related projects.
birdie is offline   Reply With Quote
Old 4th January 2017, 20:03   #11  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,410
I wasn't aware of that. Does it support more than WebRTC and baseline profile or does it have the same limitations as the codec in Firefox?
sneaker_ger is offline   Reply With Quote
Old 5th January 2017, 08:53   #12  |  Link
birdie
.
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 135
Quote:
Originally Posted by sneaker_ger View Post
I wasn't aware of that. Does it support more than WebRTC and baseline profile or does it have the same limitations as the codec in Firefox?
Since Google Chrome plays 4K AVC youtube videos I guess it's a full fledged decoder (for the purposes of online video - I've never had a chance to verify whether it plays 10/12bit AVC streams).

For fun I've just downloaded and verified that Google Chrome plays this video stream (after changing the container to mp4):
Code:
    Stream #0:0(eng): Video: h264 (High 10) (avc1 / 0x31637661), yuv420p10le(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 9478 kb/s, 24.01 fps, 23.98 tbr, 16k tbn, 47.95 tbc (default)
just fine.

Last edited by birdie; 5th January 2017 at 09:15.
birdie is offline   Reply With Quote
Old 5th January 2017, 14:37   #13  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,410
Proof of playback isn't proof it's using the internal decoder unless maybe you are on Windows N and/or without suitable GPU.

chrome://media-internals shows decoder being used.
sneaker_ger is offline   Reply With Quote
Old 5th January 2017, 17:13   #14  |  Link
birdie
.
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 135
God, I'm on Linux right now and I don't have any system wide AVC decoder/encoder. You could stop arguing with me if you know shat about Google Chrome. It's now the third time you're misinforming people here.
birdie is offline   Reply With Quote
Old 5th January 2017, 17:39   #15  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,410
I didn't say you are wrong, just being skeptical without any source.
sneaker_ger is offline   Reply With Quote
Old 5th January 2017, 21:30   #16  |  Link
birdie
.
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 135
ffmpeg libraries are not installed locally:

Code:
render_id: 69
player_id: 0
pipeline_state: kPlaying
event: PLAY
url: file://******redacted******
total_bytes: 1558194162
streaming: false
single_origin: true
passed_cors_access_check: false
range_header_supported: true
info: FFmpegDemuxer: created audio stream, config codec: aac bytes_per_channel: 4 channel_layout: 12 samples_per_second: 48000 sample_format: 6 bytes_per_frame: 24 seek_preroll: 0ms codec_delay: 0 has extra data? true encrypted? false
duration: 6028.142307
audio_channels_count: 6
audio_codec_name: aac
audio_sample_format: Float 32-bit planar
audio_samples_per_second: 48000
bitrate: 2067893
coded_height: 720
coded_width: 960
found_audio_stream: true
found_video_stream: true
height: 720
max_duration: 6028.142307
start_time: 0
time_base: 417083/20000000
video_codec_name: h264
video_format: PIXEL_FORMAT_YV12
video_is_encrypted: false
width: 960
audio_dds: false
audio_decoder: FFmpegAudioDecoder
video_dds: false
video_decoder: FFmpegVideoDecoder
Also, you could grep google chrome binaries and see for yourself. But I guess you were too lazy to in the first place.

Also google for CVE-2014-3157:

Quote:
Heap-based buffer overflow in the FFmpegVideoDecoder::GetVideoBuffer function in media/filters/ffmpeg_video_decoder.cc in Google Chrome before 35.0.1916.153 allows remote attackers to cause a denial of service or possibly have unspecified other impact by leveraging VideoFrame data structures that are too small for proper interaction with an underlying FFmpeg library.
Magic. You didn't even try.

Last edited by birdie; 5th January 2017 at 21:33.
birdie is offline   Reply With Quote
Old 5th January 2017, 22:25   #17  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,410
It's not that I didn't test but it is difficult for me to determine the exact decoder being used on Windows 7 with both MS decoders and DXVA decoders being available. If I turn off hardware acceleration in chrome settings it can still play H.264 high profile HTML5 <video>, using "FFmpegVideoDecoder" which seems to confirm what you are saying though ffmpeg also supports DXVA2. Unlike you I don't have a system without any support for H.264 decoding.
sneaker_ger is offline   Reply With Quote
Old 14th January 2017, 15:03   #18  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,949
Quote:
Originally Posted by Blue_MiSfit View Post
The real use cases for these hardware encoders are:

A) You simply cannot spend or do not have the CPU resources necessary to do proper software encoding (e.g. game streaming at high resolutions and 60p)
B) You care much more for speed than quality (e.g. quick transcodes for compatibility)

When quality is the primary consideration, x265 is still the bees knees
This is also interesting in AMDs new RyZen Strategy they actually market it like (N)CU supported Encoding is bad (Quicksync,APP,NVENC) our 8 Cores can do it way more efficiently alone without impacting anything at the same time due to our extremely cool async processing architecture and all that at super low latency as well

But in all that greatness how it is they don't say anything about Power Efficiency of course

I guess this is not really needed with a crowed of Gamers/Youtube/Twitch sellers that need to be impressed

Quote:
Originally Posted by Blue_MiSfit View Post
That is a huge problem with HEVC right now.

The only browser that does support it AFAIK is Edge, and in that case only if you have HEVC hardware acceleration in your system.

From what I understand the lack of browser support is mostly due to licensing issues. FWIW, VP9 has decent software decode support on all current desktop browsers except Safari

Netflix did just announce that they will stream 4K HEVC to Edge and their Windows 10 app - but curiously chose to allow this only for users of Intel's new Kaby Lake CPUs - even though users like myself with an old Sandy Bridge CPU and a new GTX 1080 GPU could play this content perfectly fine... Hmm...

Basically HEVC distribution today is limited to smart TVs, set top boxes, and a few edge cases like Netflix on Kaby Lake on Windows 10.

There may also be some linear TV contribution or distribution being done via satellite using HEVC, but that's also kind of an edge case.

HEVC is fabulous, but until we see broad support on the desktop I think VP9 will be the "current gen" codec of choice for in-browser streaming, but with AVC still being hugely prevalent as well for "non youtube" sites
Hollywoods R&D doesn't accepts Sandy Bridge and Windows 7/8 as a Secure enough platform for High Quality Content Distribution, despite it's working AES Core but the UEFI structure has been to often compromised by now including the ME.

Windows 10 was hardened on many levels internaly which also cause many problems and needed heavy driver and Hardware work

Quote:
Originally Posted by sneaker_ger View Post
It's not that I didn't test but it is difficult for me to determine the exact decoder being used on Windows 7 with both MS decoders and DXVA decoders being available. If I turn off hardware acceleration in chrome settings it can still play H.264 high profile HTML5 <video>, using "FFmpegVideoDecoder" which seems to confirm what you are saying though ffmpeg also supports DXVA2. Unlike you I don't have a system without any support for H.264 decoding.
It could also access VDPAU on Linux via ffmpeg directly it depends on the underlaying system to much we talking about HTML5 here it's heavily system dependent overall though Google prefers a internal controlled ffmpeg for several reasons and they invested a lot of Security bugfixing into it.
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 14th January 2017 at 17:52.
CruNcher is offline   Reply With Quote
Reply

Tags
bitrate, h.265, hevc

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 07:08.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.