Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > VP9 and AV1

Reply
 
Thread Tools Search this Thread Display Modes
Old 6th January 2021, 22:47   #2381  |  Link
Greenhorn
Registered User
 
Join Date: Apr 2018
Posts: 49
Quote:
Originally Posted by utack View Post
Not sure if ffmpeg currently works inefficently but mpv with an 8K AV1 Video it allocates just over 4000MB VRAM for me, so that would not work
Could someone cross-check with the native Windows Video Player?
don't have MPV installed to test, but testing with a 8K60 HDR clip downloaded from Youtube shows the native media player allocating ~800 megabytes. (MPC-BE with madVR allocates ~2.1 gigabytes.) 1660 TI with a 6GB buffer.

8K is going to have huge memory requirements regardless of the codec, though; I'd expect it to basically scale by the number of frames prerendered by the player rather with additional codec overhead being negligible. I almost wonder if the suspiciously low numbers are due to decoding being too slow to fill some internal buffer, with the MS (libaom) decoder being slower than dav1d in MPC.

Last edited by Greenhorn; 7th January 2021 at 03:16.
Greenhorn is offline   Reply With Quote
Old 7th January 2021, 02:23   #2382  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 3,461
Quote:
Originally Posted by utack View Post
Not sure if ffmpeg currently works inefficently but mpv with an 8K AV1 Video it allocates just over 4000MB VRAM for me, so that would not work
Could someone cross-check with the native Windows Video Player?
There are about 34M pixels in an 8K video frame. And there's no point in non-HDR 8K and HDR will be 10-bit minimum. With 4:2:0 and assuming no bit alignment overhead, that's 2.5 bytes per pixel, about 85M per frame. Of course, there is always bit alignment overhead, and internal high precision frequency transforms could easily make for 48 bits/pixel. Frame-parallel decoding could involve several of those. And a few RGB decoded frames for buffer could be way bigger than that. 16-bits per channel at RGBA 444 would be 272 MB/frame.

HW decoders have an easier time of it because they don't need frame-level parallel decoding nor RGB buffers since they can write 420 straight to GPU. But yeah, 4GB for 8K SW decoder seems quite plausible for me if a decent number of frames need to be buffered at different stages.

None of that is specific to AV1, but AV1 is the only thing people are talking about doing 8K SW decode with. My own research hasn't found any content that actually looks better at 8K than 4K, so there's a whole lot of solution looking for a problem going on in that scenario. 8K YouTube looks better than 4K YouTube because YouTube is bit-starved at every resolution and bitrate
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 7th January 2021, 02:29   #2383  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 3,461
Coming to think of it, SW AV1 decoding is actually going to have an impact on global CO2 emissions. A CPU can easily draw 20 more watts in SW decode versus HW decode. 500K simultaneous YouTube viewers watching AV1 could be another 5 MWatt more power consumption and emissions than if YouTube used HEVC. Even assuming low-emissions NG plants, that would be around an extra megaton of global CO2 emissions an hour.

Yowza.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 9th January 2021, 04:01   #2384  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 181
Quote:
Originally Posted by benwaggoner View Post
And there's no point in non-HDR 8K and HDR will be 10-bit minimum.
Arguably there's no point in >10bpc content.

I've seen plenty 8bpc content without banding so it clearly isn't inherent and I doubt that the average human could tell the difference between 10 and 12 bpc content at all.
soresu is offline   Reply With Quote
Old 10th January 2021, 15:08   #2385  |  Link
takla
Registered User
 
Join Date: May 2018
Posts: 12
Quote:
Originally Posted by benwaggoner View Post
Coming to think of it, SW AV1 decoding is actually going to have an impact on global CO2 emissions. A CPU can easily draw 20 more watts in SW decode versus HW decode. 500K simultaneous YouTube viewers watching AV1 could be another 5 MWatt more power consumption and emissions than if YouTube used HEVC. Even assuming low-emissions NG plants, that would be around an extra megaton of global CO2 emissions an hour.

Yowza.
Yikes. I can already see the headlines for laws being passed (in EU countries anyway)
takla is offline   Reply With Quote
Old 11th January 2021, 03:34   #2386  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 181
Quote:
Originally Posted by takla View Post
Yikes. I can already see the headlines for laws being passed (in EU countries anyway)
With products capable of 8K AV1 decode already on the market, by the time they passed a law there would be far more in consumer hands making the law redundant - I'd take it as a given someone willing to waste money buying an 8K TV probably would be willing to shell out for the latest and greatest PC and gfx card too.

That being said, the Samsung 8K TV models already have terrible power efficiency even without other issues coming in to play - I'm not sure whether it is to do with them having more FALD zones or just higher peak nits (or a combo of both) but the lowest efficiency rating their 4K QLED TVs have is B, whereas their 8K TV's can go as low as D (A being the best rating).

There's also the hybrid decoder recently committed for XB1 and later consoles using DX shaders and UWP, it would be interesting to see what the power consumption on the XSX doing 8k AV1 decode when using that.

Last edited by soresu; 11th January 2021 at 03:38.
soresu is offline   Reply With Quote
Old 11th January 2021, 04:53   #2387  |  Link
takla
Registered User
 
Join Date: May 2018
Posts: 12
Quote:
Originally Posted by soresu View Post
With products capable of 8K AV1 decode already on the market, by the time they passed a law there would be far more in consumer hands making the law redundant
Yeah true. I thought about it some more and came to the same conclusion.
takla is offline   Reply With Quote
Old 12th January 2021, 15:43   #2388  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 181
It seems that at least Samsung's mobile division is pushing AV1 support going by their latest reveal at CES of the new Exynos 2100 SoC destined for Galaxy S21.

Given reports put AV1 support in 2020 QLED models I will wait until actual hardware is in reviewers hands before I dance for joy.
Attached Images
 
soresu is offline   Reply With Quote
Old 13th January 2021, 00:08   #2389  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 3,461
Quote:
Originally Posted by soresu View Post
Arguably there's no point in >10bpc content.

I've seen plenty 8bpc content without banding so it clearly isn't inherent and I doubt that the average human could tell the difference between 10 and 12 bpc content at all.
The problem is dithering doesn't encode very well. A smooth gradient from Y'=64 to Y'=72 across a 1920x1080 frame is going to have banding in 8-bit without really good dithering that actual frequency-transform compression tends to lose.

And HDR with 8-bit is much harder. Just encoding Rec 2100 content in 8-bit yields a horrible mess.

And it's challenging to detect full 4K detail in SDR for natural images, and in many cases impossible even by expert viewers. HDR is what makes 4K generally worthwhile for natural images. Seeing the difference between carefully selected 4K and 8K HDR moving images is only possible by expert viewers with 20/10 vision and only on a minority of "stress test" clips.

Higher resolutions pay off a lot more for computer games, but that's more about the limitations of anti-aliasing technology and the much greater local contrast of synthetic graphics. Rendering games at 4K and downscaling to 1080p still looks a lot better than native 1080p gaming.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 13th January 2021, 00:15   #2390  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 3,461
Quote:
Originally Posted by soresu View Post
With products capable of 8K AV1 decode already on the market, by the time they passed a law there would be far more in consumer hands making the law redundant - I'd take it as a given someone willing to waste money buying an 8K TV probably would be willing to shell out for the latest and greatest PC and gfx card too.

That being said, the Samsung 8K TV models already have terrible power efficiency even without other issues coming in to play - I'm not sure whether it is to do with them having more FALD zones or just higher peak nits (or a combo of both) but the lowest efficiency rating their 4K QLED TVs have is B, whereas their 8K TV's can go as low as D (A being the best rating).

There's also the hybrid decoder recently committed for XB1 and later consoles using DX shaders and UWP, it would be interesting to see what the power consumption on the XSX doing 8k AV1 decode when using that.
TVs don't have the horsepower for SW decode in any case. The big power differential is with computers which can provide lots of peak compute in exchange for much more power draw. And we're talking 2022 before even half of new PCs have AV1 HW decode, and 2025+ before the installed based could be even 50%. YouTube using any codec that doesn't have a HW decoder on a system that does have some HW decoders must hugely add up. Plus the encoding power needed is also a lot higher.

The XSX decoder is probably better, but consoles are power beasts in general. Xbox and PS consoles generally draw >100 watts to just have something on the screen. Compare to things like Roku or Fire TV which draw <10 watts running full blast.

Of course, when doing streaming over 4/5G, higher bandwidths also mean more antenna power, so there's some tradeoff there somewhere.

Environmental organizations should really come out with a browser plugin to force YouTube et all to only stream the best codec that has a HW decoder.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 13th January 2021, 09:03   #2391  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 181
Quote:
Originally Posted by benwaggoner View Post
Environmental organizations should really come out with a browser plugin to force YouTube et all to only stream the best codec that has a HW decoder.
Most new PC's have VP9 HW decoding and obviously all have H264 HW too - if you lack AV1 capable HW then all you have to do in Firefox to only use HW decoders is to disable AV1 playback in the about:config page.

I'm not sure if Chrome has an equivalent easily found switch to control AV1 playback capability.
soresu is offline   Reply With Quote
Old 13th January 2021, 09:40   #2392  |  Link
soresu
Registered User
 
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 181
Quote:
Originally Posted by benwaggoner View Post
And HDR with 8-bit is much harder. Just encoding Rec 2100 content in 8-bit yields a horrible mess.

And it's challenging to detect full 4K detail in SDR for natural images, and in many cases impossible even by expert viewers. HDR is what makes 4K generally worthwhile for natural images. Seeing the difference between carefully selected 4K and 8K HDR moving images is only possible by expert viewers with 20/10 vision and only on a minority of "stress test" clips.
Ah sorry, I didn't mean using HDR or Rec 2100 for 8 bit.

When I wrote ">10 bpc" I only meant above 10 bpc, not 10 bpc and above.

Some people write > to mean 'more than or equal to', for me it just means 'more than', and >= means 'more than or equal to'. Linguistic consequence of Python dabbling I think.

As to the difference between 2K and 4K being visible without HDR, I would say that depends upon display size and the viewing distance.

IMHO many people get screens too small to even appreciate the resolution uptick from SD to 1080p, and often sit too far away from the screen which only makes the issue worse.

I have a 40 inch 1080p TV which I use as a PC monitor (50 cm away at most), and I can just about see the screen door effect of the pixel separation.

Obviously this gets much worse for a 4K screen, and 8K is never going to be anything but a placebo to the consumer, unless viewing through VR with insane pixel res per eye and the right optics to capitalise on it.
soresu is offline   Reply With Quote
Old 14th January 2021, 13:52   #2393  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 999
rav1e v0.4.0 is out: https://github.com/xiph/rav1e/releases/tag/v0.4.0
hajj_3 is offline   Reply With Quote
Old 14th January 2021, 14:17   #2394  |  Link
GTPVHD
Registered User
 
Join Date: Mar 2008
Posts: 207
https://www.anandtech.com/show/16390...z590-coming-q1
https://images.anandtech.com/doci/16390/11900K.png

Intel Rocket Lake supports AV1 fixed-function hardware decoding as per slide.
GTPVHD is offline   Reply With Quote
Old 14th January 2021, 18:55   #2395  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 3,461
Quote:
Originally Posted by soresu View Post
Ah sorry, I didn't mean using HDR or Rec 2100 for 8 bit.

When I wrote ">10 bpc" I only meant above 10 bpc, not 10 bpc and above.
Gotcha. And yes, I've not seen many cases where >10-bit is needed for final consumer delivery presuming that good dithering was done. Things are simpler with more precision, because various dithering stages can be skipped (dithering-on-encoding is just one; the playback device can often have at least 2 rounds of dithering post-decode). In content creation, at least 2 bits more than deliver should be used so that dithering isn't required in all the intermediate steps.

Quote:
As to the difference between 2K and 4K being visible without HDR, I would say that depends upon display size and the viewing distance.
Those can also be limiting factors. But the fundamental limitation is the human visual system and the content. For >2K to look better, you'll need content with frequencies greater than Nyquist to have material that can use more pixels. Lots of sources won't really have that, and a lot more sources will only have that in grain (source or synthetic). Most 4K studio content we see has lots of 2K + grain shots.

Quote:
IMHO many people get screens too small to even appreciate the resolution uptick from SD to 1080p, and often sit too far away from the screen which only makes the issue worse.
Very true. For years I've been telling people that often the best upgrade to their TV experience would be pushing their couch forward.

Quote:
I have a 40 inch 1080p TV which I use as a PC monitor (50 cm away at most), and I can just about see the screen door effect of the pixel separation.
Yeah, that's WAY too close! If you're looking at the center of the screen, the viewing angle to the edges of the screen are going to be terrible. You actually need to move your head around to see different parts, and push your chair back to see the whole image at once.

Quote:
Obviously this gets much worse for a 4K screen, and 8K is never going to be anything but a placebo to the consumer, unless viewing through VR with insane pixel res per eye and the right optics to capitalise on it.
And with VR, only because the optics reduce the actual worse case detail delivered. Its really more about the fundamental limits of how small an arc we can resolve visually.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 14th January 2021, 21:14   #2396  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,783
Another aspect of VR is that when it comes to pre-produced content you're effectively rendering a 360 degree scene and then using equirectangular projection to fit that into a standard video frame size. During playback the player wraps that video inside a sphere and drops your viewport inside of that sphere. This means that you actually look at a small piece of the video. With a ~4K video and a ~2.5K head mounted display / headset with a typical FOV you're going to be looking at maybe 1/4 of the encoded resolution. The player of course has to upscale to hit the native HMD display. All of this means that you're basically watching sub HD video on a very dense screen as close as your eyes can focus

360 video is generally pretty boring, but to really maximize the potential you'd basically want a 16k video. Some companies (Pixvana) tried to get around this by cutting the video into slices and only streaming one or two at a time. This theoretically lets you get higher resolution during playback and lower bandwidth (since you're not streaming / decoding / processing everything you can't see). Ultimately 360 video just does not scale though, and the lack of parallax is disturbing and uncomfortable for many. Here's hoping for lots of neat developments in light field capture, compression, and delivery. The guys at Lytro were doing wild and crazy stuff a few years ago before they ran out of money. I wonder what Google is doing with all that IP...
Blue_MiSfit is offline   Reply With Quote
Old 15th January 2021, 19:24   #2397  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 3,461
Yeah, I've got ~7 patents on VR encoding and playback, and it's not something I see a way to make work for customers for scripted content. Video games are obviously a good fit for some genres, and some experiences more like museum curation can be great. But VR is mainly a new thing, not a new way to deliver old things.

And video quality is at least 15 years behind what we can do with a 2D flat screen.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old Yesterday, 14:57   #2398  |  Link
hajj_3
Registered User
 
Join Date: Mar 2004
Posts: 999
Google to require all new android tv device model released after march 31st 2021 must include AV1 decode support: https://www.xda-developers.com/googl...ideo-decoding/

We will therefore see all new android tv box models having support and also some tv's also use android tv so they will support av1 too.
hajj_3 is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 07:31.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, vBulletin Solutions Inc.