Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
3rd December 2020, 12:40 | #2341 | Link |
Angel of Night
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,559
|
I've moved dav1d-specific posts to dav1d accelerated AV1 decoder, beginning from a bit over a year ago. There's still plenty of room for a general decoders comparison thread, and of course an encoders face-off thread.
|
3rd December 2020, 23:52 | #2342 | Link | |||
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
SW DRM is simply not allowed for lots of premium content, however. AV1 is a lot more practical for user-generated and other non-commercial content than for professional licensed content. Also, the reduced battery life of using a SW decoder matters a lot more when watching a two hour movie than short-form content. Quote:
Quote:
As a content creator, if one is choosing one codec beyond H.264, HEVC certainly offers a much bigger audience for 2021 except for Firefox and Chrome. |
|||
4th December 2020, 15:10 | #2343 | Link |
Registered User
Join Date: Oct 2016
Posts: 896
|
Oh wow, I had no idea about that. So is that the actual reason why UHD isn't available on Chrome & Firefox for those streaming services that offer it? I always assumed it was because of stronger DRM in Edge.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 |
4th December 2020, 20:09 | #2344 | Link | |
Registered User
Join Date: Mar 2020
Posts: 117
|
Quote:
All the current hardware decoder including those in Laptops have a much higher power usage allowance, i.e You could have a hardware decoder working in 1+W range without problem. Compare to a mobile phone where it is expected to operate in few hundred mW range. This time around it isn't so simple because VVC has barely finished and on the surface doesn't seems to share that much with AV1. How this translate to hardware decoding block differences remains to be seen, especially when the power requirement is much more stringent. I have previously written this will change with 5nm SoC as both transistor budget and power usage improves, I was referring to TSMC's 5nm, the Sanpdragon 888 based on Samsung 5nm, which has a lower transistor density so it isn't quite there yet. Finally Mediatek only has one chip that has AV1 decoder. And that is their High End flagship. 90% of Mediatek volume are low to mid range SoC. And transistor budget are even tightener in those segment. I just wish people are more mindful of different interest in video codec, from hardware to software and from users to producers.
__________________
Previously iwod Last edited by ksec; 4th December 2020 at 20:22. |
|
5th December 2020, 01:15 | #2345 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
This is partly a reflection of the strong focus on user-generated content to Google (YouTube) and Facebook. Among other things, this is why there's no browser-based HDR premium content. While AV1 technically can do HDR, no one has released an encoder with mature HDR tuning. x265 needed quite a lot of feature development to get optimal HDR encoding, since PQ and 709 have some pretty foundational differences and different optimization requirements. The net effect is we'll probably see premium content playback on Windows/Mac continue to shift away from browsers towards apps. The large majority of PC and Mac systems can decode 10-bit HEVC in HW. |
|
18th December 2020, 12:06 | #2347 | Link |
Registered User
Join Date: Mar 2004
Posts: 1,125
|
|
18th December 2020, 18:13 | #2348 | Link | |
Registered User
Join Date: Jul 2020
Posts: 5
|
Quote:
|
|
6th January 2021, 16:33 | #2350 | Link | |
Registered User
Join Date: Apr 2018
Posts: 63
|
Quote:
Could someone cross-check with the native Windows Video Player? |
|
6th January 2021, 22:47 | #2351 | Link | |
Registered User
Join Date: Apr 2018
Posts: 61
|
Quote:
8K is going to have huge memory requirements regardless of the codec, though; I'd expect it to basically scale by the number of frames prerendered by the player rather with additional codec overhead being negligible. I almost wonder if the suspiciously low numbers are due to decoding being too slow to fill some internal buffer, with the MS (libaom) decoder being slower than dav1d in MPC. Last edited by Greenhorn; 7th January 2021 at 03:16. |
|
7th January 2021, 02:23 | #2352 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
HW decoders have an easier time of it because they don't need frame-level parallel decoding nor RGB buffers since they can write 420 straight to GPU. But yeah, 4GB for 8K SW decoder seems quite plausible for me if a decent number of frames need to be buffered at different stages. None of that is specific to AV1, but AV1 is the only thing people are talking about doing 8K SW decode with. My own research hasn't found any content that actually looks better at 8K than 4K, so there's a whole lot of solution looking for a problem going on in that scenario. 8K YouTube looks better than 4K YouTube because YouTube is bit-starved at every resolution and bitrate |
|
7th January 2021, 02:29 | #2353 | Link |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Coming to think of it, SW AV1 decoding is actually going to have an impact on global CO2 emissions. A CPU can easily draw 20 more watts in SW decode versus HW decode. 500K simultaneous YouTube viewers watching AV1 could be another 5 MWatt more power consumption and emissions than if YouTube used HEVC. Even assuming low-emissions NG plants, that would be around an extra megaton of global CO2 emissions an hour.
Yowza. |
9th January 2021, 04:01 | #2354 | Link | |
Registered User
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 196
|
Quote:
I've seen plenty 8bpc content without banding so it clearly isn't inherent and I doubt that the average human could tell the difference between 10 and 12 bpc content at all. |
|
10th January 2021, 15:08 | #2355 | Link | |
Registered User
Join Date: May 2018
Posts: 184
|
Quote:
|
|
11th January 2021, 03:34 | #2356 | Link | |
Registered User
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 196
|
Quote:
That being said, the Samsung 8K TV models already have terrible power efficiency even without other issues coming in to play - I'm not sure whether it is to do with them having more FALD zones or just higher peak nits (or a combo of both) but the lowest efficiency rating their 4K QLED TVs have is B, whereas their 8K TV's can go as low as D (A being the best rating). There's also the hybrid decoder recently committed for XB1 and later consoles using DX shaders and UWP, it would be interesting to see what the power consumption on the XSX doing 8k AV1 decode when using that. Last edited by soresu; 11th January 2021 at 03:38. |
|
12th January 2021, 15:43 | #2358 | Link |
Registered User
Join Date: May 2005
Location: Swansea, Wales, UK
Posts: 196
|
It seems that at least Samsung's mobile division is pushing AV1 support going by their latest reveal at CES of the new Exynos 2100 SoC destined for Galaxy S21.
Given reports put AV1 support in 2020 QLED models I will wait until actual hardware is in reviewers hands before I dance for joy. |
13th January 2021, 00:08 | #2359 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
And HDR with 8-bit is much harder. Just encoding Rec 2100 content in 8-bit yields a horrible mess. And it's challenging to detect full 4K detail in SDR for natural images, and in many cases impossible even by expert viewers. HDR is what makes 4K generally worthwhile for natural images. Seeing the difference between carefully selected 4K and 8K HDR moving images is only possible by expert viewers with 20/10 vision and only on a minority of "stress test" clips. Higher resolutions pay off a lot more for computer games, but that's more about the limitations of anti-aliasing technology and the much greater local contrast of synthetic graphics. Rendering games at 4K and downscaling to 1080p still looks a lot better than native 1080p gaming. |
|
13th January 2021, 00:15 | #2360 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,770
|
Quote:
The XSX decoder is probably better, but consoles are power beasts in general. Xbox and PS consoles generally draw >100 watts to just have something on the screen. Compare to things like Roku or Fire TV which draw <10 watts running full blast. Of course, when doing streaming over 4/5G, higher bandwidths also mean more antenna power, so there's some tradeoff there somewhere. Environmental organizations should really come out with a browser plugin to force YouTube et all to only stream the best codec that has a HW decoder. |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|