Quote:
Originally Posted by NikosD
|
Interesting!
I wish there was some hint on how many transistors this takes, so we could estimate the silicon cost of adding it to a chip. It could be bigger than normal as this is JUST an AV1 decoder, without sharing anything with H.264/HEVC/VP9/etcetera decoders. In a more mature implementation, one would expect an integrated decoder which supports multiple bitstreams. That takes a lot fewer transistors in total that having all those as independent decoders.
400/500 MHz is pretty reasonable, as it can run in a processor in a relatively lower power state for better battery life on long-term content.
I am not a deep SoC guy, so take all above with an appropriately scaled grain of salt.
I'm looking forward to seeing an announcement for the first device with HW AV1 decode. AV1 isn't relevant for premium content until a material portion of customers have devices with HW decoders with integrated HW DRM.
So much hinges on whether the additive cost of AV1 decode will be low enough to be a default in lower cost SoCs in the next year or two. I'm kinda startled how murky that still is as we approach 2020.