View Single Post
Old 7th May 2015, 19:31   #3  |  Link
LoRd_MuldeR
Software Developer
 
LoRd_MuldeR's Avatar
 
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,248
The so-called "GPU decoding" almost always does not actually run on the GPU, but uses a dedicated hardware H.264 decoder (i.e. a separate piece of silicon) that just happens to be integrated with the GPU. This has the important consequence that you do not need to write your own GPU shader/kernel code for H.264 decoding (e.g. via CUDA or OpenCL), since the "programmable" part of the GPU is not even used. Instead, you just use the "hardwired" H.264 decoding routines that already are burnt into the silicon. And you can use the hardware H.264 decoder via standard programming interfaces, such as DXVA, CUVID or VDPAU. So it's the DXVA, CUVID or VDPAU SDK that you need to look into for code samples, I suppose...
__________________
Go to https://standforukraine.com/ to find legitimate Ukrainian Charities 🇺🇦✊

Last edited by LoRd_MuldeR; 7th May 2015 at 21:19.
LoRd_MuldeR is offline   Reply With Quote