For my application, I want to decode live h264 streams and display them in OpenGL. I'm testing on a R9 380, Media SDK 1.1, driver 16.4.1, and Win10 Pro x64.
I've looked at both pipelinePlayback and simpleDecoder samples.
Decoding via the OpenGL path in pipelinePlayback produces the wrong colors. Specifically a green cast on left edge, a gradient somewhere in the frame, and a purple cast on the right edge. I stopped working with this path at this point.
Asking for either Dx9 or Dx11 decode path appears fine in pipelinePlayback. Likewise the output written by simpleDecoder also appears fine. I added a VideoConverter to the Dx9 path of simpleDecoder, converting NV12 to BGRA, and the output of this also seems fine. So far so good.
Adding an OpenGL VideoConverter stage with input/output both as AMF_SURFACE_BGRA after the Dx9 VideoConverter produces the same gradient as the pure OpenGL path. Calling Convert(amf::AMF_MEMORY_OPENGL) on the output of the Dx9 VideoConverter seems to be fine, however.
In my own app, I'm passing the Dx9 device and the OpenGL handles in to initialize the AMF context. I can use the Dx9 Decoder and VideoConverter, then call Convert(amf::AMF_MEMORY_OPENGL) on the output surface from the VideoConverter. This works (I get a usable OpenGL texture id and a texture with correct colors), but it cuts my framerate in half (vs drawing with out that call and using some temporary texture), i.e. where I would expect to hit the video refresh rate of 60, I get 30 fps instead.
I've tried using WGL_NV_DX_interop instead of the convert call, but I only get a blank texture -- everything looks set up right, glGetError and GetLastError return 0 after the respective OGL and interop calls, the texture id appears valid, etc.
If needed I can come back with sample code for investigating the Dx9 to OpenGL conversion, but perhaps there's a simple fix for the issue with OpenGL in pipelinePlayback and it won't be necessary.
Any advice?
Thanks