AnsweredAssumed Answered

Trying to DMA from capture card to Direct3D10 texture, driver crashes

Question asked by on Jan 7, 2013
Latest reply on Jan 13, 2013 by



I'm writing a capture plugin for Open Broadcaster Software. The card I'm capturing from provides an API to DMA a captured bitmap to a user-supplied buffer. The SDK documentation mentions this makes it possible to DMA straight to GPU memory, and there is a Direct3D9 SDK sample that (supposedly) does so by passing the pointer to a LockedRect to the capture driver and calling UnlockRect on it. I have no idea what goes on under the hood with Direct3D9 and if it is doing a DMA straight into GPU memory, or if the LockedRect is just a regular system memory buffer that gets uploaded to the GPU.


I've been trying to replicate that behaviour within OBS' Direct3D10.1 context, using the same method but with a pointer to a Map()ed texture (with D3D10_USAGE_DYNAMIC and D3D10_CPU_ACCESS_WRITE), and by calling Unmap(). However, as soon as I do that, my AMD display driver crashes (balloon with "Your display driver has stopped responding and has recovered"). Now I'm almost completely certain that I shouldn't be able to cause that just by doing what I'm doing!


If I just allocate a plain old bitmap using CreateDIBSection(), and memcpy that onto my mapped texture, that will capture just fine (obviously). What puzzles me though is the crash that happens on DX10.1, that doesn't occur with DX9. Is this a bug in the way the texture is unmapped by the display driver? With my limited knowledge, I can imagine that the mapped texture might in fact be a hardware address mapped into my application's address space, and that trying to DMA to that address would need some extra magic from the display driver and/or memory manager to make sure the DMA actually goes to the right address before it would work? Or is what I'm trying to do simply impossible-- in that case what about the working DX9 sample?


The SDK documentation mentions this about trying to DMA straight onto a GPU surface:

Note, using the back buffer with some integrated Intel graphics devices can BSOD or show striped capture data.

This to me implies that there is some sort of inherent danger in this method, but that it should work.


I'd love to hear any expert thoughts on this. The capture card in question is a Datapath card from their Vision product line (they all use the same drivers and SDK), and my graphics card is an AMD Radeon 5870.


My source code: (offending lines currently commented out)

OBS source code: