OpenGL: Packed Depth Stencil Buffer problem

Discussion created by Spooky_ on Apr 22, 2010
possible bug?

There seems to be a bug on ATi cards with the packed depth stencil buffer of a framebuffer object, if the buffer is defined as a texture.

Short description: if you bind the depth stencil buffer texture, the stencil buffer gets either "destroyed" or the stencil test is somehow actually accessing the bits of the depth buffer values, instead of the stencil buffer.

Someone else posted this problem also on gamedev.net (I posted there too as Spikx). That user also posted a bit of example code there, to illustrate when this occurs.

I also posted a picture of what the stencil buffer looks like, when the depth stencil buffer texture is bound: http://dl.dropbox.com/u/2309215/tree_stencil.gif