cancel
Showing results for 
Search instead for 
Did you mean: 

Archives Discussions

sechrest
Journeyman III

Shadow Map errors

There are holes in my depth shadows that do not appear on graphics cards from other vendors

I've got a Windows/OpenGL app that uses depth textures to implement shadow maps.  All is well on other graphcis cards but on ATI cards I get holes in the shadow maps at resolutions greater than 128 x 128.

It seems to be worse on 64 bit systems but I can't say that reliably.  All ATI cards tested have the behavior to some extent.  I've tried several driver versions and am currently running driver version 8.712.3.0 (it came with Catalyst 10.3).

Anyone else seeing something similar?

0 Likes
5 Replies
mrmacete
Journeyman III

are you using glsl for this?

can you post a sample project or some code to explain your algorithm?

i'm using depth-based shadowmaps on ATI cards ( tested x1300 and HD4850) without problems in a glsl based rendering pipeline.

0 Likes

I'm using glsl compile targets and Cg/Fx files for all the shader management.  I can't post a project (it's a commercial app) but I can add some details.  I'm using the exact same algorithm we're using in several apps.  This error only shows up in our editing application -- it is not present in our stand alone OpenGL or DirectX apps.  It also works fine in the editor on NVIDIA cards.

After some more investigating I found that the compare stage of the shadow system is correct but what gets rendered into the shadow buffer is corrupted. There are garbage depth values in regular patterns which increase in number as the shadow buffer resolution increases.  What's really troubling about it is that these squares of garbage exist wholly within otherwise correct regions. 

We're still cranking away at it so I might have more to post later...

 

0 Likes

Originally posted by: sechrest There are garbage depth values in regular patterns which increase in number as the shadow buffer resolution increases.


i've seen such squared-pattern garbage on ATI using GL's automatic mipmap generation on x1300 at some texture resolutions, expecially using non-square or non-power-of-two textures. if you're not interested in mipmapping, make sure that automatic generation is not triggered.

F.

0 Likes

It were interesting to know how you solved the problem. I have exactly the same problem.. I use OpenGL + Cg/CgFx... HD 4670.. But the issue happens with newer AMD cards too....

The problem is on Windows and Linux, arb profiles and glsl. It doesn't happen with NVIDIA cards and (!) it also doesn't happen with old Catalyst driver (from CD to my 4670, year 2009)... I can provide more information and test case... Playing with parameters so far didn't help, also filtering doesn't change anything. I looked for autogenerated mipmaps (s. post above) in gDEbugger i don't see any, i also tried to disable it explicit, but it didn't help...

Any ideas? Thank you..

 

0 Likes

solved... there was one glBindTexture() call in wrong place, useless, doesn't produce an error, but on some systems this effect somehow

... ... ...

glBindFramebuffer(GL_FRAMEBUFFER, shadow_fbo);

//glBindTexture(GL_TEXTURE_2D, shadow_texture); <-----

// draw to shadow buffer

... ... ...

 

0 Likes