AnsweredAssumed Answered

Invalid depth buffer on ATI X1300/X1550

Question asked by allforum on Mar 22, 2013



I can't find what is wrong on ATI cards. On nvidia I have no problem.

Maybe you will have an idea...


I have 3 views :

- a 2D (ortho) view on the left

- two 3D (pers) view on the right, one above the other.

The model is drawn on the 3 views and the 2D view shares the display lists.


When I close one 3D view, the remaining view takes all the place on the right of the 2D view.


Problem : glReadPixels returns 0 where the closed 3D view was. So on the half of the view it returns 0 and on the other half it is correct.

I obtain the same problem by resizing the views : with the two 3D views, one above the other ; I reduce the size of one view so that the other take the place. And I have the problem.

What is incredible is that the views are well drawn !


I saw with gDebugger that the depth buffer is wrong on the half of the view.

Here are the depth buffers :


I also note this :

- if I disable antialiasing and I use the classical PIXELFORMATDESCRIPTOR then I have no more the problem. (On FireGL V7200 I think it never works)

- If I enable the GL_*_SMOOTH and I use the classical PIXELFORMATDESCRIPTOR then I have  the problem.

- if I enable or disable GL_MULTISAMPLE_ARB  and I use wglChoosePixelFormatARB then I have  the problem.


I have no more problem on HD 5450 and FirePro V3750 after an update of the drivers.


Is it a known problem on old cards ?

Or do you have an idea of what can cause this problem ?