Hello,
We are a company building 3D webgames for low-end machines. In order to make our games look good without the use of shaders (those freakin' Intel integrated videocards have no, or very limited shader support) we use OpenGL lighting for the lighting of our game world.
We implemented a system that dynamically switches lights, so we can use more than 8 lights in a game world, provided that we have a maximum lights per object (default 3). We use vertex arrays and display lists for our geometry for maximum compatibility.
Using Ati Catalyst driver version 9.4 and above, we encounter a major memory leak; mainly when using lots and lots of objects. In one of our games called Mahjong Mountain, during gameplay the game leaks about 2 to 3 megabytes per second! On older driver versions, or on other video cards, no problems occur. We also tried a different (smaller) codebase using some random on/off switching of lights, which also resulted in a memory leak (on a 4870x2 using Catalyst 9.8).
Is there any driver developer here that can help us with our problems; we would like a workaround without all of our clients having to reinstall their system.
Yours,
Michiel Roza
Engine developer Kalydo BV.
I'm suffering exactly the same problem with you.
It is reported to ATI OpenGL driver team. (Ticket 531)
I hope it is fixed at the next release.
We also experience the same problem using OpenGL lighting and huge amount of objects (rendering using VBO and VertexArrays). Leaking up to 5 MB per second. I hope this is fixed ASAP or we will have to recommend all our custumer not to use AMD/ATI graphics.
Can you provide a code sample and link where I can dload?
The leak seems to have been fixed in Catalyst 9.10.
Great work! (but about time too)
Thanks for the update.
Hi,
we are developing a video application (to decode and display multiple video streams) and for rendering/displaying we are using OpenGL and I have to report that we still have significant memory leak with ATI graph. cards (4800 HD for example) and latest 9.10 drivers. Note that there was no memory leak with drivers version 9.3 and older. The same memory leak is on windowsXP/Vista and Linux. There is no memory leak with NVIDIA graph. card.
Memory grows up to 500-700K per sec. and this amount of memory per sec. depends on number of frames rendered per sec. (SwapBuffers). We are using texture update (a lot) glTexSubImage2D(..), GL_LIGHTING, shaders, ...
Here is a piece of code:
**********
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glEnable(GL_DEPTH_TEST);
glEnable(GL_LIGHTING);
glDisable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glColor4ub((GLubyte)15,(GLubyte)15,(GLubyte)20, (GLubyte)150);
glTranslated (center.x(), center.y(), center.z());
glBegin(GL_QUADS);
glNormal3f( 0.0f, 0.0f, 1.f);
glVertex3d(-dimensions.width()/2, -dimensions.height()/2, 0);
glVertex3d(dimensions.width()/2, -dimensions.height()/2, 0);
glVertex3d(dimensions.width()/2, dimensions.height()/2, 0);
glVertex3d(-dimensions.width()/2, dimensions.height()/2, 0);
glEnd();
glDisable(GL_BLEND)
glDisable(GL_LIGHTING);
glPopMatrix();
*************
If I comment bolded line //glNormal3f( 0.0f, 0.0f, 1.f); it stops to eat RAM memory (observed by task manager in winXP).
Does somebody report this issue? Thanks