We are the developer of a stage lighting application. We use OpenGL to visualize stage lighting in our application.
Our customers are complaining about a memory leak issue with the Vega card. When our application simulating a lighting show, the Windows Task Manager shows our application’s memory usage goes up quickly and continuously. Eventually, our application will crash as a result. We tested with different AMD and nVidia cards and the memory leak only happens with the Vega cards.
We found that the memory leak can be avoided if I commenting out some glClear(GL_COLOR_BUFFER_BIT) on FBO in our code. However, not clearing the color texture will make the rendering incorrect. Clearing of the depth texture seems to be ok and won’t cause any memory leak. It also seems NOT ALL glClear(GL_COLOR_BUFFER_BIT) will cause memory leak. I suspect the memory leak only happens under certain situations but I cannot figure out what those situations are. However, this memory leak is not uncommon and is easy to reproduce.
I tested with the OpenGL samples from here: https://github.com/NVIDIAGameWorks/GraphicsSamples
My Vege card can run 6 of those samples and two of them (WeightedBlendedOIT and NormalBlendedDecal) have the same memory leak issue. Just like our application, I can eliminate the memory leaks by taking out the glClear and glClearBufferfv in these OpenGL samples.
If there are any questions or further explanations needed, please let us know. We look forward to your reply.
Thanks in advance,
William Law
wlaw@cast-soft.com
Here is my Radeon settings:
Radeon Software Version - 17.7
Radeon Software Edition - Crimson ReLive
Graphics Chipset - Radeon RX Vega
High Bandwidth Cache Size - 8176 MB
High Bandwidth Cache Type - HBM2
Core Clock - 1630 MHz
Windows Version - Windows 10 (64 bit)
System Memory - 16 GB
CPU Type - Intel(R) Core(TM) i7-7700K CPU @ 4.20GHz