AnsweredAssumed Answered

AMD Radeon 5700 Series OpenGL Driver Issues

Question asked by cryru on Mar 21, 2020
Latest reply on Mar 22, 2020 by cryru

Hey guys, I'm working on some OpenGL code and I noticed some peculiarities on one of my machines which don't happen on any of the other ones. After messing around with my code I managed to get it working but I have no idea why the changes I made caused it to work, and how to detect this other than hardcoding the renderer's name, so I decided to ask here.

 

GL_VERSION 4.5.13417

GL_RENDERER AMD Radeon HD 5700 Series

GLSL 440


I've also tested using a core, core and forward compat, compatibility and default OpenGL profile.

 

Here's what I'm doing each frame:

  • I upload vertex data to a VertexBuffer using glMapBufferRange with GL_MAP_WRITE_BIT | GL_MAP_INVALIDATE_BUFFER_BIT | GL_MAP_FLUSH_EXPLICIT_BIT
  • I then flush the range of data which was mapped using glFlushMappedBufferRange
  • I unmap the buffer
  • One of the attributes of the vertex data is an int specifying the texture index within the shader.
  • I bind and activate a series of textures, up to the number specified by MAX_TEXTURE_IMAGE_UNITS which on this driver is 18.
  • In the fragment shader, I index a uniform array of samplers to determine which texture the vertex uses.
    • uniform sampler2D textures[TEXTURE_COUNT]

 

When performing this I get the result in the attached video. Random flickering, and sometimes the last texture being drawn is not displayed. I'd like to stress that I have checked the parameters that go into the functions stated below and have tested them on a variety of machines and GPUs including the Mesa3D llvmpipe, and therefore do not believe my ranges are wrong, but rather that I'm making a wrong assumption somewhere.

 

I got the scene to render properly by using only "GL_MAP_WRITE_BIT" and by limiting my activated textures (and uniform array size) to 1. The driver specifies that it supports both the GL_ARB_gpu_shader5 extension (which I've tried to request) and I'm using GLSL 440 therefore uniform indexing of opaque types should work. Additionally, I do not see why the invalidate buffer and flush explicit bits would cause problems either, considering the stated driver support.

 

I would greatly appreciate it if someone could shed some insight on this.

Attachments

Outcomes