AnsweredAssumed Answered

Bug when transfering integer uniforms?

Question asked by s1s0 on Dec 6, 2012
Latest reply on Dec 11, 2012 by gsellers


Here is a simple code (implementing a stipple pattern) which works "most of the time" but fails with certain data when running with AMD graphic driver.



The vertex shader:


#version 330

uniform vec3  in_Color;

uniform float in_Alpha;

uniform uint  in_Stipple[32];

uniform bool  in_StippleEn = false;

out vec4 out_Color;


void main(void)


   bool dropThePixel = false;

   if (in_StippleEn)


      uvec2 ufCoord = uvec2(gl_FragCoord.x, gl_FragCoord.y);

      uint index = 31u - (ufCoord.y % 32u);

      uint mask  = uint(0x80000000) >> (ufCoord.x % 32u);

      dropThePixel = !bool(in_Stipple[index] & mask);


   if (dropThePixel)



      out_Color = vec4(in_Color,in_Alpha);



The code to change in_Stipple variable:



   const byte* tellStipple;


   //... tellStipple takes 128 unsigned char values



      GLuint shdrStipple [32];

      for (unsigned i = 0; i < 32; i++)

         shdrStipple[i] = ((GLuint)(tellStipple[4*i + 0]) << 8*3)

                        | ((GLuint)(tellStipple[4*i + 1]) << 8*2)

                        | ((GLuint)(tellStipple[4*i + 2]) << 8*1)

                        | ((GLuint)(tellStipple[4*i + 3]) << 8*0);

      glUniform1uiv(glslUniVarLoc[glslu_in_Stipple], 32, shdrStipple);




OK - I have to add that it all compiles without troubles... and works in most of the cases.

The trouble appears to be with quite particular combination of bits in a partucular location of the stipple array. That's shdrStipple[0], bit 31 and 30. So:

- if both bit 31 and 30 of shdrStipple[0] are NOT set - no problem - it all works with any values of the rest of the array

- if only  bit 31 and 30 of shdrStipple[0] are set and the rest of the array is 0 - no problem

The troubles are when those two bits are set simultaneously. In this case - all the rest of the array is compromised and of course the picture on the screen is quite different from the expected.


Of course it took me some nights to find this out. A simplest working workaround is to expand the array to 33 uints and to assign 0 to shdrStipple[0]. With the corresponding correction in the shader - it works fine.

The other workaround (very ugly!) is to convert the pattern to floats on both sides (the array has to be doubled in this case,because a float can't get uint without precision loss).

Both workarounds are not posted.

The question is - what's wrong with the code? The implression is that data is getting compromised somewhere in the pipe between the user software on the host and the shader. That is not my first problem with the way AMD is handling integers, so some comments will be highly appreciated.





P.S. Shall I add that the same code works fine on other platforms