1 Reply Latest reply on Aug 13, 2013 9:04 AM by dutta

    OpenGL Compute Shader BSOD

    dutta

      Hi again!

       

      I've been experimenting with compute shaders and explicit image format bindings and ran into somewhat of a problem. I know my usage is incorrect, but it still shouldn't cause a BSOD. First I create a texture like so:

       

      glGenTextures(1, &this->computeTexture);
      glBindTexture(GL_TEXTURE_2D, computeTexture);
      glTexImage2D(GL_TEXTURE_2D, 0, GL_R32F, 512, 512, 0, GL_RED, GL_FLOAT, 0);
      glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
      glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
      glGenerateMipmap(GL_TEXTURE_2D);
      glBindTexture(GL_TEXTURE_2D, 0);
      
      

       

      And the compute shader code is:

       

      layout(local_size_x = 16, local_size_y = 16, local_size_z = 1) in;
      layout(rgba32f) restrict uniform image2D img;
      uniform float roll;
      
      void
      main()
      {
        ivec2 storePos = ivec2(gl_GlobalInvocationID.xy);
        float localCoef = length(vec2(ivec2(gl_LocalInvocationID.xy)-8)/8.0f);
        float globalCoef = sin(float(gl_WorkGroupID.x+gl_WorkGroupID.y) * 0.1 + roll) * 0.5f;
        imageStore(img, storePos, vec4(1.0f - globalCoef, 0, 0, 0));
      }
      
      

       

      It's executed as such:

       

      glDispatchCompute(512/16,512/16,1);
      
      

       

      Whenever I run this compute shader I get a BSOD (or sometimes an non-recovering driver crash). I know that I try to bind an R32F texture as an rgba32f image, however I still only write to the red channel. This is with a 7970 on Windows 7 with the 13.4 driver.


      Correction, I forgot to post how I bound the texture to the image:


      glBindImageTexture(this->textureUnit, (GLint)this->currentValue[0], 0, GL_FALSE, 0, GL_READ_WRITE, GL_R32F);


      This caused the BSOD, however when I changed it to:

       

      glBindImageTexture(this->textureUnit, (GLint)this->currentValue[0], 0, GL_FALSE, 0, GL_READ_WRITE, GL_RGBA32F);

       

      It worked fine. So it seems that whenever glBindImageTexture is supplied a format which is a mismatch with what it says in the shader, the result is a BSOD. The OpenGL specification doesn't mention whether or not this is supposed to be allowed, so I guess this behaviour is simply undefined.