1 Reply Latest reply on Nov 21, 2010 7:13 PM by vautieri

    Issues with OpenGL 3.x on ATI hardware

    vautieri
      OpenGL 3.x Issues not yet addressed (or thought to not be addressed)

      ----------- UPDATE-------------------

      No ATI address on the GL_R11F_G11F_B10F?  In anycase, I've narrowed down the other issue listed below..."A more significant issue" about the draw buffers.  Note, this works on OpenGL 2.1(removing the fancy opengl 3.x stuff to test), but not 3.2 and above (not sure about 3.0).   I have determined that the buffers are indeed getting written as if I blit them to FBO 0 I can see the data.  The ATI issue deals with having a MRT and then later reading textures in a shader later (I'm still in the same frame).  I even tried to unbind my textures from the original FBO, thinking they were locked.. but no success...nothing but black    The issues in this post so far appear to be true ATI bugs within their current released driver.  I'm still sifting through the spec looking for a magic phrase thats says I'm breaking a rule, but.. looks like I'm inline so far... 

       

      Note, I've tried sampler2DMS and without MS to ensure its not a mutlisample bug...

       

      Again, works in NVIDIA, not an ATI, Texture bound to the FBO are getting written to as if I blit the color buffer to FBO 0 I can see the content.  It appears to be teh fact of binding to another FBO and trying to read out the texures within the shader that the first FBO filled in.

      ---------------------------------- END UPDATE---------------------

      I'll start adding a list of issues I'm having with an OpenGL 3.x render context over the next few days/weeks.

      GL_R11F_G11F_B10F as a multisampled texture generates an opengl Error.  (using glTexImage2DMultisample).  My other multisampled textures work ok (and are bound to an FBO).  In the meantime, I'm using an RGBA16F texture to get around this ATI issue.  Our app runs on NVIDIA hardware without a flaw and I've been looking over the opengl specification to try to make sure I'm not in violation of anything.. maybe I am, I would like to support ATI GPUs so hopefully going here makes that happen.

      glTexImage2DMultisample is available only if the GL version is 3.2 or greater.

      A more significant issue I'm having is that it appears That my colorbuffers are not getting drawn into if using drawelements (multiple render targets with depth_stencil bound to the FBO).  I did read where ATI had a bug and the vertex attribute had to be 0, so in GLSL I force that by:

       layout(location = 0)in vec3 g_Vertex;   

      Still not good... All I get is a black screen with no opengl errors

      hmm. I just saw this.. I do use a forward compatible context

      http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=286280#Post286280

      Installed: Display Driver 55 MB 10.11 11/17/2010

      ATI Radeon HD 3450



       



        • Issues with OpenGL 3.x on ATI hardware
          vautieri

          ** EDIT **

          This seems messed up, I see in the 3.3 spec GL_READ_FRAMEBUFFER must be bound to 0!  Not sure why, as maybe NVIDIA didn't believe it either and let it work....  So I did this(bind read to 0) and blit my depth to buffer 0 and read it back from there instead of my resolve...and all is good.

          Per the spec for below to work... otherwise error! 

          glBindFramebuffer( GL_READ_FRAMEBUFFER, 0 );

          ----------End of Edit------------

           Here is another issue on ATI hardware vs. NVIDIA hardware,  I'm pretty sure everything is according to the opengl specification.  Since the depth buffer is multisampled readpixels needs the depth be resolved to a non multisampled.  I have a gut feeling this would still be an issue( opengl error) without the resolve buffer, as I get PBO/compressed texture issues as well//TBD yet.   I'll post that code later but I want to read the opengl spec to ensure again I'm not in violation about PBOs and compressed textures.

          Just for an FYI about the readpixel example below..:

          const char* err = (const char*)gluErrorString(glError);

          return a null array!.. probably because its a 1286 error number...

           The error happens just after:

           

          glReadPixels( x, y, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, 0 );



          .....and here is the code

          glBindFramebuffer( GL_READ_FRAMEBUFFER, resolve_fbos_[ current_pass_ ] );

          if(!key_to_pbo[key])
          {
           glGenBuffers(1, &key_to_pbo[key]);
           glNamedBufferDataEXT(key_to_pbo[key],  sizeof(float), 0, GL_STREAM_READ);
          }

          glBindBuffer(GL_PIXEL_PACK_BUFFER, key_to_pbo[key] );
          glReadPixels( x, y, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, 0 );
          glBindBuffer(GL_PIXEL_PACK_BUFFER, 0 );