2 Replies Latest reply on Nov 20, 2014 11:56 PM by jzhou

    Bugs on Linux with GL state queries

    baldurk

      There are a couple of GL state queries that fail on linux, namely GL_VERTEX_BINDING_BUFFER and GL_POLYGON_MODE. The former is listed as an unrecognised enum to glGetIntegeri_v. GL_POLYGON_MODE is recognised, but it claims that it was removed from the Core profile (it works fine on Compatibility).

       

      This happens for me on the latest drivers available at the time of writing (GL_VERSION is 4.4.13084 Core Profile/Debug Context 14.301.1001), and on various core profile versions, I tested 3.2, 4.2, 4.3, 4.4.

       

      Some code to show the problem. I just initialised a 4.4 (for example) context with SDL, but I don't think it's specific to that as I can also repro this on a codebase that calls straight to glXCreateContextAttribs

       

        printf("GL_VENDOR: %s\n", glGetString(GL_VENDOR));
        printf("GL_RENDERER: %s\n", glGetString(GL_RENDERER));
        printf("GL_VERSION: %s\n", glGetString(GL_VERSION));
      
        GLint maj = 0, minr = 0;
        glGetIntegerv(GL_MAJOR_VERSION, &maj);
        glGetIntegerv(GL_MINOR_VERSION, &minr);
      
        printf("GL_MAJOR/MINOR: %d.%d\n", maj, minr);
      
        GLuint vertex_array = 0;
        GLuint vertex_buffer = 0;
      
        glGenVertexArrays(1, &vertex_array);
        glBindVertexArray(vertex_array);
      
        glGenBuffers(1, &vertex_buffer);
        glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);
        glBufferData(GL_ARRAY_BUFFER, 128, NULL, GL_STATIC_DRAW);
        glBindBuffer(GL_ARRAY_BUFFER, 0);
      
        glVertexAttribBinding(0, 0);
        glBindVertexBuffer(0, vertex_buffer, 32, 16);
      
        printf("vertex_array: %d\n", vertex_array);
        printf("vertex_buffer: %d\n", vertex_buffer);
      
        GLuint vao_binding = 0;
        GLuint vb_binding = 0;
        GLuint64 vb_offset = 0;
        GLuint64 vb_stride = 0;
      
        glGetIntegerv(GL_VERTEX_ARRAY_BINDING, (GLint *)&vao_binding);
        glGetIntegeri_v(GL_VERTEX_BINDING_OFFSET, 0, (GLint *)&vb_offset);
        glGetIntegeri_v(GL_VERTEX_BINDING_STRIDE, 0, (GLint *)&vb_stride);
        glGetIntegeri_v(GL_VERTEX_BINDING_BUFFER, 0, (GLint *)&vb_binding);
      
        printf("A: %d %d %d %d\n", vao_binding, vb_binding, (int)vb_offset, (int)vb_stride);
      
        // this works, fetching the buffer binding 'through' the VAO.
        glGetVertexAttribiv(0, GL_VERTEX_ATTRIB_ARRAY_BUFFER_BINDING, (GLint *)&vb_binding);
      
        printf("B: %d %d %d %d\n", vao_binding, vb_binding, (int)vb_offset, (int)vb_stride);
      
        GLint data[2] = { 0, 0 };
        glGetIntegerv(GL_POLYGON_MODE, data);
      
        printf("%d %d\n", data[0], data[1]);
      
      

       

      The output from the above code, including output from DebugMessageCallback:

       

      GL_VENDOR: ATI Technologies Inc.
      GL_RENDERER: AMD Radeon R9 200 Series
      GL_VERSION: 4.4.13084 Core Profile/Debug Context 14.301.1001
      GL_MAJOR/MINOR: 4.4
      vertex_array: 1
      vertex_buffer: 1
      Got a Debug message from 33350, type 33356, ID 1001, severity 37190:
      'glGetIntegerIndexedv parameter <pname> has an invalid enum '0x8f4f' (GL_INVALID_ENUM)'
      A: 1 0 32 16
      B: 1 1 32 16
      Got a Debug message from 33350, type 33356, ID 3200, severity 37190:
      'Using glGetIntegerv in a Core context with parameter <pname> and enum '0xb40' which was removed from Core OpenGL (GL_INVALID_ENUM)'
      1337 1337 1337 1337
      
      

       

      I would expect the A: and B: lines to both print "1 1 32 16", and the last line should print "6914 6914" (GL_FILL) Please let me know if you need any more information.

       

      Baldur

        • Re: Bugs on Linux with GL state queries
          baldurk

          Upon further investigation when trying to use GL_VERTEX_ATTRIB_ARRAY_BUFFER_BINDING as a workaround, I think that this query is also subtly wrong - it seems to me to behave exactly as VERTEX_BINDING_BUFFER does, in that the index passed to glGetVertexAttribiv returns the vertex buffer bound there, regardless of what the attribute binding is. It's not clear from the original code, but modifying the above code like so (with appropriate buffer creation) shows the problem:

           

          glBindVertexBuffer(0, vertex_buffer[0], 32, 16);
          glBindVertexBuffer(1, vertex_buffer[1], 32, 16);
          glBindVertexBuffer(2, vertex_buffer[2], 32, 16);
          glBindVertexBuffer(3, vertex_buffer[3], 32, 16);
          glVertexAttribBinding(0, 1);
          glVertexAttribBinding(1, 1);
          glVertexAttribBinding(2, 1);
          glVertexAttribBinding(3, 1);
          
          for(int i=0; i < 4; i++)
          {
            glGetVertexAttribiv(i, GL_VERTEX_ATTRIB_ARRAY_BUFFER_BINDING, (GLint *)&attrib_buf_binding);
            glGetVertexAttribiv(i, GL_VERTEX_ATTRIB_BINDING, (GLint *)&attrib_binding);
            printf("C%d: %d %d\n", i, attrib_buf_binding, attrib_binding);
          }
          

           

          which prints:

          C0: 1 1
          C1: 2 1
          C2: 3 1
          C3: 4 1
          

           

          when it should print

           

          C0: 2 1
          C1: 2 1
          C2: 2 1
          C3: 2 1
          

           

          Since all four vertex attribs (0, 1, 2, 3) are pointing at the same vertex buffer (that in slot 1 - buffer 2).