Shrinker

OpenGL 3.3/GLSL 330 Vertex Attribute and Built-In issues

Discussion created by Shrinker on Sep 9, 2010
Latest reply on Feb 8, 2011 by Shrinker

Hello! I guess this is the right place to post about problems I have been experiencing lately.

I am working with OpenGL (3.3) on my Nvidia desktop and my Ati laptop, both are patched to the newest versions of the respective graphics drivers. I have identified some problems that I could confirm with friends who also own properly maintained Ati systems.

In GLSL nowadays, the proper way of inputting vertex coordinate data is by declaring a vertex attribute in the source code and using an "in" variable in the vertex shader. That vertex data shall then be transformed with the model view and the projection matrices, both input as uniforms. That works good so far.

Now though, I have a specific case where there is no need to specify vertex coordinates, because they can easily be inferred from gl_VertexID. I have for days studied the OpenGL specifications and am quite sure that what I am doing is nothing out of the ordinary. So I have a vertex shader that has no inputs, but outputs data, and it infers what to output solely from gl_VertexID. This works fine on my Nvidia system, but my Ati system just doesn't display anything. It starts displaying something once I declare a dummy input and enter dummy data that is also use and overwritten in the shader itself (i.e. unnecessary assignments to make sure the input is not optimized away). It looks to me like the driver doesn't display anything when there are no inputs, which is behavior that is not backed by the OpenGL specification. Triangles or Quads can easily be produced with the sequence information coming from gl_VertexID alone.

Another problem that I've had is that as soon as I used any of the still valid Built-Ins, like gl_VertexID or gl_InstanceID, I could only use vertex attributes IF, sorted alphanumerically, their names would be BEFORE gl_*. I have triple-checked this all in the source code counterpart too to make sure I am sending the right data, and to make sure that I retrieve valid input data locations.

So, for instance, if I input vertex positions for a quad through the named input "foo" and color the quad based on gl_VertexID, it works fine. If I, however, name my input "eeh" (which comes after gl_*), nothing is displayed at all. My guess is that the built-ins somehow count against the input table when used, making it impossible to actually input any attribute data.

As a workaround to my specific case, I'm now using an integer input named "_vertexID", which is just that, the vertex id. I have to use the leading underline because whenever I use any gl_-Built-In, nothing is displayed otherwise (because of the issue described above).

Somewhere else I have seen an ongoing debate about what gl_VertexID should yield when the call to glDrawArrays does not start at the 0th, but at a greater-than-0th vertex. In my case, where all vertex coordinates are inferred directly from gl_Vertex, it just doesn't make sense to let gl_VertexID start at zero when it is nonzero in the draw call. The OpenGL specification states that the value is implied by the draw call. If the Ati driver really starts at zero, no matter the starting vertex specified in the draw call, then that is probably erroneous behavior which is not in accordance with the specification.

 

Best regards,

Shrinker

Outcomes