[10:25:29]: Detected Linux x86_64
[10:25:29]: Detected Kernel 2.6.38-8-generic
[10:25:29]: Detected OpenGL vendor ATI Technologies Inc.
[10:25:29]: Detected OpenGL version 3.2.10665 Core Profile Context
[10:25:29]: Detected OpenGL renderer ATI Radeon HD 4800 Series
[10:25:29]: Detected GLSL version 3.30
My project is completely shader centric and at present I'm only using the following shaders, non-indexed geometry and 3.2 compat/core profiles:
#version 150
// predefined inputs from application
uniform mat4 p_ModelViewMatrix;
uniform mat4 p_ProjectionMatrix;
in vec3 p_Vertex;
// outs
out vec3 p_Vertex_Out;
void main()
{
p_Vertex_Out = p_Vertex;
gl_Position = p_ProjectionMatrix * p_ModelViewMatrix * vec4(p_Vertex, 1.0);
}
NOTE: The predef. uniforms are correctly modified as reported by gDEBugger. The attribute p_Vertex is bound to index 0. p_Vertex_Out simply passes the world-space position of the vertex to the geometry stage.
#version 150
layout(triangles) in;
layout(triangle_strip, max_vertices = 3) out;
in vec3 p_Vertex_Out[];
out vec3 color;
void main()
{
for(int i = 0; i < gl_in.length(); i++)
{
// assign each vertex its world-space position as color and interpolate
color = p_Vertex_Out{i}; // forum screws up with correct brackets!
gl_Position = gl_in{i}.gl_Position; // forum screws up with correct brackets!
EmitVertex();
}
EndPrimitive();
}
#version 150
in vec3 color;
layout(location = 0) out vec4 p_FragColor;
void main()
{
p_FragColor = vec4(color, 1.0);
}
The program compiles and is successfully validated. gDEBugger reports that no deprecated funtionality is used. Compat gives me the expected output but when using the core profile no fragments seem to be generated. The application performs flawlessly on the latest NVIDIA Linux and Windows drivers - on the current Linux (11.4 pre in 11.04 Natty) and Windows Catalyst (11.3) it does not. If this part of AMD's GL implementation is platform independent, then I suspect a driver bug.
Anyone got something on this?
Thomas
Maybe you use some deprecated functions in your application but not in the shader. If it can't be resolved, could you please send your program to me by frank.li@amd.com? Thanks.
Do you enable the vertex attribute by glEnableVertexAttribArray(0)/(1) outside?
Another problem in geometry shader, it's maybe a typo.
color = p_Vertex_Out; => color = p_Vertex_Out [ i ];
It's hard to detect what's wrong in a piece of code. Could you please send your program to me by frank.li@amd.com? Thanks.
Hello everyone.
Do you need to have a Vertex Array Object with the Core Contexts? I was having the same kind of issue and then I looked at the gl errors with GL_AMD_debug_output and saw
glVertexAttribPointer in a Core context called without a bound Vertex Array Object [which is now required for Core Contexts]. (GL_INVALID_OPERATION)
glDrawArrays in a Core context called without a bound Vertex Array Object [which is now required for Core Contexts]. (GL_INVALID_OPERATION)
In my simple example I just added
GLuint vao;
glGenVertexArrays(1, &vao);
glBindVertexArray(vao);
and it started to work.