I tried to render the depth information of multiple cameras into a layered framebuffer that has a texture array attached to it. Each texture of the array should receive depth information of a different camera (in a single pass). But it seems like only the first texture of the array receives the depth information of all cameras. Is this the right behavior? Is it possible to render into each layer of a layered depth buffer? If yes: Is there a sample that demonstrates how to render into a layered depth buffer?
Here is a simple sample program that should fill each layer of the layered depth buffer with a depth value of 0.0. But instead it only fills the first depth texture with depth information:
Code to create the depth texture array:
glGenTextures(1, &texture);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D_ARRAY, texture);
glTexParameteri(target, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(target, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(target, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);
glTexParameteri(target, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(target, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage3D(GL_TEXTURE_2D_ARRAY, 0, GL_DEPTH_COMPONENT32, 512, 512, 4, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0);
Code to create the framebuffer:
glGenFramebuffers(1, &framebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, texture, 0);
Code for drawing:
glEnable(GL_DEPTH_TEST);
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glDrawBuffer(GL_NONE);
glViewport(0, 0, 512, 512);
glClear(GL_DEPTH_BUFFER_BIT);
glUseProgram(program);
// draw rectangle
glDrawElementsBaseVertex(GL_TRIANGLES, 6, GL_UNSIGNED_SHORT, 0, 0);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glDrawBuffer(GL_BACK_LEFT);
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glDisable(GL_DEPTH_TEST);
Vertex Shader:
#version 330
in vec4 position;
void main(void){
gl_Position = position;
}
Geometry Shader:
#version 330
layout(triangles) in;
layout(triangle_strip, max_vertices = 12) out;
void main(void){
int layer, i;
for(layer=0; layer<4; layer++){
gl_Layer = layer;
for(i=0; i<gl_in.length(); i++){
gl_Position = gl_in.gl_Position;
EmitVertex();
}
EndPrimitive();
}
}
No Pixel Shader.
Vertex and index data:
const float verticesRect[4][2][4] = {
// position, texture coordinates
{{-1.f,-1.f, 0.f, 1.f}, {0.f, 0.f, 0.f, 0.f}},
{{ 1.f,-1.f, 0.f, 1.f}, {1.f, 0.f, 0.f, 0.f}},
{{ 1.f, 1.f, 0.f, 1.f}, {1.f, 1.f, 0.f, 0.f}},
{{-1.f, 1.f, 0.f, 1.f}, {0.f, 1.f, 0.f, 0.f}}
};
const ushort indicesRect[] = {
0,1,2,
2,3,0
};
glCheckFramebufferStatus returns GL_FRAMEBUFFER_COMPLETE.
There are no errors when linking the shaders.
OpenGL 3.3
display driver version 10.9
HD4600
I hope someone can help me . Thanks
the code seems correct.
it seems that layered depth has an issue in our latest driver, where layered color rendering is correct. we will fix it asap
Pierre B.
Thanks for the reply.
I hope it can be fixed soon.
Are there any news on this bug? I have tried to get this to work on recent drivers, though to no avail.
STL Programming works very well with matrices I think.
this code seems correct.
elanvacations
The code is correct, though it still does not work, as I tested it on 11.6. The bug makes it impossible to render depth to a texture_2d_array.
Is there or will there be any news on this subject?