** EDIT **
This seems messed up, I see in the 3.3 spec GL_READ_FRAMEBUFFER must be bound to 0! Not sure why, as maybe NVIDIA didn't believe it either and let it work.... So I did this(bind read to 0) and blit my depth to buffer 0 and read it back from there instead of my resolve...and all is good.
Per the spec for below to work... otherwise error!
glBindFramebuffer( GL_READ_FRAMEBUFFER, 0 );
----------End of Edit------------
Here is another issue on ATI hardware vs. NVIDIA hardware, I'm pretty sure everything is according to the opengl specification. Since the depth buffer is multisampled readpixels needs the depth be resolved to a non multisampled. I have a gut feeling this would still be an issue( opengl error) without the resolve buffer, as I get PBO/compressed texture issues as well//TBD yet. I'll post that code later but I want to read the opengl spec to ensure again I'm not in violation about PBOs and compressed textures.
Just for an FYI about the readpixel example below..:
const char* err = (const char*)gluErrorString(glError);
return a null array!.. probably because its a 1286 error number...
The error happens just after:
glReadPixels( x, y, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, 0 );
.....and here is the code
glBindFramebuffer( GL_READ_FRAMEBUFFER, resolve_fbos_[ current_pass_ ] );
if(!key_to_pbo[key])
{
glGenBuffers(1, &key_to_pbo[key]);
glNamedBufferDataEXT(key_to_pbo[key], sizeof(float), 0, GL_STREAM_READ);
}
glBindBuffer(GL_PIXEL_PACK_BUFFER, key_to_pbo[key] );
glReadPixels( x, y, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, 0 );
glBindBuffer(GL_PIXEL_PACK_BUFFER, 0 );