antzrhere

write_imageui() / write_imagei() Bug (??)

Discussion created by antzrhere on Oct 2, 2011
Latest reply on Oct 5, 2011 by antzrhere

I've been doing some GL/CL interop using a GL renderbuffer made into an OpenCl image2d_t object. The format is GL_RGBA8  (same as OpenGL pixel format), but It seems changing the format makes no difference.

Basically if I use write_imageui() to write pixels (0-255) range, I get black wherever I write. Here is the line of code (Voxel.Colour is a float3 with all values between 0 and 255) 

 write_imageui( ScreenBuffer, PixelPos, (uint4)(convert_uint(Voxel.Colour.s0), convert_uint(Voxel.Colour.s1), convert_uint(Voxel.Colour.s2), 0.0f) );

Previously I was writing a OpenCL global memory array without GL sharing and blitting to screen so I know the code works 100% fine.

Initially I thought it was a bug in my code, however, if I used write_imagef (and subsequently scale all colour component values to 0.0f-1.0f), it works perfectly! Code:

 write_imagef( ScreenBuffer, PixelPos,((float4)(Voxel.Colour.s0, Voxel.Colour.s1, Voxel.Colour.s2, 0.0f ) * 0.0039215686274509803921568627451f) );

I'm happy to use this function (its more readable), but I can't understand why the integer versions don't work.

Windows 7 64bit, ATI 5870, Catalyst 11.8, AMD APP SDK 2.5.

 

 

Outcomes