AnsweredAssumed Answered

Texture reads return garbage for CLImage2D [with minimal example]

Question asked by thegregg on Nov 12, 2012
Latest reply on Apr 7, 2013 by hxmingzzz

Hello everyone,


first I should say that this is a cross-post from

Since I didn't get an answer after two weeks I figured I could post here as well.


Basically I'm having trouble with texture reads (read_imagef) when the texture size is 960x540 pixels and the format is CL_RGBA and CL_UNORM_INT8 or CL_UNSIGNED_INT8. I get correct results for any other texture size. Also note that the problem only appears on ATi cards on Windows. Any other configuration works (e.g. ATi on Linux, nVidia on Windows/Linux).


I made a small example that reproduces the problem on all ATi cards  I could test with: FirePro V7900, HD6900, HD5870. You can get the source here:


Can anyone reproduce this problem?

Could this be a driver bug and if so, who should I contact?


Let me know if you need more specific information.


Kind regards,