I have a rather simple kernel that I'm having strange issues with type casting uint to ulong inside of a printf with the AMD APP SDK 2.8.1 running on a x86_64 CPU:
__kernel void opencl_hello_world(uint n)
{
size_t i = get_global_id(0);
size_t j = get_global_id(1);
if (i < n)
printf("OpenCL Hello World: (%lu, %lu) of (%lu, %lu)\n", (ulong) (i + 1), (ulong) (j + 1), (ulong) n, (ulong) 1);
}
|
If I set the NDRange offset size = 0, global size = 16, and local size = 16, I get the following output:
OpenCL Hello World: (1, 0) of (1, 0)
OpenCL Hello World: (2, 0) of (1, 0)
OpenCL Hello World: (3, 0) of (1, 0)
...
OpenCL Hello World: (16, 0) of (1, 0)
|
It seems that the %lu (64 bit) within the printf is behaving as %u (32 bit).