According to the OpenCL documentation, cl_amd_printf is supported by Radeon 5000+ cards. When I run my application on my 5870, nothing prints. When I run it on my CPU, it does. This is about what I would expect because GPUs can't access file descriptors to write to the terminal. As such, I think printf() is a nop on the GPU.
However, in the programming guide, it says printf isn't supported on the Radeon 4000 series cards. Why is that if printf doesn't do anything? I suppose you theoretically could do printf on a GPU, writing the string directly to a host buffer from the kernel and then the CPU prints that, but I think ATI hardware has supported writes to host memory since RV600, which wouldn't explain why you can't do it with RV700.
In that case, if it helps, I'm using a Radeon 5870 Eyefinity, SDK 2.4, and I can't get even the simplest format strings to print, must less more interesting ones.
Both of the following don't printf anything (but they do increase the runtime) on a Radeon 5870:
printf("%.*s\n", peptideLength, &curPeptideString[myPeptide * 64 + 1]);
Both strings work correctly on a Nehalem quad core.
If printf works on the GPU, programmability will be increased by two orders of magnitude since you can actually see what's going on inside the damn things.