alx

Which platform id to pass for a context spanning multiple devices

Discussion created by alx on Apr 29, 2011
Latest reply on Apr 29, 2011 by alx
+ some compilation err/warn questions

The context creation interface has always felt a bit odd to me, the spec text on it is terse, and all examples I've seen seem to come from a single source. I can get OpenCL programs working, but I've tried to create a context spanning multiple devices from different platforms. This failed me, no matter what platform id I passed (or if left to the runtime). Is this possible at all or is this the reason you can only pass 1 platform id? (I know I can create >1 contexts.) More generic, why can one override a platform here at all; in what case can't the runtime just select the platform from the first passed device?

I found that the AMD OpenCL compiler (APP 2.4) errors on an unidentified pragma (e.g. enable intel printf). Isn't it suppose to warn and ignore unknown pragmas?

Now, I need to #ifdef on a compiler identity macro. Which macro can I use to identify AMD's compiler? Couldn't find it in FAQs and failed to find it in the APP Program Guide, Sec 2.4 Predefined Macros.

Related, I tried to pragma enable the AMD printf in an included header instead of in the same .cl (which works). The header is found (it #defines something I use in the .cl), but 'printf' is still unrecognized in the .cl. Is this intended?

The __read_write and read_write access qualifiers (OCL spec 1.1, Sec 6.1.9 Keywords) in kernels are rejected. Of course I can leave them out.

I suspect a missing 'const' qualifier in cl.hpp. For enqueueWriteImage(..., void* ptr, ...), I think the 'ptr' parameter read from must be const, as is the case for its Buffer equivalent and also in the plain C functions called. Maybe there are other missing consts (or I'm wrong). Seems risky to correct, because code written for the corrected cl.hpp may rely on it.

Finally, AMD's APP compiler (still 2.4) rejects an uint2 typed coordinate passed to write_imagef(). The type should indeed be int2, but why can't uint2 coerce to int2?

Thanks for any replies.

Outcomes