Reduction is a difficult problem to solve with this approach. As reduction is really a multi-pass algorithm implemented by runtime. The constants that you see are the ones those runtime need to pass run that kernel.
e.g. if you scroll down the generated header file. You see the following line -
It means a constant is passed to the kernel. To really understand what values you need to pass, you need to understand the runtime.
I think if you want to experiment with this path, you should first try with non-reduction, non-scatter kernels where the mapping of constants is straight forward and no extra constants are generated in generated kernel. Let me know if you have more questions.
Thanks for reply.
I "think" I have no trouble with non-reduction kernels - I mean I've never tried, but I see what's going on.
What I need is a reduction kernel though...
Yes I've seen this brook::CONST_REDUCTIONFACTOR and it only left me confused - it's simply an integer-sized enum, and as you say the actual brook runtime must be doing something with it.
I'll install Brook source and see if I can deduce something...
Is there a magic way for me to use Brook after all? The difficulty I have is D3D interoperbility, source data is in a D3D9 surface. Can I somehow mix CAL and Brook to work around this? CAL can give me source stream, Brook can handle reduction.
I think the right approach for you should be modify Brook+ interface to get CALresource. Changing interface to get these CAL handles should be easier than understanding and implementing various internal Brook+ virtualization algorithms.
As a side note we have an RFE to support D3D interoperability at Brook+ level. So, you can be sure that these features will be available at Brook+ level in future release.