I've been wracking my brain all day trying to figure out why I'm not getting the values I expect in a DirectX Effect I'm building in RenderMonkey.
I've tracked it down to an int2 parameter I'm passing to my pixel shader, CellFrequency.
Currently it's set to (3, 3) and clamped in the range (3, 128).
The value I'm receiving in the shader is something on the order of 1077936064, as tested by this line:
return (CellFrequency.xyxy == 0x403FFFC0) //returns all-white
This also occurs if I use an int3 datatype, but the value is passed correctly using int, int4, or any float format.
Any insight into what's going on would be appreciated. I can use another datatype in the meantime as a hack, but since this variable is going to be artist-exposed, I'd like its type to match its use as closely as possible to avoid confusion.
For reference, I'm running RenderMonkey Version 1.82 (build 322) on Windows Vista 32-bit. My card is an NVIDIA GeForce GTX 260, and my CPU is an Intel Core2 Quad Q9550. The error occurs whether I compile the effect for shader model 2 or 3, including software versions.
Thank you for reading!