I'm trying to write a BC1/DXT1 software decoder.
When comparing my results to what my radeon graphics card is doing, I'm getting some very minor off-by-1 results in the pixels which are displayed using the two interpolants.
I seem to be expanding the two 565 reference colours fine, but generating the two interpolants is error prone, and I think it's something to do with rounding.
Sometimes my graphics card is the rounding the colour up, sometimes it's rounding down. It's always the same result when I run it multiple times, but I can't seem to figure out how the graphics card decides on whether to round down or up.
Can anyone tell me what my radeon graphics card is doing for rounding on the two interpolant colours?
Any help would be appreciated!
Additionally, I'm testing on DX11, and I'm ignoring the DXT1a case for the moment, until I can get this to work correctly.