I'm trying to write a BC1/DXT1 software decoder.
When comparing my results to what my radeon graphics card is doing, I'm getting some very minor off-by-1 results in the pixels which are displayed using the two interpolants.
I seem to be expanding the two 565 reference colours fine, but generating the two interpolants is error prone, and I think it's something to do with rounding.
Sometimes my graphics card is the rounding the colour up, sometimes it's rounding down. It's always the same result when I run it multiple times, but I can't seem to figure out how the graphics card decides on whether to round down or up.
Can anyone tell me what my radeon graphics card is doing for rounding on the two interpolant colours?
Any help would be appreciated!
Additionally, I'm testing on DX11, and I'm ignoring the DXT1a case for the moment, until I can get this to work correctly.
The algorithm for decoding BCn textures is required to be bit accurate to the spec. I don't have the exact rounding methods at hand right now, but I'm very confident that Radeon GPUs do get it right, otherwise we'd fail all kinds of conformance and compatibility tests. The exact rounding rules are quite clearly spelled out in the specification. It'd be tough to see exactly where you're dropping a bit without looking closely at the code. I'm afraid you have some debugging to do.