Unfortunately, I do not have access to an ATI GPU but many users of my OpenGL application have notified me of a bug that appears to have been introduced around the time of the 11.10 driver update a few months ago. The problem is semi-randomly discarded fragments. My fragment shader is reasonably complex and attempts to emulate the behavior of a mid-90's Real3D GPU. It distinguishes between three different types of texture formats:
1. T1RGB5 "contour" textures. The T bit indicates transparency and indicates the fragment must be discarded and should NOT modify the Z buffer. I encode T in the alpha channel: 0.0 is transparent, 1.0 is opaque. This can be enabled/disabled on a per polygon basis by the Real3D hardware.
2. RGBA4 textures. This is what you would expect except that when A is near 0 (full transparency), the depth buffer is not modified. The entire fragment is discarded.
3. Fully opaque textures. No contour/alpha processing.
What I do is pass two vertex attribs to the fragment shader:
1. fsTexFormat If >0, indicates a T1RGB5 texture.
2. fsTexParams.y If >0, indicates that fragments with a near 0 should be discarded entirely (T1RGB5 or RGBA4).
I encode the T (transparency) bit as an alpha value (either 1.0 or 0.0). If fsTexParams.y indicates that transparent fragments must be discarded, I do so. If the texture is a T1RGB5 type and the fragment has not been discarded, I force its alpha value to 1.0 (opaque), because when processing of this bit is disabled, the T bit is ignored and often set randomly.
I've attached the fragment shader.
Here are examples of the problem (wide images -- click them to view them in their entirety):
The left image demonstrates the problem. Look at the clouds and the castle ramparts. On the right, I forced alpha = 1.0 for all fragments, which breaks RGBA4 decals on the vehicle but eliminates the other problem.
The same user also submitted this image:
Notice that the corruption in the alpha channel on the right image takes the shape of another texture in the game. Please note that I am using only a single texture map: all textures are decoded into a giant 2048x2048 texture sheet, so this is not a CPU-side problem with texture decoding.
Here is a snippet from the fragment shader:
* The transparency bit determines whether to discard pixels (if set).
* What is unknown is how this bit behaves when interpolated. OpenGL
* processes it as an alpha value, so it might concievably be blended
* with neighbors. Here, an arbitrary threshold is chosen.
* To-do: blending could probably enabled and this would work even
* better with a hard threshold.
* Countour processing also seems to be enabled for RGBA4 textures.
* When the alpha value is 0.0 (or close), pixels are discarded
if (fsTexParams.y > 0.5) // contour processing enabled
if (fragColor.a < 0.01) // discard anything with alpha == 0
// If contour texture and not discarded, force alpha to 1.0 because will later be modified by polygon translucency
if (fsTexFormat < 0.5) // contour (T1RGB5) texture map
fragColor.a = 1.0;
The top right image ('Scud Race') is from a version of this fragment shader with all of these lines removed except for: fragColor.a = 1.0;
This disables all transparencies, of course.
I have no idea where the problem might be. As I said, this is not a problem with older drivers and it does not occur on Nvidia hardware. Also, at least one user was able to fix it by completely wiping the drivers and doing a fresh install. Other users have not been able to replicate this but it may be because they are not completely removing the old drivers (they happen to be a bit less computer savvy and I can't verify if they are doing it correctly).