Showing results for 
Search instead for 
Did you mean: 

Archives Discussions

Journeyman III

[OpenGL]Artifacts when using an attribute to index into a uniform array

I am working on setting up a batch renderer for a new project, with the goal of being able to batch N objects with up to 32 textures into one draw call. Currently it is only using two textures, both of which are 8x8 32bit png files composed of a single, solid color (either black or white). The hardware I have tested on personally is an R9 280X, however several friends have also tested it on AMD hardware with similar results.

In the frag shader I am trying to use a flat int attribute to index into a uniform sampler2D array so that I can draw the correct texture on each object. I have tested the values of both the textureID and the uv coordinates and am confident they are correct, however making a call to texture() using them results in some severe artifacting which can be seen here​. The same code on Intel/Nvidia hardware has the intended output, which can be seen here​. I have been trying to figure out what the issue is for quite some time now, and have asked many friends who know OpenGL much better than I do. After a couple days without being able to figure it out, one of them suggested I post on the AMD forums. If anyone is interested in looking at the source code to try to figure out the issue, it is available here​.

Any help figuring out the cause of the issue or a solution would be greatly appreciated!

Edited to set source code URL to a specific commit which should reproduce the bug.

2 Replies

I've added you to the developer forums white list and moved this message to the OpenGL forum.

Thank you!