Overview: I modified one of my example programs to use compute shader instead of fragment shader and I found a visual error which (according to the OpenGL standard) is likely a driver bug. Card is Radeon RX Vega 64, no issue on NVidia 1050 Ti (I asked for a better card, but manager was "naah...").
System info (incl. dxdiag):
https://www.dropbox.com/s/fdx4s59kgkzqvya/sysinfo_23_06.zip?dl=0
(sidenote: Windows keeps replacing my graphics driver, so I can't really verify whether it was fixed in later versions)
Repro code:
https://www.dropbox.com/s/5rdh3rlf8xrhjpm/FOR_AMD_27.zip?dl=0
Instructions:
- upgrade solution if needed, build the sample
- when it's running
(run around, it's fun) press button "4" (to change to compute shader) - screen will flash
- go to main.cpp line 253, put it to comment
- now if you do the same thing before, the artifact is persisent
Findings:
So of course my first tought was "oh well, I forgot a memory barrier". Well...no... Because as I read trough the standard, reads are always coherent. Nevertheless I tried to put glMemoryBarrier in all places possible. Didn't fix the problem.
Screenshot:
caused by RGBA16F normal vectors
Notes:
As funny as it sounds, it seems to happen with rgab16f, rgba32f (less noticable, but present) formats. Calling glClearBuffer apparently solves the issue retroactively (that is, it flushes the tile/whatever cache).