I bought a 6800 XT and have been benchmarking it. Under Windows it performs much worse than my previous cards. Under Linux, however, it runs perfectly fine. I DDU'd my nvidia drivers before installing it, installed the latest drivers, chose 'performance' in Adrenaline and got a Superposition score of 9,321. My 6600xt scored 17,000 for reference. I disabled all Radeon and Adrenaline tools / clipping, and it gets around 14,000 now. I've tried installing some older drivers, disabling rebar and 4g decoding all to no avail. It's running with all 16 lanes at gen 3. It's not a power draw issue, CPU bottleneck, thermal throttling, as none of these things make sense given it works under Linux just fine. It just doesn't like Windows. My previous GPU was a 3070 laptop on a PCIe chinese frankenstein which performed a bit better than the 1080 Ti I used to have, and it's performing better than a 6800xt under Windows. The 6600xt was a friends card I benchmarked in my system.
Here are some more benchmarks, all were on the same CPU and RAM configuration:
RDR2 , 3440x1440 XB1X settings : Linux - 112 avg | Windows - 62fps avg
Furmark (1080 - 1440p) : Linux - 256, 163 | Windows - 231, 121
Cinebench GPU benchmark - 1286. A 2070 Super is benchmarked at 6306 for reference. Unfortunately would not run under Wine.
Superposition 1080p Medium | 6800xt, Windows : 14029 - Linux (native) : 22110 | 6600xt : 17044 | 1080 Ti : 19115 | 3070m : 19870
Every game I've played feels much worse. Holdfast I'm getting 40-50 and 140-160 in Linux. Deep rock - 80-90 compared to the 130-160 in Linux. Cyberpunk stops responding in Windows, but I'm not sure if that's the card as I've not tested it on another GPU. Works fine under Proton, though. I obviously expected my i5-10400f to be a bottleneck when I bought the card, but the performance difference between OS's, and it performing worse than cards that are objectively worse just doesn't make sense.