Kinda click-baity, but hear me out. I bought a Power Color Red Devil 6800 XT a year ago. It arrived DOA. After I returned it, I got the ASUS TUF model. I tested the card and it worked fine. Some time after I noticed that the ray tracing was crashing. Only the ray tracing, nothing else. I researched the topic and found out it wasn't unheard of, also some PS5 systems which also use the RDNA 2 architecture semed to have that issue.
So, after some tinkering with my system Ram, which in the end didn't do much, I got a better power supply (from a BeQuiet 700W Bronze to a Corsair 850 W Gold) and the crashes went away. So, of course I thought that was the solution. But... a few months later ray tracing started to crash again. I tested in Metro Exodus Enhanced, Control, Cyberpunk and Shadow of the Tomb Raider. SOTTR was the outlier here, because that wouldn't even work without raytracing, because of a DX DEVICE_Removed/Hung error. Every other game was working fine without RT.
This february I had enough of this uncertainty and decided troubleshoot those crashes. I used the AMD cleanup tool to remove my drivers and reinstalled them. After some testing... the card just died on a reboot after it had started to crash even when I was not using ray tracing.
So right now I'm waiting for ASUS to send me a replacement card. Funny thing, up until the very end the card always worked fine without RT, other than in Frostbite games (Dragaon Age Inquisition and Mass Effect Andromeda) and Shadow of the Tomb Raider. And there are a lot of people having issues with those games too, so i wasn't relating the issue to that and maybe it isn't, it could still be driver related. On the GTX 1060 that I am using atm, those games work without any problem btw.
TLDR: 6800 XT card did apparently get more unstable over time when I tested with RT, became stable again on a better power supply and eventually died. Until the end I could play most games without RT no problem. Could it acually be the raytracing causing this continual degredation?
Yes, the harder you work your system over time, the more likely it is to fail. Ray Tracing requires a lot more computation than normal rendering. Your GPU is working much harder when you pay games with it on. Like any other intensive processing, it taxes your system harder, including more heat.
All GPUs suck at Ray Tracing right now, but AMD cards are decidedly worse than Nvidia.
If you're big on Ray Tracing, you might want to trade up when you get your replacement card. Otherwise, you can lower your other settings a bit or use FSR to tax your system less.
I strongly recommend doing some good thermal checking when you're enabling RT. Make sure your thermals are staying in an acceptable range and adjust settings or the cooling in your case to compensate.
There were never any thermal issues with the card, I always monitored temps and hotspot temps.
As for the replacement card, if they want to give me a 6900 XT instead I wont say no, but I'm not going to pay for that. The performance of the card has always been excellent, no issues there. I was testing with RT because I wanted to know whether the card was ok... and it wasn't. I can't have unstable hardware in my system.
They absolutely need to replace it if it was under warranty and failed. You're completely correct: You shouldn't have to live with that.
I was only responding to your question. The thing that usually degrades cards is heat. Ray Tracing itself doesn't directly cause degradation, but because of the more intensive workloads, you're going to get more heat.
In your case, it was likely just a defective card, not some ray-tracing-specific degradation.
I think we're probably a generation or two away from the right interactions between hardware and software for efficient ray tracing without having to compromise on other settings or enabling DLSS or FSR.
So, I got my card back today. Seems to work fine. Tested a bit with ray tracing, no problems there.
But now I am actually afraid to turn raytracing on for longer periods because i fear it might damage the card. Stupid, I know.
Just leave RT off. It's just gimmick, to send us back to the dark ages, in terms of resolution and FPS. If you want 60-120FPS leave it off. All RT does is make practically every surface in a game look wet and shiny. Even areas of Cyberpunk, where it hasn't rained and is not currently raining looks like a glazed doughnut all the time. It's so stupidly unrealistic. I have a strong hatred for RT and wish I can buy cards without the silicon for it on there. That silicon space could be used for more practical things, like more streaming cores. RT was always on the horizon anyway. Crytek talked about it a lot in the Crysis 2 days. The last Nvidia Titan that came out was able to run software RT just fine. It can be done, via, software, using the same shader cores, on every GPU. We'd have eventually gotten there anyway...if everyone wants wet and shiny 100%of the time that badly. It's like when bump mapping was first a thing and every developer had to make every surface look like it had chicken pox. Thankfully, bump mapping dissipated into nothingness...and I can only hope the same for RT in the future. We'll arrive at a point where RT has gone back to being just a basic lighting mechanic, light real time lighting, back when it was first a thing...now it's just a standard.