So the 7900 XTX is at a pretty unbeatable price with 24 GB of VRAM and I am also a gamer and game developer so a single good GPU is better than 2 3090’s for example but I was wondering how it is for AI applications like oobabagooga(or however it is spelt) and sadtalker or bark. These use CUDA and CUDA is Nvidia technology so will I be forced to run the translator and settle for lower performance?