cancel
Showing results for 
Search instead for 
Did you mean: 

AI Discussions

Atalaura
Journeyman III

Is it possible to use Cuda technology with AMD?

Greetings. In these times of increasing AI programs, I think AMD is falling short. A lot of AI tools prefer Cuda instead of ROCm. For example, even AMD-supported versions of Stable Diffusion may not detect the graphics card, or even versions of voice cloning-training AI tools that claim to be AMD-supported may not detect the graphics card. In terms of machine learning and AI, as an RX 6600 user, I think AMD is lagging behind. We cannot use many tools that we should be able to use. AMD really has to do something. Nvidia users are very lucky in this regard.

8 Replies
wanginator
Adept I

Not trying to speak for the company, but I kinda doubt it as cuda is closed source. 

And, you can get Stable Diffusion running on our hardware.  I got it running on my Legion with a 6700M just last night.  It's a bit more involved, but I can imagine that nvidia users experienced the same things when they were getting started.  

Here is the tutorial I used.   It helped me get SD going on Nobara, PikaOS and Ubuntu.

How to use Stable Diffusion XL locally with AMD ROCm. With AUTOMATIC1111 WebUI and ComfyUI on Linux....


0 Likes
payyourcell1
Adept I

As of my last knowledge update in January 2022, CUDA technology is developed by NVIDIA, and it is primarily designed to work with NVIDIA GPUs. CUDA is a parallel computing platform and application programming interface model that allows developers to use NVIDIA GPUs for general-purpose processing.

AMD has its own equivalent technology called OpenCL (Open Computing Language), which is an open standard for parallel programming of heterogeneous systems. OpenCL is not specific to AMD; it is supported by various vendors, including AMD, Intel, and others.

If you have an AMD GPU and want to leverage its parallel processing capabilities, you would typically use OpenCL rather than CUDA. However, it's important to note that not all software applications support both CUDA and OpenCL. The level of support depends on how the software was developed.

It's a good idea to check the documentation of the specific software you're using to determine whether it supports CUDA, OpenCL, or both. Additionally, developments in the field of GPU computing may have occurred since my last update, so checking for the latest information from AMD and NVIDIA is recommended.

Hurry
Adept I

images.png

CUDA technology is exclusive to NVIDIA, and it's not directly compatible with AMD GPUs. If you're facing issues with AI tools preferring CUDA over AMD's ROCm, consider checking for software updates, exploring alternative tools that support AMD, and engaging with community forums or developers for potential solutions. Keep your AMD GPU drivers up to date, and monitor for any updates from AMD or the specific AI tool developers regarding compatibility improvements.

0 Likes

I have already installed this program on my pc. For example, I want to train a voice for AI, but I can't do it in any way. I necessarily have to use Google Colab. AMD needs to come up with a solution for this kind of problem. NVIDIA has made so much progress with AI while AMD users are innocently watching. Whether training or using voice in AI, there is always a problem for AMD users at some point.

0 Likes

Hello! Could you please help me? I would like to propose a project to AMD regarding the integration of neural networks into GPUs, which would optimize data access patterns, cache video memory, predict video memory usage, and provide other functions that can lead to a significant performance breakthrough in GPUs. How can I contact AMD for this purpose? Thank you in advance for your response.

0 Likes
ricopicouk
Journeyman III

Yes, this is being actively worked on. 
https://github.com/vosen/ZLUDA

0 Likes
farshadg
Adept I

Sadly as quickly as ZLUDA was announced to support running CUDA on AMD GPUs Nvidia was quick to shut this down and point out that emulating any kind of CUDA functionality on competitor hardware is against their terms of service. Sure nothing is stopping you from using it for personal projects, but any large company looking to do so will surely not want to try now.

https://www.tomshardware.com/pc-components/gpus/nvidia-bans-using-translation-layers-for-cuda-softwa...

Don't lose hope though as ROCm is improving at a fast pace week by week and support for things like LLMs and Stable Diffusion is getting better month by month.

0 Likes