- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
AMD brings Multi-GPU support to ROCm 6.1 Software for AMD Radeon desktop systems
"With ROCm™ 6.1 open compute software, we are making AI development and deployment with AMD Radeon™ desktop GPUs more compatible, accessible and scalable with the addition of key feature enhancements - now enabling local and private AI workstation configurations for up to four users." - Machine Learning Development with AMD Radeon™ Graphics Cards
- Labels:
-
ROCm AI
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I can vouch for Multi-GPU support working beautifully on Radeon Pro GPUs. I've personally tested model inference and fine-tuning with up to 4x Radeon Pro W7900 GPUs and they work great after implementing a fix for a known issues here: https://rocm.docs.amd.com/projects/install-on-linux/en/latest/how-to/native-install/install-faq.html.... For those wondering about ROCm support for AI Workloads including Multi-GPU support I gave a presentation in Paris, France last week on this that may be of interest: https://youtu.be/k2g_lC0fI-k?si=a3uyAj81PcLUK77y
