cancel
Showing results for 
Search instead for 
Did you mean: 

AI Discussions

chew1976
Journeyman III

Please take the lead over the N-team to come out with higher VRAM consumer GPU for AI

Dear AMD,

 

Please take the lead over the N-team by coming out with higher VRAM consumer GPUs for AI purposes. (e.g 48gb/96gb)

Many of us are desperate for more VRAM.
We want to be able to load, run and play with 70b/130b LLM, and not be limited to merely toying with 7b/13b.
We want to run long, complicated, exhaustive Comfyui workflows without running out of allocation halfway.

We don't need the latest, badest power-sucking GPU chip as we are not running high FPS games at ultra-high resolution, we just need way more VRAM on any of your existing GPUs.

 

Thank you.

Best regards,

Chew

0 Likes
1 Reply
farshadg
Adept I

AMD already has Radeon Pro GPUs available with 48GBs of VRAM, although I wouldn't exactly call them consumer since the cost of them makes them out of reach for many people. If you do have the money for one though I highly recommend the Radeon Pro W7900DS Dual Slot for AI workloads although it isn't exactly cheap at $3,499 USD. It's 48GBs of VRAM and Dual-slot form factor makes them ideal for stacking multiple of these GPUs into a workstation. If you have any questions about using them for AI I personally own the bigger Radeon Pro W7900 (3-slot variant) and can help answer them.