Dear AMD,
Please take the lead over the N-team by coming out with higher VRAM consumer GPUs for AI purposes. (e.g 48gb/96gb)
Many of us are desperate for more VRAM.
We want to be able to load, run and play with 70b/130b LLM, and not be limited to merely toying with 7b/13b.
We want to run long, complicated, exhaustive Comfyui workflows without running out of allocation halfway.
We don't need the latest, badest power-sucking GPU chip as we are not running high FPS games at ultra-high resolution, we just need way more VRAM on any of your existing GPUs.
Thank you.
Best regards,
Chew