cancel
Showing results for 
Search instead for 
Did you mean: 

AI Discussions

rajvholla
Journeyman III

Stable diffusion on Ryzen AI

I am trying to run Stable Diffusion on AMD IPU on Razer Blade 14 laptop with Ryzen 9. I was able to convert Stable diffusion models 1.4 / 1/5/ 2.1 from huggingface to onnx models via the script below. Conversion is successful, but inferencing always defaults to CPU even if VitisAI is the only execution provider specified. onnx.get_device shows CPU. 

Has anyone tried Stable Diffusion on VitisAI , or has any pointers to get it run on IPU. 

Are there any additional parameters that need to be provided while converting the model. 

System config :

  • Razer Blade 14 with AMD Ryzen 9 7940HS with IPU driver version ( 10.105.6.45) .
  • ONNX runtime packages for Ryzen AI are : Onnxruntime_vitisai-1.16.0 WHL package present in Client to cloud demo and Onnxruntime_vitisai-1.15.1 in the Vitis AI VOE package.
  • XLNX_VART_FIRMWARE   is set to the xclbin file and config file path is set to “vaip_config.json” from the respective packages.
  • Disabled both GPUs on the system to force inferencing on IPU. 

 

Vitis AI runtime setup instructions followed to setup the system are from :

0 Likes
2 Replies
kylee009
Adept I

Hello rajvholla,

Running Stable Diffusion on the AMD IPU with VitisAI sounds promising. Since your conversion is successful but inferencing defaults to CPU, it might be worth double-checking your setup. Make sure you're specifying the execution provider correctly in the script. Also, ensure your system's configuration and setup are in line with the documentation you mentioned. If inferencing is still defaulting to CPU, reaching out to the AMD or VitisAI community for specific troubleshooting advice could help resolve this.

 

0 Likes
Uday_Das
Staff

Hi, We dont support Stable diffusion officially on Ryzen-AI platform yet, please do keep an eye on the upcoming releases for its support. 

Thanks 

0 Likes