I am trying to run Stable Diffusion on AMD IPU on Razer Blade 14 laptop with Ryzen 9. I was able to convert Stable diffusion models 1.4 / 1/5/ 2.1 from huggingface to onnx models via the script below. Conversion is successful, but inferencing always defaults to CPU even if VitisAI is the only execution provider specified. onnx.get_device shows CPU.
Has anyone tried Stable Diffusion on VitisAI , or has any pointers to get it run on IPU.
Are there any additional parameters that need to be provided while converting the model.
System config :
- Razer Blade 14 with AMD Ryzen 9 7940HS with IPU driver version ( 10.105.6.45) .
- ONNX runtime packages for Ryzen AI are : Onnxruntime_vitisai-1.16.0 WHL package present in Client to cloud demo and Onnxruntime_vitisai-1.15.1 in the Vitis AI VOE package.
- XLNX_VART_FIRMWARE is set to the xclbin file and config file path is set to “vaip_config.json” from the respective packages.
- Disabled both GPUs on the system to force inferencing on IPU.
Vitis AI runtime setup instructions followed to setup the system are from :