cancel
Showing results for 
Search instead for 
Did you mean: 

AI Discussions

yunhao2
Journeyman III

Can not run my CNN model on AMD NPU with Onnxruntime VitisAI EP

Hello,

I was recently trying to run a CNN model on an NPU, and I followed the official procedure and successfully ran sample: getting_started_resnet, but got an error when using my own model. The error was reported when loading the model after I quantized it. My model is a simple CNN model without any special custom operators.

Can someone help me solve this problem?

Thanks a lot.

yunhao2_0-1743407458131.png

 

0 Likes
3 Replies
yunhao2
Journeyman III

Is there someone know this issue?
Thanks.

0 Likes
carls5555
Journeyman III

Your issue likely stems from model compatibility, quantization errors, or unsupported operations in Vitis AI EP. Try these steps:

  1. Check ONNX Model – Run onnx.checker.check_model("your_model.onnx") to verify validity.

  2. Ensure Proper Quantization – Use supported quantization methods like QDQ format.

  3. Confirm ONNX Runtime & Vitis AI Versions – Run python -c "import onnxruntime; print(onnxruntime.get_device())".

  4. Enable Verbose Logging – Load with InferenceSession("your_model.onnx", providers=["VitisAIExecutionProvider"]) for error details.

  5. Test Without Quantization – Run your model on CPU to isolate issues.

  6. Check Supported Operators – Ensure all layers are compatible with Vitis AI EP.

Muhammad Talha
0 Likes
Uday_Das
Staff

It seems you are running in C++ flow. I suggest you try python flow first, that can ensure you have installation and everything right. If you like to run using C++ flow, do try Getting Started Resnet tutorial C++. That can show all the required DLL files will be required. 

 

Your error seems like you dont have correct setup/all required files, so try Getting Started Tutorial C++ flow first. 

 

Thanks 

0 Likes