cancel
Showing results for 
Search instead for 
Did you mean: 

AI Discussions

Giganotauroghz
Journeyman III

Running LLMs on LM Studio

How to run a Large Language Model (LLM) on your AM... - AMD Community

Do LLMs on LM studio work with the 7900xtx only on Linux? I have Windows and followed all the instructions to make it work as per the blog I'm sharing here and got this error that I tried to post here but apparently am not allowed to. The error basically stated that there was a problem with either the configuration or model itself but I even allowed the studio to load the respective configuration for each model I tried. It kept saying "Try a different model and/or config." I tried with different 7b models and made sure they were all the Q4M versions of the GGUF but could not get them to work. I would select them, they would try to load, and then give me the error.  I kept getting the same error. I will highlight that this is my first experience using LM Studio for LLMs.

0 Replies