You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Based on your screenshot, the output from ollama serve looks normal. Could you share the exact error message you're encountering? Alternatively, you can try running the following commands to see if everything works correctly:
./ollama pull <model_name>
./ollama run <model_name>
Debugged the issue on the user's machine, and the root cause was that he had oneAPI version 2024.0 installed. After switching to oneAPI 2024.2, it worked normally.
Intel B580 -> not able to run Ollama serve on GPU after following guide
https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/bmg_quickstart.md#32-ollama
https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_quickstart.md
The text was updated successfully, but these errors were encountered: