We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I have ollama service run in the background and it is working well to run any model in ternimal. However, when it comes to python, things happend.
import ollama ollama.list()
site-packages/ollama/_client.py:71) response.raise_for_status() site-packages/ollama/_client.py:72) except httpx.HTTPStatusError as e: --->site-packages/ollama/_client.py:73) raise ResponseError(e.response.text, e.response.status_code) from None site-packages/ollama/_client.py:75) return response
The text was updated successfully, but these errors were encountered:
Are you using the standard ollama host/port, e.g. 127.0.0.1:11434? If not, you will need to set OLLAMA_HOST or ollama.Client(host='...')
127.0.0.1:11434
OLLAMA_HOST
ollama.Client(host='...')
Sorry, something went wrong.
try turn off your VPN app
Thanks, buddy. You made my day!
Wondering why? REST call to http://127.0.0.1:11434/api/embeddings was ok on the same machine.
I've meet the same question, and i turn off my VPN but still not working. Could you give me any clue?
No branches or pull requests
I have ollama service run in the background and it is working well to run any model in ternimal.
However, when it comes to python, things happend.
site-packages/ollama/_client.py:71) response.raise_for_status()
site-packages/ollama/_client.py:72) except httpx.HTTPStatusError as e:
--->site-packages/ollama/_client.py:73) raise ResponseError(e.response.text, e.response.status_code) from None
site-packages/ollama/_client.py:75) return response
The text was updated successfully, but these errors were encountered: