-
Notifications
You must be signed in to change notification settings - Fork 589
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running without network error: ollama._types.ResponseError #85
Comments
遇到同样的问题了,只要网络不通就报这个错,您这边解决这个问题了吗。 from openai import OpenAI |
仍然有一些问题。
然后我做出了些许修改:
至此,在联网状况下正常运行,但是一旦断开网络链接,就出现了更多问题:
貌似是OpenAI的接口需要联网。在创建chat completion时出现了内部服务器错误(Error code: 502)。这意味着在尝试与OpenAI服务器通信时出现了问题。 |
Met similar issue that direct me here, fixed by turning off the global proxy. |
be sure if running offline to use ollama serve |
It works for me. Thanks! |
|
It works for me. Thanks! Until now, I used the Intranet to penetrate the tunnel |
Really helpful project! However, I met some problem When I turn off WI-FI connection.
All of my program can work well with internet connection. But when I turn off the wifi switch, it totally error.
U can see my project at:https://github.com/Gloridust/LocalChatLLM
I really need it to run completely offline, any solutions?
The text was updated successfully, but these errors were encountered: