-
-
Notifications
You must be signed in to change notification settings - Fork 32.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAI does not support changing the base url #137087
Comments
Hey there @balloob, mind taking a look at this issue as it has been labeled with an integration ( Code owner commandsCode owners of
(message by CodeOwnersMention) openai_conversation documentation |
There is an Extended OpenAI Conversation fork in HACS that seems to work for openAI compatible API's I've not used it long enough to see if an 8B model is good enough or if I need a larger one but it responds pretty much instantly (Picked an 8B param model to keep costs down) |
@dragon2611 do you have a link to it as a stop-gap until @pepijndevos get's the go-ahead to look into this resolution from @balloob :) I really want to play with AI responses but as I already pay for one secure AI that will work I don't want to pay again for OpenAI and give them all my data. :) |
The problem
As indicated by the persistent PRs and persistent rejections in eg #122423, #94051, #94341, #99517, #101356, #121105 there is a clear interest from users to support OpenAI-compatible APIs, but a clear desire from Home Assistant member @balloob to not confuse the experience of actual OpenAI users.
So let's be clear
So, let's find a solution.
I think the first issue does require some type of separate integration with its own config flow, but that could internally share code with the OpenAI integration to avoid duplication. There would be a fair bit of boilerplate, but minimal maintenance burden. It would also serve as a clear point for bug reports and handling of divergent behavior, without retroactively generalizing the actual OpenAI integration.
To avoid confusion I would suggest calling this new integration "LiteLLM Conversation", and then using virtual integrations to add other user-facing names like "llama.cpp", "OpenRouter" and whatever are deemed user-relevant services that use an OpenAI-compatible API under the hood.
I think these two things combined would strike a good balance for both users and developers:
This issue would be a PR but I would like approval for an acceptable approach before I start building.
I really hope we can find a good solution to add support for OpenAI-compatible services that are clearly in demand without making the experience for OpenAI users worse or confusing. I'm in my own "year of the voice", developing a product around llama.cpp, so having an official integration that supports it matters to me.
What version of Home Assistant Core has the issue?
core-2025.1.2
What was the last working version of Home Assistant Core?
No response
What type of installation are you running?
Home Assistant OS
Integration causing the issue
OpenAI Conversation
Link to integration documentation on our website
https://www.home-assistant.io/integrations/openai_conversation/
Diagnostics information
No response
Example YAML snippet
Anything in the logs that might be useful for us?
Additional information
No response
The text was updated successfully, but these errors were encountered: