You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For Azure OpenAI's API, you can use AzureGPTLLM, the config file looks like below
name: AzureGPTLLM
model_id: gpt-4o # the model id should be same as deployment id in Azure dashboard
api_key: ${env| custom_openai_key, openai_api_key}
endpoint: ${env| custom_openai_endpoint, https://api.openai.com/v1} # should be same as deployment endpoint in Azure dashboard
api_version: ${env| custom_openai_api_version, 2024-02-15-preview} # this is mandatory
temperature: 0
vision: true
Because of the difference between Azure OpenAI's API and OpenAI's API, there may have some issues like incorrect model id or incorrect endpoint. If you encouter these issues, explainations below may help.
OpenaiGPTLLM is set to be compatible with API providers that the request and response are same as OpenAI's request and response. If the request and response format of your custom API provider is compatible with OpenAI, you can use OpenaiGPTLLM directly. Also, if you have mixture API provider of Azure and others, there is a way to convert all those APIs to be compatible with OpenAI, you can refer to new-api, one-api and etc.
I want to ask because I am not using OpenAI's API, but rather Azure OpenAI's API or other APIs. Is this compatible? How should I configure it?
The text was updated successfully, but these errors were encountered: