openAI: model selection for o1 family falling back to gpt-4o #5357
-
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
What you might be seeing is the title generation completion, which uses If you are not seeing the icon for the LLM turn from black to purple, then likely a fallback isn't happening on LibreChat's end. To be absolutely sure what the completion was generated with, you should check your debug logs. They are found in There is only a switch/fallback mechanism when you have images attached, because o1-mini and o1-preview are not vision models, and the default behavior of LC is to fallback to a vision model when images are attached. Debug log: 2025-01-18T15:09:38.406Z debug: [OpenAIClient] chatCompletion response
{
object: "chat.completion",
service_tier: "default",
id: "chatcmpl-Ar4k51M6wg6fiC4qkgQirS1brJziU",
// 1 choice(s)
choices: [{"message":{"role":"assistant","content":"Hi there! 👋 How can I help you today?","refusal":null,"pa... [truncated]],
created: 1737212977,
model: "o1-mini-2024-09-12",
system_fingerprint: "fp_f56e40de61",
} you would see gpt-4o-mini for the title generation if this is the first reply in the chat
Attaching imageDebug log (model switched): {
baseURL: "https://api.openai.com/v1",
modelOptions.model: "gpt-4o",
modelOptions.user: "6640b7d73f29bdd33b5b8882",
modelOptions.stream: true,
// 3 message(s)
modelOptions.messages: [{"role":"user","content":"hi"},{"role":"assistant","content":"Hi there! 👋 How can I help you today?"},{"role":"user","content":[{"type":"text","text":"what is this"},{"type":"image_url","image_url":{"ur... [truncated]],
} It will keep switching unless you toggle off "Resend files" in Parameters found on Side panelDebug log (o1 confirmed): 2025-01-18T15:19:53.852Z debug: [OpenAIClient] chatCompletion response
{
object: "chat.completion",
service_tier: "default",
id: "chatcmpl-Ar4tzFSeNZOzYB8P10VPlL18tzjLz",
// 1 choice(s)
choices: [{"message":{"role":"assistant","content":"You're welcome! If you have any more questions or need fur... [truncated]],
created: 1737213591,
model: "o1-mini-2024-09-12",
system_fingerprint: "fp_f56e40de61",
} ConclusionTo be absolutely sure, check your debug logs. They are found in |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
What you might be seeing is the title generation completion, which uses
gpt-4o-mini
after the first message for a new chat.If you are not seeing the icon for the LLM turn from black to purple, then likely a fallback isn't happening on LibreChat's end.
To be absolutely sure what the completion was generated with, you should check your debug logs. They are found in
./logs
at project root if using docker, or./api/logs
if using npm.There is only a switch/fallback mechanism when you have images attached, because o1-mini and o1-preview are not vision models, and the default behavior of LC is to fallback to a vision model when images are attached.
Debug log: