Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming with OpenAIChatCompletionClient raises an 'empty_chunk` exception when usage is requested, fixed using max_consecutive_empty_chunk_tolerance #5078

Open
auto-d opened this issue Jan 16, 2025 · 1 comment
Labels
documentation Improvements or additions to documentation proj-extensions
Milestone

Comments

@auto-d
Copy link

auto-d commented Jan 16, 2025

The model client documentation suggests this fix for missing tokens in an OpenAIChatCompletionClient streaming response:

set extra_create_args={"stream_options": {"include_usage": True}},

However the (final) message from the server with the requested usage information raises an exception due to an 'empty chunk' in the streaming processor implementation (openai/_openai_client.py).

Code to reproduce in autogen 0.41:

model_client = OpenAIChatCompletionClient(model="gpt-4o-mini", api_key = api_key)
model_client.create_stream(
                messages=["Tell me a story about pirates"],
                extra_create_args={"stream_options": {"include_usage": True}})

async for response in stream: 
    print(response) 
    # ValueError raised by logic in _openai_client.py's `create_stream`. 

Workaround for me was setting max_consecutive_empty_chunk_tolerance to 2, which per the comments was included to fix a problem with the Azure endpoint.

@ekzhu
Copy link
Collaborator

ekzhu commented Jan 16, 2025

Could you submit a PR to update the API docs of the create_stream methods of both OpenAIChatCompletionClient and AzureOpenAIChatCompletionClient on your error resolution? cc @MohMaz

@ekzhu ekzhu changed the title Streaming with OpenAIChatCompletionClient raises an exception when usage is requested Streaming with OpenAIChatCompletionClient raises an 'empty_chunk` exception when usage is requested, fixed using max_consecutive_empty_chunk_tolerance Jan 16, 2025
@ekzhu ekzhu added the documentation Improvements or additions to documentation label Jan 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation proj-extensions
Projects
None yet
Development

No branches or pull requests

4 participants