-
Notifications
You must be signed in to change notification settings - Fork 503
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLM Monitoring not working for Async OpenAI requests #3494
Comments
Assigning to @getsentry/support for routing ⏲️ |
Routing to @getsentry/product-owners-insights for triage ⏲️ |
@AltafHussain4748 thank you for the bug report, I just added to our backlog of work to triage next week. We'll update soon. |
Hey @AltafHussain4748 ! Your code looks good, I think the only thing your are missing that you need to enable tracing. (Setting You can check if data is sent to Sentry by setting
(The LLM data is in the transaction envelope) Hope this help! |
Hey, sorry I didn't notice this issue before and created a new one, but it doesn't work because of this: #3496 |
Cool @vetyy , thanks for linking! |
This is the configuration I am using
Don't forget to specify
and also make sure you have |
Thanks for your reply even with this parameter i was not able to make it work |
Problem Statement
I just experimented LLM monitoring and could not make it work with AsyncOpenAI.
Below is my code
My sentry init is
Function i am using:
Versions
Solution Brainstorm
No response
Product Area
Insights
The text was updated successfully, but these errors were encountered: