Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose tokens / prompt / http response etc through the Collector interface #1512

Open
wants to merge 32 commits into
base: canary
Choose a base branch
from

Conversation

aaronvg
Copy link
Contributor

@aaronvg aaronvg commented Feb 24, 2025

Implement the LogCollector

  • python (missing some more tests with retries / fallbacks / llm http failures / and parsing failures, as well as bedrock support for getting raw http request/response)
  • typescript
  • ruby

Other bugfixes

  • Fix memory leak for Python streams
  • dont add deserializer conditions on final response of a stream due to high memory usage

Copy link

vercel bot commented Feb 24, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
baml ❌ Failed (Inspect) Feb 24, 2025 6:40pm

Copy link
Contributor

ellipsis-dev bot commented Feb 24, 2025

⚠️ This PR is too big for Ellipsis, but support for larger PRs is coming soon. If you want us to prioritize this feature, let us know at [email protected]


Generated with ❤️ by ellipsis.dev

@aaronvg aaronvg changed the title tracingv3.1 Expose tokens / prompt / raw response etc through the Collector interface Feb 24, 2025
@aaronvg aaronvg changed the title Expose tokens / prompt / raw response etc through the Collector interface Expose tokens / prompt / http response etc through the Collector interface Feb 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants