Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Real time feedback on remaining context window length #1208

Open
heathlesjak opened this issue Feb 12, 2025 · 0 comments
Open

Real time feedback on remaining context window length #1208

heathlesjak opened this issue Feb 12, 2025 · 0 comments
Labels
cli enhancement New feature or request ui

Comments

@heathlesjak
Copy link

heathlesjak commented Feb 12, 2025

Right now users blindly approach the context window limit, and then run into it, breaking their current Goose session. It would be helpful to expose the remaining context window length in some fashion. Yes this varies by LLM provider, but at least with the default provider (currently Claude) we likely have a good understanding of the limit.

With some kind of feedback about approaching the context window limit, or how much of the window got consumed up to this point (or on each message), users could better pace the session. Users could learn that certain commands consume large amounts of the context window. They could also learn how much of the context window is consumed before they're even able to ask the first question: with extensions (like Memory), and goosehints files. If a session approaches the limit, then users would know to wrap things up and/or prepare to start up a fresh session.

For the GUI, I'm envisioning some kind of bar with number-of-tokens remaining (or % remaining) that appropriately drops after each turn. For the CLI, perhaps a small text note at the end of each turn with the number-of-tokens remaining (or % remaining). If necessary, we could also have a setting to hide/show this information. Alternately, maybe a "you've consumed 75% of the context window" warning could help (with a potentially configurable threshold).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cli enhancement New feature or request ui
Projects
Status: No status
Development

No branches or pull requests

2 participants