-
Notifications
You must be signed in to change notification settings - Fork 866
Issues: meta-llama/llama-stack
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Support multiple tool call functions in remote vLLM inference provider
enhancement
New feature or request
#1120
opened Feb 15, 2025 by
terrytangyuan
default templates upgrade: default embedding model from inference providers
enhancement
New feature or request
AWS Bedrock provider requires repetition_penalty to be set to 0
bug
Something isn't working
#1102
opened Feb 14, 2025 by
raghotham
2 tasks
segfault running vector_io test
bug
Something isn't working
#1092
opened Feb 14, 2025 by
yanxi0830
2 tasks
server defaults to port 8321 ignoring configured server port from yaml
bug
Something isn't working
#1076
opened Feb 13, 2025 by
thoraxe
2 tasks
EleutherAI/lm-evaluation-harness /eval provider
enhancement
New feature or request
#1069
opened Feb 13, 2025 by
yanxi0830
docs: openapi generator should generate meaningful names for options for dropdowns in html
bug
Something isn't working
Reranking API
enhancement
New feature or request
RAG
Relates to RAG functionality of the agents API
#1066
opened Feb 12, 2025 by
yanxi0830
Provide better control over the RAG ingestion stages (conversion, chunking, embedding, storing)
enhancement
New feature or request
RAG
Relates to RAG functionality of the agents API
#1061
opened Feb 12, 2025 by
ilya-kolchinsky
process_chat_completion_response()
does not work well with tool calls for vLLM remote provider
bug
#1058
opened Feb 12, 2025 by
terrytangyuan
Support post-training / pre-deployment security testing models and systems
enhancement
New feature or request
#1054
opened Feb 11, 2025 by
neha-nupoor
Consider switching to alternatives for client SDKs generation
enhancement
New feature or request
#1045
opened Feb 11, 2025 by
terrytangyuan
llama client sdk can not reach server using ollama
bug
Something isn't working
#1029
opened Feb 10, 2025 by
sancelot
2 tasks
Fix the Llama stack README image after the Refactor
bug
Something isn't working
#1027
opened Feb 10, 2025 by
zanetworker
2 tasks
Propogate provider config into ProviderInfo
enhancement
New feature or request
#1011
opened Feb 7, 2025 by
noelo
Sometimes getting Something isn't working
Error exporting span to SQLite: Cannot operate on a closed database.
when running RAG agent example from getting started document
bug
#999
opened Feb 7, 2025 by
booxter
1 of 2 tasks
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.