-
Notifications
You must be signed in to change notification settings - Fork 589
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Full features of rest api #190
Comments
can you expand on what you mean by "embedded context"? this python client library aims to implement the complete ollama api. if this isn't the case, it would be helpful to know what's missing so it can be added |
I was trying to send embeddings as additional context with my prompt for generation. The REST API has a context entry in the json, but I didn't see a corresponding argument in the python function signature. I also couldn't find keep_alive argument in the python docs. If you've implemented every feature of the REST API, perhaps it's just your documentation that is missing features. You might take a look at what's in the REST API docs and add anything that's missing. BTW, thank you for all your hard work. I love Ollama. You have made a really important contribution to all the people of the world whether they realize it or not. Cheers! |
One thing that I am missing (apologies if I just didn't find it elsewhere) is to pass a list of functions (tools) to the client. This what works perfectly calling the API: model = "llama3.1:70b"
tools = [
{
"type": "function",
"function": {
"name": "retrieve_payment_status",
"description": "Get payment status of a transaction",
"parameters": {
"type": "object",
"properties": {
"transaction_id": {
"type": "string",
"description": "The transaction id."
}
},
"required": ["transaction_id"]
}
}
},
{
"type": "function",
"function": {
"name": "retrieve_payment_date",
"description": "Get payment date of a transaction",
"parameters": {
"type": "object",
"properties": {
"transaction_id": {
"type": "string",
"description": "The transaction id."
}
},
"required": ["transaction_id"]
}
}
}
]
messages = [
{
"content": "What is the status and payment date of my transaction T1001?",
"role": "user"
}
]
data = {
"messages": messages,
"tools": tools,
"model": model,
"temperature": 0.3
}
response = requests.post('http://localhost:11434/v1/chat/completions',
headers={"Content-Type": "application/json"},
json=data) With this, I get a response with the different tool choices and their parameters. However, I am missing the feature to pass the list of tools to the client. I was thinking of something like this: ollama_client = Client(host='http://localhost:11434')
response = ollama_client.chat(model=model, messages=messages, tools=tools) Is there a way to do this without passing the tools into the system prompt? |
Does ollama-python have less functionality than the rest api? If so, you should say so at the top. I want to send embedded context, and it isn't working, and the documentation isn't helping. I can find the answer in the REST API, but not here; so I guess I'll switch, but I wasted a lot of time here. I program in python, so I prefer this, but WTH? If I need to use the REST API I will.
The text was updated successfully, but these errors were encountered: