Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Full features of rest api #190

Open
MikeyBeez opened this issue Jun 14, 2024 · 3 comments
Open

Full features of rest api #190

MikeyBeez opened this issue Jun 14, 2024 · 3 comments

Comments

@MikeyBeez
Copy link

Does ollama-python have less functionality than the rest api? If so, you should say so at the top. I want to send embedded context, and it isn't working, and the documentation isn't helping. I can find the answer in the REST API, but not here; so I guess I'll switch, but I wasted a lot of time here. I program in python, so I prefer this, but WTH? If I need to use the REST API I will.

@mxyng
Copy link
Collaborator

mxyng commented Jun 14, 2024

can you expand on what you mean by "embedded context"? this python client library aims to implement the complete ollama api. if this isn't the case, it would be helpful to know what's missing so it can be added

@MikeyBeez
Copy link
Author

MikeyBeez commented Jun 15, 2024

I was trying to send embeddings as additional context with my prompt for generation. The REST API has a context entry in the json, but I didn't see a corresponding argument in the python function signature. I also couldn't find keep_alive argument in the python docs. If you've implemented every feature of the REST API, perhaps it's just your documentation that is missing features. You might take a look at what's in the REST API docs and add anything that's missing. BTW, thank you for all your hard work. I love Ollama. You have made a really important contribution to all the people of the world whether they realize it or not. Cheers!

@jmorzeck
Copy link

jmorzeck commented Jul 24, 2024

One thing that I am missing (apologies if I just didn't find it elsewhere) is to pass a list of functions (tools) to the client. This what works perfectly calling the API:

model = "llama3.1:70b"

tools = [
    {
        "type": "function",
        "function": {
            "name": "retrieve_payment_status",
            "description": "Get payment status of a transaction",
            "parameters": {
                "type": "object",
                "properties": {
                    "transaction_id": {
                        "type": "string",
                        "description": "The transaction id."
                    }
                },
                "required": ["transaction_id"]
            }
        }
    },
    {
        "type": "function",
        "function": {
            "name": "retrieve_payment_date",
            "description": "Get payment date of a transaction",
            "parameters": {
                "type": "object",
                "properties": {
                    "transaction_id": {
                        "type": "string",
                        "description": "The transaction id."
                    }
                },
                "required": ["transaction_id"]
            }
        }
    }
]

messages = [
    {
        "content": "What is the status and payment date of my transaction T1001?",
        "role": "user"
    }
]

data = {
    "messages": messages,
    "tools": tools,
    "model": model,
    "temperature": 0.3
}

response = requests.post('http://localhost:11434/v1/chat/completions',
                         headers={"Content-Type": "application/json"},
                         json=data)

With this, I get a response with the different tool choices and their parameters.

However, I am missing the feature to pass the list of tools to the client. I was thinking of something like this:

ollama_client = Client(host='http://localhost:11434')
response = ollama_client.chat(model=model, messages=messages, tools=tools)

Is there a way to do this without passing the tools into the system prompt?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants