ollama-python icon indicating copy to clipboard operation
ollama-python copied to clipboard

Pass a list of functions/tools to client like in the API

Open jmorzeck opened this issue 1 year ago • 1 comments

Hi there, with more and more models supporting function calling now, I am missing a way to pass a list of functions (tools) to the client. This what works perfectly calling the API:

model = "llama3.1:70b"

tools = [
    {
        "type": "function",
        "function": {
            "name": "retrieve_payment_status",
            "description": "Get payment status of a transaction",
            "parameters": {
                "type": "object",
                "properties": {
                    "transaction_id": {
                        "type": "string",
                        "description": "The transaction id."
                    }
                },
                "required": ["transaction_id"]
            }
        }
    },
    {
        "type": "function",
        "function": {
            "name": "retrieve_payment_date",
            "description": "Get payment date of a transaction",
            "parameters": {
                "type": "object",
                "properties": {
                    "transaction_id": {
                        "type": "string",
                        "description": "The transaction id."
                    }
                },
                "required": ["transaction_id"]
            }
        }
    }
]

messages = [
    {
        "content": "What is the status and payment date of my transaction T1001?",
        "role": "user"
    }
]

data = {
    "messages": messages,
    "tools": tools,
    "model": model,
    "temperature": 0.3
}

response = requests.post('http://localhost:11434/v1/chat/completions',
                         headers={"Content-Type": "application/json"},
                         json=data)

With this, I get a response with the different tool choices and their parameters.

However, I am missing the feature to pass the list of tools to the client. I was thinking of something like this:

ollama_client = Client(host='http://localhost:11434')
response = ollama_client.chat(model=model, messages=messages, tools=tools)

Is there a way to do this without passing the tools into the system prompt? Apologies if I didn't find it elsewhere.

jmorzeck avatar Jul 24 '24 08:07 jmorzeck

I think this was implemented in chat(...): #213 https://github.com/ollama/ollama-python/pull/213/files#diff-0f246a9c084dd2ef9d4a58c02fb818ac4a114c34877e558cac88ab351daae9eeR183

help(ollama.Client.chat) in latest client main branch:P

 |  chat(self, ..., tools: Optional[Sequence[ollama._types.Tool]] = None, ...

anthonywu avatar Jul 31 '24 16:07 anthonywu

Closing this out with the recent tool passing updates https://ollama.com/blog/functions-as-tools

ParthSareen avatar Dec 19 '24 00:12 ParthSareen