Python: Parameters are not recognized when connecting to OpenAI/OpenAPI Plugin
I have prepared an API server that is compliant with ChatGPT Plugin as shown below.
http://0.0.0.0:5003/.well-known/ai-plugin.json
{
"schema_version": "v1",
"name_for_human": "TODO List (no auth)",
"name_for_model": "todo",
"description_for_human": "Manage your TODO list. You can add, remove and view your TODOs.",
"description_for_model": "Plugin for managing a TODO list, you can add, remove and view your TODOs.",
"auth": {
"type": "none"
},
"api": {
"type": "openapi",
"url": "http://0.0.0.0:5003/openapi.yaml"
},
"logo_url": "http://0.0.0.0:5003/logo.png",
"contact_email": "[email protected]",
"legal_info_url": "http://example.com/legal"
}
http://0.0.0.0:5003/openapi.yaml
components:
schemas:
addTodoRequest:
properties:
todo:
description: The todo to add to the list.
type: string
required:
- todo
type: object
getTodosResponse:
properties:
todos:
description: The list of todos.
items:
type: string
type: array
type: object
info:
description: A plugin that allows the user to create and manage a TODO list using
ChatGPT. If you do not know the user's username, ask them first before making
queries to the plugin. Otherwise, use the username "global".
title: TODO Plugin
version: v1
openapi: 3.0.1
paths:
/todos/{username}:
delete:
operationId: deleteTodo
parameters:
- description: The name of the user.
in: path
name: username
required: true
schema:
type: string
responses:
'200':
description: OK
summary: Delete a todo from the list
get:
operationId: getTodos
parameters:
- description: The name of the user.
in: path
name: username
required: true
schema:
type: string
responses:
'200':
content:
application/json:
schema:
$ref: '#/components/schemas/getTodosResponse'
description: OK
summary: Get the list of todos
post:
operationId: addTodo
parameters:
- description: The name of the user.
in: path
name: username
required: true
schema:
type: string
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/addTodoRequest'
required: true
responses:
'200':
description: OK
summary: Add a todo to the list
servers:
- url: http://0.0.0.0:5003
I registered this plugin to Semantic Kernel using the code below. The result is converted to:
from semantic_kernel.connectors.ai.open_ai import utils
await kernel.add_plugin_from_openai(
plugin_name="example",
plugin_url="http://0.0.0.0:5003/.well-known/ai-plugin.json",
description="desc",
execution_parameters=None,
)
r = utils.get_tool_call_object(kernel, filter={"exclude_plugin": []})
print(r)
↓
[{'type': 'function', 'function': {'name': 'example-deleteTodo', 'description': 'Delete a todo from the list', 'parameters': {'type': 'object', 'properties': {'path_params': {'description': '', 'type': 'string'}, 'query_params': {'description': '', 'type': 'string'}, 'headers': {'description': '', 'type': 'string'}, 'request_body': {'description': '', 'type': 'string'}}, 'required': ['path_params', 'query_params', 'headers', 'request_body']}}}, {'type': 'function', 'function': {'name': 'example-getTodos', 'description': 'Get the list of todos', 'parameters': {'type': 'object', 'properties': {'path_params': {'description': '', 'type': 'string'}, 'query_params': {'description': '', 'type': 'string'}, 'headers': {'description': '', 'type': 'string'}, 'request_body': {'description': '', 'type': 'string'}}, 'required': ['path_params', 'query_params', 'headers', 'request_body']}}}, {'type': 'function', 'function': {'name': 'example-addTodo', 'description': 'Add a todo to the list', 'parameters': {'type': 'object', 'properties': {'path_params': {'description': '', 'type': 'string'}, 'query_params': {'description': '', 'type': 'string'}, 'headers': {'description': '', 'type': 'string'}, 'request_body': {'description': '', 'type': 'string'}}, 'required': ['path_params', 'query_params', 'headers', 'request_body']}}}]
The problem is below. parameters are not recognized. There is actually a PATH parameter defined in openapi.yaml.
"parameters": {
"type": "object",
"properties": {
"path_params": {"description": "", "type": "string"},
"query_params": {"description": "", "type": "string"},
"headers": {"description": "", "type": "string"},
"request_body": {"description": "", "type": "string"},
},
"required": ["path_params", "query_params", "headers", "request_body"],
},
Am I missing some configuration to get openapi.yaml to be interpreted correctly?
Platform
- OS: mac os sonoma 14.4.1
- IDE: VS Code
- Language: Python
- Source: pip 23.2.1, semantic-kernel==0.9.6b1
Additional context None
@moonbox3 knows most about this one, he can comment next week!
thank you!
@SergeyMenshykh, do you know if this is supported today in the .NET version of Semantic Kernel?
Hi @yuichiromukaiyama, thanks for reaching out about this. When I run the Klarna kernel syntax example, and get the tool call object JSON, I see the same:
'parameters': {
'type': 'object',
'properties': {
'path_params': {'description': '', 'type': 'string'},
'query_params': {'description': '', 'type': 'string'},
'headers': {'description': '', 'type': 'string'},
'request_body': {'description': '', 'type': 'string'}
},
'required': ['path_params', 'query_params', 'headers', 'request_body']
}
And when I continue to run the example, based on the supplied function arguments, as well as the kernel function's signature that was registered, the correct arguments are being sent:
Are you experiencing missing function arguments when running the tool call? If you aren't explicitly passing in the function arguments to the function call, and are having the model form the required function arguments dynamically, it's doable but a bit trickier. You'll want to view this other Q&A thread that may shine some light on how to handle it.
Please do let us know what other specific issues you are facing, aside from the tool call object looking off. I've looked at other tool call object formations for other plugins and they look similar, and ultimately work fine given that the required parameters are there.
Hi, @moonbox3, Thank you for answering.
I'm a bit confused. I'm sorry, but I'd like to clarify a bit. For example, in the following C# example, if I specify /.well-known/ai-plugin.json, then /openapi.yaml is automatically recognized. Even if I don't explicitly specify any arguments, the appropriate arguments are automatically passed to the plugin and executed.
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using Microsoft.SemanticKernel.Plugins.OpenApi;
// Load environments
var DEPLOYMENT_NAME = Environment.GetEnvironmentVariable("DEPLOYMENT_NAME");
var ENDPOINT = Environment.GetEnvironmentVariable("ENDPOINT");
var API_KEY = Environment.GetEnvironmentVariable("API_KEY");
var PLUGIN_MANIFEST_URL = Environment.GetEnvironmentVariable("PLUGIN_MANIFEST_URL"); // example: http://domain:port/.well-known/ai-plugin.json
// Create kernel
var builder = Kernel.CreateBuilder();
builder.Services.AddAzureOpenAIChatCompletion(
deploymentName: DEPLOYMENT_NAME,
endpoint: ENDPOINT,
apiKey: API_KEY
);
var kernel = builder.Build();
// Add the math plugin using the plugin manifest URL
#pragma warning disable SKEXP0042
var handler = new HttpClientHandler { CheckCertificateRevocationList = false };
var client = new HttpClient(handler: handler);
await kernel.ImportPluginFromOpenAIAsync(
"TodoPlugin",
new Uri(PLUGIN_MANIFEST_URL),
new OpenAIFunctionExecutionParameters(httpClient: client)
);
#pragma warning restore SKEXP0042
// Get chat completion service
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
// Start the conversation
ChatHistory history = [];
while (true)
{
// Get user input
Console.Write("User > ");
history.AddUserMessage(Console.ReadLine()!);
// Enable auto function calling
OpenAIPromptExecutionSettings openAIPromptExecutionSettings = new()
{
ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
};
var result = chatCompletionService.GetStreamingChatMessageContentsAsync(
history,
executionSettings: openAIPromptExecutionSettings,
kernel: kernel);
// Result
string fullMessage = "";
var first = true;
await foreach (var content in result)
{
if (content.Role.HasValue && first)
{
Console.Write("Assistant > ");
first = false;
}
Console.Write(content.Content);
fullMessage += content.Content;
}
Console.WriteLine();
history.AddAssistantMessage(fullMessage);
}
If I want to achieve the same code in Python, I think it would look something like this. Am I misunderstanding this usage? The issues with this process are as follows.
- I need to execute the function call beforehand, but because the result of utils.get_tool_call_object doesn't include the parameters or request body for openapi.yaml, I can't create arguments in the pre-call function.
await kernel.add_plugin_from_openai(
plugin_name="example",
plugin_url="https://chatgpt-plugin-example.sb-presales.com/.well-known/ai-plugin.json",
description="desc",
execution_parameters=None,
)
chat_history = ChatHistory()
tools = utils.get_tool_call_object(kernel, filter={"exclude_plugin": []})
settings = kernel.get_service("default").get_prompt_execution_settings_class()(
service_id="default"
)
settings.tool_choice = "auto"
settings.tools = (
tools # <-- Issue: At this point, parameters and request body are not defined.
)
chat_history.add_user_message(content="add task 'cleaning'")
res = await service.complete_chat(chat_history=chat_history, settings=settings)
chat_history.add_message(res[0])
await service._process_tool_calls(
result=res[0],
kernel=kernel,
chat_history=chat_history,
arguments=KernelArguments(),
)
return chat_history
I just found an example like the following. Do you need to define the arguments yourself in the Python version?
https://github.com/moonbox3/semantic-kernel/blob/2e315c5df0a90b8ff7960ab07d4d58e1d7e8d425/python/samples/kernel-syntax-examples/openapi_example/openapi_client.py
Hello. Since the new Semantic Kernel has been released, I tried it out with the new version. However, I encountered the following error. (I think it's great that FunctionCalling can be implemented with such simple syntax.)
await kernel.add_plugin_from_openai(
plugin_name="Example",
plugin_url="https://www.klarna.com/.well-known/ai-plugin.json",
)
settings.tool_choice = "auto"
settings.function_call_behavior = FunctionCallBehavior.AutoInvokeKernelFunctions()
history = ChatHistory()
history.add_user_message("Please call example")
response = await service.complete_chat(
chat_history=history,
settings=settings,
kernel=kernel,
arguments=KernelArguments(settings=settings),
)
content = response[0]
print(content)
↓
Error occurred while invoking function productsUsingGET: Expecting value: line 1 column 1 (char 0)
Error occurred while invoking function Example-productsUsingGET
Traceback (most recent call last):
File "/.venv/lib/python3.11/site-packages/semantic_kernel/functions/kernel_function.py", line 191, in invoke
return await self._invoke_internal(kernel, arguments)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.venv/lib/python3.11/site-packages/semantic_kernel/functions/kernel_function_from_method.py", line 103, in _invoke_internal
result = await result
^^^^^^^^^^^^
File "/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/openapi_plugin/openapi_manager.py", line 353, in run_openapi_operation
json.loads(path_params) if isinstance(path_params, str) else path_params if path_params else None
^^^^^^^^^^^^^^^^^^^^^^^
File "/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.pyenv/versions/3.11.7/lib/python3.11/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.pyenv/versions/3.11.7/lib/python3.11/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py", line 427, in _process_tool_call
func_result = await kernel.invoke(**func.split_name_dict(), arguments=args_cloned)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.venv/lib/python3.11/site-packages/semantic_kernel/kernel.py", line 280, in invoke
raise KernelInvokeException(
semantic_kernel.exceptions.kernel_exceptions.KernelInvokeException: Error occurred while invoking function: 'Example-productsUsingGET'
I traced the cause of the error and found that it occurred because the PATH PARAMETER was empty. As I understand it, this is because, by default, PATH PARAMETERS and RequestBody are not read from openapi, as you had mentioned.
However, I believe it is difficult to pass arguments midway through the process when using FunctionCallBehavior.AutoInvokeKernelFunctions(). Is it possible that FunctionCallBehavior.AutoInvokeKernelFunctions does not support openai-style plugins?
Hello. Since the new Semantic Kernel has been released, I tried it out with the new version. However, I encountered the following error. (I think it's great that FunctionCalling can be implemented with such simple syntax.)
await kernel.add_plugin_from_openai( plugin_name="Example", plugin_url="https://www.klarna.com/.well-known/ai-plugin.json", ) settings.tool_choice = "auto" settings.function_call_behavior = FunctionCallBehavior.AutoInvokeKernelFunctions() history = ChatHistory() history.add_user_message("Please call example") response = await service.complete_chat( chat_history=history, settings=settings, kernel=kernel, arguments=KernelArguments(settings=settings), ) content = response[0] print(content)↓
Error occurred while invoking function productsUsingGET: Expecting value: line 1 column 1 (char 0) Error occurred while invoking function Example-productsUsingGET Traceback (most recent call last): File "/.venv/lib/python3.11/site-packages/semantic_kernel/functions/kernel_function.py", line 191, in invoke return await self._invoke_internal(kernel, arguments) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.venv/lib/python3.11/site-packages/semantic_kernel/functions/kernel_function_from_method.py", line 103, in _invoke_internal result = await result ^^^^^^^^^^^^ File "/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/openapi_plugin/openapi_manager.py", line 353, in run_openapi_operation json.loads(path_params) if isinstance(path_params, str) else path_params if path_params else None ^^^^^^^^^^^^^^^^^^^^^^^ File "/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.pyenv/versions/3.11.7/lib/python3.11/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.pyenv/versions/3.11.7/lib/python3.11/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_chat_completion_base.py", line 427, in _process_tool_call func_result = await kernel.invoke(**func.split_name_dict(), arguments=args_cloned) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.venv/lib/python3.11/site-packages/semantic_kernel/kernel.py", line 280, in invoke raise KernelInvokeException( semantic_kernel.exceptions.kernel_exceptions.KernelInvokeException: Error occurred while invoking function: 'Example-productsUsingGET'I traced the cause of the error and found that it occurred because the PATH PARAMETER was empty. As I understand it, this is because, by default, PATH PARAMETERS and RequestBody are not read from openapi, as you had mentioned.
However, I believe it is difficult to pass arguments midway through the process when using FunctionCallBehavior.AutoInvokeKernelFunctions(). Is it possible that FunctionCallBehavior.AutoInvokeKernelFunctions does not support openai-style plugins?
Hi @yuichiromukaiyama, we have a fix for this, and will release a new package today.
@moonbox3
That's great news! (I've been struggling with this implementation for over a week.) I'll try it out once it's released.
It worked perfectly as expected! Thank you for this wonderful release!
# sk-0.9.9b1
async def newFunctionCall():
await kernel.add_plugin_from_openai(
plugin_name="Example",
plugin_url="https://chatgpt-plugin-example.sb-presales.com/.well-known/ai-plugin.json",
)
settings.tool_choice = "auto"
settings.function_call_behavior = FunctionCallBehavior.AutoInvokeKernelFunctions()
history = ChatHistory()
history.add_user_message("please get my todos")
# history.add_user_message("please set todo -> cleaning")
response = await service.get_chat_message_contents(
chat_history=history,
settings=settings,
kernel=kernel,
arguments=KernelArguments(settings=settings),
)
content = response[0]
print(content)
Hi @yuichiromukaiyama, thanks so much for testing it out and for the feedback. We did a major overhaul on the openapi_manager as you can see, to make the improvements. Please let us know if anything else comes up in the future and we will have a look.
Understood! I will start testing this feature from next week, so if I notice anything, I will issue an Issue.