Feat/responses api
What
Ability to work with models through Responses API.
OpenAI's most advanced interface for generating model responses. Supports text and image inputs, and text outputs. Create stateful interactions with the model, using the output of previous responses as input. Extend the model's capabilities with built-in tools for file search, web search, computer use, and more. Allow the model access to external systems and data using function calling.
Why
There have already been requests for it: https://github.com/MacPaw/OpenAI/issues/298. And it seems that OpenAI is making it the default way to interact with models:
If you're a new user, we recommend using the Responses API. (https://platform.openai.com/docs/guides/responses-vs-chat-completions)
Affected Areas
Non of the existing public API have changed, but the deployment target is introduced with iOS 13 (macOS 10.15) minimum.
Internally - refactoring of OpenAI class by extracting "internal clients" so that they can be reused in other "public clients" like newly introduced ResponsesEndpoint.
The generated code of struct "ParametersPayload" has only a "additionalProperties" field, caused by missing info in OpenAPI document.
@diegoeddy yes, great catch! And actually, there is already a solution to it implemented in https://github.com/MacPaw/OpenAI/pull/322, but I need to integrate it. Will let you know when done
Hi @diegoeddy, it would be great if you tried to pull this branch to get the latest changes and see if it works for you
The usage would look something like this:
let tools: [Tool] = [.functionTool(
.init(
type: "asd",
name: "asd",
parameters: .init(fields: [.const(1)]),
strict: true
)
)]
Hi @nezhyborets , I found ResponseObject decode failed with errors below:
valueNotFound(OpenAI.Tool, Swift.DecodingError.Context(codingPath: [CodingKeys(stringValue: "tools", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0)], debugDescription: "The oneOf structure did not decode into any child schema.", underlyingError: Optional(MultiError (contains 4 errors):
Error 1: [DecodingError: dataCorrupted - at CodingKeys(stringValue: "tools", intValue: nil)/_JSONKey(stringValue: "Index 0", intValue: 0)/CodingKeys(stringValue: "type", intValue: nil): Cannot initialize _TypePayload from invalid String value function (underlying error:
OpenAI network response payload is here: response.json
@diegoeddy yes, sorry for that, fixing it right now
Hi @nezhyborets , I found another problem when debug response streaming.
unknownEventType("response.function_call_arguments.delta")
unknownEventType("response.function_call_arguments.done")
how to fix:
in ModelResponseStreamEventType
case responseFunctionCallArgumentsDelta = "response.function_call.arguments.delta"
case responseFunctionCallArgumentsDone = "response.function_call.arguments.done"
should be changed to
case responseFunctionCallArgumentsDelta = "response.function_call_arguments.delta"
case responseFunctionCallArgumentsDone = "response.function_call_arguments.done"
@diegoeddy fixed both Tool decoding in ResponseObject and function_call_arguments, please try now. And thanks very much for trying out and letting me know about these issues
Have started on implementing Function Tool usage in Responses part of Demo to see myself if it works and if it's usable
Added a very basic Function Calling to Demo. Seems to be working fine. Pushed to this branch. Would be much better to have some kind of helper to parse arguments returned by the model, but that'd be another task.
I also haven't tried multi step, when model is called again with the results of a function call. Will do later
Quality Gate passed
Issues
116 New issues
0 Accepted issues
Measures
0 Security Hotspots
0.0% Coverage on New Code
2.5% Duplication on New Code