OpenAIChatTarget is not compatible with new Azure OpenAI endpoints
The new GA version of Azure OpenAI does not use deployments in the URL (its just /openai/v1), and the latest version of official OpenAI() constructor accepts a callable for the api_key. Azure OpenAI also no longer has an API version.
Could OpenAIChatTarget be adjusted to work with the new Azure OpenAI endpoints?
I was hoping I could do this:
credential = azure.identity.DefaultAzureCredential() token_provider = azure.identity.get_bearer_token_provider( credential, "https://cognitiveservices.azure.com/.default" ) attack_llm = OpenAIChatTarget(model_name=os.environ["AZURE_AI_CHAT_DEPLOYMENT"], endpoint=os.environ["AZURE_AI_ENDPOINT"] + "openai/v1/", api_key=token_provider)
I don't want to use the entra auth parameter as I want more control over the credential ideally (sometimes I use other credential classes).
We'll have someone look into it, but I see two different problems here.
- Shorter URL: Instead of "https://
.openai.azure.com/openai/deployments/<deployment_name>/chat/completions" it's the shorter one that doesn't include the deployment name, but instead it needs to be sent in the body presumably. Note: OpenAI has been doing that forever and it works there so we may already be supporting it, but we need to confirm. - Auth: We currently support API key and Entra auth. You're trying to provide a custom token provider which is definitely different. That said, we could potentially make it more generic to let you set your own token provider URL. CC @jsong468 who's recently looked into Entra.
for 2. right now, we use DefaultAzureCredential by default and retrieve a token from there. To clarify, do you want to be able to explicitly define a credential like AzureCliCredential or EnvironmentCredential (not the scope i.e. https://cognitiveservices.azure.com/.default)?
Yes, that is the best practice for SDKs, to allow developers to bring their own credential. In the case of the official Python OpenAI SDK, it now accepts a token provider function as the api_key, so as long as OpenAIChatTarget passes on that token provider, we should be able to use that with Azure OpenAI deployments. I think it may only accept a string currently? I don't recall the error message but that's the most common issues with SDKs that wrap the OpenAI Python SDK, that they don't allow the new flexibility of the api_key parameter.
Gotcha, we will look into this! Having flexible credential options is a work item in our queue :)
I like the auth provider as an issue and direction, but we're likely punting for this release. If any oss contributor volunteers to take it, great - let us know! Or if not, the team will likely take it on in the next couple months.