Example adding Entra Auth to OpenAI provider
Description
Added a new attribute to base LLM to allow for handling non-apikey based authentication models. Also moved the OpenAI header logic to async to allow for making calls to an IdP (Entra in this case).
Related to thread https://discord.com/channels/1108621136150929458/1131313996750917835/1251283613362683994
Note I know supporting SSO is a hot topic especially between the Open and possible value added release so this implementation may not be the best one to be reused by all the products.
I know we kicked around doing this IdP hit in config.ts as well, but the config load is sync for the moment so that would need to be an async model. I also had issues with filtering the model this code would be invoked on. If there was a LLM.model load generic (vs just a global load) it might allow for staging these environmental nuances to be staged in ~/.continue vs in the Extension itself. Sort of pre-commit hook type world.
So this code may not be the right thing to merge, but figured it could be an example of approaching this problem and rather than bespoke providers providing a baseLLM method for auth that could be done using userspace and ideally just config.json. One challenge in doing that might be module loading as in my example azure/identity is needed.
Checklist
- [x] The base branch of this PR is
dev, rather thanmain - [ ] The relevant docs, if any, have been updated or created
If this path is valuable I have no problem amending the PR to include doc on using authType as control logic in a provider.
Testing
So assumes that the OpenAI endpoint is in Azure OpenAI and uses Entra for authentication, but ideally it should allow for end to end authentication using Browser path to get the access_token and refresh if its expired within the session. This is similar to the AWS Bedrock use, but instead of requiring the user to make sure the STS is in their .aws path before calling it the token can be grabbed dynamically. I did poke at mimicking this with azcli, but they obfuscate the access token now and sort of force you to use azcli to expose it.
@byjrack I'm impressed by how clean this change is. What you point out with the synchronous nature of config.ts + the need to include external dependencies are major reasons that we've been looking for a long-term solution to SSO, rather than the situation we'd end up in by patching on dozens of providers to our code that must be shipped to every client. The direction we're taking is to do it through our teams product, and we have something working today, though we've been testing quietly in beta. I won't rule out this PR definitively quite yet, but can I email you? It would help to know what precisely your auth+LLM setup looks like. I'm confident either way we can get this working for you
Tried to make the change reusable across the base, but I don't love the if/then logic. You know it's a Pandora's box because every idp could need a custom package and possibly framework (some azcli requires python and then all the requests stuff). An oauth client could be pretty reusable, but the "it depends" is in the back of my mind. A hooks model I think could work, but again lots of corner cases.
Email is on the profile. Nothing really unique in our space, but can try to give as much info as I can.
Going to close this for reasons mentioned above and in other conversations. If we return to this will definitely look into something more extensible like hooks