semantic-kernel
semantic-kernel copied to clipboard
Python: #6499 Mistral AI Function Calling
Motivation and Context
- Why is this changed required ? To enable Mistral Models with Semantic Kernel, there was the issue #6499 in the Backlog to add a MistralAI Connector
- What problem does it solve ? It solves the problem, that semantic kernel is not yet integrated with MistralAI
- What scenario does it contribute to? The scenario is to use different connector than HF,OpenAI or AzureOpenAI. When User's want to use Mistral they can easliy integrate it now
- If it fixes an open issue, please link to the issue here. #6499
Description
The changes made are designed by the open_ai connector, i tried to stay as close as possible to the structure. For the integration i installed the mistral python package in the repository.
After Integrating ChatCompletion and Embeddings now i would like to bring in FunctionCalling
What is integrated yet :
- [X] MistralAI FunctionCalling
- [X] MistralAI FunctionCalling Streaming
- [x] Extended Testing including Unit Testing
- [x] Integration Tests with FunctionCalling
Some Notes
- Integration Tests are skipped when the environment variables are not to not break the current Integration Tests
Contribution Checklist
- [X] The code builds clean without any errors or warnings
- [X] The PR follows the SK Contribution Guidelines and the pre-submission formatting script raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone :smile: