.Net: Ollama LocalModels FunctionCalling Support
Update Ollama Connector to allow usage of the pattern for function calling using the Ollama’s raw mode.
Reference:
As LLamaSharp support Tooling, will this bring changes in overall Ollam Connector or is it specific to Mistral?
Hello @RogerBarreto, Is there any update on this issue?
@taha-ghadirian this feature is coming in the following week.
Thanks for the interest!
@taha-ghadirian this feature is coming in the following week.
Thanks for the interest!
Thanks for your answer. I'm eagerly waiting for ollama connector, Does this implemantation also supports function calling as are available in AzureAi and OpenAi?
Hey,
Any update on this functionality?
Thanks!
@RogerBarreto is there an ETA on #9488 to be included in a published package? I believe the current 1.28.0-alpha does not have that.
@muqeet-khan As this is already part of main, it will be available as soon we publish our next release. Which normally happens on Tuesday or Wednesday's.
@RogerBarreto Hi, Roger
I have updated all the packages to version 1.29.0. According to the example code in the Samples->Demo->OllamaFunctionCalling (using chatCompletionService.GetChatMessageContentAsync), The action works as expected and successfully calls the plugin. However, when using chatCompletionService.GetStreamingChatMessageContentsAsync, it appears to be the same situation as before, returning a response like the one below without invoking the function calling.
{"name": "CurrentDateTimePlugin", "parameters": {}}
I'm wondering if this is a current limitation?