semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

.Net: Bug: Ollama with multiple tool calls concurrently

Open dotnetKyle opened this issue 6 months ago • 2 comments

Bug

An LLM is calling multiple tools in an array and semantic-kernel is printing the JSON as text instead of calling the function.

The LLM I am using is Ollama Devstral and I am using OpenAI chat completion with it.

I believe could be an Ollama Devstral problem and not a semantic-kernel problem, but I want to ask to be sure. Here is the format of the assistant's actual "tool call":

Example 1

 [
    {
        "name": "plugin-Get_File_List",
        "arguments": {
            "path": "C:\\Users\\<username>\\source\\repos\\<FolderName>",
            "skip": 0,
            "take": 25
        }
    },
    {
        "name": "plugin-Directory_Search",
        "arguments": {
            "path": "C:\\Users\\<username>\\source\\repos\\<FolderName>",
            "searchPattern": "README.md",
            "skip": 0,
            "take": 1
        }
    },
    {
        "name": "plugin-ReadFileText",
        "arguments": {
            "path": "C:\\Users\\<username>\\source\\repos\\<FolderName>\\README.md"
        }
    }
]

Open AI Docs

I believe the above format might be incorrect, see the 'Output' section in the Open AI Platform Docs - Function Calling. The docs say it should look more like this:

[{
    "type": "function_call",
    "id": "fc_12345xyz",
    "call_id": "call_4567xyz",
    "name": "search_knowledge_base",
    "arguments": "{\"query\":\"What is ChatGPT?\",\"options\":{\"num_results\":3,\"domain_filter\":null,\"sort_by\":\"relevance\"}}"
}]

So is my Devstral model sending bad JSON, or should the semantic-kernel handle this?

Example 2

Here is another example that happens often:

[{"name":"FileSystemTools_Get_File_List","arguments":{"path":"C:\\Users\\<username>\\source\\repos\\<FolderName>","skip":0,"take":30}}]

here is the OTel log of the response from ollama:

info: Microsoft.Extensions.AI.OpenTelemetryChatClient[1]
      {"finish_reason":"stop","index":0,"message":{"content":"[{\u0022name\u0022:\u0022FileSystemTools_Get_File_List\u0022,\u0022arguments\u0022:{\u0022path\u0022:\u0022C:\\\\Users\\\\*******\\\\source\\\\repos\\\\*******\u0022,\u0022skip\u0022:0,\u0022take\u0022:30}}]"}}

To Reproduce

Steps to reproduce the behavior:

  1. Download Ollama Devstral LLM
  2. Tell the LLM it has access to tools and encourage it to run as many tools as it needs to in the system prompt.

Expected behavior

// setup:
// local, so no api key
this.Services.AddOpenAIChatClient("devstral", new Uri(...));
...
// later after .Build()
this.kernel = this.Services.GetRequiredService<Kernel>();

// agent
this.agent = new ChatCompletionAgent
{
    Name = "AgentName",
    Kernel = this.kernel,
    Arguments = new KernelArguments(new PromptExecutionSettings
    {
        FunctionChoiceBehavior = FunctionChoiceBehavior.None(
            null, 
            new FunctionChoiceBehaviorOptions
            {
                // I have tried changing these around including keeping defaults, unsure of what this should be set to
                AllowConcurrentInvocation = true,
                AllowParallelCalls = false,
                AllowStrictSchemaAdherence = true
            })
    })
};
...

async Task InvokeChat(ChatHistory chat, ChatCompletionAgent agent)
{
  IAsyncEnumerable<AgentResponseItem<StreamingChatMessageContent>> response = 
      agent.InvokeStreamingAsync(chat, cancellationToken: cancellationToken);
  
  await foreach (StreamingChatMessageContent content in response)
  {
      foreach (StreamingKernelContent item in content.Items)
      {
          if (item is StreamingTextContent text)
          {
              // Section 1
              // if it's Text Content, simply write to console
              Console.Write(text.Text);
          }
          else if (item is StreamingFunctionCallUpdateContent funcCall)
          {
              // Section 2
              // if it's a tool call execute the tool call
              this.InvokeTool(funcCall);
          }
          else { ...

When I encounter this behavior, I would expect the semantic-kernel to detect this and send the tool call to 'Section 2' above where this.InvokeTool(...) is, but instead it is sending it to the Console.WriteLine() in 'Section 1'.

dotnetKyle avatar Jul 27 '25 21:07 dotnetKyle

I am looking at correct tool calls and the OTel logs look a lot different than what I showed above:

info: Microsoft.Extensions.AI.OpenTelemetryChatClient[1]
      {"finish_reason":"tool_calls","index":0,"message":{"tool_calls":[{"id":"call_rvreisdm","type":"function","function":{"name":"GitRepoTools_Repository","arguments":{"path":"C:\\Users\\******\\source\\repos\\******","skip":0,"take":30}}}]}}
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function GitRepoTools-Repository invoking.
info: DevMcpClient.Plugins.GitRepoTools[0]
      Running Repository tool, "C:\Users\******\source\repos\******", (0, 30)
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function GitRepoTools-Repository succeeded.
info: Microsoft.SemanticKernel.KernelFunction[0]
      Function GitRepoTools-Repository completed. Duration: 0.8164957s

Note the "finish_reason":"tool_calls".

dotnetKyle avatar Jul 27 '25 21:07 dotnetKyle

This issue is stale because it has been open for 90 days with no activity.

github-actions[bot] avatar Oct 28 '25 02:10 github-actions[bot]

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar Nov 17 '25 02:11 github-actions[bot]