crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

Couldn't use Ollama along with CrewAI agents [Reason: Stop Sequence]

Open Nanthagopal-Eswaran opened this issue 1 year ago • 3 comments

Hi,

I recently went through Multi AI Agent Systems with crewAI short course in DeepLearning.AI platform. I was trying to use the "Content Writer Crew" example in local using Ollama/llama3.

I noticed that crew was executing infinitely and I had to interrupt the kernel to stop.

I set up litellm proxy and saw the request that is going in. I tried to send the same request directly using llm and found the issue. Request: image Response: image

If we don't give the stop sequence "\nObservation", it stop at the above default stop sequences. But when we add the crewai stop sequence, it simply continues to generate.

I don't know think this stop sequence is general across all models/providers and also there is no mention about this in the prompt.

Could you check this enable support for Ollama also?

Nanthagopal-Eswaran avatar May 19 '24 17:05 Nanthagopal-Eswaran

As a stopgap, can this be fixed by altering the stop sequence expected by the ollama MODELFILE?

jcoombes avatar May 19 '24 17:05 jcoombes

Thanks @jcoombes for quick response.

As a temporary solution, I made below modification and able to execute the crew. But this only works for ollama/llama3 model now.

image

Nanthagopal-Eswaran avatar May 19 '24 19:05 Nanthagopal-Eswaran

I get that this is an unexpected issue from ollama where it should have handled both requested stop sequence as well as default stop sequence. I also raised an issue there - https://github.com/ollama/ollama/issues/4524

Here in our case, I couldn't get why do we need this stop sequence "\nObservation" when there is no mention about this in the prompt.

Nanthagopal-Eswaran avatar May 19 '24 19:05 Nanthagopal-Eswaran

As a stopgap, can this be fixed by altering the stop sequence expected by the ollama MODELFILE?

I did not create custom crewai ModelFile initially, it seems running for long time, then I followed https://docs.crewai.com/how-to/LLM-Connections/#setting-up-ollama to create a custom model crewai-llama3 , then it ran fine,

does PARAMETER stop Result mean that crewai will stop once getting result? Thanks

wgong avatar Jun 29 '24 23:06 wgong

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Aug 17 '24 12:08 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Aug 23 '24 12:08 github-actions[bot]