crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

Two new Wiki addition suggestions. Configuring with LLM providers and Local model quality/experiences.

Open JRBobDobbs opened this issue 2 years ago • 2 comments

A feature request for the wiki

  1. A section on configuring CrewAI with different LLM providers, as well as local model providers such as LM-Studio, llama.cpp etc. So far, I have been able to get local models working with Ollama, mistral-medium on the Mistral AI platform and mistral-small with Anyscale. Nevertheless, I had to search through the Discord community to get hints to configure the later.

  2. A section on local models, which ones work, how well they work. Which models to avoid, and so forth. In my limited testing so far, it's obvious some open source models punch way above their weight. A section like this could shorten the time to develop a useful implementation and potentially hasten adoption.

JRBobDobbs avatar Jan 07 '24 00:01 JRBobDobbs

@JRBobDobbs do you want to leave some hints how to include llama cpp, I am using python version

ludar avatar Jan 09 '24 16:01 ludar

100% we have a PR coming soon to add this to the new docs, we will get rid of the wiki in favor of a proper documentation website

joaomdmoura avatar Jan 21 '24 03:01 joaomdmoura