ireicht
ireicht
I get the same error when using LMStudio for local inference : ``` @CrewBase class Crew1Test(): llm = LLM(model="lm_studio/meta-llama-3-8b-instruct", api_key="fsdf", base_url="http://localhost:1234/v1", temperature=0.7) @agent def researcher(self) -> Agent: return Agent( config=self.agents_config['researcher'],...
thank you for your suggestion. changing the argument `... llm = LLM(model="ollama/lm_studio/meta-llama-3-8b-instruct", ... ` results into this output: ``` Running the Crew LLM value is already an LLM object LLM...