Tanishq Tembhurne

Results 2 comments of Tanishq Tembhurne

Try this: ``` llm = LLM( model="meta-llama/llama-3.2-1b-instruct:free", temperature=0.7, base_url="https://openrouter.ai/api/v1", api_key="your api model key" ) ``` After this, pass llm=llm as a parameter in the Agent

You don’t need to use langchain constructor as CrewAI uses LiteLLM internally. Use the LLM class like this: `llm=LLM(model=“deepseek-r1:1.5b”, base_url=“http://localhost:11434/”)`