crewAI-examples icon indicating copy to clipboard operation
crewAI-examples copied to clipboard

local llms without using Ollama -with vllm or huggingface

Open rajeshkochi444 opened this issue 1 year ago • 1 comments

Hi,

Can we use local llms through vllm or huggingface without using ollama?

Thanks Rajesh

rajeshkochi444 avatar Mar 30 '24 20:03 rajeshkochi444

I'm also interessted in vllm connections.

FBR65 avatar Jun 18 '24 12:06 FBR65

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Sep 04 '24 13:09 github-actions[bot]

This issue was closed because it has been stale for 5 days with no activity.

github-actions[bot] avatar Sep 10 '24 02:09 github-actions[bot]