Flowise
Flowise copied to clipboard
Loading private model from huggingface hub
I have created a basic flow which connects the LLMChain with huggingface hub and a basic prompt. The flow works fine with any public open source models like falcon-7b. However, When I am trying to load my fine-tuned private model from the huggingface hub, it gives me error when I am asking questions: Task not found for this model
What would be the reason for this output?
if you just try to use HuggingFace Inference API outside of flowise with your model does it work?
Yes, The model works fine when loaded with huggingface library in Python.