LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

support functionary models

Open mudler opened this issue 1 year ago • 3 comments

Functionary models are trained to output functions.

We should just provide a flag to turn off calculating BNF grammars as we are already doing and just leave to the LLM to answer appropriately.

mudler avatar Feb 15 '24 19:02 mudler

It may also be necessary to provide a method to inject the definition of the tool as json string into the prompt template.

bdqfork avatar Feb 20 '24 03:02 bdqfork

It may also be necessary to provide a method to inject the definition of the tool as json string into the prompt template.

that's already present, you can render in the template by ranging over {{.Functions}}. See for example: https://github.com/mudler/LocalAI/blob/f9c75d487851749d3b382f64bb3d8a9bf52d94dd/embedded/models/hermes-2-pro-mistral.yaml#L23

mudler avatar Apr 18 '24 16:04 mudler

Will be nice , to use functionary models do you have already a yaml recipe for the functionary models?

olariuromeo avatar Aug 26 '24 00:08 olariuromeo