guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Add support for PaLM models, such as chat-bison and text-bison

Open ivano-donadi-ennova opened this issue 2 years ago • 2 comments

Hi,

I added PaLM models to the available LLMs in guidance. A quick overview can be found in the dedicated notebook in the llm folder.

New features:

  • "context" alias for the system role
  • "example" role containing input and output blocks to provide demonstration examples, including notebook formatting of examples
  • support for library (through VertexAI SDK) and rest calls to google's text genertion and chat models with or without streaming

Issues:

  • the select tool requires tokenization of the prompt. However, to the best of my knowledge, there is not an equivalent to tiktoken providing google's models tokenizers. This could still be done by a billable rest call to their tokenization API, but I would like to get community feedback to decide whether to implement this feature or if there are other workarounds.

Tests:

  • I tried to replicate all tests done on OpenAI models on PaLM, with the exception of the select tool.

Please let me know what you think!

ivano-donadi-ennova avatar Sep 07 '23 14:09 ivano-donadi-ennova

Hi,

Google doesn't provide any Tokenizer open-source library, so I think you should change self._tokenizer to a restful api wrapper original copied from _openai.py

        import tiktoken
        if encoding_name is None:
            encoding_name = tiktoken.encoding_for_model(model).name
        self._tokenizer = tiktoken.get_encoding(encoding_name)

xnohat avatar Sep 18 '23 03:09 xnohat

Hi, unfortunately, google's text embedding apis do not seem to include a decoding option, which is required in the 'select' pipeline.

ivano-donadi-ennova avatar Sep 18 '23 09:09 ivano-donadi-ennova