llama-cpp-python
llama-cpp-python copied to clipboard
phi3 chat format
Hello,
Could you add the chat format for the phi3 models ?
It is described here :
<|user|>\nQuestion<|end|>\n<|assistant|>\nAnswer<|end|>\n
Maybe something like this in llama_chat_format.py ?
@register_chat_format("phi3")
def format_phi3(
messages: List[llama_types.ChatCompletionRequestMessage],
**kwargs: Any,
) -> ChatFormatterResponse:
_roles = dict(
user="<|user|>\n",
assistant="<|assistant|>\n",
)
_sep = "<|end|>\n"
_messages = _map_roles(messages, _roles)
_messages.append((_roles["assistant"], None))
_prompt = _format_no_colon_single("", _messages, _sep)
return ChatFormatterResponse(prompt=_prompt, stop=_sep)