llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

How to display a chat prompt after create_chat_completion

Open DmitryDiTy opened this issue 1 year ago • 0 comments

How can I see my prompt with generated text in message content after use:

chat = llm.create_chat_completion(
    messages,
    tools=tools
)

print(chat)
{'id': 'chatcmpl-e2235484-1ccb-4eb7-a93b-e9e171fbb2ee',
 'object': 'chat.completion',
 'created': 1725972811,
 'model': 'model.gguf',
 'choices': [{'index': 0,
   'message': {'role': 'assistant',
    'content': 'Hello! ....'},
   'logprobs': None, 
   'finish_reason': 'stop'}],
 'usage': {'prompt_tokens': 40, 'completion_tokens': 173, 'total_tokens': 213}}

DmitryDiTy avatar Sep 10 '24 13:09 DmitryDiTy