llm
llm copied to clipboard
Add -u usage flag to `llm chat`
I really enjoy tossing in -u to my normal LLM prompts to get a good gut sense of cost. But llm chat doesn't support the -u flag. I think the conversation cost should be roughly quadratic in length since each conversation turn requires sending the entire conversation history as input tokens, but it would be cool to have a -u flag actally confirm it.
Implementation could be either a turn-by-turn usage cost or one final tally at the end of the chat.