open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

Making the cost of interaction available as part of the core component.

Open guiramos opened this issue 2 years ago • 6 comments

Is your feature request related to a problem? Please describe.

No response

Describe the solution you'd like

When using open-interpreter as part of my python project, I'd like to have access to the cost of each interaction of the llm.

So, after each call to interpreter.chat there should be a way to extract the cost. The payload of the cost is visible in the logs and described in the 'Additional context' field.

Describe alternatives you've considered

No response

Additional context

When verbose = True we can see in the logs:

final cost: 0.036570000000000005; prompt_tokens_cost_usd_dollar: 0.03579; completion_tokens_cost_usd_dollar: 0.00078

I think this is coming from litellm.

guiramos avatar Feb 02 '24 13:02 guiramos

I think this is a good idea. Cost should be an attribute of Core rather than the TUI.

The changes would be straight-forward:

  • migrate interpreter/terminal_interface/utils/count_tokens.py to interpreter/core/utils/
  • update interpreter/terminal_interface/magic_commands.py imports

As @Notnaton pointed out, we can remove the dependency on tiktoken as well.

LiteLLM supports encoding text to allow cost_per_token calculations: https://docs.litellm.ai/docs/completion/token_usage

@KillianLucas do you support this change?

MikeBirdTech avatar Feb 02 '24 16:02 MikeBirdTech

Also, if possible, the OpenInterpreter is defaulting to "gpt-4" which is the most expensive model. Can we provide tha as an argument?

@Arrendy @KillianLucas

image

guiramos avatar Feb 03 '24 21:02 guiramos

We have cost limit already In llm.py?:

Self.max_budget = None

Notnaton avatar Feb 12 '24 11:02 Notnaton

@Notnaton , cost limit is one different than how much it did cost the last interaction with the chat. Programatically, this is basically unreacheable, after you call interpreter.chat()

guiramos avatar Feb 12 '24 14:02 guiramos

Interpreter.llm.max_budget should be reachable...

I will have a look at it when I come home in an hour. If I'm misunderstanding something please clarify.

Notnaton avatar Feb 12 '24 15:02 Notnaton

The Interpreter.llm.max_budget is reachable, what is not reachable is the cost of the last interaction with the chat.

We can see in the logs a line like this:

final cost: 0.036570000000000005; prompt_tokens_cost_usd_dollar: 0.03579; completion_tokens_cost_usd_dollar: 0.00078

But you cant get access to that data outside the 'interpreter' instance.

guiramos avatar Feb 12 '24 15:02 guiramos