This model's maximum context length is 8192 tokens.
I'm searching for a theme using an agent and the search engine returns a lot of results. How do I limit the amount of tokens? I'm getting this error: BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 8192 tokens. However, your messages resulted in 8217 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
I'm using GPT-3.5 from OpenAI API. Is there a way to overcome that problem or to increase the amount of tokens?
try this ?
from langchain_openai import ChatOpenAI
def configure_language_model(model_name):
"""Function to configure the language model. """
return ChatOpenAI(temperature=0.7, model_name=model_name , max_tokens=8000)
researcher = Agent(
llm = configure_language_model("gpt-3.5-turbo-0125")
)
so it's possible to use langchain_openai configs to pass into the llm parameter? I thought only langchain and langchain-community can be used (in python 3.8 it yells at me when I install langchain and tells to install the langchain-community instead, because of deprecation). Is it also possible to limit the max tokens for local LLMs? what kind of interface the 'llm' param is, it accepts "anything related to llm init params from any APIs and companies"? 😅 kinda cool
That won't work in all the cases because there will be again a distribution between Input Tokens and Output Tokens. Which will again lead to Context length error.
Hello, I´m with same issue , Do you have any fix ?
Thanks a lot !
Hey John I was able to solve it using the RAG method but It started to hallucinate.
On Tue, Jul 9, 2024 at 9:47 PM JoseMiracydeSouzaFilho < @.***> wrote:
Hello, I´m with same issue , Do you have any fix ?
Thanks a lot !
— Reply to this email directly, view it on GitHub https://github.com/joaomdmoura/crewAI/issues/323#issuecomment-2218121010, or unsubscribe https://github.com/notifications/unsubscribe-auth/BBKEDE36EEHRKGEEOBGFL7LZLQEJ3AVCNFSM6AAAAABEJKIHF6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMJYGEZDCMBRGA . You are receiving this because you commented.Message ID: @.***>
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.