Tony Wood

Results 12 comments of Tony Wood

Thanks for testing, weird that it did not affect you. Let me turn on logs and next time it happens I will post.

I notice in Crewai, version 0.117.0 there is support for OpenAI GPT 4.1... I think then that this error with unsupported GPT 4.1.. I will keep an eye out for...

Damn it happened again.. Full error log LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'. Error during LLM call: litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: OpenAIException - Error code: 400 -...

I apologise for not being clear about the error As I do not have control over the agent and the website it scraps then I cannot avoid the error. My...

LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'. Error during LLM call: litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: OpenAIException - Error code: 400 - {'error': {'message': "This model's maximum context length...

These are all from the same routine and I do not have a 7m context window. Something weird is going on here

From the issues I am seeing.. I think the web scrap in some circumstance is reading a binary rather than text. This would explain the token size maybe.

Sadly this still occurs. Not at often Here is an extra as I think it is PDF content that is causing the problem i asked GPT what it was. (...

I do not know where the PDF is... It is coming from an online search from ScrapeWebsiteTool(). How would I track that down? This is an example of the task...

I cannot share the entire project as it is for a client.. However I am building another and if i get the same issue I will give you access. And...