Season
Season
Hi, any folks have ideas how to solve this error? Thanks.
Exact issue with you @evrenyal , step 99 in 1 second X_X ``` docker run \ -e LLM_API_KEY="ollama" \ -e LLM_MODEL="ollama/llama3:8b" \ -e LLM_EMBEDDING_MODEL="local" \ -e LLM_BASE_URL="http://localhost:11434" \ -e WORKSPACE_DIR="./shared_workspace"...
I am pretty sure with only ollama it's fine for inferencing @evrenyal
@rbren ``` docker run \ --add-host host.docker.internal=host-gateway \ -e LLM_API_KEY="ollama" \ -e LLM_MODEL="ollama/llama3:8b" \ -e LLM_EMBEDDING_MODEL="local" \ -e LLM_BASE_URL="http://host.docker.internal:11434" \ -e WORKSPACE_DIR="./shared_workspace" \ -v /home/seasonbobo/Desktop/simsreal/shared_workspace:/opt/workspace_base \ -v /var/run/docker.sock:/var/run/docker.sock \ -p...
Same issue .
@enyst ``` File "/app/.venv/lib/python3.12/site-packages/openai/_exceptions.py", line 81, in __init__ super().__init__(message, response.request, body=body) ^^^^^^^^^^^^^^^^ AttributeError: 'NoneType' object has no attribute 'request' 17:25:38 - opendevin:ERROR: agent_controller.py:178 - Error condensing thoughts: 'NoneType' object has...
> @spoonbobo If that's the only place it appears in the error, then it's just showing up because LiteLLM uses some class definitions for its OpenAI compatibility. It doesn't mean...
> The BoT-SORT indeed is real-time. > > It is the CMC that takes up most of the computation time. For reference, you can check [this](https://github.com/viplix3/BoTSORT-cpp/blob/main/docs/PerformanceReport.md). > > When CMC...
I have read the Contributor License Agreement and I hereby accept the Terms.
> Should this be merged for 0.8 release? Thanks for getting back, i personally find it quite funny and inspiring example :D