agent-zero icon indicating copy to clipboard operation
agent-zero copied to clipboard

Just installed with docker to test it, and changed all models to local Ollama, but Agent Zero still want access to OpenAI !

Open alainivars opened this issue 7 months ago • 7 comments

All "External Services" fields are empty All "Agent Settings" fields are set to Ollama and models or default How can I fix it ?

No models are reachable because I got the errors:

User message:

hi

Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/urllib3/connection.py", line 198, in _new_conn sock = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/urllib3/util/connection.py", line 60, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/socket.py", line 978, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ socket.gaierror: [Errno -3] Temporary failure in name resolution

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 787, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 488, in _make_request raise new_e File "/opt/venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 464, in _make_request self._validate_conn(conn) File "/opt/venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 1093, in _validate_conn conn.connect() File "/opt/venv/lib/python3.12/site-packages/urllib3/connection.py", line 704, in connect self.sock = sock = self._new_conn() ^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/urllib3/connection.py", line 205, in _new_conn raise NameResolutionError(self.host, self, e) from e urllib3.exceptions.NameResolutionError: <urllib3.connection.HTTPSConnection object at 0x7fe15c54a7b0>: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -3] Temporary failure in name resolution)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/requests/adapters.py", line 667, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 841, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/urllib3/util/retry.py", line 519, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x7fe15c54a7b0>: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -3] Temporary failure in name resolution)"))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/a0/agent.py", line 183, in _process_chain agent.hist_add_user_message(msg) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/agent.py", line 545, in hist_add_user_message msg = self.hist_add_message(False, content=content) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/agent.py", line 519, in hist_add_message return self.history.add_message(ai=ai, content=content, tokens=tokens) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/history.py", line 327, in add_message return self.current.add_message(ai, content=content, tokens=tokens) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/history.py", line 143, in add_message msg = Message(ai=ai, content=content, tokens=tokens) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/history.py", line 83, in init self.tokens: int = tokens or self.calculate_tokens() ^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/history.py", line 92, in calculate_tokens return tokens.approximate_tokens(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/tokens.py", line 25, in approximate_tokens return int(count_tokens(text) * APPROX_BUFFER) ^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/tokens.py", line 13, in count_tokens encoding = tiktoken.get_encoding(encoding_name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

21 stack lines skipped <<<

File "/opt/venv/lib/python3.12/site-packages/tiktoken/load.py", line 24, in read_file resp = requests.get(blobpath) ^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/requests/api.py", line 73, in get return request("get", url, params=params, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/requests/adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x7fe15c54a7b0>: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -3] Temporary failure in name resolution)"))

requests.exceptions.ConnectionError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x7fe15c54a7b0>: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -3] Temporary failure in name resolution)"))

alainivars avatar Jun 08 '25 15:06 alainivars

Same for LM-Studio. I expected it would offer to configure base-url, api-key-dummy, but was not offered. Where to configure such parametrs?

UPD:

adding base_url=http://[my-lm-studio-server-IP]:[my-lm-studio-server-port]/v1 to model additional parameters solved it. It's also important to note that base_url=http://localhost:port won't do it, coz localhost in this case is A0 container itself.

I guess that example provided in A0 docs assumes that ollama-server/lm-studio-server info were passed as .env variables. But that is not the case with simple docker install. So docs should be updated with information about that it is mandatory to provide base-url in case of ollama/lms/other type of inference engines.

nemomode7 avatar Jun 18 '25 01:06 nemomode7

Did you change all three models in settings?

Systemsplanet avatar Jun 20 '25 17:06 Systemsplanet

Many frameworks are often hardwired to look for the OPENAI_API_KEY environment variable for initialization. Providing a dummy value satisfies this requirement without needing a real key, and should allow the framework to proceed with using your chosen model. Try adding: OPENAI_API_KEY='DUMMY_KEY_FOR_LOCAL_LLM' to your .env file in the project root directory.

thirdeyenation avatar Jun 20 '25 19:06 thirdeyenation

Did you change all three models in settings?

All except embedding.

nemomode7 avatar Jun 20 '25 21:06 nemomode7

TLDR: (I will continue to update documents using a bottom - up approach)

Original: A solid and extremely thorough review, followed by a highly detailed and easy-to-follow revision of the A0 docs in their current form has been placed on my priority list with a grade of 7.7/10 defined as high priority. Please note, however: due to the system being in beta for a little while longer and the speed at which the team continues to work and pump out features, updates, bug fixes, changes, etc.. the focus will certainly be on established cemented (STABLE) versions and their respective features. As we cover everything (from a bottom-up approach), documentation will likely take a backseat as it approaches v0.8 until we have officially moved on to v0.9, (or specifically the features of said version and any relevant information) has been cemented by seeing no major changes (bug fixes not included unless they affect a method of navigation, configuration, feature(s), etc.). For now, I will work on documentation starting with the basics. Look for document release announcements on Discord, in merges, or possibly a public repository on my end if the the team prefers to wait until the documents are 100% complete before merging to main. I will be uploading sections at a time.

I most certainly recognize the importance of ensuring any/all documentation, specifically with regard to Public & Open-Source Projects such as Agent Zero — combined with the immense number of new changes, additional features, and so on. Also taking into consideration the faster development speed, the higher user base, and massive influx of new, genuinely interested individuals (or teams) joining the Agent Zero community with high hopes of finally finding the AI system we all dream of. Luckily, Agent Zero is undoubtedly that exact system. I feel this is a good time to mention that once one is able to get immersed in the A0 System, the A0 community, and the A0 life — by truly learning and becoming a confident Agent Zero Engineer — I promise that any user will agree that this is ABSOLUTELY the right place to land.

 That being said!

Having the resources available for enabling ease of onboarding, ease of community connection, ease of access, ease of set-up, ease of troubleshoot referencing, ease of communication, and ease of collaboration, etc., is certainly something that must be undertaken and available for ALL Agent Zero users.

In the meantime, while I undertake this Agent Zero Documentation Overhaul & Update process, if there are ANY OTHER (specific OR general) AREAS that you can identify as missing, need to be updated, or otherwise categorized as "Considerations for Document Inclusions" I invite you to share as such! In doing so, you must be crystal clear in explaining your stated item, your reasoning why said item should be included/added into the Official Agent Zero documentation(s): sections, statements, instructions, direction, workflows, functions, features, security & safety best practices, or "otherwise" crucial information for any user to begin working with & using the Agent Zero: AI Adaptive Agentic Digital Workstation, or a lack thereof), that would be sincerely appreciated.

I will keep this post saved and accessible in order to ensure you are provided updates as soon as possible.

Thank you to everyone! Working as a community is precisely what led Agent Zero. It is how the team behind the project was naturally formed, continuing after one year officially to dedicate the time, efforts, and energy toward building, sharing, and planning this absolutely amazing project. It has been a complete, natural process of letting things roll, even when not sure how it will turn out. Yet, that faith and trust placed in each other — in life — is truly what has gotten Agent Zero, and the amazing, active, intelligent user base or "Community" to its current point. With a 10x increase in community members, tech/project usage, and general engagement, this should only further solidify that the ultimate open-source, free-to-use, community-built system is special, and provides a TRUE gift - much like everyone in this community.

T.E.N., Agent Zero

thirdeyenation avatar Jun 20 '25 22:06 thirdeyenation

add this env to ur container with docker desktop or portainer

OLLAMA_BASE_URL="http://127.0.0.1:11434"

netixc avatar Jun 23 '25 08:06 netixc

add this env to ur container with docker desktop or portainer

OLLAMA_BASE_URL="http://127.0.0.1:11434"

rather "http://host.internal:11434" to reach outside of docker container

jisokuor avatar Aug 10 '25 12:08 jisokuor