cai icon indicating copy to clipboard operation
cai copied to clipboard

Ollama Cloud isn't compitable with CAI.

Open DrDark1999 opened this issue 2 months ago • 5 comments

Hello Team,

Recently i have found that ollama is also providing cloud instance to their user. To access it we can use ollama API keys as documented over https://docs.ollama.com/cloud URL. I requet you, to integrate the ollama cloud with the service.

DrDark1999 avatar Nov 27 '25 09:11 DrDark1999

I got same issue, i think maybe the cloud filters it or w.e, i tried

my .env

OPENAI_API_KEY="sk-1234"
ANTHROPIC_API_KEY=""
OLLAMA="http://127.0.0.1:11434"
OLLAMA_PROVIDER="ollama"          # explicitly tell CAI the provider
OLLAMA_API_KEY="NA"
PROMPT_TOOLKIT_NO_CPR=1
CAI_STREAM=false

/model deepseek-v3.1:671b-cloud

scan my router

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/cli.py", line 1627, in run_cai_cli response = asyncio.run(Runner.run(agent, conversation_input)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/runners.py", line 194, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/sdk/agents/run.py", line 239, in run input_guardrail_results, turn_result = await asyncio.gather( ^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/sdk/agents/run.py", line 847, in _run_single_turn new_response = await cls._get_new_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/sdk/agents/run.py", line 1046, in _get_new_response new_response = await model.get_response( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/sdk/agents/models/openai_chatcompletions.py", line 698, in get_response response = await self._fetch_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/sdk/agents/models/openai_chatcompletions.py", line 2979, in _fetch_response raise e File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/sdk/agents/models/openai_chatcompletions.py", line 2835, in _fetch_response return await self._fetch_response_litellm_openai( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/cai/sdk/agents/models/openai_chatcompletions.py", line 3246, in _fetch_response_litellm_openai ret = await litellm.acompletion(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/litellm/utils.py", line 1461, in wrapper_async raise e File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/litellm/utils.py", line 1322, in wrapper_async result = await original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/litellm/main.py", line 476, in acompletion _, custom_llm_provider, _, _ = get_llm_provider( ^^^^^^^^^^^^^^^^^ File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/litellm/litellm_core_utils/get_llm_provider_logic.py", line 360, in get_llm_provider raise e File "/home/jay/Documents/Scripts/AI/CAI-Autonomous-Ethical-Hacking/cai_env/lib/python3.12/site-packages/litellm/litellm_core_utils/get_llm_provider_logic.py", line 337, in get_llm_provider raise litellm.exceptions.BadRequestError( # type: ignore litellm.exceptions.BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3.1:671b-cloud Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers CAI>

jamieduk avatar Nov 27 '25 15:11 jamieduk

Thank you for sharing your suggestion with us. We truly appreciate your input and interest in improving CAI.

While our current priority is addressing requests from our CAI PRO customers, we remain committed to making CAI the best possible tool for all users. Your feedback has been logged and will definitely be reviewed by our team for consideration in future releases.

If you'd like to help accelerate this process, we warmly welcome community contributions! Feel free to submit a MR with your proposed changes. Our team would be happy to review it collaboratively.

Closing this.

aliasrobotics-support avatar Dec 01 '25 09:12 aliasrobotics-support

for time being check this out i made it work by making this from scratch using ai and python and local ollama and ollama cloud model all for free https://github.com/jamieduk/cai-ollama-python-ai/

jamieduk avatar Dec 01 '25 17:12 jamieduk

for time being check this out i made it work by making this from scratch using ai and python and local ollama and ollama cloud model all for free https://github.com/jamieduk/cai-ollama-python-ai/

Love this contribution. Many thanks!

vmayoral avatar Dec 02 '25 08:12 vmayoral

Just re-opened this ticket. @jamieduk, not very familiar with Ollama Cloud but the value proposition seems to be centered around self-served models that are quantized for scaling, right?

Based on experience, I'd argue this won't add much value to real-world/professional pentests but I don't see why not to integrate this at the core of CAI. @jamieduk would you be willing to contribute some of your capabilities to CAI's upstream?

If so, I'd be happy to review such a PR.

vmayoral avatar Dec 02 '25 09:12 vmayoral

Thank you @jamieduk again for your contribution. Please check latest PR #372 which includes the use of Ollama Cloud in CAI

pzabalegui avatar Dec 10 '25 13:12 pzabalegui

Closing this.

aliasrobotics-support avatar Dec 10 '25 13:12 aliasrobotics-support