crawl4ai icon indicating copy to clipboard operation
crawl4ai copied to clipboard

[Bug]: MCP SSE is failing with "Unexpected message"

Open betterthanever2 opened this issue 6 months ago • 19 comments

crawl4ai version

0.7.0

Expected Behavior

When called upon, MCP SSE server responds properly

Current Behavior

MCP server is added to Claude Code and shows as "connected", however, trying to use it results in Error: All connection attempts failed

Snippet of the docker compose logs is below.

I'm trying to use the mcp for getting markdown of a webpage, no additional params.

Is this reproducible?

Yes

Inputs Causing the Bug

I was using "https://www.numberanalytics.com/blog/advanced-propaganda-detection", but pretty sure URL has nothing to do with this

Steps to Reproduce


Code snippets


OS

Linux/Docker

Python version

3.11

Browser

No response

Browser version

No response

Error logs & Screenshots (if applicable)

crawl4ai  | [2025-07-17 17:57:40 +0000] [13] [ERROR] Exception in ASGI application
crawl4ai  |   + Exception Group Traceback (most recent call last):
crawl4ai  |   |   File "/usr/local/lib/python3.12/site-packages/starlette/_utils.py", line 77, in collapse_excgroups
crawl4ai  |   |     yield
crawl4ai  |   |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 183, in __call__
crawl4ai  |   |     async with anyio.create_task_group() as task_group:
crawl4ai  |   |                ^^^^^^^^^^^^^^^^^^^^^^^^^
crawl4ai  |   |   File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 772, in __aexit__
crawl4ai  |   |     raise BaseExceptionGroup(
crawl4ai  |   | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
crawl4ai  |   +-+---------------- 1 ----------------
crawl4ai  |     | Traceback (most recent call last):
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi
crawl4ai  |     |     result = await app(  # type: ignore[func-returns-value]
crawl4ai  |     |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
crawl4ai  |     |     return await self.app(scope, receive, send)
crawl4ai  |     |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
crawl4ai  |     |     await super().__call__(scope, receive, send)
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__
crawl4ai  |     |     await self.middleware_stack(scope, receive, send)
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in __call__
crawl4ai  |     |     raise exc
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in __call__
crawl4ai  |     |     await self.app(scope, receive, _send)
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 182, in __call__
crawl4ai  |     |     with recv_stream, send_stream, collapse_excgroups():
crawl4ai  |     |                                    ^^^^^^^^^^^^^^^^^^^^
crawl4ai  |     |   File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
crawl4ai  |     |     self.gen.throw(value)
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/_utils.py", line 83, in collapse_excgroups
crawl4ai  |     |     raise exc
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 185, in __call__
crawl4ai  |     |     await response(scope, wrapped_receive, send)
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 223, in __call__
crawl4ai  |     |     async for chunk in self.body_iterator:
crawl4ai  |     |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 169, in body_stream
crawl4ai  |     |     assert message["type"] == "http.response.body", f"Unexpected message: {message}"
crawl4ai  |     |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
crawl4ai  |     | AssertionError: Unexpected message: {'type': 'http.response.start', 'status': 200, 'headers': [(b'content-length', b'4'), (b'content-type', b'application/json')]}
crawl4ai  |     +------------------------------------
crawl4ai  |
crawl4ai  | During handling of the above exception, another exception occurred:
crawl4ai  |
crawl4ai  | Traceback (most recent call last):
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi
crawl4ai  |     result = await app(  # type: ignore[func-returns-value]
crawl4ai  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
crawl4ai  |     return await self.app(scope, receive, send)
crawl4ai  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
crawl4ai  |     await super().__call__(scope, receive, send)
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__
crawl4ai  |     await self.middleware_stack(scope, receive, send)
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in __call__
crawl4ai  |     raise exc
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in __call__
crawl4ai  |     await self.app(scope, receive, _send)
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 182, in __call__
crawl4ai  |     with recv_stream, send_stream, collapse_excgroups():
crawl4ai  |                                    ^^^^^^^^^^^^^^^^^^^^
crawl4ai  |   File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
crawl4ai  |     self.gen.throw(value)
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/_utils.py", line 83, in collapse_excgroups
crawl4ai  |     raise exc
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 185, in __call__
crawl4ai  |     await response(scope, wrapped_receive, send)
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 223, in __call__
crawl4ai  |     async for chunk in self.body_iterator:
crawl4ai  |   File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 169, in body_stream
crawl4ai  |     assert message["type"] == "http.response.body", f"Unexpected message: {message}"
crawl4ai  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
crawl4ai  | AssertionError: Unexpected message: {'type': 'http.response.start', 'status': 200, 'headers': [(b'content-length', b'4'), (b'content-type', b'application/json')]}
crawl4ai  | 2025-07-17 17:57:43,378 - mcp.server.lowlevel.server - INFO - Processing request of type ListToolsRequest
crawl4ai  | 2025-07-17 17:57:43,389 - mcp.server.lowlevel.server - INFO - Processing request of type ListResourcesRequest
crawl4ai  | 2025-07-17 17:58:35,778 - mcp.server.lowlevel.server - INFO - Processing request of type CallToolRequest

betterthanever2 avatar Jul 17 '25 18:07 betterthanever2

i get the same

romeoman avatar Jul 19 '25 00:07 romeoman

I got the same and found the code is using config.yml, the "app" section. There is a "glitch", the port ist set to 11234. If I correct this to 11235 the error disappears.

Application Configuration

app: title: "Crawl4AI API" version: "1.0.0" host: "0.0.0.0" port: 11234 <=== Change this to 11235 reload: True workers: 1 timeout_keep_alive: 300

Shiny71 avatar Jul 19 '25 17:07 Shiny71

@Shiny71 this is interesting, thank you. However, now I'm getting Error: MCP error -32602: Invalid request parameters, and I see the following in the docker logs:

crawl4ai  | MCP server running on 0.0.0.0:11235
crawl4ai  | [2025-07-19 17:56:41 +0000] [14] [INFO] Started server process [14]
crawl4ai  | [2025-07-19 17:56:41 +0000] [14] [INFO] Waiting for application startup.
crawl4ai  | [2025-07-19 17:56:45 +0000] [14] [INFO] Application startup complete.
crawl4ai  | [INIT].... → Crawl4AI 0.7.1
crawl4ai  | 2025-07-19 18:08:51,317 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:08:59,745 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:09:03,552 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:09:13,693 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:09:27,757 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:09:35,179 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:09:42,713 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:09:48,381 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:09:58,508 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 7, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:09:58,508 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 4, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:09:58,509 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 3, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:09:58,510 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 9, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:09:58,510 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 8, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:09:58,511 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 6, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:09:58,511 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 10, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:10:26,838 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:10:36,198 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:11:01,264 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:11:30,466 - root - WARNING - Failed to validate request: Received request before initialization was complete
crawl4ai  | 2025-07-19 18:12:01,584 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 13, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:12:01,585 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 14, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:12:01,585 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 11, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'
crawl4ai  | 2025-07-19 18:12:01,586 - root - WARNING - Failed to validate notification: RequestResponder must be used as a context manager. Message was: method='notifications/cancelled' params={'requestId': 12, 'reason': 'AbortError: This operation was aborted'} jsonrpc='2.0'

What is your experience?

betterthanever2 avatar Jul 19 '25 18:07 betterthanever2

Hello Everyone,

I made changes to the config.yml file and I am still having the same issue... for the life of me I can't get this mcp server to work at all... any help would be appreciated!

2025-07-27 03:06:27,597 INFO success: redis entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) 2025-07-27 03:06:27,597 INFO success: gunicorn entered RUNNING state, process has stayed up for > than 1 seconds (startsecs) [2025-07-27 03:06:27 +0000] [8] [INFO] Started server process [8] [2025-07-27 03:06:27 +0000] [8] [INFO] Waiting for application startup. [INIT].... → Crawl4AI 0.7.2 [2025-07-27 03:06:27 +0000] [8] [INFO] Application startup complete.

[2025-07-27 03:10:06 +0000] [8] [ERROR] Exception in ASGI application

  • Exception Group Traceback (most recent call last): | File "/usr/local/lib/python3.12/site-packages/starlette/_utils.py", line 77, in collapse_excgroups | yield | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 183, in call | async with anyio.create_task_group() as task_group: | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 772, in aexit | raise BaseExceptionGroup( | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi | result = await app( # type: ignore[func-returns-value] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in call | return await self.app(scope, receive, send) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in call | await super().call(scope, receive, send) | File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in call | await self.middleware_stack(scope, receive, send) | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in call | raise exc | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in call | await self.app(scope, receive, _send) | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 182, in call | with recv_stream, send_stream, collapse_excgroups(): | ^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.12/contextlib.py", line 158, in exit | self.gen.throw(value) | File "/usr/local/lib/python3.12/site-packages/starlette/_utils.py", line 83, in collapse_excgroups | raise exc | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 185, in call | await response(scope, wrapped_receive, send) | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 223, in call | async for chunk in self.body_iterator: | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 169, in body_stream | assert message["type"] == "http.response.body", f"Unexpected message: {message}" | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | AssertionError: Unexpected message: {'type': 'http.response.start', 'status': 200, 'headers': [(b'content-length', b'4'), (b'content-type', b'application/json')]} +------------------------------------

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in call await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in call raise exc File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in call await self.app(scope, receive, _send) File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 182, in call with recv_stream, send_stream, collapse_excgroups(): ^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/contextlib.py", line 158, in exit

greg-hydrogen avatar Jul 27 '25 03:07 greg-hydrogen

I don't know if anybody would find this useful, but errors I mentioned above were caused when using ClaudeCode. When I tried using it via AugmentCode, the MCP miraculously worked fine. I don't know what is the difference between the 2 in this particular regard, but there you have it. @greg-hydrogen which client are you using it with?

betterthanever2 avatar Jul 27 '25 05:07 betterthanever2

@betterthanever2 - i am using langflow mcp and nothing...can't get it to work with mcp Inpector either....

greg-hydrogen avatar Jul 27 '25 23:07 greg-hydrogen

It appears to be an issue caused by the incompatibility between the MCP Server implementation and Starlette. https://github.com/encode/starlette/pull/2953

luoxiaohei avatar Jul 29 '25 08:07 luoxiaohei

I have the same issue Claude Code run on WSL2. Gemini suggested the following two lines of code. I will try that once my Claude usage limit is reset.

claude mcp remove c4ai-sse
claude mcp add --transport sse c4ai-sse http://127.0.0.1:11235/mcp/sse

gemini's-suggestions.txt

ywatanabe1989 avatar Jul 31 '25 19:07 ywatanabe1989

Sorry, the above two lines did not work.

ywatanabe1989 avatar Aug 01 '25 00:08 ywatanabe1989

Not sure if related but something in v0.7 broke the SSE endpoints, rolling back to image: unclecode/crawl4ai:0.6.0rc1-r2 allowed Claude Code to still use as a MCP server with the following configuration:

"crawl4ai": {
      "type": "sse",
      "url": "http://localhost:11235/mcp/sse"
    }

darth-veitcher avatar Aug 01 '25 10:08 darth-veitcher

Thank you very much. Yes, image: unclecode/crawl4ai:0.6.0rc1-r2 works on Claude Code.

(.env-3.11) (wsl) SciTeX-Code $ docker run -d -p 11235:11235 --name crawl4ai --shm-size=1g unclecode/crawl4ai:0.6.0rc1-r2 3ba6ae... (.env-3.11) (wsl) SciTeX-Code $ cld MCP config found: /home/ywatanabe/.claude/mcp-config.json ╭───────────────────────────────────────────────────╮ │ ✻ Welcome to Claude Code! │ │ │ │ /help for help, /status for your current setup │ │ │ │ cwd: /home/ywatanabe/proj/SciTeX-Code │ ╰───────────────────────────────────────────────────╯

could you try to use crawl4ai?

● I'll help you try using Crawl4ai. Let me first check the current MCP configuration and test the Crawl4ai server connection.

● Read(.claude/mcp-config.json) ⎿  Read 47 lines (ctrl+r to expand)

● I can see that Crawl4ai is configured as the "c4ai-sse" MCP server. Let me test if it's working by trying to use one of its functions:

● c4ai-sse - md (MCP)(url: "https://example.com") ⎿  {
"url": "https://example.com", "filter": "fit", … +7 lines (ctrl+r to expand)

c4ai-sse

/mcp

● I can see the Crawl4ai MCP server (c4ai-sse) is working! Let me show you the available MCP resources and functions.

● listMcpResources(List all MCP resources) ⎿  [
{ "name": "Browser console logs", … +5 lines (ctrl+r to expand)

ywatanabe1989 avatar Aug 01 '25 10:08 ywatanabe1989

I have created a PR to fix this problem. https://github.com/unclecode/crawl4ai/pull/1373/files

Please see the log in which Claude Code uses the crawl4ai MCP server: https://github.com/unclecode/crawl4ai/pull/1373/files#diff-dda4f1db7d9e544bf6c82b82ccf0aab7604c56a9caf1c22725a153ea78c33af3

ywatanabe1989 avatar Aug 06 '25 20:08 ywatanabe1989

Can confirm that their latest version v0.7.4 is still having issue. As mentioned above, unclecode/crawl4ai:0.6.0rc1-r2 works.

Thanks

vandolphreyes94 avatar Aug 18 '25 13:08 vandolphreyes94

Can confirm. Facing the same issue with v0.7.4.

thesabbir avatar Sep 02 '25 05:09 thesabbir

Still not working.

Expro avatar Sep 08 '25 19:09 Expro

I can confirm only works with crawl4ai:0.6.0rc1-r2

docker rm -f crawl4ai >/dev/null 2>&1 || true; docker run -d --restart=unless-stopped -p 11235:11235 --name crawl4ai --shm-size=1g -e OPENAI_API_KEY='sk-xxxxxxxxxxxx' -e LLM_PROVIDER='openai/gpt-5' unclecode/crawl4ai:0.6.0rc1-r2
"crawl4ai": {
  "type": "sse",
  "url": "http://localhost:11235/mcp/sse"
},

CyberT33N avatar Sep 21 '25 16:09 CyberT33N

🎉 Solution Delivered!

The MCP SSE transport issues have been resolved with a complete migration to modern HTTP transport.

New PR: #1525 - feat(mcp): migrate to modern HTTP transport with clean architecture refactor

Key Achievements:

  • ✅ Migrated from deprecated SSE to standard FastMCP HTTP transport
  • ✅ All 7 MCP tools fully functional and tested
  • ✅ Clean service-oriented architecture with comprehensive error handling
  • ✅ 35 new tests with 1,507 lines of coverage ensuring reliability
  • ✅ Zero breaking changes - full backward compatibility maintained

This represents a complete modernization that aligns crawl4ai with current MCP ecosystem best practices while eliminating the transport-layer stability issues.

Ready for testing and review!

leoric-crown avatar Sep 28 '25 10:09 leoric-crown

@leoric-crown Any idea when this gonna be merged?

betterthanever2 avatar Oct 13 '25 07:10 betterthanever2

Still broken in latest release. It would be nice to have some info on the main page if you need MCP (the key feature) use 0.6.0rc1-r2

Anyway, anyone who already has integrated version v0.7.6 into their workflow and just discovered it isn't working feel free to use my backport of the fix @ywatanabe1989 has created to v0.7.6 here https://github.com/lukolszewski/crawl4ai/tree/v0.7.6-fixed simply build from, that branch. I also fix another issue where nvidia stuff was attempted to be installed but without non-free repo being added in apt sources.

Thanks @ywatanabe1989 for providing your fix.

@leoric-crown I'd love to test your rearchitect, but I need to get up and running quickly so I've fixed v0.7.6 for myself. Anyone is welcome to use it. I'm looking forward to your code being merged.

Edit: There was another bug that killed the connection after 20s, this was especially bad for llm.

lukolszewski avatar Nov 13 '25 15:11 lukolszewski