AI Assistant Not Working in Supabase Studio GPT 5, GPT 5 mini
Describe the bug
The AI Assistant in Supabase Studio is failing with an error message "Something went wrong with the AI Assistant" when attempting to use it. The network request to the generate-v4 endpoint completes with a 200 status code, but the UI displays an error instead of processing the AI response.
To Reproduce
Steps to reproduce the behavior:
- Start Supabase local development installed either with node or brew
- Navigate to Supabase Studio at
http://127.0.0.1:54323 - Go to the Project Overview page
- Type a message in the AI Assistant input field (e.g., "hello world")
- Select any of two available options: GPT-5 || GPT-5-mini
- Click the send button or press Enter
- See error: "Something went wrong with the AI Assistant"
Expected behavior
The AI Assistant should process the user's query and return a helpful response, similar to how it works in the hosted Supabase Studio.
Screenshots
Screenshots are attached showing:
- Network tab displaying the
generate-v4POST request with 200 status - Error message in the UI: "Something went wrong with the AI Assistant"
- Response data showing streaming JSON format with reasoning and text content
- Console error - javascript parsing response logic error
System information
Ticket ID: #4436
Version of OS: macOS 24.5.0 (Darwin Kernel Version 24.5.0, ARM64)
Version of CLI: v2.58.8, v2.58.5
Version of Docker: v24.0.6 (build ed223bc)
Versions of services:
SERVICE IMAGE | LOCAL | LINKED
-----------------------|------------------------|--------
supabase/postgres | 17.6.1.042 | -
supabase/gotrue | v2.182.1 | -
postgrest/postgrest | v13.0.5 | -
supabase/realtime | v2.63.0 | -
supabase/storage-api | v1.29.1 | -
supabase/edge-runtime | v1.69.23 | -
supabase/studio | 2025.11.10-sha-5291fe3 | -
supabase/postgres-meta | v0.93.1 | -
supabase/logflare | 1.25.3 | -
supabase/supavisor | 2.7.4 | -
Browser: Chrome (based on developer tools interface)
Version of supabase-js: Not applicable (using Supabase CLI, not supabase-js client library)
Version of Node.js: v22.18.0
Additional context
-
Configuration: The
supabase/config.tomlfile hasopenai_api_key = "env(OPENAI_API_KEY)"configured in the[studio]section (line 53) -
Environment: The
.envfile exists and containsOPENAI_API_KEY(verified) -
Network Request Details:
- Endpoint:
generate-v4(POST) - Status: 200 OK
- Response Time: ~4.25 seconds
- Response Type: Streaming fetch response
- Response appears to contain valid JSON with streaming data chunks including:
-
{"type": "start"} -
{"type": "reasoning-start", ...} -
{"type": "text-start", ...} -
{"type": "text-delta", ...}
-
- Endpoint:
- Error Location: The error occurs in the frontend UI right after receiving the first bit of response stream, suggesting a client-side parsing or rendering issue rather than an API failure.
- Project: Local Supabase development environment
- Studio Port: 54323
- API Port: 54321
Potential Issues
- Frontend seems not to be able to parse the response. Maybe due to thinking models?
Updating studio to version 2025.11.17-sha-6a18e49 solved the issue for me
Yes, I confirm. There was some change in between that was apparently solved with the latest Studio. Cc @sweatybridge