[bug]: Waiting for API Service to Start
Is there an existing issue for this?
- [x] I have searched the existing issues
Current behavior
I try to install a fresh app then it work, but all my data was gone. So I do restore (using restore.sh) then it keep waiting for API service start for hours even I updated my application, docker, docker compose update to date.
I'm using Docker in Ubuntu 24.04 with a cloud server has 8GB RAM and 4vCores. Docker compose file same as: https://raw.githubusercontent.com/makeplane/plane/refs/heads/preview/deploy/selfhost/docker-compose.yml
docker-compose up give success on all tasks.
Start from ./setup.sh got this error.
--------------------------------------------
____ _ /////////
| _ \| | __ _ _ __ ___ /////////
| |_) | |/ _` | '_ \ / _ \ ///// /////
| __/| | (_| | | | | __/ ///// /////
|_| |_|\__,_|_| |_|\___| ////
////
--------------------------------------------
Project management tool from the future
--------------------------------------------
Select a Action you want to perform:
1) Install
2) Start
3) Stop
4) Restart
5) Upgrade
6) View Logs
7) Backup Data
8) Exit
Action [2]: 2
Builds, (re)creates, starts, and attaches to containers for a service.
Unless they are already running, this command also starts any linked services.
The `docker-compose up` command aggregates the output of each container. When
the command exits, all containers are stopped. Running `docker-compose up -d`
starts the containers in the background and leaves them running.
If there are existing containers for a service, and the service's configuration
or image was changed after the container's creation, `docker-compose up` picks
up the changes by stopping and recreating the containers (preserving mounted
volumes). To prevent Compose from picking up changes, use the `--no-recreate`
flag.
If you want to force Compose to stop and recreate all containers, use the
`--force-recreate` flag.
Usage: up [options] [--scale SERVICE=NUM...] [--] [SERVICE...]
Options:
-d, --detach Detached mode: Run containers in the background,
print new container names. Incompatible with
--abort-on-container-exit.
--no-color Produce monochrome output.
--quiet-pull Pull without printing progress information
--no-deps Don't start linked services.
--force-recreate Recreate containers even if their configuration
and image haven't changed.
--always-recreate-deps Recreate dependent containers.
Incompatible with --no-recreate.
--no-recreate If containers already exist, don't recreate
them. Incompatible with --force-recreate and -V.
--no-build Don't build an image, even if it's missing.
--no-start Don't start the services after creating them.
--build Build images before starting containers.
--abort-on-container-exit Stops all containers if any container was
stopped. Incompatible with -d.
--attach-dependencies Attach to dependent containers.
-t, --timeout TIMEOUT Use this timeout in seconds for container
shutdown when attached or when containers are
already running. (default: 10)
-V, --renew-anon-volumes Recreate anonymous volumes instead of retrieving
data from the previous containers.
--remove-orphans Remove containers for services not defined
in the Compose file.
--exit-code-from SERVICE Return the exit code of the selected service
container. Implies --abort-on-container-exit.
--scale SERVICE=NUM Scale SERVICE to NUM instances. Overrides the
`scale` setting in the Compose file if present.
--no-log-prefix Don't print prefix in logs.
Data Migration completed successfully ✅
>> Waiting for API Service to Start..........................
Steps to reproduce
- Run
apt get update - Run './setup.sh'
- Choose Upgrade or Restart
Environment
Production
Browser
None
Variant
Self-hosted
Version
v0.26.0
In the logs, it shown as below when use docker-compose up but after that when I use ./setup.sh then it keep showing up Waiting for API Service to Start... even I updated to 0.26.1.
Any update on this?
Service: 3
Service 'api' is running.
Attaching to plane-app_api_1
api_1 | Waiting for database...
api_1 | Database available!
api_1 | No migrations Pending. Starting processes ...
api_1 | Instance already registered
api_1 | ENABLE_SIGNUP configuration already exists
api_1 | DISABLE_WORKSPACE_CREATION configuration already exists
api_1 | ENABLE_EMAIL_PASSWORD configuration already exists
api_1 | ENABLE_MAGIC_LINK_LOGIN configuration already exists
api_1 | GOOGLE_CLIENT_ID configuration already exists
api_1 | GOOGLE_CLIENT_SECRET configuration already exists
api_1 | GITHUB_CLIENT_ID configuration already exists
api_1 | GITHUB_CLIENT_SECRET configuration already exists
api_1 | GITHUB_ORGANIZATION_ID configuration already exists
api_1 | GITLAB_HOST configuration already exists
api_1 | GITLAB_CLIENT_ID configuration already exists
api_1 | GITLAB_CLIENT_SECRET configuration already exists
api_1 | EMAIL_HOST configuration already exists
api_1 | EMAIL_HOST_USER configuration already exists
api_1 | EMAIL_HOST_PASSWORD configuration already exists
api_1 | EMAIL_PORT configuration already exists
api_1 | EMAIL_FROM configuration already exists
api_1 | EMAIL_USE_TLS configuration already exists
api_1 | EMAIL_USE_SSL configuration already exists
api_1 | LLM_API_KEY configuration already exists
api_1 | LLM_PROVIDER configuration already exists
api_1 | LLM_MODEL configuration already exists
api_1 | GPT_ENGINE configuration already exists
api_1 | UNSPLASH_ACCESS_KEY configuration already exists
api_1 | IS_INTERCOM_ENABLED configuration already exists
api_1 | INTERCOM_APP_ID configuration already exists
api_1 | IS_GOOGLE_ENABLED configuration already exists
api_1 | IS_GITHUB_ENABLED configuration already exists
api_1 | IS_GITLAB_ENABLED configuration already exists
api_1 | Checking bucket...
api_1 | Bucket 'uploads' exists.
api_1 | Cache Cleared
api_1 | [2025-06-08 16:44:08 +0000] [1] [INFO] Starting gunicorn 23.0.0
api_1 | [2025-06-08 16:44:08 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1)
api_1 | [2025-06-08 16:44:08 +0000] [1] [INFO] Using worker: uvicorn.workers.UvicornWorker
api_1 | [2025-06-08 16:44:08 +0000] [25] [INFO] Booting worker with pid: 25
Hi @hgaquan, Please update the setup.sh script by running the following curl command:
curl -fsSL -o setup.sh https://github.com/makeplane/plane/releases/latest/download/setup.sh
Hi @hgaquan, Please update the setup.sh script by running the following curl command:
curl -fsSL -o setup.sh https://github.com/makeplane/plane/releases/latest/download/setup.sh
I was try, and also try to install a fresh instance, same error, stuck at API start step. Can we start it manually or trace the process to see what happen?
This issue was related to the setup script and has been fixed in our latest release. I tried to reproduce it but couldn’t replicate the problem.
Same problem working from Ubuntu lts. :)
@hgaquan Were you able to solve the problem?
@hgaquan Were you able to solve the problem?
Not yet bro. This error happen suddenly even it just work before. I try to upgrade to latest version and this issue happen. I try install new instance and do a restore, this issue happen again.
For those who are thinking about upgrading, please consider carefully before doing so. <<<
Any update of this issue? I have the same issue.. stuck on Waiting for API service to start
To fix this, update your setup script to the latest version: Download latest setup script
To fix this, update your setup script to the latest version: Download latest setup script
I have done this already and still have the same issue about waiting API Ubuntu 24.04 LTS
LE: I see that sometimes the api service response was not show, but the app will work. Anyone who faccing this issue, check the logs.
Sorry for
I downloaded latest script version but start from ./setup.sh got same issue.
Run docker-compose up -d given all service start but then I use ./restore.sh it show as below and application still show onboard mode instead of use my backup data, even I do docker-compose down -v and restart again. Seem latest version did not work with old data.
--------------------------------------------
____ _ /////////
| _ \| | __ _ _ __ ___ /////////
| |_) | |/ _` | '_ \ / _ \ ///// /////
| __/| | (_| | | | | __/ ///// /////
|_| |_|\__,_|_| |_|\___| ////
////
--------------------------------------------
Project management tool from the future
--------------------------------------------
No such command: ls
Commands:
build Build or rebuild services
config Validate and view the Compose file
create Create services
down Stop and remove resources
events Receive real time events from containers
exec Execute a command in a running container
help Get help on a command
images List images
kill Kill containers
logs View output from containers
pause Pause services
port Print the public port for a port binding
ps List containers
pull Pull service images
push Push service images
restart Restart services
rm Remove stopped containers
run Run a one-off command
scale Set number of containers for a service
start Start services
stop Stop services
top Display the running processes
unpause Unpause services
up Create and start containers
version Show version information and quit
Found plane-app/backup/20250606-1346/redisdata.tar.gz
.....Restoring plane-app_redisdata
Error: Failed to remove volume plane-app_redisdata
Found plane-app/backup/20250606-1346/uploads.tar.gz
.....Restoring plane-app_uploads
Error: Failed to remove volume plane-app_uploads
Restore completed successfully.
_ Below is API logs when using docker-compose _
--------------------------------------------
____ _ /////////
| _ \| | __ _ _ __ ___ /////////
| |_) | |/ _` | '_ \ / _ \ ///// /////
| __/| | (_| | | | | __/ ///// /////
|_| |_|\__,_|_| |_|\___| ////
////
--------------------------------------------
Project management tool from the future
--------------------------------------------
Select a Action you want to perform:
1) Install
2) Start
3) Stop
4) Restart
5) Upgrade
6) View Logs
7) Backup Data
8) Exit
Action [2]: 6
Select a Service you want to view the logs for:
1) Web
2) Space
3) API
4) Worker
5) Beat-Worker
6) Migrator
7) Proxy
8) Redis
9) Postgres
10) Minio
11) RabbitMQ
0) Back to Main Menu
Service: 3
Service 'api' is running.
Attaching to plane-app_api_1
api_1 | Waiting for database...
api_1 | Database available!
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | Waiting for database migrations to complete...
api_1 | No migrations Pending. Starting processes ...
api_1 | Instance registered
api_1 | ENABLE_SIGNUP loaded with value from environment variable.
api_1 | DISABLE_WORKSPACE_CREATION loaded with value from environment variable.
api_1 | ENABLE_EMAIL_PASSWORD loaded with value from environment variable.
api_1 | ENABLE_MAGIC_LINK_LOGIN loaded with value from environment variable.
api_1 | GOOGLE_CLIENT_ID loaded with value from environment variable.
api_1 | GOOGLE_CLIENT_SECRET loaded with value from environment variable.
api_1 | GITHUB_CLIENT_ID loaded with value from environment variable.
api_1 | GITHUB_CLIENT_SECRET loaded with value from environment variable.
api_1 | GITHUB_ORGANIZATION_ID loaded with value from environment variable.
api_1 | GITLAB_HOST loaded with value from environment variable.
api_1 | GITLAB_CLIENT_ID loaded with value from environment variable.
api_1 | GITLAB_CLIENT_SECRET loaded with value from environment variable.
api_1 | EMAIL_HOST loaded with value from environment variable.
api_1 | EMAIL_HOST_USER loaded with value from environment variable.
api_1 | EMAIL_HOST_PASSWORD loaded with value from environment variable.
api_1 | EMAIL_PORT loaded with value from environment variable.
api_1 | EMAIL_FROM loaded with value from environment variable.
api_1 | EMAIL_USE_TLS loaded with value from environment variable.
api_1 | EMAIL_USE_SSL loaded with value from environment variable.
api_1 | LLM_API_KEY loaded with value from environment variable.
api_1 | LLM_PROVIDER loaded with value from environment variable.
api_1 | LLM_MODEL loaded with value from environment variable.
api_1 | GPT_ENGINE loaded with value from environment variable.
api_1 | UNSPLASH_ACCESS_KEY loaded with value from environment variable.
api_1 | IS_INTERCOM_ENABLED loaded with value from environment variable.
api_1 | INTERCOM_APP_ID loaded with value from environment variable.
api_1 | IS_GOOGLE_ENABLED loaded with value from environment variable.
api_1 | IS_GITHUB_ENABLED loaded with value from environment variable.
api_1 | IS_GITLAB_ENABLED loaded with value from environment variable.
api_1 | Checking bucket...
api_1 | Bucket 'uploads' does not exist. Creating bucket...
api_1 | Bucket 'uploads' created successfully.
api_1 | Cache Cleared
api_1 | [2025-06-21 08:22:11 +0000] [1] [INFO] Starting gunicorn 23.0.0
api_1 | [2025-06-21 08:22:11 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1)
api_1 | [2025-06-21 08:22:11 +0000] [1] [INFO] Using worker: uvicorn.workers.UvicornWorker
api_1 | [2025-06-21 08:22:11 +0000] [25] [INFO] Booting worker with pid: 25
^CERROR: Aborting.
_ END Logs _
And this is migration logs:
Select a Service you want to view the logs for:
1) Web
2) Space
3) API
4) Worker
5) Beat-Worker
6) Migrator
7) Proxy
8) Redis
9) Postgres
10) Minio
11) RabbitMQ
0) Back to Main Menu
Service: 6
Service 'migrator' is running.
Attaching to plane-app_migrator_1
migrator_1 | Waiting for database...
migrator_1 | Database available!
migrator_1 | Operations to perform:
migrator_1 | Apply all migrations: auth, contenttypes, db, django_celery_beat, license, sessions
migrator_1 | Running migrations:
migrator_1 | Applying contenttypes.0001_initial... OK
migrator_1 | Applying contenttypes.0002_remove_content_type_name... OK
migrator_1 | Applying auth.0001_initial... OK
migrator_1 | Applying auth.0002_alter_permission_name_max_length... OK
migrator_1 | Applying auth.0003_alter_user_email_max_length... OK
migrator_1 | Applying auth.0004_alter_user_username_opts... OK
migrator_1 | Applying auth.0005_alter_user_last_login_null... OK
migrator_1 | Applying auth.0006_require_contenttypes_0002... OK
migrator_1 | Applying auth.0007_alter_validators_add_error_messages... OK
migrator_1 | Applying auth.0008_alter_user_username_max_length... OK
migrator_1 | Applying auth.0009_alter_user_last_name_max_length... OK
migrator_1 | Applying auth.0010_alter_group_name_max_length... OK
migrator_1 | Applying auth.0011_update_proxy_permissions... OK
migrator_1 | Applying auth.0012_alter_user_first_name_max_length... OK
migrator_1 | Applying db.0001_initial... OK
migrator_1 | Applying db.0002_auto_20221104_2239... OK
migrator_1 | Applying db.0003_auto_20221109_2320... OK
migrator_1 | Applying db.0004_alter_state_sequence... OK
migrator_1 | Applying db.0005_auto_20221114_2127... OK
migrator_1 | Applying db.0006_alter_cycle_status... OK
migrator_1 | Applying db.0007_label_parent... OK
migrator_1 | Applying db.0008_label_colour... OK
migrator_1 | Applying db.0009_auto_20221208_0310... OK
migrator_1 | Applying db.0010_auto_20221213_0037... OK
migrator_1 | Applying db.0011_auto_20221222_2357... OK
migrator_1 | Applying db.0012_auto_20230104_0117... OK
migrator_1 | Applying db.0013_auto_20230107_0041... OK
migrator_1 | Applying db.0014_alter_workspacememberinvite_unique_together... OK
migrator_1 | Applying db.0015_auto_20230107_1636... OK
migrator_1 | Applying db.0016_auto_20230107_1735... OK
migrator_1 | Applying db.0017_alter_workspace_unique_together... OK
migrator_1 | Applying db.0018_auto_20230130_0119... OK
migrator_1 | Applying db.0019_auto_20230131_0049... OK
migrator_1 | Applying db.0020_auto_20230214_0118... OK
migrator_1 | Applying db.0021_auto_20230223_0104... OK
migrator_1 | Applying db.0022_auto_20230307_0304... OK
migrator_1 | Applying db.0023_auto_20230316_0040... OK
migrator_1 | Applying db.0024_auto_20230322_0138... OK
migrator_1 | Applying db.0025_auto_20230331_0203... OK
migrator_1 | Applying db.0026_alter_projectmember_view_props... OK
migrator_1 | Applying db.0027_auto_20230409_0312... OK
migrator_1 | Applying db.0028_auto_20230414_1703... OK
migrator_1 | Applying db.0029_auto_20230502_0126... OK
migrator_1 | Applying db.0030_alter_estimatepoint_unique_together... OK
migrator_1 | Applying db.0031_analyticview... OK
migrator_1 | Applying db.0032_auto_20230520_2015... OK
migrator_1 | Applying db.0033_auto_20230618_2125... OK
migrator_1 | Applying db.0034_auto_20230628_1046... OK
migrator_1 | Applying db.0035_auto_20230704_2225... OK
migrator_1 | Applying db.0036_alter_workspace_organization_size... OK
migrator_1 | Applying db.0037_issue_archived_at_project_archive_in_and_more... OK
migrator_1 | Applying db.0038_auto_20230720_1505... OK
migrator_1 | Applying db.0039_auto_20230723_2203... OK
migrator_1 | Applying db.0040_projectmember_preferences_user_cover_image_and_more... OK
migrator_1 | Applying db.0041_cycle_sort_order_issuecomment_access_and_more... OK
migrator_1 | Applying db.0042_alter_analyticview_created_by_and_more... OK
migrator_1 | Applying db.0043_alter_analyticview_created_by_and_more... OK
migrator_1 | Applying db.0044_auto_20230913_0709... OK
migrator_1 | Applying db.0045_issueactivity_epoch_workspacemember_issue_props_and_more... OK
migrator_1 | Applying db.0046_label_sort_order_alter_analyticview_created_by_and_more... OK
migrator_1 | Applying db.0047_webhook_apitoken_description_apitoken_expired_at_and_more... OK
migrator_1 | Applying db.0048_auto_20231116_0713... OK
migrator_1 | Applying db.0049_auto_20231116_0713... OK
migrator_1 | Applying db.0050_user_use_case_alter_workspace_organization_size... OK
migrator_1 | Applying db.0051_cycle_external_id_cycle_external_source_and_more... OK
migrator_1 | Applying db.0052_auto_20231220_1141... OK
migrator_1 | Applying db.0053_auto_20240102_1315... OK
migrator_1 | Applying db.0054_dashboard_widget_dashboardwidget... OK
migrator_1 | Applying db.0055_auto_20240108_0648... OK
migrator_1 | Applying db.0056_usernotificationpreference_emailnotificationlog... OK
migrator_1 | Applying db.0057_auto_20240122_0901... OK
migrator_1 | Applying db.0058_alter_moduleissue_issue_and_more... OK
migrator_1 | Applying db.0059_auto_20240208_0957... OK
migrator_1 | Applying db.0060_cycle_progress_snapshot... OK
migrator_1 | Applying db.0061_project_logo_props... OK
migrator_1 | Applying db.0062_cycle_archived_at_module_archived_at_and_more... OK
migrator_1 | Applying db.0063_state_is_triage_alter_state_group... OK
migrator_1 | Applying db.0064_auto_20240409_1134... OK
migrator_1 | Applying db.0065_auto_20240415_0937... OK
migrator_1 | Applying db.0066_account_id_token_cycle_logo_props_module_logo_props... OK
migrator_1 | Applying db.0067_issue_estimate... OK
migrator_1 | Applying db.0068_remove_pagelabel_project_remove_pagelog_project_and_more... OK
migrator_1 | Applying db.0069_alter_account_provider_and_more... OK
migrator_1 | Applying db.0070_apitoken_is_service_exporterhistory_filters_and_more... OK
migrator_1 | Applying db.0071_rename_issueproperty_issueuserproperty_and_more... OK
migrator_1 | Applying db.0072_issueattachment_external_id_and_more... OK
migrator_1 | Applying db.0073_alter_commentreaction_unique_together_and_more... OK
migrator_1 | Applying db.0074_deploy_board_and_project_issues... OK
migrator_1 | Applying db.0075_alter_fileasset_asset... OK
migrator_1 | Applying db.0076_alter_projectmember_role_and_more... OK
migrator_1 | Applying db.0077_draftissue_cycle_user_timezone_project_user_timezone_and_more... OK
migrator_1 | Applying db.0078_fileasset_comment_fileasset_entity_type_and_more... OK
migrator_1 | Applying db.0079_auto_20241009_0619... OK
migrator_1 | Applying db.0080_fileasset_draft_issue_alter_fileasset_entity_type... OK
migrator_1 | Applying db.0081_remove_globalview_created_by_and_more... OK
migrator_1 | Applying db.0082_alter_issue_managers_alter_cycleissue_issue_and_more... OK
migrator_1 | Applying db.0083_device_workspace_timezone_and_more... OK
migrator_1 | Applying db.0084_remove_label_label_unique_name_project_when_deleted_at_null_and_more... OK
migrator_1 | Applying db.0085_intake_intakeissue_remove_inboxissue_created_by_and_more... OK
migrator_1 | Applying db.0086_issueversion_alter_teampage_unique_together_and_more... OK
migrator_1 | Applying db.0087_remove_issueversion_description_and_more... OK
migrator_1 | Applying db.0088_sticky_sort_order_workspaceuserlink... OK
migrator_1 | Applying db.0089_workspacehomepreference_and_more... OK
migrator_1 | Applying db.0090_rename_dashboard_deprecateddashboard_and_more... OK
migrator_1 | Applying db.0091_issuecomment_edited_at_and_more... OK
migrator_1 | Applying db.0092_alter_deprecateddashboardwidget_unique_together_and_more... OK
migrator_1 | Applying db.0093_page_moved_to_page_page_moved_to_project_and_more... OK
migrator_1 | Applying django_celery_beat.0001_initial... OK
migrator_1 | Applying django_celery_beat.0002_auto_20161118_0346... OK
migrator_1 | Applying django_celery_beat.0003_auto_20161209_0049... OK
migrator_1 | Applying django_celery_beat.0004_auto_20170221_0000... OK
migrator_1 | Applying django_celery_beat.0005_add_solarschedule_events_choices... OK
migrator_1 | Applying django_celery_beat.0006_auto_20180322_0932... OK
migrator_1 | Applying django_celery_beat.0007_auto_20180521_0826... OK
migrator_1 | Applying django_celery_beat.0008_auto_20180914_1922... OK
migrator_1 | Applying django_celery_beat.0006_auto_20180210_1226... OK
migrator_1 | Applying django_celery_beat.0006_periodictask_priority... OK
migrator_1 | Applying django_celery_beat.0009_periodictask_headers... OK
migrator_1 | Applying django_celery_beat.0010_auto_20190429_0326... OK
migrator_1 | Applying django_celery_beat.0011_auto_20190508_0153... OK
migrator_1 | Applying django_celery_beat.0012_periodictask_expire_seconds... OK
migrator_1 | Applying django_celery_beat.0013_auto_20200609_0727... OK
migrator_1 | Applying django_celery_beat.0014_remove_clockedschedule_enabled... OK
migrator_1 | Applying django_celery_beat.0015_edit_solarschedule_events_choices... OK
migrator_1 | Applying django_celery_beat.0016_alter_crontabschedule_timezone... OK
migrator_1 | Applying django_celery_beat.0017_alter_crontabschedule_month_of_year... OK
migrator_1 | Applying django_celery_beat.0018_improve_crontab_helptext... OK
migrator_1 | Applying license.0001_initial... OK
migrator_1 | Applying license.0002_rename_version_instance_current_version_and_more... OK
migrator_1 | Applying license.0003_alter_changelog_title_alter_changelog_version_and_more... OK
migrator_1 | Applying license.0004_changelog_deleted_at_instance_deleted_at_and_more... OK
migrator_1 | Applying license.0005_rename_product_instance_edition_and_more... OK
migrator_1 | Applying sessions.0001_initial... OK
plane-app_migrator_1 exited with code 0
@akshat5302 Sorry for bothering you, please take a look. Our application is down for weeks.
To fix this, update your setup script to the latest version: Download latest setup script
- Small but helpful information that
Data Migration completed successfully ✅return very fast when I run./setup.sh startbut actually when I check the API logs, migration progress was taken ~1 minute to complete. - I purge all docker data and install a fresh app then got this:
--------------------------------------------
____ _ /////////
| _ \| | __ _ _ __ ___ /////////
| |_) | |/ _` | '_ \ / _ \ ///// /////
| __/| | (_| | | | | __/ ///// /////
|_| |_|\__,_|_| |_|\___| ////
////
--------------------------------------------
Project management tool from the future
--------------------------------------------
Select a Action you want to perform:
1) Install
2) Start
3) Stop
4) Restart
5) Upgrade
6) View Logs
7) Backup Data
8) Exit
Action [2]: 1
Begin Installing Plane
Checking for the latest release...
Please wait while we check the availability of Docker images for the selected release (v0.26.1) with X86_64 support.
Plane supports amd64
Syncing environment variables...
Environment variables synced successfully
Updating custom variables...
Custom variables updated successfully
Pulls images for services defined in a Compose file, but does not start the containers.
Usage: pull [options] [--] [SERVICE...]
Options:
--ignore-pull-failures Pull what it can and ignores images with pull failures.
--parallel Deprecated, pull multiple images in parallel (enabled by default).
--no-parallel Disable parallel pulling.
-q, --quiet Pull without printing progress information
--include-deps Also pull services declared as dependencies
Most recent version of Plane is now available for you to use
In case of 'Upgrade', please check the 'plane.env 'file for any new variables and update them accordingly
It did not pulling any container, I'm following Install Community Edition instruction.
Hey @hgaquan is your api container running properly? The above shared logs looks fine to me
Hey @hgaquan is your api container running properly? The above shared logs looks fine to me
If I use docker-compose seem it was started. But still can not start by using ./setup.sh command. I want to start by setup.sh then use restore.sh, please check problem when restore above.
I was able to run on plane with the ./setup, for some reason it was a problem of how the .env was configured.
I was able to run on plane with the ./setup, for some reason it was a problem of how the .env was configured.
@sentryf Yes I think so, if I change something in .env, it can be run but after do ./restore.sh, seem it be conflict with current configuration.
It seems this issue will happen if you fill in custom passwords to the env-file. That is fixable by filling the _URL variables for postgres, redis and rabbit. Apparently the plane container does not use the variables to construct the relevant connection urls. The setup.sh will hang on the API phase if auth fails for any of the other services. Plane docs has hints for the url formats. I'm relatively sure this would work: (if redis takes user and password, maybe the postgres vars can be used)
DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${PGHOST}:${POSTGRES_PORT}/${POSTGRES_DB} REDIS_URL=redis://${REDIS_HOST}:${REDIS_PORT} AMQP_URL=amqp://${RABBITMQ_USER}:${RABBITMQ_PASSWORD}@${RABBITMQ_HOST}:${RABBITMQ_PORT}/${RABBITMQ_VHOST}
I upgrade to version 0.27.0 but still stuck at this problem. I was deleted all docker containers, images and volumes then download latest script to install a new one but something went wrong when I choose 1 to install, everything run very fast without any docker pulling.
This problem only happen when you do actions on an old server which was installed Plane before, a fresh server doesn't
This is logs after I remove everything to try install Plane again.
--------------------------------------------
____ _ /////////
| _ \| | __ _ _ __ ___ /////////
| |_) | |/ _` | '_ \ / _ \ ///// /////
| __/| | (_| | | | | __/ ///// /////
|_| |_|\__,_|_| |_|\___| ////
////
--------------------------------------------
Project management tool from the future
--------------------------------------------
Select a Action you want to perform:
1) Install
2) Start
3) Stop
4) Restart
5) Upgrade
6) View Logs
7) Backup Data
8) Exit
Action [2]: 1
Begin Installing Plane
Please wait while we check the availability of Docker images for the selected release (v0.27.0) with X86_64 support.
Plane supports amd64
Syncing environment variables...
Updating custom variables...
Custom variables updated successfully
sed: -e expression #1, char 29: unknown option to `s'
sed: -e expression #1, char 55: unknown option to `s'
sed: -e expression #1, char 39: unknown option to `s'
sed: -e expression #1, char 22: unknown option to `s'
sed: -e expression #1, char 53: unknown option to `s'
sed: -e expression #1, char 50: unknown option to `s'
sed: -e expression #1, char 57: unknown option to `s'
Environment variables synced successfully
Updating custom variables...
sed: -e expression #1, char 57: unknown option to `s'
Custom variables updated successfully
Pulls images for services defined in a Compose file, but does not start the containers.
Usage: pull [options] [--] [SERVICE...]
Options:
--ignore-pull-failures Pull what it can and ignores images with pull failures.
--parallel Deprecated, pull multiple images in parallel (enabled by default).
--no-parallel Disable parallel pulling.
-q, --quiet Pull without printing progress information
--include-deps Also pull services declared as dependencies
Most recent version of Plane is now available for you to use
In case of 'Upgrade', please check the 'plane.env 'file for any new variables and update them accordingly
Hope this helpful!
Hi @hgaquan could you please share your docker compose version and OS name
Having the same problem here on ubuntu 24.04 LTS. I followed the instructions for the community edition at https://developers.plane.so/self-hosting/methods/docker-compose#install-community-edition exactly as written. Works fine until I get to the part where I start the container - it just hangs at the "Waiting for API to start" line. This happens on a brand new, clean, fresh install.
Hi @akshat5302 , this is my information:
docker-compose -v
docker-compose version 1.29.2, build unknown
lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 24.04.2 LTS
Release: 24.04
Codename: noble
You are currently using an outdated version of Docker Compose. The v1 format has been deprecated for a long time. Please upgrade to Docker Compose v2.
After upgrading, also ensure that the v1 binary is completely removed from your system to avoid conflicts.
You are currently using an outdated version of Docker Compose. The v1 format has been deprecated for a long time. Please upgrade to Docker Compose v2.
After upgrading, also ensure that the v1 binary is completely removed from your system to avoid conflicts.
@akshat5302
After upgrade docker compose to latest version, install script work but start script got error as below and still stuck at API start stage, let me know if you need any additional information.
--------------------------------------------
____ _ /////////
| _ \| | __ _ _ __ ___ /////////
| |_) | |/ _` | '_ \ / _ \ ///// /////
| __/| | (_| | | | | __/ ///// /////
|_| |_|\__,_|_| |_|\___| ////
////
--------------------------------------------
Project management tool from the future
--------------------------------------------
[+] Running 9/13
! proxy Interrupted 1.3s
! space Interrupted 1.3s
! admin Interrupted 1.3s
! migrator Interrupted 1.3s
⠸ plane-redis Pulling 1.3s
⠸ plane-minio Pulling 1.3s
⠸ plane-mq Pulling 1.3s
⠹ plane-db Pulling 1.3s
! live Interrupted 1.3s
✘ beat-worker Error manifest for artifacts.plane.... 1.3s
! web Interrupted 1.3s
! api Interrupted 1.3s
! worker Interrupted 1.3s
Error response from daemon: manifest for artifacts.plane.so/makeplane/plane-backend:local not found: manifest unknown: manifest unknown
Data Migration completed successfully ✅
>> Waiting for API Service to Start............
It looks like something might be misconfigured in your current setup. Could you try doing a fresh installation to ensure everything is set up correctly?
It looks like something might be misconfigured in your current setup. Could you try doing a fresh installation to ensure everything is set up correctly?
Hi @akshat5302 , I confirm that upgrade docker-compose to latest version has fixed my problem but one small thing was made me so confuse in hours is comment # Comment this if you already have a reverse proxy running, actually proxy block in docker-compose.yaml should remain then we will use a reverse proxy like nginx to do proxy_pass, isn't it? Correct if I'm wrong.
@dojoca Let's try to upgrade everything related to latest version and try again, I think it will be ok, I tested with few cases, different OS and environments.
Has anyone solved this issue? I found out that the migration is erroring out and looping:
migrator-1 | Operations to perform:
migrator-1 | Apply all migrations: auth, contenttypes, db, django_celery_beat, license, sessions
migrator-1 | Running migrations:
beat-worker-1 | Waiting for database migrations to complete...
api-1 | Waiting for database migrations to complete...
worker-1 | Waiting for database migrations to complete...
migrator-1 | Traceback (most recent call last):
migrator-1 | File "/code/manage.py", line 15, in <module>
migrator-1 | execute_from_command_line(sys.argv)
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/core/management/__init__.py", line 442, in execute_from_command_line
migrator-1 | utility.execute()
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/core/management/__init__.py", line 436, in execute
migrator-1 | self.fetch_command(subcommand).run_from_argv(self.argv)
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/core/management/base.py", line 412, in run_from_argv
migrator-1 | self.execute(*args, **cmd_options)
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/core/management/base.py", line 458, in execute
migrator-1 | output = self.handle(*args, **options)
migrator-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/core/management/base.py", line 106, in wrapper
migrator-1 | res = handle_func(*args, **kwargs)
migrator-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/core/management/commands/migrate.py", line 356, in handle
migrator-1 | post_migrate_state = executor.migrate(
migrator-1 | ^^^^^^^^^^^^^^^^^
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/db/migrations/executor.py", line 135, in migrate
migrator-1 | state = self._migrate_all_forwards(
migrator-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/db/migrations/executor.py", line 167, in _migrate_all_forwards
migrator-1 | state = self.apply_migration(
migrator-1 | ^^^^^^^^^^^^^^^^^^^^^
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/db/migrations/executor.py", line 252, in apply_migration
migrator-1 | state = migration.apply(state, schema_editor)
migrator-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/db/migrations/migration.py", line 132, in apply
migrator-1 | operation.database_forwards(
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/db/migrations/operations/models.py", line 659, in database_forwards
migrator-1 | alter_together(
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/db/backends/base/schema.py", line 554, in alter_unique_together
migrator-1 | self._delete_composed_index(
migrator-1 | File "/usr/local/lib/python3.12/site-packages/django/db/backends/base/schema.py", line 610, in _delete_composed_index
migrator-1 | raise ValueError(
migrator-1 | ValueError: Found wrong number (0) of constraints for workspace_user_properties(workspace_id, user_id)
migrator-1 exited with code 0
migrator-1 | Waiting for database...
migrator-1 | Database available!
beat-worker-1 | Waiting for database migrations to complete...
api-1 | Waiting for database migrations to complete...
worker-1 | Waiting for database migrations to complete...
migrator-1 | Operations to perform:
migrator-1 | Apply all migrations: auth, contenttypes, db, django_celery_beat, license, sessions
migrator-1 | Running migrations: