Iceberg docker-compose. Won't start again when you Ctrl+C twice to force stop and start again.
^CGracefully stopping... (press Ctrl+C again to force)
[+] Stopping 0/3
⠹ Container spark-iceberg Stopping 4.3s
⠹ Container starrocks-be Stopping 4.3s
⠹ Container mc Stopping 4.3s
[+] Killing 4/6
[+] Killing 6/6eberg-rest Killed 0.3s
✔ Container iceberg-rest Killed 0.3s
[+] Stopping 6/4rk-iceberg Killed 0.4s
✔ Container spark-iceberg Stopped 4.7s
✔ Container starrocks-be Stopped 4.7s
✔ Container mc Stopped 4.5s
✔ Container starrocks-fe Stopped 0.0s
✔ Container iceberg-rest Stopped 0.0s
✔ Container minio Stopped 0.0s
canceled
atwong@Albert-CelerData iceberg % docker-compose up
[+] Running 6/0
✔ Container iceberg-rest Created 0.0s
✔ Container minio Created 0.0s
✔ Container starrocks-fe Created 0.0s
✔ Container mc Created 0.0s
✔ Container spark-iceberg Created 0.0s
✔ Container starrocks-be Created 0.0s
Attaching to iceberg-rest, mc, minio, spark-iceberg, starrocks-be, starrocks-fe
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 56: source: not found
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 67: export_env_from_conf: not found
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 70: source: not found
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 74: [[: not found
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 108: jdk_version: not found
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 110: [[: not found
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 133: detect_jvm_xmx: not found
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 136: [[: not found
starrocks-fe | Frontend running as process 1. Stop it first.
minio | MinIO Object Storage Server
minio | Copyright: 2015-2023 MinIO, Inc.
minio | License: GNU AGPLv3 <https://www.gnu.org/licenses/agpl-3.0.html>
minio | Version: RELEASE.2023-11-20T22-40-07Z (go1.21.4 linux/arm64)
minio |
minio | Status: 1 Online, 0 Offline.
minio | S3-API: http://192.168.0.4:9000 http://127.0.0.1:9000
minio | Console: http://192.168.0.4:9001 http://127.0.0.1:9001
minio |
minio | Documentation: https://min.io/docs/minio/linux/index.html
minio | Warning: The standard parity is set to 0. This can lead to data loss.
iceberg-rest | 2024-02-02T19:44:34.911 INFO [org.apache.iceberg.rest.RESTCatalogServer] - Creating catalog with properties: {jdbc.password=password, s3.endpoint=http://minio:9000, jdbc.user=user, io-impl=org.apache.iceberg.aws.s3.S3FileIO, catalog-impl=org.apache.iceberg.jdbc.JdbcCatalog, warehouse=s3://warehouse/, uri=jdbc:sqlite:file:/tmp/iceberg_rest_mode=memory}
iceberg-rest | 2024-02-02T19:44:34.925 INFO [org.apache.iceberg.CatalogUtil] - Loading custom FileIO implementation: org.apache.iceberg.aws.s3.S3FileIO
minio |
minio | You are running an older version of MinIO released 2 months before the latest release
minio | Update: Run `mc admin update`
minio |
minio |
starrocks-fe exited with code 1
iceberg-rest | 2024-02-02T19:44:35.055 INFO [org.eclipse.jetty.util.log] - Logging initialized @256ms to org.eclipse.jetty.util.log.Slf4jLog
iceberg-rest | 2024-02-02T19:44:35.085 INFO [org.eclipse.jetty.server.Server] - jetty-9.4.51.v20230217; built: 2023-02-17T08:19:37.309Z; git: b45c405e4544384de066f814ed42ae3dceacdd49; jvm 17.0.9+8-LTS
iceberg-rest | 2024-02-02T19:44:35.093 INFO [org.eclipse.jetty.server.handler.ContextHandler] - Started o.e.j.s.ServletContextHandler@574b560f{/,null,AVAILABLE}
iceberg-rest | 2024-02-02T19:44:35.100 INFO [org.eclipse.jetty.server.AbstractConnector] - Started ServerConnector@b2c9a9c{HTTP/1.1, (http/1.1)}{0.0.0.0:8181}
iceberg-rest | 2024-02-02T19:44:35.100 INFO [org.eclipse.jetty.server.Server] - Started @301ms
spark-iceberg | starting org.apache.spark.deploy.master.Master, logging to /opt/spark/logs/spark--org.apache.spark.deploy.master.Master-1-57b8cb2c0cbd.out
mc | Added `minio` successfully.
mc | mc: <ERROR> Unable to make bucket `minio/warehouse`. Your previous request to create the named bucket succeeded and you already own it.
mc | mc: Please use 'mc anonymous'
spark-iceberg | starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/logs/spark--org.apache.spark.deploy.worker.Worker-1-57b8cb2c0cbd.out
spark-iceberg | starting org.apache.spark.deploy.history.HistoryServer, logging to /opt/spark/logs/spark--org.apache.spark.deploy.history.HistoryServer-1-57b8cb2c0cbd.out
spark-iceberg | starting org.apache.spark.sql.hive.thriftserver.HiveThriftServer2, logging to /opt/spark/logs/spark--org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-57b8cb2c0cbd.out
spark-iceberg | [I 2024-02-02 19:44:45.609 ServerApp] Package notebook took 0.0000s to import
spark-iceberg | [I 2024-02-02 19:44:45.616 ServerApp] Package jupysql_plugin took 0.0072s to import
spark-iceberg | [I 2024-02-02 19:44:45.621 ServerApp] Package jupyter_lsp took 0.0041s to import
spark-iceberg | [W 2024-02-02 19:44:45.621 ServerApp] A `_jupyter_server_extension_points` function was not found in jupyter_lsp. Instead, a `_jupyter_server_extension_paths` function was found and will be used for now. This function name will be deprecated in future releases of Jupyter Server.
spark-iceberg | [I 2024-02-02 19:44:45.623 ServerApp] Package jupyter_server_terminals took 0.0018s to import
spark-iceberg | [I 2024-02-02 19:44:45.623 ServerApp] Package jupyterlab took 0.0000s to import
spark-iceberg | [I 2024-02-02 19:44:45.636 ServerApp] Package notebook_shim took 0.0000s to import
spark-iceberg | [W 2024-02-02 19:44:45.636 ServerApp] A `_jupyter_server_extension_points` function was not found in notebook_shim. Instead, a `_jupyter_server_extension_paths` function was found and will be used for now. This function name will be deprecated in future releases of Jupyter Server.
spark-iceberg | [I 2024-02-02 19:44:45.636 ServerApp] jupysql_plugin | extension was successfully linked.
spark-iceberg | [I 2024-02-02 19:44:45.636 ServerApp] jupyter_lsp | extension was successfully linked.
spark-iceberg | [I 2024-02-02 19:44:45.638 ServerApp] jupyter_server_terminals | extension was successfully linked.
spark-iceberg | [W 2024-02-02 19:44:45.639 LabApp] 'token' has moved from NotebookApp to ServerApp. This config will be passed to ServerApp. Be sure to update your config before our next release.
spark-iceberg | [W 2024-02-02 19:44:45.639 LabApp] 'password' has moved from NotebookApp to ServerApp. This config will be passed to ServerApp. Be sure to update your config before our next release.
spark-iceberg | [W 2024-02-02 19:44:45.640 ServerApp] ServerApp.token config is deprecated in 2.0. Use IdentityProvider.token.
spark-iceberg | [I 2024-02-02 19:44:45.640 ServerApp] jupyterlab | extension was successfully linked.
spark-iceberg | [I 2024-02-02 19:44:45.643 ServerApp] notebook | extension was successfully linked.
spark-iceberg | [I 2024-02-02 19:44:45.739 ServerApp] notebook_shim | extension was successfully linked.
spark-iceberg | [W 2024-02-02 19:44:45.747 ServerApp] WARNING: The Jupyter server is listening on all IP addresses and not using encryption. This is not recommended.
spark-iceberg | [W 2024-02-02 19:44:45.747 ServerApp] WARNING: The Jupyter server is listening on all IP addresses and not using authentication. This is highly insecure and not recommended.
spark-iceberg | [I 2024-02-02 19:44:45.748 ServerApp] notebook_shim | extension was successfully loaded.
spark-iceberg | [I 2024-02-02 19:44:45.748 ServerApp] Registered jupysql-plugin server extension
spark-iceberg | [I 2024-02-02 19:44:45.748 ServerApp] jupysql_plugin | extension was successfully loaded.
spark-iceberg | [I 2024-02-02 19:44:45.749 ServerApp] jupyter_lsp | extension was successfully loaded.
spark-iceberg | [I 2024-02-02 19:44:45.750 ServerApp] jupyter_server_terminals | extension was successfully loaded.
spark-iceberg | [I 2024-02-02 19:44:45.750 LabApp] JupyterLab extension loaded from /usr/local/lib/python3.9/site-packages/jupyterlab
spark-iceberg | [I 2024-02-02 19:44:45.750 LabApp] JupyterLab application directory is /usr/local/share/jupyter/lab
spark-iceberg | [I 2024-02-02 19:44:45.751 LabApp] Extension Manager is 'pypi'.
spark-iceberg | [I 2024-02-02 19:44:45.752 ServerApp] jupyterlab | extension was successfully loaded.
spark-iceberg | [I 2024-02-02 19:44:45.753 ServerApp] notebook | extension was successfully loaded.
spark-iceberg | [I 2024-02-02 19:44:45.754 ServerApp] Serving notebooks from local directory: /home/iceberg/notebooks
spark-iceberg | [I 2024-02-02 19:44:45.754 ServerApp] Jupyter Server 2.10.0 is running at:
spark-iceberg | [I 2024-02-02 19:44:45.754 ServerApp] http://localhost:8888/tree
spark-iceberg | [I 2024-02-02 19:44:45.754 ServerApp] http://127.0.0.1:8888/tree
spark-iceberg | [I 2024-02-02 19:44:45.754 ServerApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
spark-iceberg | [I 2024-02-02 19:44:45.763 ServerApp] Skipped non-installed server(s): bash-language-server, dockerfile-language-server-nodejs, javascript-typescript-langserver, jedi-language-server, julia-language-server, pyright, python-language-server, python-lsp-server, r-languageserver, sql-language-server, texlab, typescript-language-server, unified-language-server, vscode-css-languageserver-bin, vscode-html-languageserver-bin, vscode-json-languageserver-bin, yaml-language-server
starrocks-be | ERROR 2005 (HY000): Unknown MySQL server host 'starrocks-fe' (-2)
starrocks-be | Backend running as process 1. Stop it first.
starrocks-be exited with code 1
It only works again when you delete the containers.... and then start the environment fresh.
I am also getting the same issue with the containers. We sometimes also get in fresh containers, but mostly when restarting stopped containers.
You can reproduce easily with the docker compose for the shared data quickstart tutorial:
~/temp/test_starrocks$ curl -O -s https://raw.githubusercontent.com/StarRocks/demo/master/documentation-samples/quickstart/docker-compose.yml
~/temp/test_starrocks$ docker-compose up -d
Creating network "test_starrocks_default" with the default driver
Creating minio ... done
Creating test_starrocks_minio_mc_1 ... done
Creating starrocks-fe ... done
Creating starrocks-cn ... done
~/temp/test_starrocks$ docker compose logs starrocks-fe
~/temp/test_starrocks$ docker compose logs starrocks-cn
~/temp/test_starrocks$ docker compose stop
[+] Running 4/4
✔ Container starrocks-cn Stopped 10.5s
✔ Container test_starrocks_minio_mc_1 Stopped 0.0s
✔ Container starrocks-fe Stopped 1.0s
✔ Container minio Stopped 0.5s
~/temp/test_starrocks$ docker-compose up -d
Starting minio ... done
Starting test_starrocks_minio_mc_1 ... done
Starting starrocks-fe ... done
Starting starrocks-cn ... done
~/temp/test_starrocks$ docker compose logs starrocks-fe
~/temp/test_starrocks$ docker compose logs starrocks-cn
~/temp/test_starrocks$ docker compose stop
[+] Running 4/4
✔ Container test_starrocks_minio_mc_1 Stopped 0.0s
✔ Container starrocks-cn Stopped 1.5s
✔ Container starrocks-fe Stopped 0.9s
✔ Container minio Stopped 0.4s
~/temp/test_starrocks$ docker-compose up -d
Starting minio ... done
Starting test_starrocks_minio_mc_1 ... done
Starting starrocks-fe ... done
Starting starrocks-cn ... done
~/temp/test_starrocks$ docker compose logs starrocks-fe
~/temp/test_starrocks$ docker compose logs starrocks-cn
starrocks-cn | ERROR 2003 (HY000): Can't connect to MySQL server on 'starrocks-fe:9030' (111)
starrocks-cn | Backend running as process 1. Stop it first.
I went through these steps several times doing a docker compose down between steps. I never saw the error on the first restart, but it almost always happened on the 2nd.
will be fixed in https://github.com/StarRocks/starrocks/pull/49013
done. wait for next official release with the fix.
starrocks-fe | /opt/starrocks/fe/bin/start_fe.sh: 56: source: not found
Just passing to say that this still occurs in the latest versions.
Manually replacing sh with bash in docker-compose.yml solved this error message.