Caching images for use in docker compose between jobs and runs.
Behaviour
This tool looks like exactly what I need, and it almost works, but I'm stuck on the following: Only some of my images are cached between jobs and runs, but other are constantly rebuilt, and I cannot determine the reason for the difference. I must be doing something fundamentally incorrect, but I cannot find it.
Steps to reproduce this issue
Here is how I am currently attempting to build and save images:
jobs:
build_neurosynth_compose:
runs-on: ubuntu-latest
defaults:
run:
working-directory: compose
steps:
-
name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
-
name: Configuration
run: |
cp .env.example .env
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
-
name: Build and push
uses: docker/bake-action@master
with:
files: docker-compose.yml,docker-compose.dev.yml
push: true
load: false
workdir: compose
set: |
neurosynth.tags=ghcr.io/${{ github.repository_owner }}/neurosynth_compose:${{ hashFiles('**/compose/neurosynth_compose/**') }}
neurosynth.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/neurosynth_compose:${{ hashFiles('**/compose/neurosynth_compose/**') }}
neurosynth.cache-from=type=gha,scope=cached-stage
neurosynth.cache-to=type=gha,scope=cached-stage,mode=max
nginx.tags=ghcr.io/${{ github.repository_owner }}/synth_nginx:${{ hashFiles('**/compose/nginx/**') }}
nginx.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/synth_nginx:${{ hashFiles('**/compose/nginx/**') }}
nginx.cache-from=type=gha,scope=cached-stage
nginx.cache-to=type=gha,scope=cached-stage,mode=max
synth_pgsql.tags=ghcr.io/${{ github.repository_owner }}/synth_pgsql:${{ hashFiles('**/compose/postgres/**') }}
synth_pgsql.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/synth_pgsql:${{ hashFiles('**/compose/postgres/**') }}
synth_pgsql.cache-from=type=gha,scope=cached-stage
synth_pgsql.cache-to=type=gha,scope=cached-stage,mode=max
And here is where I try to get the cache from the above job
neurosynth_compose_backend_tests:
runs-on: ubuntu-latest
needs: build_neurosynth_compose
defaults:
run:
working-directory: compose
steps:
-
name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Configuration
run: |
cp .env.example .env
-
name: load images
uses: docker/bake-action@master
with:
files: docker-compose.yml,docker-compose.dev.yml
push: false
load: true
workdir: compose
set: |
neurosynth.cache-from=type=gha,scope=cached-stage
nginx.cache-from=type=gha,scope=cached-stage
synth_pgsql.cache-from=type=gha,scope=cached-stage
-
name: Spin up backend
run: |
docker network create nginx-proxy
docker-compose pull
docker-compose \
-f docker-compose.yml \
-f docker-compose.dev.yml \
up -d --no-build
-
name: Create Test Database
run: |
until docker-compose exec -T \
synth_pgsql pg_isready -U postgres; do sleep 1; done
docker-compose exec -T \
synth_pgsql \
psql -U postgres -c "create database test_db"
-
name: Backend Tests
env:
AUTH0_CLIENT_ID: ${{ secrets.AUTH0_CLIENT_ID }}
AUTH0_CLIENT_SECRET: ${{ secrets.AUTH0_CLIENT_SECRET }}
AUTH0_BASE_URL: ${{ secrets.AUTH0_BASE_URL }}
AUTH0_ACCESS_TOKEN_URL: ${{ secrets.AUTH0_ACCESS_TOKEN_URL }}
AUTH0_AUTH_URL: ${{ secrets.AUTH0_AUTH_URL }}
run: |
docker-compose run \
-e "APP_SETTINGS=neurosynth_compose.config.DockerTestConfig" \
-e "AUTH0_CLIENT_ID=${AUTH0_CLIENT_ID}" \
-e "AUTH0_CLIENT_SECRET=${AUTH0_CLIENT_SECRET}" \
-e "AUTH0_BASE_URL=${AUTH0_BASE_URL}" \
-e "AUTH0_ACCESS_TOKEN_URL=${AUTH0_ACCESS_TOKEN_URL}" \
-e "AUTH0_AUTH_URL=${AUTH0_AUTH_URL}" \
--rm -w /neurosynth \
neurosynth \
python -m pytest neurosynth_compose/tests
here is the docker-compose file I'm using
version: "2"
services:
neurosynth:
image: neurosynth_compose
restart: always
build: ./neurosynth_compose
expose:
- "8000"
volumes:
- ./postgres/migrations:/migrations
- ./:/neurosynth
command: /usr/local/bin/gunicorn -w 2 -b :8000 neurosynth_compose.core:app --log-level debug --timeout 120
env_file:
- .env
container_name: neurosynth_compose
nginx:
image: synth_nginx
restart: always
build: ./nginx
expose:
- "80"
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
volumes_from:
- neurosynth
environment:
- VIRTUAL_HOST=${V_HOST}
- LETSENCRYPT_HOST=${V_HOST}
synth_pgsql:
image: synth_pgsql
restart: always
build: ./postgres
volumes:
- postgres_data:/var/lib/postgresql/data
expose:
- '5432'
env_file:
- .env
volumes:
postgres_data:
networks:
default:
external:
name: nginx-proxy
version: "2"
services:
nginx:
ports:
- "81:80"
neurosynth:
expose:
- "8000"
command: /usr/local/bin/gunicorn -w 2 -b :8000 neurosynth_compose.core:app --log-level debug --timeout 300 --reload
restart: "no"
Expected behaviour
I expect the gha cache in the build job would make it so that the cache was reused in the test job, and that pushing images to ghcr.io would make subsequent runs pull from that cache during the build step.
Actual behaviour
sometimes neurosynth is cached, other times it's nginx, but never synth_pgsql
Configuration
- Repository URL (if public): https://github.com/neurostuff/neurostore
- Build URL (if public): https://github.com/neurostuff/neurostore/actions/runs/3094355644/jobs/5007643849
name: Testing Workflow
on: [workflow_dispatch,push]
jobs:
build_neurosynth_compose:
runs-on: ubuntu-latest
defaults:
run:
working-directory: compose
steps:
-
name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
-
name: Configuration
run: |
cp .env.example .env
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
-
name: Build and push
uses: docker/bake-action@master
with:
files: docker-compose.yml,docker-compose.dev.yml
push: true
load: false
workdir: compose
set: |
neurosynth.tags=ghcr.io/${{ github.repository_owner }}/neurosynth_compose:${{ hashFiles('**/compose/neurosynth_compose/**') }}
neurosynth.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/neurosynth_compose:${{ hashFiles('**/compose/neurosynth_compose/**') }}
neurosynth.cache-from=type=gha,scope=cached-stage
neurosynth.cache-to=type=gha,scope=cached-stage,mode=max
nginx.tags=ghcr.io/${{ github.repository_owner }}/synth_nginx:${{ hashFiles('**/compose/nginx/**') }}
nginx.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/synth_nginx:${{ hashFiles('**/compose/nginx/**') }}
nginx.cache-from=type=gha,scope=cached-stage
nginx.cache-to=type=gha,scope=cached-stage,mode=max
synth_pgsql.tags=ghcr.io/${{ github.repository_owner }}/synth_pgsql:${{ hashFiles('**/compose/postgres/**') }}
synth_pgsql.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/synth_pgsql:${{ hashFiles('**/compose/postgres/**') }}
synth_pgsql.cache-from=type=gha,scope=cached-stage
synth_pgsql.cache-to=type=gha,scope=cached-stage,mode=max
build_neurostore:
runs-on: ubuntu-latest
defaults:
run:
working-directory: store
steps:
-
name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
-
name: Configuration
run: |
cp .env.example .env
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
-
name: Build and push
uses: docker/bake-action@master
with:
files: docker-compose.yml,docker-compose.dev.yml
push: true
load: false
workdir: store
set: |
neurostore.tags=ghcr.io/${{ github.repository_owner }}/neurostore:${{ hashFiles('**/store/neurostore/**') }}
neurostore.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/neurostore:${{ hashFiles('**/store/neurostore/**') }}
neurostore.cache-from=type=gha,scope=cached-stage
neurostore.cache-to=type=gha,scope=cached-stage,mode=max
nginx.tags=ghcr.io/${{ github.repository_owner }}/store_nginx:${{ hashFiles('**/store/nginx/**') }}
nginx.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/store_nginx:${{ hashFiles('**/store/nginx/**') }}
nginx.cache-from=type=gha,scope=cached-stage
nginx.cache-to=type=gha,scope=cached-stage,mode=max
store_pgsql.tags=ghcr.io/${{ github.repository_owner }}/store_pgsql:${{ hashFiles('**/store/postgres/**') }}
store_pgsql.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/store_pgsql:${{ hashFiles('**/store/postgres/**') }}
store_pgsql.cache-from=type=gha,scope=cached-stage
store_pgsql.cache-to=type=gha,scope=cached-stage,mode=max
neurostore_backend_tests:
runs-on: ubuntu-latest
needs: build_neurostore
defaults:
run:
working-directory: store
steps:
-
name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Configuration
run: |
cp .env.example .env
-
name: load images
uses: docker/bake-action@master
with:
files: docker-compose.yml,docker-compose.dev.yml
push: false
load: true
workdir: store
set: |
neurostore.cache-from=type=gha,scope=cached-stage
nginx.cache-from=type=gha,scope=cached-stage
store_pgsql.cache-from=type=gha,scope=cached-stage
-
name: spin up backend
run: |
docker network create nginx-proxy
docker-compose pull
docker-compose \
-f docker-compose.yml \
-f docker-compose.dev.yml \
up -d --no-build
-
name: Create Test Database
run: |
until docker-compose exec -T \
store_pgsql pg_isready -U postgres; do sleep 1; done
docker-compose exec -T \
store_pgsql \
psql -U postgres -c "create database test_db"
-
name: Backend Tests
env:
AUTH0_CLIENT_ID: ${{ secrets.AUTH0_CLIENT_ID }}
AUTH0_CLIENT_SECRET: ${{ secrets.AUTH0_CLIENT_SECRET }}
AUTH0_BASE_URL: ${{ secrets.AUTH0_BASE_URL }}
AUTH0_ACCESS_TOKEN_URL: ${{ secrets.AUTH0_ACCESS_TOKEN_URL }}
AUTH0_AUTH_URL: ${{ secrets.AUTH0_AUTH_URL }}
run: |
docker-compose run \
-e "APP_SETTINGS=neurostore.config.DockerTestConfig" \
-e "AUTH0_CLIENT_ID=${AUTH0_CLIENT_ID}" \
-e "AUTH0_CLIENT_SECRET=${AUTH0_CLIENT_SECRET}" \
-e "AUTH0_BASE_URL=${AUTH0_BASE_URL}" \
-e "AUTH0_ACCESS_TOKEN_URL=${AUTH0_ACCESS_TOKEN_URL}" \
-e "AUTH0_AUTH_URL=${AUTH0_AUTH_URL}" \
--rm -w /neurostore \
neurostore \
python -m pytest neurostore/tests
neurosynth_compose_backend_tests:
runs-on: ubuntu-latest
needs: build_neurosynth_compose
defaults:
run:
working-directory: compose
steps:
-
name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
-
name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
-
name: Configuration
run: |
cp .env.example .env
-
name: load images
uses: docker/bake-action@master
with:
files: docker-compose.yml,docker-compose.dev.yml
push: false
load: true
workdir: compose
set: |
neurosynth.cache-from=type=gha,scope=cached-stage
nginx.cache-from=type=gha,scope=cached-stage
synth_pgsql.cache-from=type=gha,scope=cached-stage
-
name: Spin up backend
run: |
docker network create nginx-proxy
docker-compose pull
docker-compose \
-f docker-compose.yml \
-f docker-compose.dev.yml \
up -d --no-build
-
name: Create Test Database
run: |
until docker-compose exec -T \
synth_pgsql pg_isready -U postgres; do sleep 1; done
docker-compose exec -T \
synth_pgsql \
psql -U postgres -c "create database test_db"
-
name: Backend Tests
env:
AUTH0_CLIENT_ID: ${{ secrets.AUTH0_CLIENT_ID }}
AUTH0_CLIENT_SECRET: ${{ secrets.AUTH0_CLIENT_SECRET }}
AUTH0_BASE_URL: ${{ secrets.AUTH0_BASE_URL }}
AUTH0_ACCESS_TOKEN_URL: ${{ secrets.AUTH0_ACCESS_TOKEN_URL }}
AUTH0_AUTH_URL: ${{ secrets.AUTH0_AUTH_URL }}
run: |
docker-compose run \
-e "APP_SETTINGS=neurosynth_compose.config.DockerTestConfig" \
-e "AUTH0_CLIENT_ID=${AUTH0_CLIENT_ID}" \
-e "AUTH0_CLIENT_SECRET=${AUTH0_CLIENT_SECRET}" \
-e "AUTH0_BASE_URL=${AUTH0_BASE_URL}" \
-e "AUTH0_ACCESS_TOKEN_URL=${AUTH0_ACCESS_TOKEN_URL}" \
-e "AUTH0_AUTH_URL=${AUTH0_AUTH_URL}" \
--rm -w /neurosynth \
neurosynth \
python -m pytest neurosynth_compose/tests
-
name: Frontend Jest Unit Tests
env:
AUTH0_CLIENT_ID: ${{ secrets.AUTH0_CLIENT_ID }}
AUTH0_CLIENT_SECRET: ${{ secrets.AUTH0_CLIENT_SECRET }}
AUTH0_BASE_URL: ${{ secrets.AUTH0_BASE_URL }}
AUTH0_ACCESS_TOKEN_URL: ${{ secrets.AUTH0_ACCESS_TOKEN_URL }}
AUTH0_AUTH_URL: ${{ secrets.AUTH0_AUTH_URL }}
REACT_APP_AUTH0_CLIENT_ID: ${{ secrets.REACT_APP_AUTH0_CLIENT_ID }}
REACT_APP_AUTH0_DOMAIN: ${{ secrets.REACT_APP_AUTH0_DOMAIN }}
REACT_APP_AUTH0_CLIENT_SECRET: ${{ secrets.REACT_APP_AUTH0_CLIENT_SECRET }}
run: |
cd neurosynth-frontend/ && \
cp .env.example .env.dev && \
docker-compose run \
-e "APP_SETTINGS=neurosynth_compose.config.DockerTestConfig" \
-e "AUTH0_CLIENT_ID=${AUTH0_CLIENT_ID}" \
-e "AUTH0_CLIENT_SECRET=${AUTH0_CLIENT_SECRET}" \
-e "AUTH0_BASE_URL=${AUTH0_BASE_URL}" \
-e "AUTH0_ACCESS_TOKEN_URL=${AUTH0_ACCESS_TOKEN_URL}" \
-e "AUTH0_AUTH_URL=${AUTH0_AUTH_URL}" \
-e "REACT_APP_AUTH0_DOMAIN=${REACT_APP_AUTH0_DOMAIN}" \
-e "REACT_APP_AUTH0_CLIENT_ID=${REACT_APP_AUTH0_CLIENT_ID}" \
-e "REACT_APP_AUTH0_AUDIENCE=localhost" \
-e "REACT_APP_AUTH0_CLIENT_SECRET=${REACT_APP_AUTH0_CLIENT_SECRET}" \
-e "REACT_APP_ENV=DEV" \
-e "REACT_APP_NEUROSTORE_API_DOMAIN=http://localhost/api" \
-e "CI=true" \
-e "REACT_APP_NEUROSYNTH_API_DOMAIN=http://localhost:81/api" \
--rm -w /neurosynth/neurosynth-frontend \
neurosynth \
bash -c "cd /neurosynth/neurosynth-frontend && \
npm install && npm run test"
-
name: Frontend Cypress E2E Tests
uses: cypress-io/github-action@v4
env:
CYPRESS_auth0ClientId: ${{ secrets.REACT_APP_AUTH0_CLIENT_ID }}
CYPRESS_auth0ClientSecret: ${{ secrets.REACT_APP_AUTH0_CLIENT_SECRET }}
CYPRESS_auth0Domain: ${{ secrets.REACT_APP_AUTH0_DOMAIN }}
CYPRESS_auth0Audience: localhost
REACT_APP_AUTH0_AUDIENCE: localhost
REACT_APP_AUTH0_CLIENT_ID: ${{ secrets.REACT_APP_AUTH0_CLIENT_ID }}
REACT_APP_AUTH0_DOMAIN: ${{ secrets.REACT_APP_AUTH0_DOMAIN }}
REACT_APP_AUTH0_CLIENT_SECRET: ${{ secrets.REACT_APP_AUTH0_CLIENT_SECRET }}
REACT_APP_ENV: DEV
with:
build: npm run build:dev
start: npm run start-ci:dev
browser: chrome
wait-on: http://localhost:3000
working-directory: /home/runner/work/neurostore/neurostore/compose/neurosynth-frontend
style_check:
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
-
name: run flake8
run: |
pip install flake8
cd ./store
flake8 ./neurostore
cd ../compose
flake8 ./neurosynth_compose
Logs
Download the log file of your build and attach it to this issue.
pinging this issue if anyone has the time to provide guidance, thanks!
@jdkent Looks like you're using the same cache scope type=gha,scope=cached-stage to write to:
set: |
neurostore.tags=ghcr.io/${{ github.repository_owner }}/neurostore:${{ hashFiles('**/store/neurostore/**') }}
neurostore.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/neurostore:${{ hashFiles('**/store/neurostore/**') }}
neurostore.cache-from=type=gha,scope=cached-stage
neurostore.cache-to=type=gha,scope=cached-stage,mode=max
nginx.tags=ghcr.io/${{ github.repository_owner }}/store_nginx:${{ hashFiles('**/store/nginx/**') }}
nginx.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/store_nginx:${{ hashFiles('**/store/nginx/**') }}
nginx.cache-from=type=gha,scope=cached-stage
nginx.cache-to=type=gha,scope=cached-stage,mode=max
store_pgsql.tags=ghcr.io/${{ github.repository_owner }}/store_pgsql:${{ hashFiles('**/store/postgres/**') }}
store_pgsql.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/store_pgsql:${{ hashFiles('**/store/postgres/**') }}
store_pgsql.cache-from=type=gha,scope=cached-stage
store_pgsql.cache-to=type=gha,scope=cached-stage,mode=max
By doing this cache will be tampered by each build so you need to affect a scope for each image you're building. For example:
set: |
neurostore.tags=ghcr.io/${{ github.repository_owner }}/neurostore:${{ hashFiles('**/store/neurostore/**') }}
neurostore.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/neurostore:${{ hashFiles('**/store/neurostore/**') }}
neurostore.cache-from=type=gha,scope=cached-stage-neurostore
neurostore.cache-to=type=gha,scope=cached-stage-neurostore,mode=max
nginx.tags=ghcr.io/${{ github.repository_owner }}/store_nginx:${{ hashFiles('**/store/nginx/**') }}
nginx.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/store_nginx:${{ hashFiles('**/store/nginx/**') }}
nginx.cache-from=type=gha,scope=cached-stage-nginx
nginx.cache-to=type=gha,scope=cached-stage-nginx,mode=max
store_pgsql.tags=ghcr.io/${{ github.repository_owner }}/store_pgsql:${{ hashFiles('**/store/postgres/**') }}
store_pgsql.cache-from=type=registry,ref=ghcr.io/${{ github.repository_owner }}/store_pgsql:${{ hashFiles('**/store/postgres/**') }}
store_pgsql.cache-from=type=gha,scope=cached-stage-pgsql
store_pgsql.cache-to=type=gha,scope=cached-stage-pgsql,mode=max
build_neurosynth_compose and build_neurostore jobs look similar. Only workdir in bake step seems to differ. Would suggest to use a matrix as explained in https://github.com/docker/bake-action/issues/87#issuecomment-1184659151
For tag creation in set input, it might be better to use our metadata-action.
that was the issue, thank you so much! and thanks for the pointers for metadata-action and matrix builds