gha-runner-scale-set - Getting ErrImagePull when using private container image for runners
Checks
- [X] I've already read https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners-with-actions-runner-controller/troubleshooting-actions-runner-controller-errors and I'm sure my issue is not covered in the troubleshooting guide.
- [X] I am using charts that are officially provided
Controller Version
0.9.3
Deployment Method
Helm
Checks
- [X] This isn't a question or user support case (For Q&A and community support, go to Discussions).
- [X] I've read the Changelog before submitting this issue and I'm sure it's not due to any recently-introduced backward-incompatible changes
To Reproduce
When using a private image for the gha-runner-scale-set runner template, I am getting an ErrImagePull error. The PAT that I am using has the correct permissions but it does not seem to be getting used.
I have changed the values as follows
template:
spec:
containers:
- name: runner
# image: ghcr.io/actions/actions-runner:latest
image: ghcr.io/<my-org>/gha-runners:latest
command: ["/home/runner/run.sh"]
Describe the bug
Getting an image pull error when pulling image. I have tested making the image public and it works but obviously need it to be private. I have also tested pulling images in other deployments using the same key and I do not see the issue.
Describe the expected behavior
Issue is down to the assignment of the secret key for pulling packages.
Additional Context
I have created a kube secrets for the token also, and assigned it in the controller values as follows
imagePullSecrets:
- name: github-pat
Controller Logs
Controller logs show nothing as the worker pods are never launching
2024-07-11T09:06:58Z INFO listener-app.worker.kubernetesworker Compare {"original": "{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"replicas\":-1,\"patchID\":-1,\"ephemeralRunnerSpec\":{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"containers\":null}}},\"status\":{\"currentReplicas\":0,\"pendingEphemeralRunners\":0,\"runningEphemeralRunners\":0,\"failedEphemeralRunners\":0}}", "patch": "{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"replicas\":1,\"patchID\":0,\"ephemeralRunnerSpec\":{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"containers\":null}}},\"status\":{\"currentReplicas\":0,\"pendingEphemeralRunners\":0,\"runningEphemeralRunners\":0,\"failedEphemeralRunners\":0}}"}
2024-07-11T09:06:58Z INFO listener-app.worker.kubernetesworker Preparing EphemeralRunnerSet update {"json": "{\"spec\":{\"patchID\":0,\"replicas\":1}}"}
2024-07-11T09:06:58Z INFO listener-app.worker.kubernetesworker Ephemeral runner set scaled. {"namespace": "actions-runner-worker", "name": "k8s-runners-qkk6q", "replicas": 1}
2024-07-11T09:06:58Z INFO listener-app.listener Getting next message {"lastMessageID": 0}
2024-07-11T09:07:48Z INFO listener-app.worker.kubernetesworker Calculated target runner count {"assigned job": 0, "decision": 1, "min": 1, "max": 5, "currentRunnerCount": 1, "jobsCompleted": 0}
2024-07-11T09:07:48Z INFO listener-app.worker.kubernetesworker Compare {"original": "{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"replicas\":-1,\"patchID\":-1,\"ephemeralRunnerSpec\":{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"containers\":null}}},\"status\":{\"currentReplicas\":0,\"pendingEphemeralRunners\":0,\"runningEphemeralRunners\":0,\"failedEphemeralRunners\":0}}", "patch": "{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"replicas\":1,\"patchID\":0,\"ephemeralRunnerSpec\":{\"metadata\":{\"creationTimestamp\":null},\"spec\":{\"containers\":null}}},\"status\":{\"currentReplicas\":0,\"pendingEphemeralRunners\":0,\"runningEphemeralRunners\":0,\"failedEphemeralRunners\":0}}"}
2024-07-11T09:07:48Z INFO listener-app.worker.kubernetesworker Preparing EphemeralRunnerSet update {"json": "{\"spec\":{\"patchID\":0,\"replicas\":1}}"}
2024-07-11T09:07:48Z INFO listener-app.worker.kubernetesworker Ephemeral runner set scaled. {"namespace": "actions-runner-worker", "name": "k8s-runners-qkk6q", "replicas": 1}
2024-07-11T09:07:48Z INFO listener-app.listener Getting next message {"lastMessageID": 0}
Runner Pod Logs
Error from server (BadRequest): container "runner" in pod "k8s-runners-qkk6q-runner-9h7wh" is waiting to start: trying and failing to pull image
Hello! Thank you for filing an issue.
The maintainers will triage your issue shortly.
In the meantime, please take a look at the troubleshooting guide for bug reports.
If this is a feature request, please review our contribution guidelines.
@thinkbiggerltd Think you need to add imagePullSecrets to your spec
template:
spec:
imagePullSecrets:
- name: your-secret-name
containers:
- name: runner
image: ghcr.io/<my-org>/gha-runners:latest
command: ["/home/runner/run.sh"]
@thinkbiggerltd Think you need to add
imagePullSecretsto your spectemplate: spec: imagePullSecrets: - name: your-secret-name containers: - name: runner image: ghcr.io/<my-org>/gha-runners:latest command: ["/home/runner/run.sh"]
Thanks, this worked for me 👍
An update: even though runner pods themselves are coming up with the imagePullSecrets applied, when I set container.image in a workflow, the resulting pod doesn't have imagePullSecrets set
Hey @madAndroid,
Are you using container hooks? If so, did you provide credentials to pull the image? The hook should create a secret for you and apply the imagePullSecrets field to the workflow pod.
Container hooks do not inherit the runner pod spec.
I am closing this issue since it is not related to ARC, but I'm happy to answer any questions ☺