How does this solution work with jobs that require huge docker images? #803
Unanswered
johanhelsing-attensi
asked this question in
Q&A
Replies: 1 comment
-
You can build an AMI that downloads the images. Check out https://github.com/actions/virtual-environments for how GH officially builds their images. There is a spot for pre-fetching specific images. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have jobs that require docker images of several GB to run. In my current setup, I use long-lived self-hosted runners, so it's only downloaded once per runner, then subsequent runs are pretty fast.
How would this work when runners are created and destroyed all the time?
I see there is a submodule that syncs a distribution image to an s3 bucket. Does this mean it's also possible to somehow make specific docker images part of that image so they don't have to be redownloaded from the package registry all the time?
Beta Was this translation helpful? Give feedback.
All reactions