There are ways to check compressed image size before pulling from Docker Hub, but in order to do the same for an insecure/domain registry, how can I check the download size before pull?
While doing a pull, I can see size of each layer being downloaded like this, which I can sum up to find total size.
7595c8c21622: Already exists
d13af8ca898f: Already exists
70799171ddba: Already exists
b6c12202c5ef: Already exists
ef50ae158fa8: Downloading [===============> ] 74.18MB/239.3MB
e5665f29b73a: Downloading [=> ] 95.26MB/2.741GB
780c121f1da3: Downloading [========> ] 75.27MB/462.5MB
5fc49bebd483: Waiting
592ff9f385a7: Waiting
80a3934684b4: Waiting
b523362dbfaa: Waiting
3bf96686ba27: Waiting
dd64f3b98c5f: Waiting
937238fb3569: Waiting
c39efc826c40: Waiting
But I would like to know beforehand what would be the total download size.
Docker pushes a manifest file for each image on the registry, containing checksum, size and misc info, which all download clients use, in order to pull layers.
Given your registry/image name, you can inspect
the manifest file, sum up the sizes of each layer to get the total download size.
First, enable Docker's experimental CLI features.
To temporarily enable:
export DOCKER_CLI_EXPERIMENTAL=enabled
Make it permanent:
Add "experimental": "enabled"
in ~/.docker/config.json
Check size:
docker manifest inspect --insecure -v <registry_or_domain>/<image_name> | grep size | awk -F ':' '{sum+=$NF} END {print sum}' | xargs printf "%f\n" | numfmt --to=iec
printf
formats the scientific notation of size (like 3e+09) to a float, which numfmt
can parse.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.