-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failure sending layers to docker daemon #2829
Comments
Hi @porqueoutai, Thanks for the report. To get some more data, which one is the case?
Understanding the setup will help us narrow down the area where the contention point is by which components. |
And although I'm skeptical, is there anything special about the |
Hey @chanseokoh, thank you for the quick response and sorry for the unclear description. Of the 3 choice you presented it is number two. Some more context regarding |
Are you sharing the same base image in a Docker daemon ( I'd like to know if the path And can you also run with |
This sub-module uses another base-image than all other sub-modules. The are no overlapping layers between the images. "sha256:ebb9ae013834b54e76c8d7dfde0ca9018f6bb3495740356a8f1dc655a8552130",
"sha256:0ca7f54856c0baa7f6beecab94a76531965d5d9e079f2fe1761c5173f2f0d9f6",
"sha256:1f59a4b2e20603f508265d81a77daeafcb7686ed15a1bc07ba5af4d0caeb7993",
"sha256:4955570ef83077f6ce792248d9dfdf3120861cf196e39c38a37c0e9e6c6b777b",
"sha256:e66de62ea7d9e1bc8d44ea0a31710a337571101cd88245c53ec568f55c1701c1",
"sha256:c75c47122f0fb3ae9b1bef195d724c0e2a7ab672e5b0d74eda872fe5bc084ccc",
"sha256:ecf65f7446b63c22281a4cffbaa989e3e29a3e796813635571edd6ddf508711f",
"sha256:1fc09f17168ed525b4989ce4aaa7864b64bd86515fc1fa32d818a22de20ea347",
"sha256:01bd257a2e8f4b64f7cc3cf055f42751aed68aad1c3ec0d60455967f50ed919e",
"sha256:b29c404b85bc3b097a2a673f49ca1e061a18141f657d2eee98cb8318add30fa8",
"sha256:8ec1e6e385638003696fd867d91b22c5a03265bafae6c66c37e2ff7bebd4b0f9",
"sha256:8ec1e6e385638003696fd867d91b22c5a03265bafae6c66c37e2ff7bebd4b0f9",
"sha256:86fddc0e22613929d0ae58798b370df44e06908952174391814e18ffa2b05943",
"sha256:8374f443324cde275735b7747f3f40d8e0fc79507e1b2b247073b734f868313b",
"sha256:9a72de032d88f2be9c5fbd8299cc6668277eccafebb6d2cebc74c8b1f8840ada" Filesystem where Jib is Running: [root@runner-4pp6rupi-project-208-concurrent-0 ~]# df -h
Filesystem Size Used Avail Use% Mounted on
overlay 99G 20G 74G 21% /
tmpfs 64M 0 64M 0% /dev
tmpfs 12G 0 12G 0% /sys/fs/cgroup
/dev/sdb1 99G 20G 74G 21% /cache
shm 64M 0 64M 0% /dev/shm
tmpfs 12G 1.1G 11G 9% /run/docker.sock
tmpfs 12G 0 12G 0% /proc/acpi
tmpfs 12G 0 12G 0% /proc/scsi
tmpfs 12G 0 12G 0% /sys/firmware Added debug log (click to expand)
Somewhat further in the log debug log 2 (click to expand)2020-10-16T11:46:23.880+0200 [LIFECYCLE] [org.gradle.api.Task] Building dependencies layer...
2020-10-16T11:46:23.880+0200 [DEBUG] [org.gradle.api.Task] TIMING Building dependencies layer
2020-10-16T11:46:27.169+0200 [DEBUG] [org.gradle.api.Task] Building dependencies layer built sha256:8d7ffb89d993be6e77c20c9c2b0587c629053798f147290061bf2120a94f4a9a
2020-10-16T11:46:27.170+0200 [DEBUG] [org.gradle.api.Task] TIMED Building dependencies layer : 3288.664 ms
2020-10-16T11:46:27.170+0200 [LIFECYCLE] [org.gradle.api.Task] Building project dependencies layer...
2020-10-16T11:46:27.170+0200 [DEBUG] [org.gradle.api.Task] TIMING Building project dependencies layer
2020-10-16T11:46:27.260+0200 [DEBUG] [org.gradle.api.Task] Building project dependencies layer built sha256:24bd28947421aa039cf66b3fe52acb855a526a140dafded53c25cb33fb366726
2020-10-16T11:46:27.260+0200 [DEBUG] [org.gradle.api.Task] TIMED Building project dependencies layer : 89.744 ms
2020-10-16T11:46:27.260+0200 [LIFECYCLE] [org.gradle.api.Task] Building resources layer...
2020-10-16T11:46:27.260+0200 [DEBUG] [org.gradle.api.Task] TIMING Building resources layer
2020-10-16T11:46:27.549+0200 [DEBUG] [org.gradle.api.Task] Building resources layer built sha256:8ad0238c274b64e5b5da5972853cb332f14e67db4f0b785ec6d9999d65ef0117
2020-10-16T11:46:27.549+0200 [DEBUG] [org.gradle.api.Task] TIMED Building resources layer : 288.862 ms
2020-10-16T11:46:27.550+0200 [LIFECYCLE] [org.gradle.api.Task] Building classes layer...
2020-10-16T11:46:27.550+0200 [DEBUG] [org.gradle.api.Task] TIMING Building classes layer
2020-10-16T11:46:29.267+0200 [DEBUG] [org.gradle.api.Task] Building classes layer built sha256:08ddc769522ac797d410160d775c51223f34faad9a5b6f1de98041753a900ec7
2020-10-16T11:46:29.267+0200 [DEBUG] [org.gradle.api.Task] TIMED Building classes layer : 1715.805 ms
2020-10-16T11:46:29.267+0200 [LIFECYCLE] [org.gradle.api.Task] Building extra files layer...
2020-10-16T11:46:29.267+0200 [DEBUG] [org.gradle.api.Task] TIMING Building extra files layer
2020-10-16T11:46:29.277+0200 [DEBUG] [org.gradle.api.Task] Building extra files layer built sha256:82006fc95e0453425cadc5157f08759749ff2404cf41d9e723fc93b5b0e9f4e1
2020-10-16T11:46:29.277+0200 [DEBUG] [org.gradle.api.Task] TIMED Building extra files layer : 11.098 ms
2020-10-16T11:46:29.281+0200 [DEBUG] [org.gradle.api.Task] TIMING Building container configuration
2020-10-16T11:46:29.282+0200 [LIFECYCLE] [org.gradle.api.Task] Container program arguments set to [/bin/sh, -c, ./start.sh] (inherited from base image)
2020-10-16T11:46:29.282+0200 [DEBUG] [org.gradle.api.Task] TIMED Building container configuration : 1.26 ms
2020-10-16T11:46:29.283+0200 [DEBUG] [org.gradle.api.Task] TIMING Loading to Docker daemon
2020-10-16T11:46:29.284+0200 [LIFECYCLE] [org.gradle.api.Task] Loading to Docker daemon...
2020-10-16T11:46:31.058+0200 [DEBUG] [org.gradle.api.Task] TIMED Loading to Docker daemon : 1774.868 ms
2020-10-16T11:46:31.232+0200 [DEBUG] [org.gradle.api.Task] TIMED Building image to Docker daemon : 61123.682 ms |
So I changed I executed the job five times and all were successful. Now the sub-module is using the previous minor version as image and the job succeeded three consecutive times. Both images do not have one layer twice. I guess this must be the root cause of the problem. |
Thanks for digging into it! This looks like the root cause. A layer is actually a tarball, so it means the image unnecessarily |
I can reproduce this with a Dockerd daemon base image that has duplicate layer entries. When I untar the image tarball that I got from
However, the temporary directory where Jib unpacked the output of
It's not a symlink, and and the size is 0. Therefore, if Jib caches the 0-byte
If Jib picked up the non-zero |
@chanseokoh I'm totally with you. |
@porqueoutai Jib 2.7.0 has been released with this fix! |
Environment:
Description of the issue:
We are building multiple images inside our pipeline job (Image: Centos 8). For that we are using the following command:
src/java/gradlew --gradle-user-home .gradle --info --stacktrace --build-cache --no-daemon --console=plain --parallel -p src/java jibDockerBuild -x test -Djib.console=plain
Today we transferred our last service building its image to jib
The successfully build layers cannot be uploaded to the docker daemon. For all other services inside the job it is working.
Out of 10 retries, one succeeded to upload to the docker daemon
Expected behavior:
Layers can be uploaded to the docker daemon
jib-gradle-plugin
Configuration:Log output:
Additional Information:
Building the image locally without --paralell works fine. Layers can be uploaded to the docker daemon.
The size of the image is 630 MB
Enough disk space inside the pipeline job was available
The text was updated successfully, but these errors were encountered: