Excessive otel related logging in docker system logs
Description
I was running a test on my system (nothing large I've done MUCH bigger ones before) and noticed all of my containers stopped working during the test. I couldn't get a response from the docker cli, and had to reboot docker desktop. The CPU and memory usages of the host system were fine, no response from docker stats. When rebooted I checked the usual suspect of VM disk space, but it was fine, about 50% free.
This was nowhere near the stress tests I've done before so I went looking to see what could be going on, and noticed a lot of the following in the docker system logs:
view ~/Library/Containers/com.docker.docker/Data/log/vm/dockerd.log
[...]
2736 [2024-04-05T08:52:51.495540551Z][dockerd][I] 2024/04/05 08:52:51 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2737 [2024-04-05T08:52:51.495618843Z][dockerd][I] 2024/04/05 08:52:51 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2738 [2024-04-05T08:53:21.555807051Z][dockerd][I] 2024/04/05 08:53:21 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2739 [2024-04-05T08:53:21.555844135Z][dockerd][I] 2024/04/05 08:53:21 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2740 [2024-04-05T08:53:51.613091510Z][dockerd][I] 2024/04/05 08:53:51 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2741 [2024-04-05T08:53:51.613139093Z][dockerd][I] 2024/04/05 08:53:51 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2742 [2024-04-05T08:54:21.658324510Z][dockerd][I] 2024/04/05 08:54:21 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2743 [2024-04-05T08:54:21.658557843Z][dockerd][I] 2024/04/05 08:54:21 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2744 [2024-04-05T08:54:51.712903593Z][dockerd][I] 2024/04/05 08:54:51 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2745 [2024-04-05T08:54:51.712927218Z][dockerd][I] 2024/04/05 08:54:51 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2746 [2024-04-05T08:55:21.762819052Z][dockerd][I] 2024/04/05 08:55:21 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2747 [2024-04-05T08:55:21.762839343Z][dockerd][I] 2024/04/05 08:55:21 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
2748 [2024-04-05T08:55:51.810495260Z][dockerd][I] 2024/04/05 08:55:51 http: superfluous response.WriteHeader call from go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp.(*respWriterWrapper).WriteHeader (wrap.go:98)
[...]
I know it's not strictly a docker desktop problem, but it's something that should be resolved (hopefully it's fixed upstream and just a version bump). If I find out what actually went wrong that caused me to find this, I'll make another report.
Reproduce
You can reproduce this by calling docker stats, which triggers a line to be written to this file. I do not know what caused so much volume in my testing, but assume it was during a period of load.
Expected behavior
These log lines should not be here.
docker version
Client:
Cloud integration: v1.0.35+desktop.11
Version: 25.0.3
API version: 1.44
Go version: go1.21.6
Git commit: 4debf41
Built: Tue Feb 6 21:13:26 2024
OS/Arch: darwin/arm64
Context: desktop-linux
Server: Docker Desktop 4.28.0 (139021)
Engine:
Version: 25.0.3
API version: 1.44 (minimum version 1.24)
Go version: go1.21.6
Git commit: f417435
Built: Tue Feb 6 21:14:22 2024
OS/Arch: linux/arm64
Experimental: false
containerd:
Version: 1.6.28
GitCommit: ae07eda36dd25f8a1b98dfbf587313b99c0190bb
runc:
Version: 1.1.12
GitCommit: v1.1.12-0-g51d5e94
docker-init:
Version: 0.19.0
GitCommit: de40ad0
docker info
Client:
Version: 25.0.3
Context: desktop-linux
Debug Mode: false
Plugins:
buildx: Docker Buildx (Docker Inc.)
Version: v0.12.1-desktop.4
Path: /Users/davidb/.docker/cli-plugins/docker-buildx
compose: Docker Compose (Docker Inc.)
Version: v2.24.6-desktop.1
Path: /Users/davidb/.docker/cli-plugins/docker-compose
debug: Get a shell into any image or container. (Docker Inc.)
Version: 0.0.24
Path: /Users/davidb/.docker/cli-plugins/docker-debug
dev: Docker Dev Environments (Docker Inc.)
Version: v0.1.0
Path: /Users/davidb/.docker/cli-plugins/docker-dev
extension: Manages Docker extensions (Docker Inc.)
Version: v0.2.22
Path: /Users/davidb/.docker/cli-plugins/docker-extension
feedback: Provide feedback, right in your terminal! (Docker Inc.)
Version: v1.0.4
Path: /Users/davidb/.docker/cli-plugins/docker-feedback
init: Creates Docker-related starter files for your project (Docker Inc.)
Version: v1.0.1
Path: /Users/davidb/.docker/cli-plugins/docker-init
sbom: View the packaged-based Software Bill Of Materials (SBOM) for an image (Anchore Inc.)
Version: 0.6.0
Path: /Users/davidb/.docker/cli-plugins/docker-sbom
scout: Docker Scout (Docker Inc.)
Version: v1.5.0
Path: /Users/davidb/.docker/cli-plugins/docker-scout
WARNING: Plugin "/Users/davidb/.docker/cli-plugins/docker-scan" is not valid: failed to fetch metadata: fork/exec /Users/davidb/.docker/cli-plugins/docker-scan: no such file or directory
Server:
Containers: 33
Running: 1
Paused: 0
Stopped: 32
Images: 117
Server Version: 25.0.3
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Using metacopy: false
Native Overlay Diff: true
userxattr: false
Logging Driver: json-file
Cgroup Driver: cgroupfs
Cgroup Version: 2
Plugins:
Volume: local
Network: bridge host ipvlan macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
Swarm: inactive
Runtimes: io.containerd.runc.v2 runc
Default Runtime: runc
Init Binary: docker-init
containerd version: ae07eda36dd25f8a1b98dfbf587313b99c0190bb
runc version: v1.1.12-0-g51d5e94
init version: de40ad0
Security Options:
seccomp
Profile: unconfined
cgroupns
Kernel Version: 6.6.16-linuxkit
Operating System: Docker Desktop
OSType: linux
Architecture: aarch64
CPUs: 10
Total Memory: 13.64GiB
Name: docker-desktop
ID: 37e4883a-e101-475c-b205-20214cbefa89
Docker Root Dir: /var/lib/docker
Debug Mode: false
HTTP Proxy: http.docker.internal:3128
HTTPS Proxy: http.docker.internal:3128
No Proxy: hubproxy.docker.internal
Experimental: false
Insecure Registries:
hubproxy.docker.internal:5555
tower:5000
127.0.0.0/8
Live Restore Enabled: false
WARNING: daemon is not using the default seccomp profile
Diagnostics ID
BCB89CB8-F1A7-4BBF-8B17-F3DC1B805EA3/20240405113718
Additional Info
No response
@milas any idea?
This is a known issue with the Engine:
- moby/moby#47448
I'll chat with runtime team and see what the latest is there, we were talking about it recently so it's on everyone's radar
A fix for this is included in docker engine 26.1.3 and up;
- https://github.com/moby/moby/pull/47843
Which will be included in the upcoming Docker Desktop 4.31 release.
edit: 26.1.3, not 24.1.3
Running 26.1.4 now, and tailing the file & running docker stats no longer logs anything, lgtm!
Good to hear! Thanks for confirming it's fixed