SocketError: other side closed #583
Bug Description
Reproducible By
Expected Behavior
Logs & Screenshots
Environment
Additional context
I can still reproduce this error, please dont close it
SocketError: other side closed
at Socket.onSocketEnd (/home/../node_modules/undici/lib/client.js:1116:22)
at Socket.emit (node:events:525:35)
at Socket.emit (node:domain:489:12)
at endReadableNT (node:internal/streams/readable:1359:12)
at processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'UND_ERR_SOCKET',
socket: {
localAddress: '127.0.0.1',
localPort: 45318,
remoteAddress: undefined,
remotePort: undefined,
remoteFamily: undefined,
timeout: undefined,
bytesWritten: 53114,
bytesRead: 12742
}
}
Can you provide an Minimum Reproducible Example to reproduce it?
@metcoder95 hi, what I observed for now is:
if I spawn a long running process and wait it finish, then the next http call may throw this error. When I say long running, it's actually < 1.5 minute in my case
I'd hope to have a minimum reproducible example but it's very hard for now
I'm using [email protected], I just tried v5.27.0 and I still have the same issue. FYI, this happened while using hardhat framework
https://github.com/NomicFoundation/hardhat/issues/3136
If I execute cargo build while holding a HTTP connection, it fails with this error, but if I only execute cargo check which takes less time to finish. It then doesn't seem to influence and I wont get the same error
The error you are referring to its for node process taking longer to close due to network constraints. Here it seems is that the remote server closed the connection. Where are you trying to connect to and how??
The error you are referring to its for node process taking longer to close due to network constraints. Here it seems is that the remote server closed the connection. Where are you trying to connect to and how??
I try to connect to a local hardhat node using ethers library
Can you provide an Minimum Reproducible Example?
This error is still around in node 20.13.1. From what I've observed, it's hard to reproduce and happens when the system is under heavy load.
It doesn't however occur at all using the bun runtime.
Without a repro there is nothing we can do here.
@ronag here is the reproducible example: https://github.com/nordluf/fetch-socket-closed-example
I've been able to eliminate this issue by limiting the number of connections the node process can make to the server. NodeJS by default uses too many ports of the host i.e. opens too many connections to the server and at some point is not able to handle them or the OS interefers and the connections break. In my case, it used over 16_000 connections.
The number of connections can be limited using HTTP agents. Following is the snippet from my code that has fixed this issue.
import { setGlobalDispatcher, Agent } from "undici";
setGlobalDispatcher(
new Agent({
connections: 50,
})
);
You can set the number of connections based on your needs.
You can monitor the number of ports used by the node process via the following command on UNIX-based systems
lsof -i -n -P | grep node | awk '{print $9}' | awk -F '->' '{print $1}' | awk -F ':' '{print $2}' | sort -u | wc -l
I didn't face this issue at all when using the Bun runtime for JS.
I identified this by analyzing the packets between the client & server via Wireshark. There's also Charles Proxy that can help you analyze the traffic flow.
@WilfredAlmeida in the example I provided above it doesn't help