node icon indicating copy to clipboard operation
node copied to clipboard

"This is caused by either a bug in Node.js or incorrect usage of Node.js internals. Please open an issue with this stack trace..."

Open Navigatron opened this issue 5 years ago • 13 comments

  • Version: v14.12.0
  • Platform: Linux jillian 5.8.10-arch1-1 #1 SMP PREEMPT Thu, 17 Sep 2020 18:01:06 +0000 x86_64 GNU/Linux

What steps will reproduce the bug?

This occurred while sitting on 2000 concurrent https requests (using a custom agent) across 4 worker threads. ~100 of the requests hit a 2-minute timeout shortly before this happened. In short, this may be difficult to reproduce.

How often does it reproduce? Is there a required condition?

So far, it's only happened once.

Edit: I can reproduce this. 2000 concurrent requests does it, 400 does not.

What is the expected behavior?

No error messages.

What do you see instead?

In text form:

events.js:291
      throw er; // Unhandled 'error' event
      ^
Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

    at assert (internal/assert.js:14:11)
    at JSStreamSocket.doWrite (internal/js_stream_socket.js:165:5)
    at JSStream.onwrite (internal/js_stream_socket.js:33:57)
    at Socket.ondata (internal/js_stream_socket.js:77:22)
    at Socket.emit (events.js:314:20)
    at addChunk (_stream_readable.js:307:12)
    at readableAddChunk (_stream_readable.js:282:9)
    at Socket.Readable.push (_stream_readable.js:221:10)
    at TCP.onStreamRead (internal/stream_base_commons.js:188:23)
Emitted 'error' event on process instance at:
    at emitUnhandledRejectionOrErr (internal/event_target.js:542:11)
    at MessagePort.[nodejs.internal.kHybridDispatch] (internal/event_target.js:357:9)
    at MessagePort.exports.emitMessage (internal/per_context/messageport.js:18:26) {
  code: 'ERR_INTERNAL_ASSERTION'
}

In picture form:

screenshot

Navigatron avatar Sep 28 '20 02:09 Navigatron

@nodejs/streams @joyeecheung

Trott avatar Sep 28 '20 04:09 Trott

Unfortunately it's close to impossible to help you with this without a script to reproduce this reliably.

mcollina avatar Sep 28 '20 06:09 mcollina

I am also facing this issue in version 12.x (we are using AWS lambda). We are calling multiple ajax in loop and sometime this error is thrown. I had also added comment in below similar issue which is for 15.x

https://github.com/nodejs/help/issues/2708

hnviradiya avatar Aug 30 '21 12:08 hnviradiya

I am also facing this issue in version 12.x (we are using AWS lambda). We are calling multiple ajax in loop and sometime this error is thrown.

@hnviradiya Are you able to share your Lambda code?

Trott avatar Aug 30 '21 16:08 Trott

I'm also experiencing this issue error while downloading large files using https. I tried using node-fetch instead but I'm seeing the same error. It happens randomly, but I've only seen it happen when I'm downloading files from a slow and somewhat unstable server connection. v14.16.0

terreng avatar Sep 26 '21 20:09 terreng

@terreng can you share your script?

mcollina avatar Sep 26 '21 20:09 mcollina

@mcollina

const https = require('https');
const fs = require('fs');
const path = require('path');

var file = fs.createWriteStream(path.resolve(__dirname, "test"));
https.get(URL, function(response) {
    response.pipe(file);
    file.on("finish", function() {
        file.close();
        console.log("Download complete");
    });
}).on("error", function(err) {
    console.error(err);
});
Error: Parse Error: Expected HTTP/
    at HTTPParser.execute (<anonymous>)
    at TLSSocket.socketOnData (_http_client.js:509:22)
    at TLSSocket.emit (events.js:315:20)
    at TLSSocket.Readable.read (internal/streams/readable.js:519:10)
    at TLSSocket.Socket.read (net.js:625:39)
    at flow (internal/streams/readable.js:992:34)
    at resume_ (internal/streams/readable.js:973:3)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  bytesParsed: 5381,
  code: 'HPE_INVALID_CONSTANT',
  reason: 'Expected HTTP/',
  rawPacket: <Buffer 76 5e f0 68 d0 0c 11 d7 50 8f 67 2e 18 c8 b3 a6 ac c8 75 70 2b e3 65 26 5d e9 b5 1f 6e 62 2b 88 87 15 0e 2f 14 4b 7d 90 33 6a 5a 18 06 bf 74 86 cf 6c ... 16334 more bytes>
}
internal/assert.js:14
    throw new ERR_INTERNAL_ASSERTION(message);
    ^

Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

    at assert (internal/assert.js:14:11)
    at TLSSocket.socketOnData (_http_client.js:507:3)
    at TLSSocket.emit (events.js:315:20)
    at TLSSocket.Readable.read (internal/streams/readable.js:519:10)
    at TLSSocket.Socket.read (net.js:625:39)
    at flow (internal/streams/readable.js:992:34)
    at resume_ (internal/streams/readable.js:973:3)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  code: 'ERR_INTERNAL_ASSERTION'
}

v14.16.0 linux arm

terreng avatar Sep 27 '21 21:09 terreng

Which URL are you downloading from?

mcollina avatar Sep 27 '21 21:09 mcollina

@mcollina I've sent you an email with an example URL.

terreng avatar Sep 27 '21 22:09 terreng

@terreng that's a different issue from the one at the top. In your case the HTTP server is responding with a response that is not conforming to the HTTP spec. I still think there is a bug there somewhere because the second error should not be there. I've opened a new issue: https://github.com/nodejs/node/issues/40242.

mcollina avatar Sep 28 '21 14:09 mcollina

@mcollina Got it. I definitely got just an ERR_INTERNAL_ASSERTION with a different stack trace at one point, but I haven't been able to recreate it so far.

terreng avatar Sep 28 '21 15:09 terreng

I ran into an assertion with the exact same stack trace the other day:

node:internal/assert:14
    throw new ERR_INTERNAL_ASSERTION(message);
    ^

Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

    at new NodeError (node:internal/errors:371:5)
    at assert (node:internal/assert:14:11)
    at JSStreamSocket.doWrite (node:internal/js_stream_socket:164:5)
    at JSStream.onwrite (node:internal/js_stream_socket:33:57)
    at MqttStream.ondata (node:internal/js_stream_socket:77:22)
    at MqttStream.emit (node:events:526:28)
    at MqttStream.emit (node:domain:475:12)
    at addChunk (node:internal/streams/readable:315:12)
    at readableAddChunk (node:internal/streams/readable:289:9)
    at MqttStream.Readable.push (node:internal/streams/readable:228:10) {
  code: 'ERR_INTERNAL_ASSERTION'
}

In my case I have a custom Duplex stream wrapped with a TLSSocket but the symptoms are the same: under moderate to high load every once in a while this assertion is hit and my app crashes.

I did some debugging and it seems to me that this is caused by a missing "current empty write" check in the native TLSWrap implementation. Debug output for the TLSSocket in question:

TLSWrap client (6653) DoWrite()
TLSWrap client (6653) Empty write
TLSWrap client (6653) Trying to read cleartext output
TLSWrap client (6653) Read -1 bytes of cleartext output
TLSWrap client (6653) No pending encrypted output, writing to underlying stream
TLSWrap client (6653) Read 33 bytes from underlying stream
TLSWrap client (6653) Trying to write cleartext input
TLSWrap client (6653) Returning from ClearIn(), no pending data
TLSWrap client (6653) Trying to read cleartext output
TLSWrap client (6653) SSLInfoCallback(SSL_CB_HANDSHAKE_START);
TLSWrap client (6653) Read -1 bytes of cleartext output
TLSWrap client (6653) Trying to write encrypted output
TLSWrap client (6653) Writing 1 buffers to the underlying stream
node:internal/assert:14
    throw new ERR_INTERNAL_ASSERTION(message);
    ^

Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

The failed assertion is produced as follows:

  • Something somewhere in the TLS implementation produces an "empty write" which is handled in a special way in TLSWrap: https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L944-L961
  • For some reason pending empty writes are tracked in a separate current_empty_write_ variable, not the regular current_write_: https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L950
  • The write is forwarded to the underlying JSStream, JSStreamSocket and then finally to the custom Duplex stream
  • In my case the _write logic of my custom Duplex stream takes its sweet time writing the data and calls the callback with a considerable delay
  • While this empty write is still pending, new data arrives and is pushed into the read buffer of the underlying Duplex stream. This is forwarded into TLSWrap where it triggers a call to OnStreamRead: https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L1050
  • OnStreamRead processes the read and then calls Cycle at the end: https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L1093
  • Cycle in turn calls EncOut which fails to check if current_empty_write_ is set – it only checks current_write_: https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L574
  • As a result EncOut writes to the underlying stream even though the previous empty write is still pending: https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L623
  • Since a write is already pending, JSStreamSocket blows up on this assertion: https://github.com/nodejs/node/blob/v16.14.2/lib/internal/js_stream_socket.js#L164

To reproduce the bug the following conditions need to be fulfilled:

  • The TLS stack needs to trigger an "empty write" somehow. This seems to be random as far as I can tell but it happens more often around SSL handshakes or renegotiations.
  • The _write logic in the underlying stream needs a large enough delay before calling the callback.
  • At the same time additional data needs to be already present in the enc_out_ buffer so it can be erroneously written out to the underlying stream during the next EncOut pass.

I tried but haven't been able to come up with a minimal implementation that reliably reproduces the issue.

For now I am using the following workaround in JS-land, in the constructor of my TLSSocket subclass:

// Patch `JSStreamSocket` implementation to work around a bug in the native `TLSWrap` class. `TLSWrap` treats
// empty writes in a special way, tracking pending ones in a separate state variable, `current_empty_write_`:
// - https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L934-L961
// Sadly `TLSWrap` forgets to check this variable upon subsequent write requests, only checking `current_write_`:
// - https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L574
// As a result, if another write is received before the empty write's callback is called, `TLSWrap` forwards it to
// the underlying `JSStreamSocket` straight away where it then blows up with an `ERR_INTERNAL_ASSERTION`:
// - https://github.com/nodejs/node/blob/v16.14.2/lib/internal/js_stream_socket.js#L164
// We work around this problem by patching `doWrite` to buffer any write requests that arrive while another write
// is still in progress, as well as patching `finishWrite` to execute any buffered writes after the pending empty
// write has finished. This *should* prevent the assertion from being hit.
// TODO: re-test with newer Node versions to see if the bug has been fixed
let kCurrentWriteRequest: symbol;
const jsStreamSocket = (this as any).ssl._parentWrap;
// Save original `doWrite` and `finishWrite` methods
const doWrite = jsStreamSocket.doWrite;
const finishWrite = jsStreamSocket.finishWrite;
// List of pending write requests
const pending: [any, Buffer[]][] = [];
// Patch `doWrite` to buffer write requests if another one is already pending
jsStreamSocket.doWrite = (req: any, bufs: Buffer[]) => {
  // Fish the private `kCurrentWriteRequest` symbol out of the `JSStreamSocket` and cache it
  kCurrentWriteRequest =
    kCurrentWriteRequest ||
    Object.getOwnPropertySymbols(jsStreamSocket).find(s => s.description === "kCurrentWriteRequest");
  // Do we have a pending write request already?
  if (kCurrentWriteRequest && jsStreamSocket[kCurrentWriteRequest]) {
    // If so, do not do the write now, instead buffer it for later
    pending.push([req, bufs]);
    return 0;
  }
  // If not, we can go ahead with the write as usual
  return doWrite.call(jsStreamSocket, req, bufs);
};
// Patch `finishWrite` to dispatch any buffered write requests now that the current one has finished
jsStreamSocket.finishWrite = (handle: any, errCode: number) => {
  if (handle) {
    // Finish the current write
    finishWrite.call(jsStreamSocket, handle, errCode);
  }
  // Take the next request from the buffer, if any
  const queued = pending.shift();
  if (queued) {
    // Call `doWrite` now that the previous write has finished
    jsStreamSocket.doWrite(...queued);
  }
};

With this in place I haven't been able to hit the assertion even on a ridiculously overloaded server.

I don't feel comfortable enough around C++ code to attempt to fix this myself (nor do I understand the internals of TLSWrap enough for that), but hopefully this input will help someone more knowledgeable develop a fix.

solymosi avatar Mar 23 '22 13:03 solymosi

MongoDB Compass has encountered an unexpected error Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals. Please open an issue with this stack trace at https://github.com/nodejs/node/issues at new NodeError (node:internal/errors:406:5) at assert (node:internal/assert:14:11) at internalConnectMultiple (node:net:1118:3) at Timeout.internalConnectMultipleTimeout (node:net:1687:3) at listOnTimeout (node:internal/timers:575:11) at process.processTimers (node:internal/timers:514:7)

my node version v20.13.1

naveenjpr avatar Jun 27 '24 09:06 naveenjpr

Happy fourth anniversary to that issue! The latest relevant update is more than two years old, I'm going to optimistically close this, let's reopen if someone is able to reproduce with a non-EOL version of Node.js.

aduh95 avatar Sep 28 '24 09:09 aduh95