Skip to content

Conversation

mikicho
Copy link
Contributor

@mikicho mikicho commented Jul 6, 2024

@mikicho mikicho mentioned this pull request Jul 6, 2024
19 tasks
@HonzaMac
Copy link

For mine use case, this error occur when using localstack and calling S3 upload. Uploading for first byte take so long (timeouts), that localstack sends back 100. 100 is not handled by undici/native fetch in v20 correctly and it throws error of not handle status code by undici.

@kettanaito
Copy link
Member

kettanaito commented Sep 8, 2024

I've pushed the fix where we are now able to construct a 100 Continue response without it throwing (<199 are not user-configurable status codes).

Updates the tests, no idea why they are failing only when (1) the interceptor is on; (2) the request.end() is nested in the continue event callback. Sharing the log outputs for two scenarios (bypass and mocked)

Bypass

SOCKET WRITE POST /resource HTTP/1.1
expect: 100-continue
Host: 127.0.0.1:62344
Connection: close
Transfer-Encoding: chunked


SOCKET EMIT [ 'resume' ]

stdout | modules/http/compliance/http-request-continue.test.ts > emits "continue" event for a request with "100-continue" expect header
SOCKET EMIT [ 'connect' ]
SOCKET CONNECT
SOCKET EMIT [ 'ready' ]
!!![server] added req.on(data)
SOCKET PUSH HTTP/1.1 100 Continue


SOCKET EMIT [ 'data', 'HTTP/1.1 100 Continue\r\n\r\n' ]
REQ CONTINUE
REQ END
!!!! writing request...
SOCKET WRITE 5
SOCKET WRITE 

SOCKET WRITE hello
SOCKET WRITE 

SOCKET WRITE 0


REQ FINISH

[server] req data: hello

SOCKET PUSH HTTP/1.1 200 OK
X-Powered-By: Express
Date: Sun, 08 Sep 2024 14:59:48 GMT
Connection: close
Transfer-Encoding: chunked

5
hello
0


SOCKET EMIT [
  'data',
  'HTTP/1.1 200 OK\r\n' +
    'X-Powered-By: Express\r\n' +
    'Date: Sun, 08 Sep 2024 14:59:48 GMT\r\n' +
    'Connection: close\r\n' +
    'Transfer-Encoding: chunked\r\n' +
    '\r\n' +
    '5\r\n' +
    'hello\r\n' +
    '0\r\n' +
    '\r\n'
]
REQ RESPONSE

SOCKET FINISH

SOCKET EMIT [ 'prefinish' ]
SOCKET EMIT [ 'finish' ]
SOCKET EMIT [ 'close', false ]

SOCKET CLOSE

Mocked

SOCKET WRITE POST /resource HTTP/1.1
expect: 100-continue
Host: 127.0.0.1:62777
Connection: close
Transfer-Encoding: chunked


[*] request POST http://127.0.0.1:62777/resource
SOCKET EMIT [ 'resume' ]
SOCKET EMIT [ 'resume' ]
SOCKET EMIT [ 'connect' ]
SOCKET CONNECT
SOCKET EMIT [ 'ready' ]
!!![server] added req.on(data)

SOCKET PUSH HTTP/1.1 100 Continue


SOCKET EMIT [ 'data', 'HTTP/1.1 100 Continue\r\n\r\n' ]
REQ CONTINUE
REQ END
!!!! writing request...
SOCKET WRITE 5
SOCKET WRITE 

SOCKET WRITE hello
REQ BODY! hello true
SOCKET WRITE 

SOCKET WRITE 0


REQ FINISH

// HANGS HERE FOR A WHILE UNTIL TEST TIMESOUT
// BECAUSE "req.on(data)" NEVER EMITS, AND THUS
// THE SERVER NEVER SENDS A RESPONSE.

SOCKET END

SOCKET FINISH

SOCKET EMIT [ 'end' ]
SOCKET EMIT [ 'close' ]
SOCKET EMIT [ 'prefinish' ]
SOCKET EMIT [ 'finish' ]
SOCKET EMIT [ 'close', false ]

SOCKET CLOSE
SOCKET CLOSE

@kettanaito kettanaito added the help wanted Extra attention is needed label Sep 8, 2024
@JoaquinFernandez
Copy link

JoaquinFernandez commented Apr 23, 2025

I'll post here, since I see it is already in conversation for Nock. This issue is also happening for me, when uploading a big file, the client first sends the Expect: 100 Continue, before sending the body. This is expected behavior from the http lib.

I'd be happy to help implement a solution, since I don't see how I could work around this elsewhere and it is necessary for me to upgrade to nock v14 (with fetch support, thanks to mswjs/interceptors).

@JoaquinFernandez
Copy link

Bump

@mikicho
Copy link
Contributor Author

mikicho commented Jun 25, 2025

@kettanaito Maybe things changed since you worked on it, but this is working to me:

const { FetchResponse } = require('./lib/node')
const { ClientRequestInterceptor } = require('./lib/node/interceptors/ClientRequest')
const http = require('http');

const interceptor = new ClientRequestInterceptor();
interceptor.apply()

// Simple HTTP server
const server = http.createServer((req, res) => {
  if (req.method === 'POST') {
    let body = '';
    req.on('data', chunk => (body += chunk));
    req.on('end', () => {
      res.writeHead(200, { 'Content-Type': 'text/plain' });
      res.end('Received: ' + body);
    });
  } else {
    res.writeHead(200, { 'Content-Type': 'text/plain' });
    res.end('Hello World');
  }
});

server.listen(3001, () => {
  console.log('Server listening on http://localhost:3001');
});

// Example client making a request with 'Expect: 100-continue'
const options = {
  port: 3001,
  method: 'POST',
  headers: {
    'Content-Length': Buffer.byteLength('test body'),
    'Expect': '100-continue'
  }
};

const req = http.request(options, res => {
  let data = '';
  res.on('data', chunk => (data += chunk));
  res.on('end', () => {
    console.log('Response:', data);
    server.close();
  });
});

req.on('continue', () => {
  req.write('test body');
  req.end();
});

The problem is if the user wants to return 100 response as mocked response:

interceptor.on('request', ({controller}) => {
  controller.respondWith(new FetchResponse(null, { status: 100 }))
});

error:

Server listening on http://localhost:3001
node:_http_client:542
emitErrorEvent(req, new ConnResetException('socket hang up'));
^

Error: socket hang up
at MockHttpSocket.socketOnEnd (node:_http_client:542:25)
at MockHttpSocket.emit (node:events:530:35)
at
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
Emitted 'error' event on ClientRequest instance at:
at emitErrorEvent (node:_http_client:104:11)
at MockHttpSocket.socketOnEnd (node:_http_client:542:5)
at MockHttpSocket.emit (node:events:530:35)
at
at process.processTicksAndRejections (node:internal/process/task_queues:105:5) {
code: 'ECONNRESET'
}

I continue (🥁) to investigate it.

@mikicho
Copy link
Contributor Author

mikicho commented Jun 25, 2025

I think the problem is that the request "continues" to send the body, but on the interceptors side, we have already passed the body read phase. So nothing happened.

The socket hang up error is because we end the stream (this.push(null))
Another weird thing I see which may be related, this.shouldKeepAlive is always false even when I send new Agent( { keepAlive: true})


Update: ok... I understand the issue, we get the body (after the continue) in the interceptor, but can't react to it, because we already responded to the request:

interceptor.on('request', ({controller, request}) => {
  request.text().then(body => controller.respondWith(new FetchResponse('Hello World', { status: 200 }))) // body = "test body" as expected
  controller.respondWith(new FetchResponse(null, { status: 100 }))
});

InterceptorError: Failed to respond to the "POST http://localhost:3001/" request: the "request" event has already been handled.

@kettanaito WDYT?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants