Skip to content

Conversation

usualoma
Copy link
Member

@usualoma usualoma commented Jul 20, 2025

fixes #260

What is this?

The method of extracting the body from the response, which was determined by a fixed (somewhat environment-specific) rule based on the response header, will be determined by adding a point for whether it is sent asynchronously.

This change may (rarely) cause behavior changes, so it is recommended to release a minor version (or major version).

Reading the current code

The following code to be changed has detailed comments explaining “How,” but there are no comments explaining “Why.”

node-server/src/listener.ts

Lines 125 to 161 in 29a19ae

/**
* If content-encoding is set, we assume that the response should be not decoded.
* Else if transfer-encoding is set, we assume that the response should be streamed.
* Else if content-length is set, we assume that the response content has been taken care of.
* Else if x-accel-buffering is set to no, we assume that the response should be streamed.
* Else if content-type is not application/json nor text/* but can be text/event-stream,
* we assume that the response should be streamed.
*/
const {
'transfer-encoding': transferEncoding,
'content-encoding': contentEncoding,
'content-length': contentLength,
'x-accel-buffering': accelBuffering,
'content-type': contentType,
} = resHeaderRecord
if (
transferEncoding ||
contentEncoding ||
contentLength ||
// nginx buffering variant
(accelBuffering && regBuffer.test(accelBuffering as string)) ||
!regContentType.test(contentType as string)
) {
outgoing.writeHead(res.status, resHeaderRecord)
flushHeaders(outgoing)
await writeFromReadableStream(res.body, outgoing)
} else {
const buffer = await res.arrayBuffer()
resHeaderRecord['content-length'] = buffer.byteLength
outgoing.writeHead(res.status, resHeaderRecord)
outgoing.end(new Uint8Array(buffer))
}
} else if (resHeaderRecord[X_ALREADY_SENT]) {

This code is a bit complex because it was written with great care to take into account x-accel-buffering and other factors specific to the (nginx) environment, but the reason for ‘Why’ is simple: “We want to determine whether to automatically add Content-Length to the response based on the header.”

Synchronous reading of data from res.body (a ReadableStream)

When reading from res.body of a response generated like new Response(text), it is usually as follows (in node.js). If Content-Length can be added to this result, it will be possible to determine the actual result rather than relying on a specific environment as is currently the case. This will also solve the problem in #260.

const firstChunk = await reader.read() // Returns all text, `done` is false
const secondChunk = await reader.read() // `done` is sure

@usualoma usualoma force-pushed the return-from-readable-stream branch 3 times, most recently from f3ce91a to fe5fa5c Compare July 21, 2025 00:01
@usualoma usualoma force-pushed the return-from-readable-stream branch from fe5fa5c to 4d70f93 Compare July 21, 2025 01:41
@usualoma usualoma changed the title [WIP] feat: always respond via readableStream feat: always respond via readableStream Jul 21, 2025
@usualoma usualoma marked this pull request as ready for review July 21, 2025 01:46
@usualoma
Copy link
Member Author

@yusukebe
Would you please review this?

@usualoma usualoma changed the title feat: always respond via readableStream feat: always respond res.body Jul 21, 2025
) {
outgoing.writeHead(res.status, resHeaderRecord)
flushHeaders(outgoing)
// In the case of synchronous responses, usually a maximum of two readings is done
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a simple and good idea!

@yusukebe
Copy link
Member

Hi @usualoma

This PR's idea makes sense, and the implementation is well done!

This change may (rarely) cause behavior changes, so it is recommended to release a minor version (or major version).

Okay! By the way, I've considered those cases, but I couldn't find a good example. Can you imagine the real-world cases?

@rphlmr
Copy link

rphlmr commented Jul 22, 2025

Works great solving #260

Tested on:

  • Node production mode
  • Node development mode with Vite (@hono/vite-dev-server)
  • Bun (@hono/vite-dev-server)

@yusukebe
Copy link
Member

I also tested some patterns that include returning streaming responses. They work great.

@usualoma
Copy link
Member Author

An app that changes results

Typical examples of apps that change results are as follows. When returning (Content-Length-less) text/html with a slight delay, the main branch automatically added Cotent-Length, but this branch no longer does so. I think this situation is possible with certain types of proxies.

However, this does not normally occur when using hono's proxy helper with Node.js. Are there any real-world applications that return text/html without a Content-Length header, with a slight delay? If so, is it expected that these applications automatically add a Content-Length header? I am not sure about this point. (If such applications exist, they are likely very rare.)

app.get(
  '/',
  () => new Response(new ReadableStream({
    async start(controller) {
      await new Promise((resolve) => setTimeout(resolve, 100))
      controller.enqueue(new TextEncoder().encode('Hello, world!'))
      controller.close()
    },
  }), {
    headers: {
      'Content-Type': 'text/html',
    },
  })
)

serve(
  {
    fetch: app.fetch,
    port: 3000,
    overrideGlobalObjects: false,
  },
  (info) => {
    console.log(`Server is running on http://localhost:${info.port}`)
  },
)

main branch

% curl -i http://localhost:3000
HTTP/1.1 200 OK
content-type: text/html
content-length: 13
Date: Tue, 22 Jul 2025 21:05:59 GMT
Connection: keep-alive
Keep-Alive: timeout=5

Hello, world!

this branch

% curl -i http://localhost:3000
HTTP/1.1 200 OK
content-type: text/html
Date: Tue, 22 Jul 2025 21:04:53 GMT
Connection: keep-alive
Keep-Alive: timeout=5
Transfer-Encoding: chunked

Hello, world!

@usualoma
Copy link
Member Author

There may be use cases where the current main branch behavior is fine, but I think this pull request branch is simpler, makes more sense, and is easier for users to control. (If Content-Length is necessary, users can just add it themselves.)

@rafaell-lycan
Copy link

@usualoma could this affect the performance?

@usualoma
Copy link
Member Author

Hi @rafaell-lycan

When using node-server, the typical response for c.html() and c.text() (i.e., the fastest case response) does not pass through the path of the changed section, so there is no impact.

For requests that pass through the changed section, the bottleneck is likely to be in a different section. Furthermore, the amount of computation should not increase after the change, so there should be no impact.

I will measure it when I have time.

@yusukebe
Copy link
Member

Hey @usualoma

Thank you for the reply!

There may be use cases where the current main branch behavior is fine, but I think this pull request branch is simpler, makes more sense, and is easier for users to control. (If Content-Length is necessary, users can just add it themselves.)

I totally agree you mentioned. This PR's implementation is simple. I think that, in the previous implementation, determining whether or not it was a stream based on the value of resHeaderRecord is not the best implementation.

Are there any real-world applications that return text/html without a Content-Length header, with a slight delay? If so, is it expected that these applications automatically add a Content-Length header?

I also think that this is a rare case, and we don't need to worry about it. We can include it in the next minor version.

Copy link
Member

@yusukebe yusukebe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@yusukebe
Copy link
Member

Looks good to me! If it doesn't have a performance issue, I can merge this.

@usualoma
Copy link
Member Author

benchmarks

I ran benchmarks on the following items and found no performance degradation due to this change.

app

const app = new Hono()

app.get('/', () => {
    const res = new Response('foo')
    res.body // create original globalThis.Response
    return res
})

result with usualoma:return-from-readable-stream

% bombardier -d 10s --fasthttp http://localhost:3003/
Bombarding http://localhost:3003/ for 10s using 125 connection(s)
[======================================================================================================================] 10s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec     47520.41    5089.03   55326.86
  Latency        2.63ms     2.66ms   304.48ms
  HTTP codes:
    1xx - 0, 2xx - 475145, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    10.29MB/s

result with main

% bombardier -d 10s --fasthttp http://localhost:3003/
Bombarding http://localhost:3003/ for 10s using 125 connection(s)
[======================================================================================================================] 10s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec     47593.05    5084.84   52283.44
  Latency        2.63ms     2.49ms   290.43ms
  HTTP codes:
    1xx - 0, 2xx - 475793, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:    10.30MB/s

@yusukebe
Copy link
Member

@usualoma

Great that it is resolving without performance degradation! Let's go with this. Thank you so much!

@yusukebe yusukebe merged commit 3c1a647 into honojs:main Jul 27, 2025
4 checks passed
@yusukebe
Copy link
Member

Hi @usualoma

Unfortunately, the test fails in v24.x (I noticed that on my machine, where Node.js 24 is installed, and I added it to the CI now):

https://github.com/honojs/node-server/actions/runs/16546886371/job/46795876085

The error occurs in the GlobalResponse section, so the global Response behavior may be changed in v24. The code is good, and it's not a major issue, maybe. What we might need to do is fix a test. Sorry for bothering you, but can you take a look at it?

@usualoma usualoma deleted the return-from-readable-stream branch July 29, 2025 21:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ReadableStream body issue since v1.14.2
4 participants