-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HTTP2 download of parallel requests is slower than HTTP1.1 #54874
Comments
Are you sure it's not the server that is being requested? Have you tried to reproduce with another http2 client?
How different were the speeds compared? CC @nodejs/http2 |
Yeah, I'm sure it's not the server that is being requested. I have tried
Here's the results of the runs I just completed on our internal APIs (25 parallel requests at the same time using the script I provided for node and for curl)
Also, I was able to get different results even when fetching |
I just ran node profiler for the script I provided fetching our internal API, and the results are weird... Here's the summary for http2
and for http1.1
http2 has less total ticks, but more GC ticks. I wonder if it could be somehow related to GC |
I guess something very odd is happening within http/2. I don't think the problem here is GC - it's 100% true that HTTP/2 consumes more resources than HTTP/1.1 but the difference should be lower. It seems that curl can progress all HTTP/2 streams in parallel, while we can't, and after a bit it becomes sequential. I wonder if @jasnell has some ideas. |
@mcollina I did some testing today and found some new insights. This does not happen for our internal endpoints with little data, those are processed faster with http2. When fetching a heavy endpoint with lots of data(json with lots of text, most of the text is contained in one particular field), http1.1 seems to be much faster. Besides, when fetching heavy endpoint, http1.1 receives smaller chunks on What's interesting: in cases when http1.1 is slower than http2, it receives larger chunks on |
@stanislav-halyn I played with your script a bit and I wasn't able to reproduce the http2 slowness (compared to http1) not even once. Do you have any more updates on the issue? Did you find anything else? |
@puskin94 I just re-run the tests on our internal APIs, and to my surprise the results are the same now. The results for a 25 parallel requests to a heavy API endpoint used to be: ~1.5s for http1 and ~4-5s for http2. I also re-run test for the Here's the script I used: test.js
/* eslint-disable */
import http2 from 'node:http2';
import https from 'node:https';
const isHttp2 = process.argv.includes('http2');
const isSilent = process.argv.includes('silent');
const MB = 1024 * 1024;
const WINDOW_SIZE = 32 * MB;
// let url = new URL('https://nodejs.org/dist/v17.1.0/node-v17.1.0-x64.msi');
let url = new URL('https://nodejs.org/dist/index.json');
const client = http2.connect(url.protocol + url.host, {
settings: {
initialWindowSize: WINDOW_SIZE,
},
rejectUnauthorized: false,
});
client.on('connect', () => {
client.setLocalWindowSize(WINDOW_SIZE);
});
const logger = isSilent
? {
log: () => {},
info: console.info,
}
: {
log: console.log,
info: console.info,
};
const headers = {};
function fetchHttp1(id) {
return new Promise((resolve) => {
const req = https.request(
{
rejectUnauthorized: false,
host: url.hostname,
path: url.pathname + url.search,
port: url.port,
headers: {
connection: 'keep-alive',
...headers,
},
},
(res) => {
let counter = 0;
let bytes = 0;
res.on('data', (chunk) => {
counter += 1;
bytes += Buffer.byteLength(chunk);
});
res.on('end', () => {
logger.log(
`Complete request with id: ${id}, bytes download: ${Math.floor(bytes / counter)} bytes/chunk`,
);
resolve();
});
},
);
req.end();
});
}
function fetchHttp2(id) {
return new Promise((resolve) => {
const req = client.request({
':path': url.pathname + url.search,
...headers,
});
let counter = 0;
let bytes = 0;
req.on('data', (chunk) => {
counter += 1;
bytes += Buffer.byteLength(chunk);
});
req.on('end', () => {
logger.log(
`Complete request with id: ${id}, bytes download: ${Math.floor(bytes / counter)} bytes/chunk`,
);
resolve();
});
req.end();
});
}
async function main() {
console.log(`process id: ${process.pid}`);
console.log(`Starting requests using HTTP${isHttp2 ? '2' : '1.1'} protocol. URL: ${url}`);
const startTime = Date.now();
const responses = await Promise.all(
Array.from({ length: 25 })
.fill(null)
.map((_, index) => {
return isHttp2 ? fetchHttp2(index) : fetchHttp1(index);
}),
);
console.log(
`Requests complete. Completion time: ${(Date.now() - startTime) / 1000}s.`,
);
process.exit(0);
}
main(); |
@stanislav-halyn for me http2 it is consistently faster (by a tiny bit ofc, given the test file), and I believe because the I believe your network was playing a big role when you submitted the issue the first time 😄 |
@puskin94 yeah, you might be right 😄 I do wonder if there can be some kind of TCP packets clogging on http2 in the fast network? Meaning, the packets can come only though one TCP connection and they come so fast, that they are not processed fast enough and get sent to some kind of queue Maybe that would be the explanation🤔 |
@puskin94 It's really weird because I still can reproduce the issue on one of our internal API endpoints which is quite JSON-heavy with lots of text (384803 bytes). Here're the results:
|
@puskin94 but why I also ran the same test using an http client on Rust and the results where consistent |
Version
v20.17.0
Platform
Subsystem
No response
What steps will reproduce the bug?
Define an http2 and http1.1 clients and fetch some data in parallel.
You can use the script below. If you run it using the following command, it will send 25 parallel requests using HTTP1.1
or you can run this command and it will send 25 parallel requests using HTTP2:
client.js
How often does it reproduce? Is there a required condition?
It's not quite stable, because sometimes http2 has the same performance or even better performance.
What is the expected behavior? Why is that the expected behavior?
I expect http2 to be faster or at least have the same performance as http1.1.
What do you see instead?
Instead, I often see that http1.1 requests are faster. Most of the times http2 took 25s to complete while http1.1 took 20s.
There's a problem, though, because those results are not consistent when using the URL I provided in the example.
But, the results are consistent with some internal APIs I'm working with. First, I thought there might be a problem with our internal APIs, but making 25 parallel requests to our internal APIs using
curl
has consistently the same performance on both http 1.1 and http2.I ran curl with the following command:
where config.txt looked like this:
Additional information
I was thinking the reason for such slowness would be TCP stall because http2 uses only 1 tcp connection, so I checked in wireshark how many tcp connections creates node vs curl when using http2, and both created only 1 tcp connection.
Also, I noticed node receives first 10 http2 requests randomly, but then sequentially, while http1 requests are all received randomly.
http2:
http1:
The text was updated successfully, but these errors were encountered: