Skip to content

Conversation

@wesleytodd
Copy link
Member

Took a quick look at converting @RafaelGSS's local server/client into a runner. I didn't have time this morning to look at the wrk2/autocannon stuff, but can assuming that #62 was not trying to say "ditch that abstraction entirely", which maybe it is and then I would like to discuss why we should do that.

@wesleytodd
Copy link
Member Author

There are still quite a few gaps:

  • Sorting out local installs for the CLI's (global installs are a no go imo)
  • Finding the right format for passing requests to wrk2 like we do to autocannon
  • Applying overrides
  • Decide how we would want to support the --node option (maybe this one doesn't since it would use what node you have locally)

But @RafaelGSS are you alright with this direction so that we align with the other runner styles and features?

Copy link
Contributor

@RafaelGSS RafaelGSS left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry delay, @wesleytodd, a lot of things are going on recently, as you might know.

To be honest, I was trying to say "ditch that abstraction entirely" - It's a personal preference, so I'm fine leaving it as internal packages. My feeling that this increases the contribution barrier and makes things overly complex.

Also, I do believe Wrk2 (at least) should be a global binary - making it available through this CLI will make things very complex as the binary needs different dependencies according to each supported environment. Leaving that open for the "sysadmin" seems a better choice to not make things more complex than it is nowadays.

(That's what we do for Node.js too)

@RafaelGSS
Copy link
Contributor

Ping me when this is ready to review @wesleytodd

@wesleytodd
Copy link
Member Author

wesleytodd commented Dec 1, 2025

Will do, just cleaning things up and will move it from draft when it is ready.

@wesleytodd wesleytodd force-pushed the add-local-server-runner branch from 5a5c185 to 3499db7 Compare December 1, 2025 13:35
cli: add local-bench to exbf

Output example

> @expressjs/perf-wg@1.0.0 local-load
> expf local-load

Running autocannon load...
Result Autocannon: 121610.67
Running wrk2 load...
Result Wrk2: [
  { percentile: 50, ms: 0.829 },
  { percentile: 75, ms: 1.1 },
  { percentile: 90, ms: 1.41 },
  { percentile: 99, ms: 1.75 },
  { percentile: 99.9, ms: 1.88 },
  { percentile: 99.99, ms: 2 },
  { percentile: 99.999, ms: 2.2 },
  { percentile: 100, ms: 2.2 }
]
@wesleytodd wesleytodd force-pushed the add-local-server-runner branch from 3499db7 to e764ef6 Compare December 1, 2025 14:07
@wesleytodd wesleytodd force-pushed the add-local-server-runner branch 2 times, most recently from 2870e11 to ce4622a Compare December 2, 2025 15:33
@wesleytodd wesleytodd force-pushed the add-local-server-runner branch from ce4622a to a61605e Compare December 2, 2025 15:51

// Start here, await in .results()
const toAwait = requesters.flatMap((requester) => {
return requests.map((request) => {
Copy link
Member Author

@wesleytodd wesleytodd Dec 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By iterating the requests array we start one cli process per request since the cli's don't support making multiple request shapes well, but also so that we can compare to see if one specific request was the cause of problems. We may want to have a setup to separate each out and compose them back together in the future, but for now this means we can support patterns like #77 for more "real world" tests.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Honestly, I'd make it opt-in.

if (parallelRequests)
  return Promise.all(requests.map(req => requester.start(req))
else {
  const results = []
  for (const req of requests) results.push(await requester.start(req))
}

We definitely don't want this to be default for the simple process.

"name": "@expressjs/perf-requests",
"version": "1.0.0",
"exports": {
".": "./index.mjs",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think after this we may be able to get rid of the exports with a small refactor to make adding request sets easier in the future. I am not going to do that now since this needs to land so we can get the CI integration over the finish line, but just wanted to call out why I added this file here.


// TODO: I don't see docs on how to do this with wrk2
if (opts.method || this.method) {
throw new Error('not yet supported');
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

method and body did not show up in the readme and since I didn't get wrk2 building on my mac I just left these here for now. We should fix this before merging.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can test it.

});
}

processResults (output) {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since I didn't get wkr2 built locally I also didn't test this after the refactor. I think I faithfully copied it but this needs more testing before we land this.

app.post(expressVersion.startsWith('4.') ? '*' : '*path', (req, res) => {
res.status(200).json({
hello: 'body!',
method: req.method,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was just to test that the new request were passing method through correctly. Not necessary for this PR.

@wesleytodd wesleytodd marked this pull request as ready for review December 2, 2025 15:58
@wesleytodd wesleytodd requested a review from a team December 2, 2025 15:58
}

// -H/--headers K=V
for (const [header, value] of Object.entries(opts.headers || this.headers)) {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

opts.headers and this.headers should probably merge?

Copy link
Contributor

@RafaelGSS RafaelGSS left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's also missing a doc update.

// -c/--connections NUM
'-c', this.connections,
// -w/--workers NUM
'-w', Math.min(this.connections, availableParallelism() || 8),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
'-w', Math.min(this.connections, availableParallelism() || 8),
'-w', Math.min(this.connections, availableParallelism() || 4),

8 sounds too much for the dedicated machine (availableParallelism is not available on v20<)


// Start here, await in .results()
const toAwait = requesters.flatMap((requester) => {
return requests.map((request) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Honestly, I'd make it opt-in.

if (parallelRequests)
  return Promise.all(requests.map(req => requester.start(req))
else {
  const results = []
  for (const req of requests) results.push(await requester.start(req))
}

We definitely don't want this to be default for the simple process.


// TODO: I don't see docs on how to do this with wrk2
if (opts.method || this.method) {
throw new Error('not yet supported');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can test it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

2 participants