node/benchmark
Adam Majer b1515c7a6a
benchmark: filter non-present deps from start-cli-version
PR-URL: https://github.com/nodejs/node/pull/51746
Refs: https://github.com/nodejs/node/pull/51146
Refs: https://github.com/nodejs/node/pull/50684
Reviewed-By: Richard Lau <rlau@redhat.com>
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
Reviewed-By: Marco Ippolito <marcoippolito54@gmail.com>
Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com>
2024-05-11 17:25:23 +02:00
..
abort_controller benchmark: add AbortSignal.abort benchmarks 2024-04-10 20:16:04 +00:00
assert benchmark: update iterations of assert/deepequal-typedarrays.js 2024-02-28 09:08:05 +00:00
async_hooks benchmark: update iterations of benchmark/async_hooks/async-local- 2024-03-04 10:05:37 +00:00
blob benchmark: reduce the buffer size for blob 2024-04-17 19:33:44 +00:00
buffers buffer: improve btoa performance 2024-04-11 11:41:44 +00:00
child_process
cluster benchmark: use cluster.isPrimary instead of cluster.isMaster 2023-05-17 05:51:53 +00:00
crypto crypto: make timingSafeEqual faster for Uint8Array 2024-04-08 11:36:53 +00:00
dgram
diagnostics_channel
dns
domain benchmark: update iterations of benchmark/domain/domain-fn-args.js 2024-03-04 10:05:28 +00:00
error errors: fix stacktrace of SystemError 2023-12-23 09:04:45 -08:00
es util: move util._extend to eol 2024-05-06 09:37:58 +00:00
esm benchmark: rewrite import.meta benchmark 2023-11-18 00:46:42 +00:00
events benchmark: add eventtarget creation bench 2023-04-30 21:09:56 +00:00
fixtures benchmark: rewrite import.meta benchmark 2023-11-18 00:46:42 +00:00
fs fs: improve mkdtemp performance for buffer prefix 2023-12-20 10:44:12 +00:00
http benchmark: use cluster.isPrimary instead of cluster.isMaster 2023-05-17 05:51:53 +00:00
http2
https
mime util: remove internal mime fns from benchmarks 2023-10-21 02:12:44 +00:00
misc benchmark: filter non-present deps from start-cli-version 2024-05-11 17:25:23 +02:00
module benchmark: use tmpdir.resolve() 2023-08-30 09:55:20 +00:00
napi node-api: provide napi_define_properties fast path 2023-06-15 18:07:30 -07:00
net stream: bump default highWaterMark 2024-03-13 19:02:14 +00:00
os benchmark: skip test-benchmark-os on IBMi 2023-10-23 21:27:55 -04:00
path benchmark: add toNamespacedPath bench 2024-03-29 22:40:22 +00:00
perf_hooks perf_hooks: performance milestone time origin timestamp improvement 2024-02-28 16:52:30 +00:00
permission benchmark: move permission-fs-read to permission-processhas-fs-read 2023-10-23 23:03:03 +02:00
process benchmark: use tmpdir.resolve() 2023-08-30 09:55:20 +00:00
querystring
readline test: refactor test-readline-async-iterators into a benchmark 2023-09-23 08:17:14 +00:00
streams stream: support typed arrays 2024-03-20 17:27:29 +00:00
string_decoder
test_runner benchmark: add benchmarks for the test_runner 2023-07-29 18:07:44 +00:00
timers
tls
url url: remove #context from URLSearchParams 2024-01-24 14:36:36 +00:00
util benchmark: add style-text benchmark 2024-03-09 14:01:04 +00:00
v8
validators lib: improve performance of validateStringArray and validateBooleanArray 2023-10-22 21:12:38 +00:00
vm vm: use default HDO when importModuleDynamically is not set 2023-10-05 00:11:04 +00:00
websocket lib: enable WebSocket by default 2024-02-04 14:03:39 +00:00
webstreams stream: reduce overhead of transfer 2023-10-12 14:37:41 +00:00
worker lib: update usage of always on Atomics API 2023-10-10 08:26:58 +02:00
zlib
_benchmark_progress.js
_cli.js benchmark: conditionally use spawn with taskset for cpu pinning 2024-04-06 22:43:53 +00:00
_cli.R
_http-benchmarkers.js src: disallow direct .bat and .cmd file spawning 2024-04-10 17:11:15 -03:00
_test-double-benchmarker.js
.eslintrc.yaml
bar.R benchmark: add bar.R 2023-06-28 13:07:19 +00:00
common.js url: add value argument to has and delete methods 2023-05-14 14:35:19 +00:00
compare.js benchmark: inherit stdio/stderr instead of pipe 2024-04-13 18:01:28 +00:00
compare.R
README.md
run.js benchmark: inherit stdio/stderr instead of pipe 2024-04-13 18:01:28 +00:00
scatter.js
scatter.R

Node.js Core Benchmarks

This folder contains code and data used to measure performance of different Node.js implementations and different ways of writing JavaScript run by the built-in JavaScript engine.

For a detailed guide on how to write and run benchmarks in this directory, see the guide on benchmarks.

Table of Contents

File tree structure

Directories

Benchmarks testing the performance of a single node submodule are placed into a directory with the corresponding name, so that they can be executed by submodule or individually. Benchmarks that span multiple submodules may either be placed into the misc directory or into a directory named after the feature they benchmark. E.g. benchmarks for various new ECMAScript features and their pre-ES2015 counterparts are placed in a directory named es. Fixtures that are not specific to a certain benchmark but can be reused throughout the benchmark suite should be placed in the fixtures directory.

Other Top-level files

The top-level files include common dependencies of the benchmarks and the tools for launching benchmarks and visualizing their output. The actual benchmark scripts should be placed in their corresponding directories.

  • _benchmark_progress.js: implements the progress bar displayed when running compare.js
  • _cli.js: parses the command line arguments passed to compare.js, run.js and scatter.js
  • _cli.R: parses the command line arguments passed to compare.R
  • _http-benchmarkers.js: selects and runs external tools for benchmarking the http subsystem.
  • common.js: see Common API.
  • compare.js: command line tool for comparing performance between different Node.js binaries.
  • compare.R: R script for statistically analyzing the output of compare.js
  • run.js: command line tool for running individual benchmark suite(s).
  • scatter.js: command line tool for comparing the performance between different parameters in benchmark configurations, for example to analyze the time complexity.
  • scatter.R: R script for visualizing the output of scatter.js with scatter plots.

Common API

The common.js module is used by benchmarks for consistency across repeated tasks. It has a number of helpful functions and properties to help with writing benchmarks.

createBenchmark(fn, configs[, options])

See the guide on writing benchmarks.

default_http_benchmarker

The default benchmarker used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.

PORT

The default port used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.

sendResult(data)

Used in special benchmarks that can't use createBenchmark and the object it returns to accomplish what they need. This function reports timing data to the parent process (usually created by running compare.js, run.js or scatter.js).