node/benchmark
Raz Luvaton a7fe8b042a
stream: improve tee perf by reduce ReflectConstruct usages
also added more webstream creation benchmarks

PR-URL: https://github.com/nodejs/node/pull/49546
Reviewed-By: Yagiz Nizipli <yagiz@nizipli.com>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
2023-09-11 07:45:11 +00:00
..
assert assert,util: improve deep equal comparison performance 2023-02-20 15:47:06 +01:00
async_hooks async_hooks: add async local storage propagation benchmarks 2023-02-03 21:12:17 +00:00
blob
buffers benchmark: split Buffer.byteLength benchmark 2023-02-26 19:02:42 +01:00
child_process
cluster benchmark: use cluster.isPrimary instead of cluster.isMaster 2023-05-17 05:51:53 +00:00
crypto benchmark: refactor crypto oneshot 2023-06-05 19:00:41 +00:00
dgram
diagnostics_channel
dns
domain
error
es
esm benchmark: use tmpdir.resolve() 2023-08-30 09:55:20 +00:00
events benchmark: add eventtarget creation bench 2023-04-30 21:09:56 +00:00
fixtures benchmark: add benchmarks for the test_runner 2023-07-29 18:07:44 +00:00
fs benchmark: use tmpdir.resolve() 2023-08-30 09:55:20 +00:00
http benchmark: use cluster.isPrimary instead of cluster.isMaster 2023-05-17 05:51:53 +00:00
http2 benchmark: add trailing commas in benchmark/http2 2023-02-09 23:15:52 +00:00
https
misc benchmark: fix worker startup benchmark 2023-02-23 14:10:00 +00:00
module benchmark: use tmpdir.resolve() 2023-08-30 09:55:20 +00:00
napi node-api: provide napi_define_properties fast path 2023-06-15 18:07:30 -07:00
net
os
path benchmark: add trailing commas in benchmark/path 2023-02-14 17:09:42 +00:00
perf_hooks
permission benchmark: add pm startup benchmark 2023-07-26 22:37:56 +00:00
policy
process benchmark: use tmpdir.resolve() 2023-08-30 09:55:20 +00:00
querystring
readline
streams
string_decoder
test_runner benchmark: add benchmarks for the test_runner 2023-07-29 18:07:44 +00:00
timers
tls
url benchmark: differentiate whatwg and legacy url 2023-04-13 21:37:17 +00:00
util benchmark: stablize encode benchmark 2023-03-07 22:45:40 +01:00
v8
vm
webstreams stream: improve tee perf by reduce ReflectConstruct usages 2023-09-11 07:45:11 +00:00
worker lib: do not crash using workers with disabled shared array buffers 2023-02-18 11:18:04 +01:00
zlib
_benchmark_progress.js
_cli.js
_cli.R
_http-benchmarkers.js benchmark: fix invalid requirementsURL 2023-04-03 23:52:21 +00:00
_test-double-benchmarker.js
.eslintrc.yaml benchmark: add trailing commas in benchmark/path 2023-02-14 17:09:42 +00:00
bar.R benchmark: add bar.R 2023-06-28 13:07:19 +00:00
common.js url: add value argument to has and delete methods 2023-05-14 14:35:19 +00:00
compare.js
compare.R
README.md benchmark: replace table in docs with description of file tree structure 2023-03-11 09:53:50 +00:00
run.js
scatter.js
scatter.R

Node.js Core Benchmarks

This folder contains code and data used to measure performance of different Node.js implementations and different ways of writing JavaScript run by the built-in JavaScript engine.

For a detailed guide on how to write and run benchmarks in this directory, see the guide on benchmarks.

Table of Contents

File tree structure

Directories

Benchmarks testing the performance of a single node submodule are placed into a directory with the corresponding name, so that they can be executed by submodule or individually. Benchmarks that span multiple submodules may either be placed into the misc directory or into a directory named after the feature they benchmark. E.g. benchmarks for various new ECMAScript features and their pre-ES2015 counterparts are placed in a directory named es. Fixtures that are not specific to a certain benchmark but can be reused throughout the benchmark suite should be placed in the fixtures directory.

Other Top-level files

The top-level files include common dependencies of the benchmarks and the tools for launching benchmarks and visualizing their output. The actual benchmark scripts should be placed in their corresponding directories.

  • _benchmark_progress.js: implements the progress bar displayed when running compare.js
  • _cli.js: parses the command line arguments passed to compare.js, run.js and scatter.js
  • _cli.R: parses the command line arguments passed to compare.R
  • _http-benchmarkers.js: selects and runs external tools for benchmarking the http subsystem.
  • common.js: see Common API.
  • compare.js: command line tool for comparing performance between different Node.js binaries.
  • compare.R: R script for statistically analyzing the output of compare.js
  • run.js: command line tool for running individual benchmark suite(s).
  • scatter.js: command line tool for comparing the performance between different parameters in benchmark configurations, for example to analyze the time complexity.
  • scatter.R: R script for visualizing the output of scatter.js with scatter plots.

Common API

The common.js module is used by benchmarks for consistency across repeated tasks. It has a number of helpful functions and properties to help with writing benchmarks.

createBenchmark(fn, configs[, options])

See the guide on writing benchmarks.

default_http_benchmarker

The default benchmarker used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.

PORT

The default port used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.

sendResult(data)

Used in special benchmarks that can't use createBenchmark and the object it returns to accomplish what they need. This function reports timing data to the parent process (usually created by running compare.js, run.js or scatter.js).