2d1b4a8cf7
Signed-off-by: Gabriel Schulhof <gabrielschulhof@gmail.com> PR-URL: https://github.com/nodejs/node/pull/53830 Reviewed-By: Vladimir Morozov <vmorozov@microsoft.com> Reviewed-By: Chengzhong Wu <legendecas@gmail.com> Reviewed-By: Michael Dawson <midawson@redhat.com> |
||
---|---|---|
.. | ||
abort_controller | ||
assert | ||
async_hooks | ||
blob | ||
buffers | ||
child_process | ||
cluster | ||
crypto | ||
dgram | ||
diagnostics_channel | ||
dns | ||
domain | ||
error | ||
es | ||
esm | ||
events | ||
fixtures | ||
fs | ||
http | ||
http2 | ||
https | ||
mime | ||
misc | ||
module | ||
napi | ||
net | ||
os | ||
path | ||
perf_hooks | ||
permission | ||
process | ||
querystring | ||
readline | ||
streams | ||
string_decoder | ||
test_runner | ||
timers | ||
tls | ||
url | ||
util | ||
v8 | ||
validators | ||
vm | ||
websocket | ||
webstreams | ||
worker | ||
zlib | ||
_benchmark_progress.js | ||
_cli.js | ||
_cli.R | ||
_http-benchmarkers.js | ||
_test-double-benchmarker.js | ||
bar.R | ||
common.js | ||
compare.js | ||
compare.R | ||
eslint.config_partial.mjs | ||
README.md | ||
run.js | ||
scatter.js | ||
scatter.R |
Node.js Core Benchmarks
This folder contains code and data used to measure performance of different Node.js implementations and different ways of writing JavaScript run by the built-in JavaScript engine.
For a detailed guide on how to write and run benchmarks in this directory, see the guide on benchmarks.
Table of Contents
File tree structure
Directories
Benchmarks testing the performance of a single node submodule are placed into a
directory with the corresponding name, so that they can be executed by submodule
or individually.
Benchmarks that span multiple submodules may either be placed into the misc
directory or into a directory named after the feature they benchmark.
E.g. benchmarks for various new ECMAScript features and their pre-ES2015
counterparts are placed in a directory named es
.
Fixtures that are not specific to a certain benchmark but can be reused
throughout the benchmark suite should be placed in the fixtures
directory.
Other Top-level files
The top-level files include common dependencies of the benchmarks and the tools for launching benchmarks and visualizing their output. The actual benchmark scripts should be placed in their corresponding directories.
_benchmark_progress.js
: implements the progress bar displayed when runningcompare.js
_cli.js
: parses the command line arguments passed tocompare.js
,run.js
andscatter.js
_cli.R
: parses the command line arguments passed tocompare.R
_http-benchmarkers.js
: selects and runs external tools for benchmarking thehttp
subsystem.common.js
: see Common API.compare.js
: command line tool for comparing performance between different Node.js binaries.compare.R
: R script for statistically analyzing the output ofcompare.js
run.js
: command line tool for running individual benchmark suite(s).scatter.js
: command line tool for comparing the performance between different parameters in benchmark configurations, for example to analyze the time complexity.scatter.R
: R script for visualizing the output ofscatter.js
with scatter plots.
Common API
The common.js module is used by benchmarks for consistency across repeated tasks. It has a number of helpful functions and properties to help with writing benchmarks.
createBenchmark(fn, configs[, options])
See the guide on writing benchmarks.
default_http_benchmarker
The default benchmarker used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.
PORT
The default port used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.
sendResult(data)
Used in special benchmarks that can't use createBenchmark
and the object
it returns to accomplish what they need. This function reports timing
data to the parent process (usually created by running compare.js
, run.js
or
scatter.js
).