node/benchmark
Trevor Norris 04d16646a0 worker: add eventLoopUtilization()
Allow calling eventLoopUtilization() directly on a worker thread:

    const worker = new Worker('./foo.js');
    const elu = worker.performance.eventLoopUtilization();
    setTimeout(() => {
      worker.performance.eventLoopUtilization(elu);
    }, 10);

Add a new performance object on the Worker instance that will hopefully
one day hold all the other performance metrics, such as nodeTiming.

Include benchmarks and tests.

PR-URL: https://github.com/nodejs/node/pull/35664
Reviewed-By: Juan José Arboleda <soyjuanarbol@gmail.com>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Gerhard Stöbich <deb2001-github@yahoo.de>
Reviewed-By: James M Snell <jasnell@gmail.com>
2020-10-27 09:43:20 +01:00
..
assert benchmark: remove special test entries 2020-03-09 22:35:54 +01:00
async_hooks benchmark: always throw the same Error instance 2020-07-29 10:59:26 +03:00
buffers benchmark: remove special test entries 2020-03-09 22:35:54 +01:00
child_process
cluster
crypto crypto: refactoring internals, add WebCrypto 2020-10-07 17:27:05 -07:00
dgram
dns benchmark: use let instead of var in dns 2020-03-30 10:16:14 +02:00
domain benchmark: use let and const instead of var 2020-01-28 19:59:41 -08:00
es tools: update ESLint to 7.2.0 2020-06-13 16:44:03 -04:00
esm module: use Wasm CJS lexer when available 2020-10-12 04:24:41 -07:00
events benchmark: fix EventTarget benchmark 2020-06-23 17:08:13 -07:00
fixtures benchmark: use let and const instead of var 2020-01-28 19:59:41 -08:00
fs benchmark: add test and all options and improve errors" 2020-03-09 22:35:53 +01:00
http benchmark: fixing http_server_for_chunky_client.js 2020-05-23 19:34:53 +02:00
http2 benchmark: add test and all options and improve errors" 2020-03-09 22:35:53 +01:00
misc benchmark: remove special test entries 2020-03-09 22:35:54 +01:00
module benchmark: use let instead of var 2020-02-13 21:41:33 +01:00
napi benchmark: update function_args addon code 2020-08-11 13:29:14 -07:00
net benchmark: add test and all options and improve errors" 2020-03-09 22:35:53 +01:00
os benchmark: swap var for let in benchmarks 2020-02-13 21:38:00 +01:00
path benchmark: use let instead of var 2020-02-13 21:41:33 +01:00
perf_hooks perf_hooks: add idleTime and event loop util 2020-08-29 07:02:31 -07:00
policy policy: add startup benchmark and make SRI lazier 2020-07-14 13:03:04 -05:00
process benchmark: add benchmark script for resourceUsage 2020-08-13 22:33:07 +08:00
querystring querystring: improve stringify() performance 2020-06-12 18:21:44 -04:00
streams benchmark,test: remove output from readable-async-iterator benchmark 2020-07-17 08:41:15 -07:00
string_decoder benchmark: use let instead of var 2020-02-13 21:41:33 +01:00
timers benchmark: use let instead of var in timers 2020-03-30 10:16:16 +02:00
tls benchmark: use let instead of var in tls 2020-03-30 10:16:18 +02:00
url benchmark: use let instead of var in url 2020-03-30 10:16:20 +02:00
util benchmark: use let instead of var in util 2020-03-30 10:16:21 +02:00
v8 benchmark: swap var for let in benchmarks 2020-02-13 21:38:00 +01:00
vm benchmark: swap var for let in benchmarks 2020-02-13 21:38:00 +01:00
worker worker: add eventLoopUtilization() 2020-10-27 09:43:20 +01:00
zlib benchmark: use let instead of var in zlib 2020-03-30 10:16:24 +02:00
_benchmark_progress.js
_cli.js benchmark: add test and all options and improve errors" 2020-03-09 22:35:53 +01:00
_cli.R
_http-benchmarkers.js benchmark: add test and all options and improve errors" 2020-03-09 22:35:53 +01:00
_test-double-benchmarker.js benchmark: add test and all options and improve errors" 2020-03-09 22:35:53 +01:00
.eslintrc.yaml benchmark: add no-var rule in .eslintrc.yaml 2020-03-30 10:16:10 +02:00
common.js benchmark: use let instead of var in common.js 2020-03-30 10:16:12 +02:00
compare.js benchmark: add test and all options and improve errors" 2020-03-09 22:35:53 +01:00
compare.R
README.md perf_hooks: add idleTime and event loop util 2020-08-29 07:02:31 -07:00
run.js benchmark: use let instead of var in run.js 2020-03-30 10:16:15 +02:00
scatter.js tools: enable no-else-return lint rule 2020-05-16 06:42:16 +02:00
scatter.R

Node.js Core Benchmarks

This folder contains code and data used to measure performance of different Node.js implementations and different ways of writing JavaScript run by the built-in JavaScript engine.

For a detailed guide on how to write and run benchmarks in this directory, see the guide on benchmarks.

Table of Contents

Benchmark Directories

Directory Purpose
assert Benchmarks for the assert subsystem.
buffers Benchmarks for the buffer subsystem.
child_process Benchmarks for the child_process subsystem.
crypto Benchmarks for the crypto subsystem.
dgram Benchmarks for the dgram subsystem.
domain Benchmarks for the domain subsystem.
es Benchmarks for various new ECMAScript features and their pre-ES2015 counterparts.
events Benchmarks for the events subsystem.
fixtures Benchmarks fixtures used in various benchmarks throughout the benchmark suite.
fs Benchmarks for the fs subsystem.
http Benchmarks for the http subsystem.
http2 Benchmarks for the http2 subsystem.
misc Miscellaneous benchmarks and benchmarks for shared internal modules.
module Benchmarks for the module subsystem.
net Benchmarks for the net subsystem.
path Benchmarks for the path subsystem.
perf_hooks Benchmarks for the perf_hooks subsystem.
process Benchmarks for the process subsystem.
querystring Benchmarks for the querystring subsystem.
streams Benchmarks for the streams subsystem.
string_decoder Benchmarks for the string_decoder subsystem.
timers Benchmarks for the timers subsystem, including setTimeout, setInterval, .etc.
tls Benchmarks for the tls subsystem.
url Benchmarks for the url subsystem, including the legacy url implementation and the WHATWG URL implementation.
util Benchmarks for the util subsystem.
vm Benchmarks for the vm subsystem.

Other Top-level files

The top-level files include common dependencies of the benchmarks and the tools for launching benchmarks and visualizing their output. The actual benchmark scripts should be placed in their corresponding directories.

  • _benchmark_progress.js: implements the progress bar displayed when running compare.js
  • _cli.js: parses the command line arguments passed to compare.js, run.js and scatter.js
  • _cli.R: parses the command line arguments passed to compare.R
  • _http-benchmarkers.js: selects and runs external tools for benchmarking the http subsystem.
  • common.js: see Common API.
  • compare.js: command line tool for comparing performance between different Node.js binaries.
  • compare.R: R script for statistically analyzing the output of compare.js
  • run.js: command line tool for running individual benchmark suite(s).
  • scatter.js: command line tool for comparing the performance between different parameters in benchmark configurations, for example to analyze the time complexity.
  • scatter.R: R script for visualizing the output of scatter.js with scatter plots.

Common API

The common.js module is used by benchmarks for consistency across repeated tasks. It has a number of helpful functions and properties to help with writing benchmarks.

createBenchmark(fn, configs[, options])

See the guide on writing benchmarks.

default_http_benchmarker

The default benchmarker used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.

PORT

The default port used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.

sendResult(data)

Used in special benchmarks that can't use createBenchmark and the object it returns to accomplish what they need. This function reports timing data to the parent process (usually created by running compare.js, run.js or scatter.js).