mirror of
https://github.com/nodejs/node.git
synced 2024-11-21 10:59:27 +00:00
589b2a1244
Use uv_metrics_idle_time() to return a high resolution millisecond timer of the amount of time the event loop has been idle since it was initialized. Include performance.eventLoopUtilization() API to handle the math of calculating the idle and active times. This has been added to prevent accidental miscalculations of the event loop utilization. Such as not taking into consideration offsetting nodeTiming.loopStart or timing differences when being called from a Worker thread. PR-URL: https://github.com/nodejs/node/pull/34938 Reviewed-By: James M Snell <jasnell@gmail.com> Reviewed-By: Adrian Estrada <edsadr@gmail.com> Reviewed-By: Juan José Arboleda <soyjuanarbol@gmail.com> Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Stephen Belanger <admin@stephenbelanger.com> Reviewed-By: Gireesh Punathil <gpunathi@in.ibm.com> Reviewed-By: Gerhard Stöbich <deb2001-github@yahoo.de>
98 lines
6.3 KiB
Markdown
98 lines
6.3 KiB
Markdown
# Node.js Core Benchmarks
|
|
|
|
This folder contains code and data used to measure performance
|
|
of different Node.js implementations and different ways of
|
|
writing JavaScript run by the built-in JavaScript engine.
|
|
|
|
For a detailed guide on how to write and run benchmarks in this
|
|
directory, see [the guide on benchmarks](../doc/guides/writing-and-running-benchmarks.md).
|
|
|
|
## Table of Contents
|
|
|
|
* [Benchmark directories](#benchmark-directories)
|
|
* [Common API](#common-api)
|
|
|
|
## Benchmark Directories
|
|
|
|
| Directory | Purpose |
|
|
| --------------- | ---------------------------------------------------------------------------------------------------------------- |
|
|
| assert | Benchmarks for the `assert` subsystem. |
|
|
| buffers | Benchmarks for the `buffer` subsystem. |
|
|
| child\_process | Benchmarks for the `child_process` subsystem. |
|
|
| crypto | Benchmarks for the `crypto` subsystem. |
|
|
| dgram | Benchmarks for the `dgram` subsystem. |
|
|
| domain | Benchmarks for the `domain` subsystem. |
|
|
| es | Benchmarks for various new ECMAScript features and their pre-ES2015 counterparts. |
|
|
| events | Benchmarks for the `events` subsystem. |
|
|
| fixtures | Benchmarks fixtures used in various benchmarks throughout the benchmark suite. |
|
|
| fs | Benchmarks for the `fs` subsystem. |
|
|
| http | Benchmarks for the `http` subsystem. |
|
|
| http2 | Benchmarks for the `http2` subsystem. |
|
|
| misc | Miscellaneous benchmarks and benchmarks for shared internal modules. |
|
|
| module | Benchmarks for the `module` subsystem. |
|
|
| net | Benchmarks for the `net` subsystem. |
|
|
| path | Benchmarks for the `path` subsystem. |
|
|
| perf_hooks | Benchmarks for the `perf_hooks` subsystem. |
|
|
| process | Benchmarks for the `process` subsystem. |
|
|
| querystring | Benchmarks for the `querystring` subsystem. |
|
|
| streams | Benchmarks for the `streams` subsystem. |
|
|
| string\_decoder | Benchmarks for the `string_decoder` subsystem. |
|
|
| timers | Benchmarks for the `timers` subsystem, including `setTimeout`, `setInterval`, .etc. |
|
|
| tls | Benchmarks for the `tls` subsystem. |
|
|
| url | Benchmarks for the `url` subsystem, including the legacy `url` implementation and the WHATWG URL implementation. |
|
|
| util | Benchmarks for the `util` subsystem. |
|
|
| vm | Benchmarks for the `vm` subsystem. |
|
|
|
|
### Other Top-level files
|
|
|
|
The top-level files include common dependencies of the benchmarks
|
|
and the tools for launching benchmarks and visualizing their output.
|
|
The actual benchmark scripts should be placed in their corresponding
|
|
directories.
|
|
|
|
* `_benchmark_progress.js`: implements the progress bar displayed
|
|
when running `compare.js`
|
|
* `_cli.js`: parses the command line arguments passed to `compare.js`,
|
|
`run.js` and `scatter.js`
|
|
* `_cli.R`: parses the command line arguments passed to `compare.R`
|
|
* `_http-benchmarkers.js`: selects and runs external tools for benchmarking
|
|
the `http` subsystem.
|
|
* `common.js`: see [Common API](#common-api).
|
|
* `compare.js`: command line tool for comparing performance between different
|
|
Node.js binaries.
|
|
* `compare.R`: R script for statistically analyzing the output of
|
|
`compare.js`
|
|
* `run.js`: command line tool for running individual benchmark suite(s).
|
|
* `scatter.js`: command line tool for comparing the performance
|
|
between different parameters in benchmark configurations,
|
|
for example to analyze the time complexity.
|
|
* `scatter.R`: R script for visualizing the output of `scatter.js` with
|
|
scatter plots.
|
|
|
|
## Common API
|
|
|
|
The common.js module is used by benchmarks for consistency across repeated
|
|
tasks. It has a number of helpful functions and properties to help with
|
|
writing benchmarks.
|
|
|
|
### `createBenchmark(fn, configs[, options])`
|
|
|
|
See [the guide on writing benchmarks](../doc/guides/writing-and-running-benchmarks.md#basics-of-a-benchmark).
|
|
|
|
### `default_http_benchmarker`
|
|
|
|
The default benchmarker used to run HTTP benchmarks.
|
|
See [the guide on writing HTTP benchmarks](../doc/guides/writing-and-running-benchmarks.md#creating-an-http-benchmark).
|
|
|
|
### `PORT`
|
|
|
|
The default port used to run HTTP benchmarks.
|
|
See [the guide on writing HTTP benchmarks](../doc/guides/writing-and-running-benchmarks.md#creating-an-http-benchmark).
|
|
|
|
### `sendResult(data)`
|
|
|
|
Used in special benchmarks that can't use `createBenchmark` and the object
|
|
it returns to accomplish what they need. This function reports timing
|
|
data to the parent process (usually created by running `compare.js`, `run.js` or
|
|
`scatter.js`).
|