{ "type": "module", "source": "doc/api/perf_hooks.md", "modules": [ { "textRaw": "Performance Timing API", "name": "performance_timing_api", "introduced_in": "v8.5.0", "stability": 2, "stabilityText": "Stable", "desc": "

The Performance Timing API provides an implementation of the\nW3C Performance Timeline specification. The purpose of the API\nis to support collection of high resolution performance metrics.\nThis is the same Performance API as implemented in modern Web browsers.

\n
const { PerformanceObserver, performance } = require('perf_hooks');\n\nconst obs = new PerformanceObserver((items) => {\n  console.log(items.getEntries()[0].duration);\n  performance.clearMarks();\n});\nobs.observe({ entryTypes: ['measure'] });\n\nperformance.mark('A');\ndoSomeLongRunningProcess(() => {\n  performance.mark('B');\n  performance.measure('A to B', 'A', 'B');\n});\n
", "modules": [ { "textRaw": "Class: `Performance`", "name": "class:_`performance`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "modules": [ { "textRaw": "`performance.clearMarks([name])`", "name": "`performance.clearmarks([name])`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

If name is not provided, removes all PerformanceMark objects from the\nPerformance Timeline. If name is provided, removes only the named mark.

", "type": "module", "displayName": "`performance.clearMarks([name])`" }, { "textRaw": "`performance.mark([name])`", "name": "`performance.mark([name])`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Creates a new PerformanceMark entry in the Performance Timeline. A\nPerformanceMark is a subclass of PerformanceEntry whose\nperformanceEntry.entryType is always 'mark', and whose\nperformanceEntry.duration is always 0. Performance marks are used\nto mark specific significant moments in the Performance Timeline.

", "type": "module", "displayName": "`performance.mark([name])`" }, { "textRaw": "`performance.measure(name, startMark, endMark)`", "name": "`performance.measure(name,_startmark,_endmark)`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Creates a new PerformanceMeasure entry in the Performance Timeline. A\nPerformanceMeasure is a subclass of PerformanceEntry whose\nperformanceEntry.entryType is always 'measure', and whose\nperformanceEntry.duration measures the number of milliseconds elapsed since\nstartMark and endMark.

\n

The startMark argument may identify any existing PerformanceMark in the\nPerformance Timeline, or may identify any of the timestamp properties\nprovided by the PerformanceNodeTiming class. If the named startMark does\nnot exist, then startMark is set to timeOrigin by default.

\n

The endMark argument must identify any existing PerformanceMark in the\nPerformance Timeline or any of the timestamp properties provided by the\nPerformanceNodeTiming class. If the named endMark does not exist, an\nerror will be thrown.

", "type": "module", "displayName": "`performance.measure(name, startMark, endMark)`" }, { "textRaw": "`performance.nodeTiming`", "name": "`performance.nodetiming`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

An instance of the PerformanceNodeTiming class that provides performance\nmetrics for specific Node.js operational milestones.

", "type": "module", "displayName": "`performance.nodeTiming`" }, { "textRaw": "`performance.now()`", "name": "`performance.now()`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Returns the current high resolution millisecond timestamp, where 0 represents\nthe start of the current node process.

", "type": "module", "displayName": "`performance.now()`" }, { "textRaw": "`performance.timeOrigin`", "name": "`performance.timeorigin`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The timeOrigin specifies the high resolution millisecond timestamp at\nwhich the current node process began, measured in Unix time.

", "type": "module", "displayName": "`performance.timeOrigin`" }, { "textRaw": "`performance.timerify(fn)`", "name": "`performance.timerify(fn)`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Wraps a function within a new function that measures the running time of the\nwrapped function. A PerformanceObserver must be subscribed to the 'function'\nevent type in order for the timing details to be accessed.

\n
const {\n  performance,\n  PerformanceObserver\n} = require('perf_hooks');\n\nfunction someFunction() {\n  console.log('hello world');\n}\n\nconst wrapped = performance.timerify(someFunction);\n\nconst obs = new PerformanceObserver((list) => {\n  console.log(list.getEntries()[0].duration);\n  obs.disconnect();\n});\nobs.observe({ entryTypes: ['function'] });\n\n// A performance timeline entry will be created\nwrapped();\n
", "type": "module", "displayName": "`performance.timerify(fn)`" } ], "type": "module", "displayName": "Class: `Performance`" }, { "textRaw": "Class: `PerformanceEntry`", "name": "class:_`performanceentry`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "modules": [ { "textRaw": "`performanceEntry.duration`", "name": "`performanceentry.duration`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The total number of milliseconds elapsed for this entry. This value will not\nbe meaningful for all Performance Entry types.

", "type": "module", "displayName": "`performanceEntry.duration`" }, { "textRaw": "`performanceEntry.name`", "name": "`performanceentry.name`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The name of the performance entry.

", "type": "module", "displayName": "`performanceEntry.name`" }, { "textRaw": "`performanceEntry.startTime`", "name": "`performanceentry.starttime`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The high resolution millisecond timestamp marking the starting time of the\nPerformance Entry.

", "type": "module", "displayName": "`performanceEntry.startTime`" }, { "textRaw": "`performanceEntry.entryType`", "name": "`performanceentry.entrytype`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The type of the performance entry. Currently it may be one of: 'node',\n'mark', 'measure', 'gc', 'function', 'http2' or 'http'.

", "type": "module", "displayName": "`performanceEntry.entryType`" }, { "textRaw": "`performanceEntry.kind`", "name": "`performanceentry.kind`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

When performanceEntry.entryType is equal to 'gc', the performance.kind\nproperty identifies the type of garbage collection operation that occurred.\nThe value may be one of:

\n", "type": "module", "displayName": "`performanceEntry.kind`" } ], "type": "module", "displayName": "Class: `PerformanceEntry`" }, { "textRaw": "Class: `PerformanceNodeTiming extends PerformanceEntry`", "name": "class:_`performancenodetiming_extends_performanceentry`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "

Provides timing details for Node.js itself.

", "modules": [ { "textRaw": "`performanceNodeTiming.bootstrapComplete`", "name": "`performancenodetiming.bootstrapcomplete`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The high resolution millisecond timestamp at which the Node.js process\ncompleted bootstrapping. If bootstrapping has not yet finished, the property\nhas the value of -1.

", "type": "module", "displayName": "`performanceNodeTiming.bootstrapComplete`" }, { "textRaw": "`performanceNodeTiming.environment`", "name": "`performancenodetiming.environment`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The high resolution millisecond timestamp at which the Node.js environment was\ninitialized.

", "type": "module", "displayName": "`performanceNodeTiming.environment`" }, { "textRaw": "`performanceNodeTiming.loopExit`", "name": "`performancenodetiming.loopexit`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The high resolution millisecond timestamp at which the Node.js event loop\nexited. If the event loop has not yet exited, the property has the value of -1.\nIt can only have a value of not -1 in a handler of the 'exit' event.

", "type": "module", "displayName": "`performanceNodeTiming.loopExit`" }, { "textRaw": "`performanceNodeTiming.loopStart`", "name": "`performancenodetiming.loopstart`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The high resolution millisecond timestamp at which the Node.js event loop\nstarted. If the event loop has not yet started (e.g., in the first tick of the\nmain script), the property has the value of -1.

", "type": "module", "displayName": "`performanceNodeTiming.loopStart`" }, { "textRaw": "`performanceNodeTiming.nodeStart`", "name": "`performancenodetiming.nodestart`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The high resolution millisecond timestamp at which the Node.js process was\ninitialized.

", "type": "module", "displayName": "`performanceNodeTiming.nodeStart`" }, { "textRaw": "`performanceNodeTiming.v8Start`", "name": "`performancenodetiming.v8start`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

The high resolution millisecond timestamp at which the V8 platform was\ninitialized.

", "type": "module", "displayName": "`performanceNodeTiming.v8Start`" } ], "type": "module", "displayName": "Class: `PerformanceNodeTiming extends PerformanceEntry`" }, { "textRaw": "Class: `PerformanceObserver`", "name": "class:_`performanceobserver`", "modules": [ { "textRaw": "`new PerformanceObserver(callback)`", "name": "`new_performanceobserver(callback)`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

PerformanceObserver objects provide notifications when new\nPerformanceEntry instances have been added to the Performance Timeline.

\n
const {\n  performance,\n  PerformanceObserver\n} = require('perf_hooks');\n\nconst obs = new PerformanceObserver((list, observer) => {\n  console.log(list.getEntries());\n  observer.disconnect();\n});\nobs.observe({ entryTypes: ['mark'], buffered: true });\n\nperformance.mark('test');\n
\n

Because PerformanceObserver instances introduce their own additional\nperformance overhead, instances should not be left subscribed to notifications\nindefinitely. Users should disconnect observers as soon as they are no\nlonger needed.

\n

The callback is invoked when a PerformanceObserver is\nnotified about new PerformanceEntry instances. The callback receives a\nPerformanceObserverEntryList instance and a reference to the\nPerformanceObserver.

", "type": "module", "displayName": "`new PerformanceObserver(callback)`" }, { "textRaw": "`performanceObserver.disconnect()`", "name": "`performanceobserver.disconnect()`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "

Disconnects the PerformanceObserver instance from all notifications.

", "type": "module", "displayName": "`performanceObserver.disconnect()`" }, { "textRaw": "`performanceObserver.observe(options)`", "name": "`performanceobserver.observe(options)`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Subscribes the PerformanceObserver instance to notifications of new\nPerformanceEntry instances identified by options.entryTypes.

\n

When options.buffered is false, the callback will be invoked once for\nevery PerformanceEntry instance:

\n
const {\n  performance,\n  PerformanceObserver\n} = require('perf_hooks');\n\nconst obs = new PerformanceObserver((list, observer) => {\n  // Called three times synchronously. `list` contains one item.\n});\nobs.observe({ entryTypes: ['mark'] });\n\nfor (let n = 0; n < 3; n++)\n  performance.mark(`test${n}`);\n
\n
const {\n  performance,\n  PerformanceObserver\n} = require('perf_hooks');\n\nconst obs = new PerformanceObserver((list, observer) => {\n  // Called once. `list` contains three items.\n});\nobs.observe({ entryTypes: ['mark'], buffered: true });\n\nfor (let n = 0; n < 3; n++)\n  performance.mark(`test${n}`);\n
", "type": "module", "displayName": "`performanceObserver.observe(options)`" } ], "type": "module", "displayName": "Class: `PerformanceObserver`" }, { "textRaw": "Class: `PerformanceObserverEntryList`", "name": "class:_`performanceobserverentrylist`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "

The PerformanceObserverEntryList class is used to provide access to the\nPerformanceEntry instances passed to a PerformanceObserver.

", "modules": [ { "textRaw": "`performanceObserverEntryList.getEntries()`", "name": "`performanceobserverentrylist.getentries()`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Returns a list of PerformanceEntry objects in chronological order\nwith respect to performanceEntry.startTime.

", "type": "module", "displayName": "`performanceObserverEntryList.getEntries()`" }, { "textRaw": "`performanceObserverEntryList.getEntriesByName(name[, type])`", "name": "`performanceobserverentrylist.getentriesbyname(name[,_type])`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Returns a list of PerformanceEntry objects in chronological order\nwith respect to performanceEntry.startTime whose performanceEntry.name is\nequal to name, and optionally, whose performanceEntry.entryType is equal to\ntype.

", "type": "module", "displayName": "`performanceObserverEntryList.getEntriesByName(name[, type])`" }, { "textRaw": "`performanceObserverEntryList.getEntriesByType(type)`", "name": "`performanceobserverentrylist.getentriesbytype(type)`", "meta": { "added": [ "v8.5.0" ], "changes": [] }, "desc": "\n

Returns a list of PerformanceEntry objects in chronological order\nwith respect to performanceEntry.startTime whose performanceEntry.entryType\nis equal to type.

", "type": "module", "displayName": "`performanceObserverEntryList.getEntriesByType(type)`" } ], "type": "module", "displayName": "Class: `PerformanceObserverEntryList`" }, { "textRaw": "`perf_hooks.monitorEventLoopDelay([options])`", "name": "`perf_hooks.monitoreventloopdelay([options])`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

Creates a Histogram object that samples and reports the event loop delay\nover time. The delays will be reported in nanoseconds.

\n

Using a timer to detect approximate event loop delay works because the\nexecution of timers is tied specifically to the lifecycle of the libuv\nevent loop. That is, a delay in the loop will cause a delay in the execution\nof the timer, and those delays are specifically what this API is intended to\ndetect.

\n
const { monitorEventLoopDelay } = require('perf_hooks');\nconst h = monitorEventLoopDelay({ resolution: 20 });\nh.enable();\n// Do something.\nh.disable();\nconsole.log(h.min);\nconsole.log(h.max);\nconsole.log(h.mean);\nconsole.log(h.stddev);\nconsole.log(h.percentiles);\nconsole.log(h.percentile(50));\nconsole.log(h.percentile(99));\n
", "modules": [ { "textRaw": "Class: `Histogram`", "name": "class:_`histogram`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "

Tracks the event loop delay at a given sampling rate.

", "modules": [ { "textRaw": "`histogram.disable()`", "name": "`histogram.disable()`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

Disables the event loop delay sample timer. Returns true if the timer was\nstopped, false if it was already stopped.

", "type": "module", "displayName": "`histogram.disable()`" }, { "textRaw": "`histogram.enable()`", "name": "`histogram.enable()`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

Enables the event loop delay sample timer. Returns true if the timer was\nstarted, false if it was already started.

", "type": "module", "displayName": "`histogram.enable()`" }, { "textRaw": "`histogram.exceeds`", "name": "`histogram.exceeds`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

The number of times the event loop delay exceeded the maximum 1 hour event\nloop delay threshold.

", "type": "module", "displayName": "`histogram.exceeds`" }, { "textRaw": "`histogram.max`", "name": "`histogram.max`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

The maximum recorded event loop delay.

", "type": "module", "displayName": "`histogram.max`" }, { "textRaw": "`histogram.mean`", "name": "`histogram.mean`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

The mean of the recorded event loop delays.

", "type": "module", "displayName": "`histogram.mean`" }, { "textRaw": "`histogram.min`", "name": "`histogram.min`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

The minimum recorded event loop delay.

", "type": "module", "displayName": "`histogram.min`" }, { "textRaw": "`histogram.percentile(percentile)`", "name": "`histogram.percentile(percentile)`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

Returns the value at the given percentile.

", "type": "module", "displayName": "`histogram.percentile(percentile)`" }, { "textRaw": "`histogram.percentiles`", "name": "`histogram.percentiles`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

Returns a Map object detailing the accumulated percentile distribution.

", "type": "module", "displayName": "`histogram.percentiles`" }, { "textRaw": "`histogram.reset()`", "name": "`histogram.reset()`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "

Resets the collected histogram data.

", "type": "module", "displayName": "`histogram.reset()`" }, { "textRaw": "`histogram.stddev`", "name": "`histogram.stddev`", "meta": { "added": [ "v11.10.0" ], "changes": [] }, "desc": "\n

The standard deviation of the recorded event loop delays.

\n

Examples

", "type": "module", "displayName": "`histogram.stddev`" } ], "type": "module", "displayName": "Class: `Histogram`" }, { "textRaw": "Measuring the duration of async operations", "name": "measuring_the_duration_of_async_operations", "desc": "

The following example uses the Async Hooks and Performance APIs to measure\nthe actual duration of a Timeout operation (including the amount of time it took\nto execute the callback).

\n
'use strict';\nconst async_hooks = require('async_hooks');\nconst {\n  performance,\n  PerformanceObserver\n} = require('perf_hooks');\n\nconst set = new Set();\nconst hook = async_hooks.createHook({\n  init(id, type) {\n    if (type === 'Timeout') {\n      performance.mark(`Timeout-${id}-Init`);\n      set.add(id);\n    }\n  },\n  destroy(id) {\n    if (set.has(id)) {\n      set.delete(id);\n      performance.mark(`Timeout-${id}-Destroy`);\n      performance.measure(`Timeout-${id}`,\n                          `Timeout-${id}-Init`,\n                          `Timeout-${id}-Destroy`);\n    }\n  }\n});\nhook.enable();\n\nconst obs = new PerformanceObserver((list, observer) => {\n  console.log(list.getEntries()[0]);\n  performance.clearMarks();\n  observer.disconnect();\n});\nobs.observe({ entryTypes: ['measure'], buffered: true });\n\nsetTimeout(() => {}, 1000);\n
", "type": "module", "displayName": "Measuring the duration of async operations" }, { "textRaw": "Measuring how long it takes to load dependencies", "name": "measuring_how_long_it_takes_to_load_dependencies", "desc": "

The following example measures the duration of require() operations to load\ndependencies:

\n\n
'use strict';\nconst {\n  performance,\n  PerformanceObserver\n} = require('perf_hooks');\nconst mod = require('module');\n\n// Monkey patch the require function\nmod.Module.prototype.require =\n  performance.timerify(mod.Module.prototype.require);\nrequire = performance.timerify(require);\n\n// Activate the observer\nconst obs = new PerformanceObserver((list) => {\n  const entries = list.getEntries();\n  entries.forEach((entry) => {\n    console.log(`require('${entry[0]}')`, entry.duration);\n  });\n  obs.disconnect();\n});\nobs.observe({ entryTypes: ['function'], buffered: true });\n\nrequire('some-module');\n
", "type": "module", "displayName": "Measuring how long it takes to load dependencies" } ], "type": "module", "displayName": "`perf_hooks.monitorEventLoopDelay([options])`" } ], "type": "module", "displayName": "Performance Timing API" } ] }