Node.js Reference
Free reference guide: Node.js Reference
About Node.js Reference
The Node.js Reference covers the full standard library of Node.js built-in modules that every backend JavaScript developer works with daily. The Modules section documents both CommonJS (require/module.exports) and ES Modules (import/export) syntax side by side, along with the path module for cross-platform file path handling, the os module for reading CPU count, memory, and platform info, and URL parsing with the WHATWG URL API. These are the building blocks used in every Node.js application regardless of which framework is on top.
The File System and HTTP sections cover the two most common I/O categories. The fs/promises API is documented with readFile, writeFile, appendFile, readdir, stat, mkdir, rm, rename, and the async iterator-based fs.watch for real-time file change detection. The HTTP section covers http.createServer for building raw servers, the native fetch API available in Node 18+, HTTPS server setup with TLS certificates, and WebSocket server creation with the ws library. Streams are documented with Readable, Writable, Transform, pipeline from stream/promises for error-safe piping, and Readable.from for creating streams from async iterables.
The Events and Process sections complete the runtime picture. EventEmitter with on, once, off, emit, and the async events.on iterator for consuming event streams are all covered. The Process section documents process.env for environment variables, child_process with exec and spawn for running subprocesses, worker_threads for CPU-intensive parallelism, graceful shutdown with process.on("SIGINT"), uncaughtException handling, and the cluster module for multi-core HTTP server scaling. This reference serves Node.js backend developers, API engineers building REST and WebSocket services, and DevOps engineers automating infrastructure with Node.js scripts.
Key Features
- CommonJS require/module.exports and ESM import/export syntax with side-by-side examples
- path.join, path.resolve, path.basename, path.extname for cross-platform path handling
- fs/promises — readFile, writeFile, mkdir (recursive), rm, stat, and async fs.watch
- http.createServer, Node 18+ native fetch, and HTTPS server with TLS cert setup
- WebSocket server with ws library — connection and message event handling
- Readable, Writable, Transform streams with stream/promises pipeline for safe piping
- EventEmitter on/once/off/emit and async for-await events.on iterator
- worker_threads for CPU parallelism and cluster module for multi-core scaling
Frequently Asked Questions
What is the difference between CommonJS and ES Modules in Node.js?
CommonJS uses require() to import and module.exports to export, and modules load synchronously. ES Modules use import/export syntax and support top-level await. In Node.js, files with .mjs extension or packages with "type": "module" in package.json use ESM. CommonJS files use .cjs or no extension with the default "type": "commonjs". You cannot mix the two directly — require() cannot import ESM, but ESM can use createRequire() to load CommonJS.
How do I read and write files asynchronously in Node.js?
Use the fs/promises module for async file operations. Import it with const fs = require("fs/promises") or import fs from "fs/promises". Then use await fs.readFile("path", "utf-8") to read a file as a string. Use await fs.writeFile("path", content) to write (overwriting the file) or await fs.appendFile("path", content) to add to the end. Always wrap in try/catch to handle ENOENT or permission errors.
How does Node.js handle concurrency if it is single-threaded?
Node.js uses a single JavaScript thread with a non-blocking event loop backed by libuv, which offloads I/O operations (file reads, network requests, DNS lookups) to the OS or a thread pool. This allows thousands of concurrent connections without creating OS threads for each. CPU-intensive work blocks the event loop, so for heavy computation use worker_threads to run JavaScript in parallel worker threads, or cluster to fork multiple processes each running their own event loop.
What are Node.js streams and when should I use them?
Streams are objects that emit data in chunks rather than loading everything into memory at once. Use Readable streams for data sources (fs.createReadStream), Writable streams for data sinks (fs.createWriteStream), and Transform streams to process data as it flows (compression, encryption, parsing). Use the pipeline function from stream/promises to connect streams with proper error handling and automatic cleanup — it throws if any stream errors, unlike the older pipe() method which could leak.
How do I run shell commands from Node.js?
Use exec() from child_process for short commands where you want the full output as a string — it buffers stdout and stderr and calls your callback when done. Use spawn() for long-running processes or when you need to stream stdout/stderr in real time — it returns a ChildProcess with .stdout and .stderr streams. Use execSync() or spawnSync() for synchronous execution that blocks the event loop, which is acceptable in CLI scripts but not in servers.
How do worker_threads differ from the cluster module?
worker_threads creates additional JavaScript threads within the same process, sharing memory through SharedArrayBuffer and communicating via postMessage. Use them for CPU-intensive tasks like image processing, cryptography, or parsing large files — offloading computation without the overhead of a new process. cluster forks the entire Node.js process multiple times (one per CPU core) and uses IPC for communication. Use cluster for scaling HTTP servers to utilize all CPU cores.
How does the EventEmitter pattern work in Node.js?
EventEmitter is the base class for Node.js objects that emit named events. You subscribe with emitter.on("event", handler) for persistent listeners or emitter.once("event", handler) for one-time listeners. You remove listeners with emitter.off("event", handler) or removeAllListeners(). You trigger an event with emitter.emit("event", ...args) which synchronously calls all registered listeners in registration order. The async for-await events.on(emitter, "event") creates an async iterable for consuming events one at a time.
What is the best way to handle environment variables in Node.js?
Access environment variables through process.env.VARIABLE_NAME. For local development, use a .env file and a library like dotenv (require("dotenv").config()) to load variables before your app code runs. In Node 20.6+, you can pass --env-file=.env on the command line without dotenv. Never hardcode secrets in source code and never commit .env files. In production, set variables through your hosting platform, Docker environment flags, or a secrets manager like AWS Secrets Manager or Vault.