DevToolBoxZA DARMO
Blog

Node.js Guide: Complete Tutorial for Backend Development

14 min readby DevToolBox

Node.js Complete Guide: Backend Development Tutorial 2026

Master Node.js backend development with this comprehensive guide. Covers event loop, Express.js, REST APIs, JWT auth, file system, database integration, testing, deployment with PM2 and Docker, and a Node.js vs Deno vs Bun comparison.

TL;DR — Node.js Quick Reference
  • Node.js runs JavaScript on the server using Chrome's V8 engine with a non-blocking, event-driven I/O model.
  • Use Express.js for routing and middleware; structure apps with controllers, services, and routes folders.
  • Authenticate with JWT + bcrypt; store tokens in httpOnly cookies; always hash passwords before saving.
  • Use PM2 for production process management; Docker for containerized deployments.
  • Never block the event loop — use async/await, streams, and worker threads for CPU-heavy tasks.

What Is Node.js?

Node.js is an open-source, cross-platform JavaScript runtime environment that executes JavaScript code outside a web browser. Built on Chrome's V8 JavaScript engine and released by Ryan Dahl in 2009, Node.js enables developers to use JavaScript for server-side programming — unifying web development around a single language for both frontend and backend.

Unlike traditional server-side languages that create a new thread per request (Apache/PHP model), Node.js uses a single-threaded, event-driven, non-blocking I/O model. This architecture makes it exceptionally efficient for I/O-bound workloads: web servers, REST APIs, real-time chat, streaming services, and microservices.

As of 2026, Node.js powers some of the most visited websites and APIs on the internet, with over 1 billion npm package downloads daily. The Node.js ecosystem — centered around npm — is the largest software registry in the world.

Key Takeaways
  • Node.js uses the V8 engine and libuv for non-blocking, event-driven I/O.
  • The event loop is single-threaded but offloads I/O to the OS kernel and thread pool.
  • npm has over 2 million packages — the largest ecosystem in software development.
  • Express.js is the de-facto standard for building HTTP servers and APIs in Node.js.
  • JWT + bcrypt is the most common authentication pattern for stateless REST APIs.
  • PM2 enables zero-downtime deployment, clustering, and process monitoring.
  • Node.js, Deno, and Bun are all viable in 2026; Node.js wins on ecosystem maturity.

Node.js Architecture: Event Loop and Non-Blocking I/O

Understanding Node.js's architecture is essential before writing production code. The most important concept is the event loop — the mechanism that enables Node.js to handle thousands of concurrent connections without creating new OS threads.

The Single-Threaded Model

Node.js JavaScript code runs on a single thread. This sounds like a limitation, but in practice it is a strength for I/O-bound applications. When Node.js needs to read a file or make a network request, it delegates that work to the operating system (via libuv), then continues processing other tasks. When the OS finishes, it queues a callback in the event loop.

// WRONG: Blocking the event loop (never do this)
const data = fs.readFileSync('/large-file.csv'); // blocks all other requests!
process(data);

// CORRECT: Non-blocking with callback
fs.readFile('/large-file.csv', 'utf8', (err, data) => {
  if (err) throw err;
  process(data);
});

// BETTER: Non-blocking with async/await
async function loadData() {
  const data = await fs.promises.readFile('/large-file.csv', 'utf8');
  process(data);
}

Event Loop Phases

The event loop processes callbacks in six phases per tick. Understanding these phases helps debug timing issues and write predictable async code:

// Event loop phase demonstration
setTimeout(() => console.log('1: setTimeout'), 0);       // timers phase
setImmediate(() => console.log('2: setImmediate'));       // check phase
process.nextTick(() => console.log('3: nextTick'));       // microtask (before next phase)
Promise.resolve().then(() => console.log('4: Promise')); // microtask (before next phase)

console.log('5: synchronous');

// Output order:
// 5: synchronous
// 3: nextTick        (microtasks run before next event loop phase)
// 4: Promise         (microtasks run before next event loop phase)
// 1: setTimeout      (timers phase)
// 2: setImmediate    (check phase)

libuv Thread Pool

Although JavaScript runs on one thread, Node.js uses libuv's thread pool (default: 4 threads) for blocking operations that the OS cannot handle asynchronously — such as file system operations, DNS lookups, and cryptographic operations.

// Increase thread pool for CPU-heavy crypto or file operations
// Set before importing any module that uses the thread pool
process.env.UV_THREADPOOL_SIZE = '8'; // max 128

// Worker threads for CPU-intensive JavaScript (not I/O)
const { Worker, isMainThread, parentPort, workerData } = require('worker_threads');

if (isMainThread) {
  // Main thread: spawn a worker for CPU work
  const worker = new Worker(__filename, { workerData: { n: 42 } });
  worker.on('message', result => console.log('Result:', result));
  worker.on('error', err => console.error(err));
} else {
  // Worker thread: perform CPU-intensive computation
  function fibonacci(n) {
    if (n <= 1) return n;
    return fibonacci(n - 1) + fibonacci(n - 2);
  }
  parentPort.postMessage(fibonacci(workerData.n));
}

Installing Node.js: nvm and Version Management

Always install Node.js via nvm (Node Version Manager) rather than the system package manager. This lets you switch between Node.js versions per project and avoids permission issues with global npm packages.

# Install nvm
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
source ~/.bashrc  # or ~/.zshrc

# Install and use specific Node.js versions
nvm install 20    # LTS (recommended for production)
nvm install 22    # Current
nvm use 20

# Set default version
nvm alias default 20

# Per-project version via .nvmrc
echo "20" > .nvmrc
nvm use           # reads .nvmrc automatically

# Check installed versions
nvm ls
node --version    # v20.x.x
npm --version     # 10.x.x

Core Modules: Built-In Node.js APIs

Node.js ships with a rich set of built-in modules. No npm install needed — just require() or import them directly.

fs — File System

const fs = require('fs');
const path = require('path');

// Read file (async — recommended)
async function readConfig() {
  const filePath = path.join(__dirname, 'config.json');
  const raw = await fs.promises.readFile(filePath, 'utf8');
  return JSON.parse(raw);
}

// Write file (create or overwrite)
await fs.promises.writeFile('output.txt', 'Hello World\n', 'utf8');

// Append to file
await fs.promises.appendFile('log.txt', new Date().toISOString() + '\n');

// Check if file exists
try {
  await fs.promises.access('config.json', fs.constants.F_OK);
  console.log('File exists');
} catch {
  console.log('File not found');
}

// List directory contents
const entries = await fs.promises.readdir('./src', { withFileTypes: true });
entries.forEach(entry => {
  if (entry.isDirectory()) console.log('DIR:', entry.name);
  else console.log('FILE:', entry.name);
});

// Create directory (recursive)
await fs.promises.mkdir('./logs/2026', { recursive: true });

// Delete file
await fs.promises.unlink('temp.txt');

// Rename/move file
await fs.promises.rename('old.txt', 'new.txt');

path — File Path Utilities

const path = require('path');

path.join('/users', 'alice', 'docs');       // '/users/alice/docs'
path.resolve('src', 'index.js');            // absolute path from cwd
path.dirname('/home/alice/file.txt');       // '/home/alice'
path.basename('/home/alice/file.txt');      // 'file.txt'
path.extname('style.min.css');              // '.css'
path.parse('/home/user/notes.txt');
// { root: '/', dir: '/home/user', base: 'notes.txt', ext: '.txt', name: 'notes' }

// Always use path.join for cross-platform paths (avoids Windows / Linux issues)
const configPath = path.join(__dirname, '..', 'config', 'app.json');

http / https — HTTP Server

const http = require('http');

const server = http.createServer((req, res) => {
  const url = new URL(req.url, `http://${req.headers.host}`);

  if (req.method === 'GET' && url.pathname === '/api/health') {
    res.writeHead(200, { 'Content-Type': 'application/json' });
    res.end(JSON.stringify({ status: 'ok', uptime: process.uptime() }));
    return;
  }

  res.writeHead(404, { 'Content-Type': 'application/json' });
  res.end(JSON.stringify({ error: 'Not Found' }));
});

server.listen(3000, '0.0.0.0', () => {
  console.log('Server listening on port 3000');
});

// Graceful shutdown
process.on('SIGTERM', () => {
  server.close(() => {
    console.log('Server closed gracefully');
    process.exit(0);
  });
});

events — EventEmitter

const { EventEmitter } = require('events');

class OrderService extends EventEmitter {
  async placeOrder(order) {
    // process order...
    const result = await processPayment(order);
    this.emit('order:placed', { orderId: result.id, order });
    return result;
  }
}

const orderService = new OrderService();

orderService.on('order:placed', ({ orderId, order }) => {
  console.log(`Order ${orderId} placed, sending confirmation email...`);
  sendEmail(order.email, 'Order Confirmation', orderTemplate(orderId));
});

// Once (fires only once)
orderService.once('order:placed', () => {
  console.log('First order ever placed!');
});

stream — Streaming Data

const fs = require('fs');
const zlib = require('zlib');
const { pipeline } = require('stream/promises');

// Pipe a large file through gzip compression to an output file
// This processes the file in chunks — never loads the whole file into memory
async function compressFile(input, output) {
  await pipeline(
    fs.createReadStream(input),
    zlib.createGzip(),
    fs.createWriteStream(output)
  );
  console.log(`Compressed ${input} -> ${output}`);
}

await compressFile('large-dataset.csv', 'large-dataset.csv.gz');

// Transform stream: process CSV line by line
const { Transform } = require('stream');
const { createInterface } = require('readline');

const lineCounter = new Transform({
  transform(chunk, encoding, callback) {
    const lines = chunk.toString().split('\n').length - 1;
    this.lineCount = (this.lineCount || 0) + lines;
    callback(null, chunk); // pass through unchanged
  }
});

crypto — Cryptography

const crypto = require('crypto');

// Generate a random token (for API keys, password reset tokens)
const token = crypto.randomBytes(32).toString('hex');
console.log(token); // 64-char hex string

// HMAC signature (for webhook verification)
function verifyWebhook(payload, signature, secret) {
  const expected = crypto
    .createHmac('sha256', secret)
    .update(payload)
    .digest('hex');
  return crypto.timingSafeEqual(
    Buffer.from(signature, 'hex'),
    Buffer.from(expected, 'hex')
  );
}

// Hash a value (for non-password data like cache keys)
const hash = crypto.createHash('sha256').update('some-data').digest('hex');

// AES-256-GCM encryption (for PII at rest)
function encrypt(text, key) {
  const iv = crypto.randomBytes(12);
  const cipher = crypto.createCipheriv('aes-256-gcm', key, iv);
  const encrypted = Buffer.concat([cipher.update(text, 'utf8'), cipher.final()]);
  const tag = cipher.getAuthTag();
  return { iv: iv.toString('hex'), tag: tag.toString('hex'), data: encrypted.toString('hex') };
}

npm and package.json: Dependency Management

npm (Node Package Manager) is the default package manager for Node.js. The package.json file is the manifest that describes your project, its dependencies, scripts, and metadata.

package.json Essentials

{
  "name": "my-api-server",
  "version": "1.2.3",
  "description": "A production-ready REST API",
  "main": "src/index.js",
  "type": "commonjs",
  "engines": {
    "node": ">=20.0.0",
    "npm": ">=10.0.0"
  },
  "scripts": {
    "start": "node src/index.js",
    "dev": "nodemon src/index.js",
    "test": "jest --coverage",
    "test:watch": "jest --watch",
    "lint": "eslint src/",
    "build": "tsc",
    "db:migrate": "node scripts/migrate.js"
  },
  "dependencies": {
    "express": "^4.18.2",
    "jsonwebtoken": "^9.0.2",
    "bcrypt": "^5.1.1",
    "dotenv": "^16.3.1",
    "pg": "^8.11.3"
  },
  "devDependencies": {
    "jest": "^29.7.0",
    "nodemon": "^3.0.2",
    "supertest": "^6.3.4",
    "eslint": "^8.55.0"
  }
}

Semantic Versioning and Lock Files

npm uses semantic versioning (semver): MAJOR.MINOR.PATCH. Version range prefixes control which updates are allowed:

# Semver ranges in package.json
"express": "4.18.2"   # exact — only this version
"express": "~4.18.2"  # patch updates only — 4.18.x
"express": "^4.18.2"  # minor + patch — 4.x.x (most common)
"express": "*"        # any version (dangerous in production!)

# package-lock.json / yarn.lock
# Lock files record the exact resolved version of every dependency
# ALWAYS commit lock files to version control for reproducible builds

npm install              # install using package-lock.json
npm install --frozen-lockfile  # fail if lock file would be updated (CI)
npm ci                   # clean install using lock file (fastest in CI)

# Managing dependencies
npm install express              # save to dependencies
npm install -D jest              # save to devDependencies
npm uninstall lodash             # remove package
npm update                       # update to allowed versions
npm outdated                     # show outdated packages
npm audit                        # check for security vulnerabilities
npm audit fix                    # auto-fix vulnerabilities

Express.js: Building HTTP Servers and APIs

Express.js is the most popular Node.js web framework. It is minimal and unopinionated, providing routing, middleware support, and request/response handling without imposing a rigid project structure.

Basic Express Server

const express = require('express');
const app = express();

// Built-in middleware
app.use(express.json());                    // parse JSON bodies
app.use(express.urlencoded({ extended: true })); // parse form data

// Route definition
app.get('/', (req, res) => {
  res.json({ message: 'API is running', version: '1.0.0' });
});

// Route with URL parameter
app.get('/users/:id', async (req, res, next) => {
  try {
    const user = await User.findById(req.params.id);
    if (!user) return res.status(404).json({ error: 'User not found' });
    res.json(user);
  } catch (err) {
    next(err); // pass to error handler
  }
});

// Route with query parameters
app.get('/products', (req, res) => {
  const { page = 1, limit = 20, category } = req.query;
  // fetch products with pagination...
  res.json({ products: [], page: Number(page), limit: Number(limit) });
});

// POST route
app.post('/users', async (req, res, next) => {
  try {
    const { name, email, password } = req.body;
    // validation, hashing, saving...
    const user = await User.create({ name, email, password });
    res.status(201).json({ id: user.id, name: user.name, email: user.email });
  } catch (err) {
    next(err);
  }
});

// Error-handling middleware (must have 4 params)
app.use((err, req, res, next) => {
  console.error(err.stack);
  const status = err.status || err.statusCode || 500;
  res.status(status).json({ error: err.message || 'Internal Server Error' });
});

app.listen(3000, () => console.log('Express running on port 3000'));

Express Router for Modular Routes

// routes/users.js
const router = require('express').Router();
const { authenticate } = require('../middleware/auth');
const userController = require('../controllers/userController');

router.get('/',         authenticate, userController.list);
router.post('/',                       userController.create);
router.get('/:id',      authenticate, userController.getById);
router.put('/:id',      authenticate, userController.update);
router.delete('/:id',   authenticate, userController.remove);

module.exports = router;

// app.js
const usersRouter = require('./routes/users');
app.use('/api/v1/users', usersRouter);

// controllers/userController.js — keeps routes thin
exports.list = async (req, res, next) => {
  try {
    const users = await UserService.list(req.query);
    res.json(users);
  } catch (err) {
    next(err);
  }
};

Custom Middleware

// Request logger middleware
function requestLogger(req, res, next) {
  const start = Date.now();
  res.on('finish', () => {
    const duration = Date.now() - start;
    console.log(`${req.method} ${req.originalUrl} ${res.statusCode} ${duration}ms`);
  });
  next(); // MUST call next() or the request hangs
}

// Rate limiting middleware
const requestCounts = new Map();
function rateLimiter(maxRequests = 100, windowMs = 60000) {
  return (req, res, next) => {
    const key = req.ip;
    const now = Date.now();
    const windowStart = now - windowMs;
    const requests = (requestCounts.get(key) || []).filter(t => t > windowStart);
    if (requests.length >= maxRequests) {
      return res.status(429).json({ error: 'Too Many Requests' });
    }
    requests.push(now);
    requestCounts.set(key, requests);
    next();
  };
}

// CORS middleware
function cors(allowedOrigins) {
  return (req, res, next) => {
    const origin = req.headers.origin;
    if (allowedOrigins.includes(origin)) {
      res.setHeader('Access-Control-Allow-Origin', origin);
      res.setHeader('Access-Control-Allow-Methods', 'GET,POST,PUT,DELETE,OPTIONS');
      res.setHeader('Access-Control-Allow-Headers', 'Content-Type,Authorization');
    }
    if (req.method === 'OPTIONS') return res.sendStatus(204);
    next();
  };
}

app.use(requestLogger);
app.use(rateLimiter(60, 60000)); // 60 req/min
app.use(cors(['https://myapp.com', 'http://localhost:3000']));

Environment Variables and Configuration

Hardcoding secrets in source code is the most common security mistake in Node.js applications. Use environment variables for all secrets, connection strings, and environment-specific configuration.

# .env (never commit this file!)
NODE_ENV=development
PORT=3000
DATABASE_URL=postgresql://user:password@localhost:5432/mydb
JWT_SECRET=your-super-secret-jwt-key-change-this-in-production
JWT_EXPIRES_IN=15m
REDIS_URL=redis://localhost:6379
SMTP_HOST=smtp.sendgrid.net
SMTP_PORT=587
SMTP_USER=apikey
SMTP_PASS=SG.your-sendgrid-api-key
BCRYPT_ROUNDS=12

# .env.example (commit this file — documents required variables)
NODE_ENV=
PORT=
DATABASE_URL=
JWT_SECRET=
JWT_EXPIRES_IN=
REDIS_URL=
SMTP_HOST=
SMTP_PORT=
SMTP_USER=
SMTP_PASS=
BCRYPT_ROUNDS=
// config/env.js — validate all required vars at startup
require('dotenv').config();

function required(name) {
  const value = process.env[name];
  if (!value) throw new Error(`Missing required environment variable: ${name}`);
  return value;
}

module.exports = {
  NODE_ENV:       process.env.NODE_ENV || 'development',
  PORT:           parseInt(process.env.PORT || '3000', 10),
  DATABASE_URL:   required('DATABASE_URL'),
  JWT_SECRET:     required('JWT_SECRET'),
  JWT_EXPIRES_IN: process.env.JWT_EXPIRES_IN || '15m',
  REDIS_URL:      process.env.REDIS_URL || 'redis://localhost:6379',
  BCRYPT_ROUNDS:  parseInt(process.env.BCRYPT_ROUNDS || '12', 10),
  isProd:         process.env.NODE_ENV === 'production',
  isDev:          process.env.NODE_ENV === 'development',
};

// Usage in any file
const config = require('./config/env');
console.log(`Starting on port ${config.PORT} in ${config.NODE_ENV} mode`);

Building REST APIs: CRUD with Express.js

A well-designed REST API follows consistent conventions for URLs, HTTP methods, and status codes. Here is a complete example of a CRUD API for a posts resource:

// routes/posts.js — Complete CRUD REST API
const router = require('express').Router();
const { authenticate } = require('../middleware/auth');
const { validate } = require('../middleware/validate');
const { createPostSchema, updatePostSchema } = require('../schemas/post');

// GET /api/posts — list with pagination and filtering
router.get('/', async (req, res, next) => {
  try {
    const { page = 1, limit = 20, tag, authorId } = req.query;
    const offset = (Number(page) - 1) * Number(limit);
    const where = {};
    if (tag) where.tag = tag;
    if (authorId) where.authorId = authorId;

    const [posts, total] = await Promise.all([
      Post.findMany({ where, limit: Number(limit), offset, orderBy: 'createdAt DESC' }),
      Post.count({ where }),
    ]);

    res.json({
      data: posts,
      meta: {
        page: Number(page),
        limit: Number(limit),
        total,
        totalPages: Math.ceil(total / Number(limit)),
      },
    });
  } catch (err) {
    next(err);
  }
});

// GET /api/posts/:id — single resource
router.get('/:id', async (req, res, next) => {
  try {
    const post = await Post.findById(req.params.id);
    if (!post) return res.status(404).json({ error: 'Post not found' });
    res.json(post);
  } catch (err) {
    next(err);
  }
});

// POST /api/posts — create
router.post('/', authenticate, validate(createPostSchema), async (req, res, next) => {
  try {
    const post = await Post.create({ ...req.body, authorId: req.user.id });
    res.status(201).json(post);  // 201 Created
  } catch (err) {
    next(err);
  }
});

// PUT /api/posts/:id — full update
router.put('/:id', authenticate, validate(updatePostSchema), async (req, res, next) => {
  try {
    const post = await Post.findById(req.params.id);
    if (!post) return res.status(404).json({ error: 'Post not found' });
    if (post.authorId !== req.user.id) return res.status(403).json({ error: 'Forbidden' });
    const updated = await Post.update(req.params.id, req.body);
    res.json(updated);
  } catch (err) {
    next(err);
  }
});

// PATCH /api/posts/:id — partial update
router.patch('/:id', authenticate, async (req, res, next) => {
  try {
    const post = await Post.findById(req.params.id);
    if (!post) return res.status(404).json({ error: 'Post not found' });
    if (post.authorId !== req.user.id) return res.status(403).json({ error: 'Forbidden' });
    const updated = await Post.partialUpdate(req.params.id, req.body);
    res.json(updated);
  } catch (err) {
    next(err);
  }
});

// DELETE /api/posts/:id
router.delete('/:id', authenticate, async (req, res, next) => {
  try {
    const post = await Post.findById(req.params.id);
    if (!post) return res.status(404).json({ error: 'Post not found' });
    if (post.authorId !== req.user.id) return res.status(403).json({ error: 'Forbidden' });
    await Post.delete(req.params.id);
    res.status(204).send(); // 204 No Content
  } catch (err) {
    next(err);
  }
});

module.exports = router;

Authentication: JWT and bcrypt

The most common authentication pattern for Node.js REST APIs is JWT (JSON Web Tokens) combined with bcrypt for password hashing. Never store plain-text passwords.

const bcrypt = require('bcrypt');
const jwt = require('jsonwebtoken');
const config = require('./config/env');

// --- Password Hashing ---
// Hash password before saving to database
async function hashPassword(plainText) {
  return bcrypt.hash(plainText, config.BCRYPT_ROUNDS); // rounds=12 takes ~300ms
}

// Verify password during login
async function verifyPassword(plainText, hash) {
  return bcrypt.compare(plainText, hash); // timing-safe comparison built in
}

// --- JWT Token Generation ---
function generateTokens(userId) {
  const accessToken = jwt.sign(
    { sub: userId, type: 'access' },
    config.JWT_SECRET,
    { expiresIn: '15m', issuer: 'myapp' }
  );
  const refreshToken = jwt.sign(
    { sub: userId, type: 'refresh' },
    config.JWT_SECRET,
    { expiresIn: '7d', issuer: 'myapp' }
  );
  return { accessToken, refreshToken };
}

// --- Auth Middleware ---
function authenticate(req, res, next) {
  const authHeader = req.headers.authorization;
  if (!authHeader?.startsWith('Bearer ')) {
    return res.status(401).json({ error: 'Missing or invalid Authorization header' });
  }
  const token = authHeader.slice(7);
  try {
    const payload = jwt.verify(token, config.JWT_SECRET, { issuer: 'myapp' });
    if (payload.type !== 'access') throw new Error('Wrong token type');
    req.user = { id: payload.sub };
    next();
  } catch (err) {
    if (err.name === 'TokenExpiredError') {
      return res.status(401).json({ error: 'Token expired', code: 'TOKEN_EXPIRED' });
    }
    return res.status(401).json({ error: 'Invalid token' });
  }
}

// --- Auth Routes ---
// POST /auth/register
router.post('/register', async (req, res, next) => {
  try {
    const { name, email, password } = req.body;
    const existing = await User.findByEmail(email);
    if (existing) return res.status(409).json({ error: 'Email already registered' });

    const hashedPassword = await hashPassword(password);
    const user = await User.create({ name, email, password: hashedPassword });
    const tokens = generateTokens(user.id);

    res.cookie('refreshToken', tokens.refreshToken, {
      httpOnly: true,
      secure: config.isProd,
      sameSite: 'strict',
      maxAge: 7 * 24 * 60 * 60 * 1000, // 7 days
    });
    res.status(201).json({ user: { id: user.id, name, email }, accessToken: tokens.accessToken });
  } catch (err) {
    next(err);
  }
});

// POST /auth/login
router.post('/login', async (req, res, next) => {
  try {
    const { email, password } = req.body;
    const user = await User.findByEmail(email);
    if (!user) return res.status(401).json({ error: 'Invalid credentials' });

    const valid = await verifyPassword(password, user.password);
    if (!valid) return res.status(401).json({ error: 'Invalid credentials' });

    const tokens = generateTokens(user.id);
    res.cookie('refreshToken', tokens.refreshToken, { httpOnly: true, secure: config.isProd, sameSite: 'strict', maxAge: 7 * 24 * 60 * 60 * 1000 });
    res.json({ user: { id: user.id, name: user.name, email }, accessToken: tokens.accessToken });
  } catch (err) {
    next(err);
  }
});

Database Integration

Node.js works with any database. Here are connection patterns for the two most common choices: PostgreSQL (relational) and MongoDB (document).

PostgreSQL with node-postgres (pg)

const { Pool } = require('pg');
const config = require('./config/env');

// Connection pool (reuse connections — critical for performance)
const pool = new Pool({
  connectionString: config.DATABASE_URL,
  max: 20,               // max connections in pool
  idleTimeoutMillis: 30000,
  connectionTimeoutMillis: 2000,
  ssl: config.isProd ? { rejectUnauthorized: false } : false,
});

// Graceful shutdown
process.on('SIGTERM', async () => {
  await pool.end();
  process.exit(0);
});

// Query helper with automatic connection management
async function query(sql, params = []) {
  const client = await pool.connect();
  try {
    return await client.query(sql, params);
  } finally {
    client.release(); // always return connection to pool
  }
}

// Transactions
async function transferFunds(fromId, toId, amount) {
  const client = await pool.connect();
  try {
    await client.query('BEGIN');
    await client.query(
      'UPDATE accounts SET balance = balance - $1 WHERE id = $2',
      [amount, fromId]
    );
    await client.query(
      'UPDATE accounts SET balance = balance + $1 WHERE id = $2',
      [amount, toId]
    );
    await client.query('COMMIT');
  } catch (err) {
    await client.query('ROLLBACK');
    throw err;
  } finally {
    client.release();
  }
}

// Example usage
const { rows } = await query(
  'SELECT id, name, email FROM users WHERE id = $1',
  [userId]
);

MongoDB with the Official Driver

const { MongoClient, ObjectId } = require('mongodb');
const config = require('./config/env');

let db;

async function connectDB() {
  const client = new MongoClient(config.MONGODB_URI, {
    maxPoolSize: 20,
    serverSelectionTimeoutMS: 5000,
  });
  await client.connect();
  db = client.db('myapp');
  console.log('MongoDB connected');

  // Create indexes for performance
  await db.collection('users').createIndex({ email: 1 }, { unique: true });
  await db.collection('posts').createIndex({ authorId: 1, createdAt: -1 });
  await db.collection('sessions').createIndex({ expiresAt: 1 }, { expireAfterSeconds: 0 });

  return db;
}

// CRUD operations
async function createUser(userData) {
  const result = await db.collection('users').insertOne({
    ...userData,
    createdAt: new Date(),
    updatedAt: new Date(),
  });
  return { _id: result.insertedId, ...userData };
}

async function findUserById(id) {
  return db.collection('users').findOne({ _id: new ObjectId(id) });
}

async function listPosts(authorId, { page = 1, limit = 20 } = {}) {
  return db.collection('posts')
    .find({ authorId: new ObjectId(authorId) })
    .sort({ createdAt: -1 })
    .skip((page - 1) * limit)
    .limit(limit)
    .toArray();
}

Testing with Jest

Automated testing is non-negotiable for production Node.js applications. Jest is the most popular testing framework, and Supertest is the standard for testing Express HTTP endpoints.

// __tests__/userService.test.js
const UserService = require('../services/userService');
const { query } = require('../db');

// Mock the database module
jest.mock('../db');

describe('UserService', () => {
  beforeEach(() => {
    jest.clearAllMocks();
  });

  describe('createUser', () => {
    it('should hash the password before saving', async () => {
      const mockUser = { id: 1, name: 'Alice', email: 'alice@example.com' };
      query.mockResolvedValueOnce({ rows: [mockUser] });

      const result = await UserService.create({
        name: 'Alice',
        email: 'alice@example.com',
        password: 'plaintext123',
      });

      expect(query).toHaveBeenCalledTimes(1);
      const [sql, params] = query.mock.calls[0];
      expect(sql).toContain('INSERT INTO users');
      expect(params[2]).not.toBe('plaintext123'); // password should be hashed
      expect(params[2]).toMatch(/^$2b$/);       // bcrypt hash prefix
      expect(result).toEqual(mockUser);
    });

    it('should throw on duplicate email', async () => {
      const error = new Error('duplicate key value');
      error.code = '23505';
      query.mockRejectedValueOnce(error);

      await expect(
        UserService.create({ name: 'Bob', email: 'existing@test.com', password: 'pass' })
      ).rejects.toThrow('Email already exists');
    });
  });
});

// __tests__/posts.route.test.js — Integration test
const request = require('supertest');
const app = require('../app');
const { generateTokens } = require('../auth');

describe('POST /api/posts', () => {
  const tokens = generateTokens('test-user-id');

  it('should create a post when authenticated', async () => {
    const res = await request(app)
      .post('/api/posts')
      .set('Authorization', `Bearer ${tokens.accessToken}`)
      .send({ title: 'Test Post', content: 'Hello world', tag: 'nodejs' })
      .expect(201);

    expect(res.body).toMatchObject({
      title: 'Test Post',
      content: 'Hello world',
      authorId: 'test-user-id',
    });
    expect(res.body.id).toBeDefined();
  });

  it('should return 401 when not authenticated', async () => {
    await request(app)
      .post('/api/posts')
      .send({ title: 'Test' })
      .expect(401);
  });
});

Deployment: PM2 and Docker

PM2 Process Manager

// ecosystem.config.js — PM2 configuration
module.exports = {
  apps: [{
    name: 'api-server',
    script: 'src/index.js',
    instances: 'max',       // use all CPU cores
    exec_mode: 'cluster',   // cluster mode for load balancing
    watch: false,           // don't watch files in production
    max_memory_restart: '500M',
    env: {
      NODE_ENV: 'development',
      PORT: 3000,
    },
    env_production: {
      NODE_ENV: 'production',
      PORT: 3000,
    },
    error_file: './logs/err.log',
    out_file: './logs/out.log',
    log_date_format: 'YYYY-MM-DD HH:mm:ss Z',
  }],
};
# PM2 commands
npm install -g pm2

pm2 start ecosystem.config.js --env production   # start in production mode
pm2 status                                        # show all processes
pm2 logs api-server                               # tail logs
pm2 monit                                         # live monitoring dashboard
pm2 reload api-server                             # zero-downtime reload
pm2 restart api-server                            # full restart
pm2 delete api-server                             # stop and delete

# Auto-start on system reboot
pm2 startup                                       # generate startup command
pm2 save                                          # save current process list

Dockerfile for Node.js

# Multi-stage Dockerfile for minimal production image
# Stage 1: Build
FROM node:20-alpine AS builder
WORKDIR /app

# Copy package files first (layer caching)
COPY package*.json ./
RUN npm ci --frozen-lockfile

# Copy source and build (TypeScript or similar)
COPY . .
RUN npm run build 2>/dev/null || echo "No build step"

# Stage 2: Production image
FROM node:20-alpine AS production
WORKDIR /app

# Security: run as non-root user
RUN addgroup -S appgroup && adduser -S appuser -G appgroup

# Copy only production dependencies
COPY package*.json ./
RUN npm ci --frozen-lockfile --only=production && npm cache clean --force

# Copy built app from builder stage
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/src ./src

# Use non-root user
USER appuser

EXPOSE 3000

# Healthcheck
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3   CMD node -e "require('http').get('http://localhost:3000/api/health', (r) => process.exit(r.statusCode === 200 ? 0 : 1))"

CMD ["node", "src/index.js"]

Node.js vs Deno vs Bun: Runtime Comparison 2026

FeatureNode.js 22Deno 2.xBun 1.x
EngineV8 (Google)V8 (Google)JavaScriptCore (Apple)
LanguageC++RustZig
TypeScriptNative (strip-types flag)Native, first-classNative, first-class
Package Managernpm / pnpm / yarndeno add (npm compat)bun install (built-in)
Test Runnernode --test / Jestdeno test (built-in)bun test (built-in)
Bundlerwebpack / vite / esbuilddeno compile / bundlebun build (built-in)
npm Compatibility100%~95%~98%
Security ModelUnrestrictedPermissions requiredUnrestricted
HTTP ServerBun ~4x faster~2x faster than NodeFastest (native)
Startup Time~50ms~70ms~5ms
npm Packages2M+ packagesnpm + deno.landnpm + JSR
LTS SupportYes (Active LTS)No formal LTSNo formal LTS
Best ForProduction, enterpriseTypeScript-first teamsSpeed, new projects

Performance Tips: Clustering, Caching, and More

Clustering — Use All CPU Cores

const cluster = require('cluster');
const os = require('os');
const app = require('./app');

if (cluster.isPrimary) {
  const numCPUs = os.cpus().length;
  console.log(`Primary ${process.pid} starting ${numCPUs} workers`);

  for (let i = 0; i < numCPUs; i++) cluster.fork();

  cluster.on('exit', (worker, code, signal) => {
    console.log(`Worker ${worker.process.pid} died. Restarting...`);
    cluster.fork(); // Auto-restart crashed workers
  });
} else {
  app.listen(3000);
  console.log(`Worker ${process.pid} started`);
}

In-Memory Caching with TTL

// Simple LRU cache with TTL (or use the 'lru-cache' npm package)
class TTLCache {
  constructor(maxSize = 1000, ttlMs = 60000) {
    this.cache = new Map();
    this.maxSize = maxSize;
    this.ttlMs = ttlMs;
  }

  get(key) {
    const entry = this.cache.get(key);
    if (!entry) return undefined;
    if (Date.now() > entry.expiresAt) {
      this.cache.delete(key);
      return undefined;
    }
    // LRU: move to end
    this.cache.delete(key);
    this.cache.set(key, entry);
    return entry.value;
  }

  set(key, value) {
    if (this.cache.size >= this.maxSize) {
      // Delete oldest entry (first in Map iteration order)
      this.cache.delete(this.cache.keys().next().value);
    }
    this.cache.set(key, { value, expiresAt: Date.now() + this.ttlMs });
  }
}

const cache = new TTLCache(500, 5 * 60 * 1000); // 500 entries, 5-min TTL

// Cache middleware for Express routes
function cacheMiddleware(ttlSeconds = 60) {
  return (req, res, next) => {
    if (req.method !== 'GET') return next();
    const key = req.originalUrl;
    const cached = cache.get(key);
    if (cached) {
      res.setHeader('X-Cache', 'HIT');
      return res.json(cached);
    }
    const originalJson = res.json.bind(res);
    res.json = (data) => {
      cache.set(key, data);
      res.setHeader('X-Cache', 'MISS');
      return originalJson(data);
    };
    next();
  };
}

app.get('/api/products', cacheMiddleware(300), productController.list);

Security Best Practices

const helmet = require('helmet');
const rateLimit = require('express-rate-limit');

// Helmet sets security headers: X-Frame-Options, X-XSS-Protection, HSTS, CSP, etc.
app.use(helmet({
  contentSecurityPolicy: {
    directives: {
      defaultSrc: ["'self'"],
      scriptSrc: ["'self'"],
      styleSrc: ["'self'", "'unsafe-inline'"],
      imgSrc: ["'self'", "data:", "https:"],
    },
  },
}));

// Rate limiting to prevent brute force and DoS
const authLimiter = rateLimit({
  windowMs: 15 * 60 * 1000, // 15 minutes
  max: 5,                    // max 5 login attempts
  message: { error: 'Too many login attempts. Try again in 15 minutes.' },
  standardHeaders: true,
  legacyHeaders: false,
});
app.use('/auth/login', authLimiter);

// Input validation with Zod
const { z } = require('zod');
const createUserSchema = z.object({
  name: z.string().min(2).max(100).trim(),
  email: z.string().email().toLowerCase(),
  password: z.string().min(8).max(100).regex(/[A-Z]/, 'Must contain uppercase'),
});

function validate(schema) {
  return (req, res, next) => {
    const result = schema.safeParse(req.body);
    if (!result.success) {
      return res.status(400).json({ error: 'Validation failed', details: result.error.issues });
    }
    req.body = result.data;
    next();
  };
}

// SQL injection prevention — always use parameterized queries
// NEVER do this:
// db.query(`SELECT * FROM users WHERE email = '${email}'`);  // VULNERABLE!

// Always do this:
// db.query('SELECT * FROM users WHERE email = $1', [email]);   // SAFE

Frequently Asked Questions

What is Node.js and what is it used for?

Node.js is a cross-platform JavaScript runtime built on Chrome's V8 engine. It is used for building backend web servers, REST APIs, real-time applications, CLI tools, and microservices. Its non-blocking I/O model makes it ideal for high-concurrency I/O-intensive workloads.

How does the Node.js event loop work?

The event loop is what allows Node.js to be non-blocking despite running on a single thread. It offloads I/O operations to the OS kernel, then processes their callbacks when complete. The loop has six phases: timers, pending callbacks, idle/prepare, poll, check (setImmediate), and close callbacks. Microtasks (promises, process.nextTick) run between each phase.

What is the difference between require() and import in Node.js?

require() is CommonJS (CJS) — synchronous and the traditional Node.js module system. import/export is ES Modules (ESM) — asynchronous and the modern standard. Use ESM for new projects by setting "type": "module" in package.json or using .mjs file extensions.

How do I handle errors in async/await Node.js code?

Use try/catch blocks around async operations. In Express.js, always call next(err) from async route handlers to propagate errors to the error-handling middleware (the 4-argument (err, req, res, next) function). Use process.on('unhandledRejection') as a last resort to catch missed rejections.

What is the best way to manage environment variables in Node.js?

Use dotenv to load a .env file during development. Never commit .env to version control. Commit a .env.example with empty values. Validate required variables at startup so the app fails fast with a clear error if any are missing. In production, set variables through your hosting platform.

How do I implement JWT authentication in Node.js?

Install jsonwebtoken and bcrypt. Hash passwords with bcrypt.hash() before storing. On login, verify with bcrypt.compare(), then sign a JWT with jwt.sign(). Protect routes with middleware that calls jwt.verify(). Store access tokens short-lived (15m) and use refresh tokens in httpOnly cookies for security.

Should I use Node.js, Deno, or Bun for a new project?

Node.js is the safest for production due to its ecosystem and LTS support. Bun is excellent for new projects needing speed and a great developer experience. Deno suits TypeScript-first teams who value security-by-default. All are production-ready in 2026.

How do I deploy a Node.js application to production?

Use PM2 on a VPS for process management with clustering and auto-restart. Use Docker for containerized deployments with a multi-stage Dockerfile. Set NODE_ENV=production, use Nginx as a reverse proxy, enable HTTPS via Let's Encrypt, and monitor with PM2's built-in dashboard or a dedicated APM tool.

𝕏 Twitterin LinkedIn
Czy to było pomocne?

Bądź na bieżąco

Otrzymuj cotygodniowe porady i nowe narzędzia.

Bez spamu. Zrezygnuj kiedy chcesz.

Try These Related Tools

{ }JSON FormatterB→Base64 Encoder%20URL Encoder/Decoder

Related Articles

API Testing: Complete Guide with cURL, Supertest, and k6

Master API testing with this complete guide. Covers HTTP methods, cURL, fetch/axios, Postman/Newman, supertest, Python httpx, mock servers, contract testing, k6 load testing, and OpenAPI documentation.

WebSocket Complete Guide: Real-Time Communication with ws and Socket.io

Master WebSocket real-time communication. Complete guide with Browser API, Node.js ws, Socket.io, React hooks, Python websockets, Go gorilla/websocket, authentication, scaling, and error handling.

PostgreSQL Complete Guide: SQL, Indexes, JSONB, and Performance

Master PostgreSQL with this complete guide. Covers core SQL, indexes, Node.js pg, Prisma ORM, Python asyncpg, JSONB, full-text search, window functions, and performance tuning.