In the world of Node.js development, logging often begins and ends with console.log
. It’s simple, it’s built-in, and it gets the job done—until it doesn’t. As applications grow in complexity and scale, console.log
reveals its weaknesses: it’s unstructured, it’s slow, and it offers no way to route messages, filter by severity, or gain meaningful insights in a production environment.
This is where structured logging comes in, and at the forefront of the Node.js ecosystem is a logger built for one primary purpose: speed. Meet Pino.js.
Pino is a “very low overhead” logger that is fundamentally changing how developers approach logging. It’s not just another console.log
wrapper; it’s a tool built on a philosophy that can make your applications faster, more observable, and easier to debug.
The Philosophy: Why is Pino So Fast?
Pino’s performance isn’t magic; it’s a result of deliberate design choices.
- JSON First: Pino’s core responsibility is to write structured JSON logs to standard output (
stdout
) as quickly as possible. Unlike loggers that build pretty, colorful strings in your main application thread, Pino defers the “prettifying” to a separate process. Your application’s event loop isn’t blocked by the expensive task of string formatting. - Minimalist Core: The core
pino
library is lean. It focuses exclusively on writing JSON. Features like log rotation, sending logs to cloud services, or pretty-printing are handled by a rich ecosystem of standalone modules called transports. - Asynchronous by Design: With its modern
pino.transport
API, Pino runs transports in dedicated worker threads. This means that writing logs to a file or sending them over the network to a service like Sentry won’t block your main thread, ensuring your application remains responsive under heavy load.
Getting Started: From Zero to Structured Logs
Getting started with Pino is refreshingly simple. After installing it (npm install pino
), you can immediately begin logging.
JavaScript
import pino from 'pino';
const logger = pino();
logger.info('Hello, Pino!');
logger.error({ err_code: 500, user: 'Radek' }, 'A simulated error occurred.');
This code produces the following output:
JSON
{"level":30,"time":1749174330000,"pid":12345,"hostname":"my-machine","msg":"Hello, Pino!"}
{"level":50,"time":1749174330100,"pid":12345,"hostname":"my-machine","err_code":500,"user":"Radek","msg":"A simulated error occurred."}
Immediately, you have machine-readable logs enriched with a level
(30 for info, 50 for error), a precise time
, process ID (pid
), and any context you provide as a JSON object.
The Power of the Ecosystem: Transports
The true power of Pino lies in its transports. You can pipe your logs to multiple destinations simultaneously, with different log levels for each. Let’s configure a logger for a real-world scenario:
- Development: Show easy-to-read, colorful logs in the console.
- Production File: Write detailed
debug
logs to a rotating file. - Production Monitoring: Send
warn
anderror
logs to a monitoring service like Sentry.
First, install the necessary packages: npm install pino pino-pretty pino-rotating-file pino-sentry-transport
Then, configure your logger using pino.transport
:
JavaScript
import pino from 'pino';
const logger = pino({
level: 'debug', // The lowest level to capture for all transports
transport: {
targets: [
// Target 1: Pretty print to the console for development
{
target: 'pino-pretty',
level: 'info', // Only show info and above in the console
options: { colorize: true }
},
// Target 2: Write to a rotating file for production records
{
target: 'pino-rotating-file',
level: 'debug',
options: {
path: 'logs/app.log',
size: '10M', // Rotate at 10 megabytes
maxFiles: 5
}
},
// Target 3: Send critical errors to Sentry
{
target: 'pino-sentry-transport',
level: 'warn', // Only send warnings and errors to Sentry
options: {
sentry: { dsn: process.env.SENTRY_DSN }
}
}
]
}
});
logger.info('User logged in successfully.'); // Goes to console and file
logger.warn('API rate limit approaching.'); // Goes to all three
logger.debug('Cache key generated.'); // Goes only to the file
This single configuration gives you a robust logging setup tailored for different environments and needs, all running efficiently in background threads.
Automating Web Server Logging with pino-http
For web applications, manually logging every request is tedious. The pino-http
middleware solves this elegantly. It automatically creates a detailed log for every single HTTP request, including the status code, URL, and—most importantly—the response time.
Here’s how to use it with Express.js:
JavaScript
import express from 'express';
import pinohttp from 'pino-http';
import logger from './logger.js'; // Import our multi-transport logger
const app = express();
const httpLogger = pinohttp({ logger });
// Add the middleware
app.use(httpLogger);
app.get('/', (req, res) => {
res.send('Your request has been logged!');
});
app.listen(3000, () => {
logger.info('Server is running.');
});
With this, every request to your server will generate a single, rich log entry, giving you instant visibility into your application’s traffic and performance without cluttering your route handlers.
Conclusion: Why Pino Deserves Your Attention
Moving from console.log
to a structured logger is a critical step in professional software development. Pino offers a compelling case for being the default choice in the Node.js world. It is:
- Blazing Fast: Its core design prioritizes low overhead, ensuring your logger doesn’t become a bottleneck.
- Structured: JSON-first output makes logs easy to parse, search, and analyze for machines.
- Extensible: A powerful ecosystem of transports allows you to send your logs anywhere you need them.
- Modern: With features like asynchronous transports in worker threads, it’s built for modern, high-concurrency applications.
By embracing the discipline of structured logging with a tool like Pino, you empower yourself to build more robust, observable, and ultimately more reliable applications. It’s a small change in development that pays massive dividends in production.