Managing Categorized Logs in a Node.js App with the ELK Stack and Docker

Managing Categorized Logs in a Node.js App with the ELK Stack and Docker

In today’s fast-paced development environment, having a reliable and efficient logging system is crucial for monitoring, debugging, and analyzing applications. In this blog, we’ll walk you through setting up a categorized logging system in a Node.js app using Winston and the ELK Stack (Elasticsearch, Logstash, Kibana) with Docker. This approach will provide you with a production-level log management system to monitor your app’s behavior effectively.

1. Setting Up the ELK Stack Using Docker

Before diving into the Node.js app, we need to set up the ELK Stack using Docker. The ELK stack is a powerful set of tools for centralized logging and visualization. Here’s how you can do it:

Clone the Repository and Run Docker

You can use an existing GitHub repository that contains a Dockerized ELK stack. The repository typically has a README.md file with instructions to set up the ELK stack using Docker.

• Clone the repository:

git clone https://github.com/deviantony/docker-elk.git
cd docker-elk

• Follow the commands in the README.md to start the ELK stack. Typically, this will involve running:

docker compose setup up
docker compose up -d

Once the ELK stack is up, you can verify that it’s working by visiting Kibana in your browser. Kibana will allow you to visualize and explore the logs stored in Elasticsearch.

Check localhost:5601

2. Create a Basic Node.js App with Winston

Now that you have your ELK stack up and running, let’s move on to setting up the Node.js application with Winston, a popular logging library.

Install Dependencies

First, you need to install the necessary dependencies for logging:

npm install winston winston-elasticsearch

Logger Configuration (logger.ts)

We’ll create a file called logger.ts to set up the logging system. The logger will categorize logs based on the different components of your system (e.g., server, oracle, postgres, etc.).

Here’s how to set up a logger with Winston and Elasticsearch

Example code:

import winston from "winston";
import { ElasticsearchTransport } from "winston-elasticsearch";
import { constants } from "./utils/constants";

// This function returns a logger for a specific category
const createLogger = (category: string, clientId: string = "app"): winston.Logger => {
  const esTransportOpts = {
    level: "info",
    clientOpts: {
      node: constants.elkLogsUrl ?? "http://localhost:9200",
      auth: {
        username: constants.elkLogsUsername ?? "elastic",
        password: constants.elkLogsPassword ?? "changeme"
      },
    },
    indexPrefix: constants.elkLogsIndex ?? "your-index-name"
  };

  return winston.createLogger({
    level: "info",
    defaultMeta: { category, clientId },
    transports: [
      new winston.transports.Console({
        format: winston.format.combine(
          winston.format.colorize(),
          winston.format.timestamp(),
          winston.format.printf(({ timestamp, level, message, category, clientId }) => {
            return `${timestamp} [${clientId}] [${category}] ${level}: ${message}`;
          })
        ),
      }),
      ...(process.env.NODE_ENV === "production" ? [new ElasticsearchTransport(esTransportOpts)] : []),
    ],
    exitOnError: false,
  });
};

// Centralized loggers
export const serverLogger = createLogger("server");
export const oracleLogger = createLogger("oracle");
// Add other loggers as needed

3. Using the Logger in Your Express API

With the logger set up, you can now use it in your Express API to log different events in your application. By categorizing logs, you can easily identify and track specific issues.

Example API Setup

Here’s an example of how to integrate the logger into your Express API:

import express from "express";
import { serverLogger, userLogger } from "./logger"; // Import loggers

const app = express();

app.get("/", (req, res) => {
  // Log a basic message using the serverLogger
  serverLogger.info("Server is up and running!");
  res.send("Welcome to the API!");
});

app.get("/user/:id", (req, res) => {
  const userId = req.params.id;
  // Log an action related to the user category
  userLogger.info(`Fetching details for user: ${userId}`);
  res.send(`User details for ${userId}`);
});

app.listen(3000, () => {
  serverLogger.info("Server started on port 3000");
});

4. Why Use Categorized Logs in Production?

Using categorized logs in production offers several advantages:

Production-Level Logging

This logging system is designed to work in production environments. It allows you to categorize logs based on the component or service generating them (e.g., server, database, user interactions). This makes it easier to trace issues, monitor system health, and ensure that the application is running smoothly.

Centralized Logging with ELK Stack

The ELK stack is a powerful tool for centralized logging. With logs flowing into Elasticsearch, you can use Kibana to visualize and search through logs. Kibana provides an interactive interface that lets you drill down into specific logs, making it easy to troubleshoot issues and track system behavior.

Scalable and Flexible

This approach scales well as your application grows. Whether you’re building a microservices architecture or a monolithic app, you can easily extend the logging system by adding new categories or components. You can also adjust the logging level based on the environment (e.g., more detailed logs in development, minimal logs in production).

5. Additional Notes:

Log Levels: You can adjust the log level (e.g., info, warn, error) depending on the severity of the event. For example, errors can be logged with the error level, while informational messages use the info level.

Conditional Logging to Elasticsearch: The logger is configured to send logs to Elasticsearch only when the application is running in a production environment. This helps reduce unnecessary load on your Elasticsearch cluster during development or staging.

Easy Filtering and Analysis: Since logs are categorized and stored in Elasticsearch, you can use Kibana to filter logs by category, client ID, log level, or any other metadata. This makes it easier to find specific logs and analyze application behavior.

6. Testing logs in Kibana

  1. Login to Kibana using your credentials, search for index Management,

  2. Discover index

There are many ways to visualize data in Kibana. Check out this link.

Conclusion

By using Winston for logging and the ELK stack for centralized log management, you can implement a robust and scalable logging system in your Node.js applications. This production-level logging setup will help you monitor your application, troubleshoot issues efficiently, and maintain visibility into system behavior, making it an essential tool for any modern application.