Linux - Day 10

Linux - Day 10

ยท

3 min read

Log Analyzer and Report Generator Using Bash Script

In this task, we are creating a Bash script that automates the process of analyzing log files and generating a daily summary report. This script will analyze error messages, identify critical events, and summarize the findings in a report. Here's how you can implement it.

Key Features:

  1. Input Log File: The script accepts a log file as a command-line argument.

  2. Error Count: Counts the number of error messages based on specific keywords.

  3. Critical Events: Extracts lines containing the keyword "CRITICAL" along with their line numbers.

  4. Top Error Messages: Identifies the top 5 most frequent error messages.

  5. Summary Report: Generates a report with date, file name, error counts, and critical events.

  6. Optional Enhancement: Moves the processed log file to an archive directory.


Simplified Log Analyzer and Report Generator Script:

#!/bin/bash

# Simple Log Analyzer and Report Generator

# Check if the user provided a log file as an argument
if [ $# -ne 1 ]; then
    echo "Usage: $0 /path/to/logfile"
    exit 1
fi

LOG_FILE="$1"

# Check if the file exists
if [ ! -f "$LOG_FILE" ]; then
    echo "Log file does not exist: $LOG_FILE"
    exit 1
fi

# Report file name
REPORT_FILE="log_report_$(date +%Y-%m-%d).txt"

# Count total lines in the log file
TOTAL_LINES=$(wc -l < "$LOG_FILE")

# Count error messages (based on "ERROR" or "Failed")
ERROR_COUNT=$(grep -E "ERROR|Failed" "$LOG_FILE" | wc -l)

# Find critical events and their line numbers
CRITICAL_EVENTS=$(grep -n "CRITICAL" "$LOG_FILE")

# Top 5 most frequent error messages
TOP_ERRORS=$(grep -E "ERROR|Failed" "$LOG_FILE" | sort | uniq -c | sort -nr | head -5)

# Generate the report
{
    echo "Log Report - $(date)"
    echo "===================="
    echo "Log File: $LOG_FILE"
    echo "Total Lines: $TOTAL_LINES"
    echo "Error Count: $ERROR_COUNT"
    echo
    echo "Top 5 Error Messages:"
    echo "$TOP_ERRORS"
    echo
    echo "Critical Events (line numbers):"
    if [ -z "$CRITICAL_EVENTS" ]; then
        echo "No critical events found."
    else
        echo "$CRITICAL_EVENTS"
    fi
    echo "===================="
} > "$REPORT_FILE"

# Notify user of report location
echo "Report generated: $REPORT_FILE"

Simplified Features:

  1. Basic Input Validation: The script checks if the user provides a log file and ensures the file exists.

  2. Core Analysis:

    • Total Lines: Counts the total number of lines using wc -l.

    • Error Count: Counts error messages containing "ERROR" or "Failed".

    • Critical Events: Finds lines with "CRITICAL" and outputs the line number using grep -n.

    • Top 5 Errors: Uses grep, sort, and uniq to list the top 5 error messages.

  3. Summary Report: The report contains the log file name, total lines, error counts, critical events, and top errors.

How to Use:

Run the script with a log file as an argument:

./simple_log_analyzer.sh /path/to/logfile.log

Example Output of Report:

Log Report - 2024-10-14
====================
Log File: /var/log/system.log
Total Lines: 15234
Error Count: 23

Top 5 Error Messages:
      10 ERROR: Failed to connect to database
       7 ERROR: File not found
       3 ERROR: Disk is full
       2 Failed to start service
       1 ERROR: User authentication failed

Critical Events (line numbers):
134: CRITICAL: System failure detected
678: CRITICAL: Service crashed
====================

Simplified Workflow:

  • Input the log file path.

  • Count errors and critical events.

  • Output a simple report summarizing the analysis.

This script automates the task of analyzing log files and generates a useful summary report for system administrators. It saves time, ensures accuracy, and can be enhanced further with more features as needed.

ย