Day 10: Log Analyzer and Report Generator 🚀
Hello DevOps enthusiasts! 👋 Welcome to Day 10 of the #90DaysOfDevOps challenge. Today, we're creating a log analysis tool.
Log Analyzer Script 💻
#!/bin/bash
# Check arguments
if [ $# -ne 1 ]; then
echo "Usage: $0 <log_file>"
exit 1
fi
log_file="$1"
report_file="report_$(date +%Y%m%d_%H%M%S).txt"
# Check if log file exists
if [ ! -f "$log_file" ]; then
echo "Error: Log file not found"
exit 1
fi
# Generate report
{
echo "Log Analysis Report"
echo "==================="
echo "Date: $(date)"
echo "Log File: $log_file"
echo "Total Lines: $(wc -l < "$log_file")"
echo "Error Count: $(grep -c "ERROR" "$log_file")"
echo -e "\nTop 5 Errors:"
grep "ERROR" "$log_file" | sort | uniq -c | sort -nr | head -5
echo -e "\nCritical Events:"
grep -n "CRITICAL" "$log_file"
} > "$report_file"
# Archive log
mkdir -p processed_logs
cp "$log_file" "processed_logs/$(basename "$log_file").$(date +%Y%m%d)"
echo "Report generated: $report_file"
Script Features 🔧
-
Analysis Capabilities
- Error counting
- Critical event detection
- Top error identification
- Line counting
-
Report Generation
- Timestamped reports
- Organized sections
- Statistical analysis
-
Log Management
- Automatic archiving
- Date-based organization
- Original file preservation
Key Takeaways 💡
- Log analysis is crucial for troubleshooting
- Pattern matching helps find issues
- Reports should be well-organized
- Archive management is important
Bash #DevOps #Monitoring #Linux #90DaysOfDevOps
This is Day 10 of my #90DaysOfDevOps journey. Keep analyzing and monitoring!