>

Perl: Iterating Through Log Files in a Directory


Today I share a Perl snippet to iterate through log files in a specified directory. In this particular case there was a need to parse the log files, ignoring previously compressed files. The file names were in the format of:

file.log
file.log.1
file.log.2
file.log.3.gz
file.log.4.gz
     ...
file.log.10.gz

Though the file names could go up to *.99, we didn’t know which, if any, would be compressed. Obviously using “*.log” wouldn’t get the *.1 or *.2 files and using “*.log*” picked up the compressed files. The solution was to use “glob” with a couple regular expressions.

Using “glob,” you can specify multiple expressions. In this case we used three:

  • *.$ext  to get *.log
  • *.$ext.[0-9] to get *.log.#, like file.log.2
  • *.$ext.[0-9][0-9] to get *.log.##, like file.log.10
#!/usr/bin/perl -w
use strict;

my $logpath="/var/log";
my $ext="log";
my $logfile="";

if ($#ARGV > 0) {
        print "usage: loopfiles [path]\n";
        exit;
}
# Overwrite default log directory via command line
if ($ARGV[0]) {
    $logpath=$ARGV[0];
}
chdir($logpath) or die "$!";
my @files = glob "*.$ext *.$ext.[0-9] *.$ext.[0-9][0-9]";
foreach $logfile (@files) {
    printf("%s \n",$logfile);
}
banner ad

Comments are closed.