Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

ajt (2546)

ajt
  (email not shown publicly)
http://www.iredale.net/

UK based. Perl, XML/HTTP, SAP, Debian hacker.

  • CPAN: ATRICKETT [cpan.org]
  • PerlMonks: ajt [perlmonks.org]
  • Local LUG: AdamTrickett [lug.org.uk]
  • Debian Administration: ajt [debian-adm...ration.org]
  • LinkedIn: drajt [linkedin.com]

Journal of ajt (2546)

Monday January 07, 2008
09:21 AM

Fixing Analog dns cache file

[ #35319 ]

The text file used as a DNS cache by the Analog web server log analyser gets rather long and out of date if you don't manage it. At the weekend I took my 20MiB file and ran my usual shell scripts to clean up the log, I'm not a shell guru, and after a days wait I had a pristine 2.6MiB cache file with no duplicates in. I then made some changes to how Analog is configured to keep things trim

All these DNS look-ups in series are very slow, especially as once they fail, the usually fail on every subsequent attempt, which is what most of the log consists off. Using a bit of Perl (hashes) and the parallel DNS look-up from Net::DNS, got a script that is marginally longer the the shell script, but can process a 20MiB file in under 5 minutes...

I'd be the first to admit that my Perl code is better than my shell, but the Perl version is both MUCH faster, and according to top, lighter on the CPU... very cool.

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.