Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Matts (1087)

Matts
  (email not shown publicly)

I work for MessageLabs [messagelabs.com] in Toronto, ON, Canada. I write spam filters, MTA software, high performance network software, string matching algorithms, and other cool stuff mostly in Perl and C.

Journal of Matts (1087)

Friday October 24, 2003
01:54 AM

Load back down

[ #15364 ]

As suspected, something fishy was wrong.

I had a cron job that didn't exit. So I had about 300 copies of it running. And of course it was running perl, so it used quite a bit of RAM.

Strange that cron doesn't set an alarm.

My load is back to a healthy 2.0 ;-) Unfortunately I know why it's that high (a super sekrit work project - mail me privately if you want to know exactly what) and don't see any way to get it down.

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • I've been bitten by this so many times in the past that I now never ever write a cronjob without inserting a few lines of code that ensure that no previous instance of the cronjob is running, and sends me mail if it's the case.
    • FreeBSD has a nice program called lockf(1) for dealing with similiar situations. It just grabs a lock, runs a command and releases the lock. very handy.

      -Dom

      • There exist CPAN modules like Proc::Pidfile, Proc::PID::File and probably other which help to assure you don't run more that one instance of the process. I used one of them in the past but forgot which one so cannot recommend any of them.
        --

        Ilya Martynov (http://martynov.org/ [martynov.org])