Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • Yep.

    Sometime multiple processes can get more done - for example, if a single process spends a large portion of its time waiting for external events then multiple processes can run at the same time without slowing each other down.

    However, if the processes all compete for the same resource (i.e. they are CPU-bound, or they spend all of their time reading from different parts of the same disk) then they can interfere proportionally and take n times longer.

    Finally, if their competition for the same resource is bad enough (like there are too many to fit into memory, or they cause sequential I/O to be turned into random I/O with lots of seeks) they can interfere more than proportionally and take longer to run together than it would take to run them all one-at-a-time sequentially.

    • Another factor can be caching. If the processes are I/O bound but are accessing the same data, then they can benefit from caching. The first process to access the data loads it into memory and the other ones don't have to read it from disk.

      This happens with databases where the same data and same queries are run.