My Web tests were working well, so I've been asked to run them and really pound our server. The tests take about 20 minutes to run, but I discovered that when I run them in a forked off process, they only take about two and a half minutes to run. I suspect that's because suppressing the output to STDOUT is a performance boost, though I'm surprised that it's that much of a boost. I worry that I did something wrong.
Still, last night I ran my program and forked off 40 processes, each one logging in as a different user and pounding our site. With one test taking two and a half minutes to run, one might guess that 40 processes might take 100 minutes to running sequentially, but since I'm forking off multiple processes, I would guess that it would actually take less time, but in reality, it took 340 minutes. Since I've mostly done Web and database programming, I don't know much about forked code and I'm not comfortable with these results. It looks like I'm misunderstanding how things work. Time to do more research.