Slash Boxes
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

perrin (4270)

  (email not shown publicly)

Perrin is a contributor to various Perl-related projects like mod_perl, Template Toolkit, and Class::DBI. He is a frequent speaker at OSCON, YAPC, and ApacheCon, and a contributor to several perl-related books.

Journal of perrin (4270)

Wednesday March 16, 2005
04:23 PM

DBD::SQLite tuning help?

If anyone out there knows any good performance tweaks for DBD::SQLite, please take a look at my post on PerlMonks about it.
Monday February 28, 2005
01:54 PM

cover story in eWeek

We got a cover story in eWeek about the mod_perl-based system we built for the United Federation of Teachers. It's a pretty good story, and mentions Perl, Apache, Linux, etc.

eWeek story

Wednesday January 05, 2005
01:57 PM

still looking for the perfect Linux distro

After a total harddrive failure in my IBM ThinkPad (don't let anyone tell you IBM makes reliable laptops), I installed Ubuntu. Now I have Ubuntu at work and Fedora Core 3 at home. Ubuntu did a great job at recognizing hardware: found the bizarre wi-fi chip in the ThinkPad correctly and mounted an iPod automatically (both failed with FC3). However, it has no working mplayer package, even with all of the crazy "totally unsupported illegal software" repositories enabled. It was fairly tricky getting mp3 encoding to work in Sound Juicer too. There was one specific package that would do it, and no documentation about it except on some wiki page. My main problem with Ubuntu though is the lack of KDE. I run Gnome at work and KDE at home, and KDE is just better in many small ways. The extra integration and the handy support for using remote files via scp in any application are great. Gnome's attempts at these things just don't work as well.

I tried loading KDE packages from Debian, but they look awful and I don't know enough about the font issues to fix them without considerable effort. I also ran into problems with the X config. The auto-detection stuff at install time worked, but I couldn't find a way to run it again when I got a new monitor. FC3 has more obvious config tools.

On my FC3 system, KDE and Gnome both look great. The mp3 stuff had to be chased down, but it was a bit more obvious than with Ubuntu. Working mplayer (and kplayer) packages are available. However, hardware detection has problems. Besides the iPod and wifi issues, it also gives me annoying screens about hardware detection if I configure it to work with a USB printer and then turn that printer off sometimes, or configure a laptop to use a keyboard and then use it without the keyboard. Ubuntu doesn't have those issues.

So, I guess I just have to keep updating these things until one or the other gets it all right. I'm not inclined to do something crazy like go back to Slackware at this point, and Suse seems determined to make free versions of their distro impossible to find on their site. Any other suggestions?

Monday November 29, 2004
01:25 PM

missing the boat on performance

I'm frequently surprised by the way best practices for good performance do not get picked up by people. A couple of weeks back at ApacheCon 2004, I listened to one of the SpamAssassin developers say that their SQL RDBMS storage was faster than their Berkeley DB storage. How could that be? My tests have always shown Berkeley DB to be significantly faster than the fastest query on MySQL (which, incidentally, is much faster than the same query on SQLite). I checked their code and there's the answer: it uses the slowest possible interface to Berkeley DB, the DB_File module with an external locking system. Using the BerkeleyDB module with direct method calls and letting BerkeleyDB manage locking would be many times faster.

In a similar vein, people are still recommending Cache::Cache or things based on IPC::ShareLite, when BerkeleyDB or Cache::FastMmap would be about ten times as fast. Hopefully my upcoming article based on my talk at ApacheCon will help point people in the right direction on that.

The most recent and most surprising was a big performance bug in Maypole that I discovered while helping Jesse Sheidlower with some performance tuning. People who have used Template Toolkit with mod_perl in a high-performance environment should know that you have to keep the TT object around between requests so that you don't blow the cache and recompile the templates on every hit. Maypole was throwing away the TT object. I gave Jesse a very small patch to fix this and he reported speedups of 250-500% on his application.

What's the lesson in all of this? Probably that being engaged on mailing lists like mod_perl and TT and sites like has a tangible payoff in terms of knowing what the best practices are. Maybe also that we need to repeat them more often.

Monday November 22, 2004
01:13 PM

latest LAMP/J2EE flamewar

On Slashdot today, there was an article about something called "Transaction Grids" being sold by a company called ActiveGrid. The gist of the article was that they want to replace J2EE clusters with LAMP clusters that use SOAP (a.k.a. the slowest thing anyone could come up with) to communicate.

I'm all for busting on bloated J2EE app servers, but it's a pretty silly article. Scaling a web app across multiple machines is basically a solved problem, at least at the load-balancing and failover level. The hard part is dealing with shared data, which this "grid" thing seems to ignore completely. You need to scale your app across a hundred cheap servers? Buy a LocalDirector off eBay for $100 and call it a day.

J2EE should be an easy target. It solves a problem almost no one has: the need to coordinate transactions across multiple databases or other data sources. This is why the ultimate defense that J2EE boosters usually cling to is "Oh yeah? Can your thing do two-phase commit?" In reality, the number of applications that truly need two-phase commit is so small that BEA would likely be out of business if only those people were buying their server.

It's too bad, because I actually like Java when it isn't mixed up in this stuff. I saw some really nice stuff at ApacheCon last week, where people were talking about Java web frameworks that actually contain useful ideas. None of them, however, are J2EE.

Wednesday October 27, 2004
03:49 PM

Open Source Content Management Conference (OSCOM4)

It looks like my submission is not being put up for some reason, so I'm posting it here. This is from October 6th.

Last week, I attended the Fourth Open Source Content Management Conference in Zurich, Switzerland. Perl was well-represented there.

The keynote was given by Roy Fielding, REST champion and author of the original libwww-perl (in Perl 4!).

I gave my talk about Krang, a new Perl CMS developed by a large magazine publisher.

Michael Kröll presented XIMS, a CMS developed by the University of Innsbruck. It is very XML-oriented and built on top of AxKit.

Harry Fuecks gave a talk about using PHP to publish a variety of document formats to the web, including POD, by post-processing the output of pod2html to get a standard appearance.

JSR-170, presented by David Nüscheler, is an attempt to create a standard API for CMS repositories. It's a Java project, but part of the project is defining a standard XML interchange format, which could be an interesting thing for Perl CMS projects to support if it catches on.

Shimon Rura and Josh Ain talked about Frassle, a system they developed (in Perl) for sharing and cross-referencing blog content.

Christian Jäger showed a project called LifeCMS. It includes a virtual filesystem written in Perl, which enables it to do automatic versioning and present the CMS repository as a standard filesystem.

There is more information available on the wiki."

Tuesday October 12, 2004
05:31 PM

old templating systems never die

I see from this story that someone is still using HTML::EP and wants to convince others to use it. I intentionally left HTML::EP out of my templating roundup because it was not actively maintained and didn't have the level of caching needed to compete with the others on performance. (Yes, I know, perfect code doesn't need new releases, but things change over time and no release for multiple years is not a good sign.) It used to be one of the most interesting ones, with a nice design based around HTML::Parser, but eventually got eclipsed by new features in other systems which are now much more widely used.

There are still people out there using ePerl too. We get a mail now and then on the mod_perl list asking for help because it won't even compile on newer Perl releases. I wonder if some of the others that seem like leaders today will fall by the wayside in a few years as their maintainers lose interest and their user-communities look elsewhere.

Thursday September 09, 2004
01:38 PM

how NOT to benchmark

So, Tim Bray wrote this hairy regex which he ran in Perl and in Java, and it ran faster in Java. He then posted this result to his blog. There is no sample data to try it yourself, it's a single test involving unusual unicode stuff, and he says that "they don’t produce quite the same results, with occasional variation around international characters", but this has not stopped Java fans from declaring that Java's regex engine is faster than Perl's.

Then he goes on to say that perl 5.8.3 gives a different result from perl 5.6.1 and that this is somehow a strike against Perl's suitability for "enterprise" work. This sounds like a result of the many bugfixes in unicode and regex stuff that happened between those versions to me, but he appears to be saying that the inability to fix bugs in Java is a positive thing, since it means you can get the wrong answer consistently. He also doesn't say whether or not he was using the same versions of Java in each test.

The problem with blog postings like this isn't so much that they are exist as that many people who read them will not have any context for evaluating the validity of the benchmark, and will just start spouting nonsense about Java's regex speed. Very frustrating.

Saturday July 03, 2004
06:26 PM

PHP and Java in "scalability" catfight

There is a bit of a flap which you may have seen on Slashdot recently over the moving of Friendster (aka "The Devil") from Java to PHP. The whole thing is summarized nicely by Chris Shiflett here.

The depressing thing about discussions like this is how much misinformation gets spread around. Here are a few examples:

  • The guy on Slashdot claiming that Apache 1 scales badly (that must be why nearly all of the largest websites in the world use it) and that a piddly little site like proves the necessity of Apache 2 and a threaded model.
  • The guy who misread the abstract of an ACM article and is now posting all over the place that Java outperformed Perl and PHP by 8 times on a benchmark in this article. (Totally wrong. The article says that static pages were 8 times as fast as dynamic pages, and that Java sometimes outperformed the others and sometimes was slower.)
  • The PHP proponent who thinks that George Schlossnagle's article about PHP and Oracle where he says to use the equivalent of the mod_perl reverse proxy setup shows some kind of application-specific thinking on the part of PHP developers. (It's just a standard workaround that anyone who uses PHP or mod_perl would use to keep from wasting database connections in apache processes that are serving images. Java servlet architectures do not normally have this problem, and thus don't need this workaround.)
  • The people claiming that Yahoo does everything in PHP. PHP just replaced some in-house proprietary templating systems at Yahoo. Yahoo still uses C/C++ and Perl for the bulk of their applications, with PHP just serving as the final delivery engine, like a fancy SSI.
  • Even Rasmus Lerdorf with his mantra about how PHP is a "shared nothing" architecture sounds a bit disingenuous. Any interesting dynamic application is going to share something, probably through a database, with possibly some additional sharing in files or shared memory or a database-like daemon such as memcached. The mechanisms for sharing this data on a cluster of PHP servers are essentially the same as the ones for doing it with Java servers.

There is one thing about Java which can be a problem for scalability, and that is the proprietary nature of many of the server environments people run it on. If you buy a commercial J2EE server, the vendor will promise you incredibly scalable session data storage and database caching, but they will typically not tell you how they do it. When you run into trouble, you have no knobs to twiddle and no alternate options because the whole thing is closed to you. This is why I would recommend that people wanting to use a Java solution should look at the open source tools instead, especially the new lightweight ones.

The most unfortunate thing about this PHP/Java discussion is that people still aren't talking about the things that actually do make a system scalable, like ways of limiting the resources that each active user on the site takes up, and ways of handling overload gracefully.

Okay, I guess I had a little more to say about this than I thought. I'm going to stop here for now. Maybe some of this will filter into my OSCON talk.

Thursday April 08, 2004
10:47 AM

we're hiring

Most of you probably saw our posting on, but in case anyone missed it, here it is again. My company, Plus Three is hiring in Washington, DC. At some point in the future we may be looking for someone in New York City as well.

We're a consulting company, doing a lot of work based on mod_perl. Our clients include the Democratic National Committee, Democratic GAIN, the United Federation of Teachers, and John Kerry for President. Here's some more info from our listing:

We need a senior Perl programmer to lead development in our Washington, D.C. office. We are building complex, high-traffic web sites using open source software, including GNU/Linux, Apache, MySQL, PostgreSQL and XML::Comma.

Required skills:

  • Demonstrable programming ability and experience with Perl, including OO design, effective use of the CPAN, and familiarity with modern web development tools like mod_perl and templating systems.
  • A broad knowledge of Internet technology, and a desire to find things out empirically rather than rely on conjecture (i.e. someone who is notafraid to telnet to port 80).
  • A desire to plan for the future with suitable programming abstractions, automated test suites, and version control, and in general to feel in control of your own destiny.
  • Expressive verbal and written communications skills. Direct interaction with clients will be involved.

You can send your resume to Thanks!