Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Matts (1087)

Matts
  (email not shown publicly)

I work for MessageLabs [messagelabs.com] in Toronto, ON, Canada. I write spam filters, MTA software, high performance network software, string matching algorithms, and other cool stuff mostly in Perl and C.

Journal of Matts (1087)

Thursday February 21, 2002
09:43 AM

DBD::SQLite benchmarking

[ #3012 ]

Benchmarking is fun.

I've been seeing how fast DBD::SQLite is (or can be). At the moment I've only done inserts. Initially I was finding it horribly slow, trying to do 100_000 inserts of random data into a table. I was managing about 1000 rows every 20 seconds. Then I tried it with transactions, and only committing every 1000 rows. Suddenly it was fast. Very fast. About 1100 rows per second. This seems to be because the transaction journal is used even in non-transaction mode - it just does an implied commit on every statement.

I haven't benchmarked selects yet, but I'm not sure it's worth it because from within dbish every time I try and do something against that table with 100_000 rows in it, results come back instantaneously. Awesome.

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.