Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Ovid (2709)

Ovid
  (email not shown publicly)
http://publius-ovidius.livejournal.com/
AOL IM: ovidperl (Add Buddy, Send Message)

Stuff with the Perl Foundation. A couple of patches in the Perl core. A few CPAN modules. That about sums it up.

Journal of Ovid (2709)

Wednesday June 18, 2008
04:55 AM

Aggressive Test Performance Ideas

[ #36711 ]

Our test suite now takes half an hour to run, and that's when no one else is here. If I'm fighting them for resources, it's forty minutes. We need to get this back under control.

There are several interesting ideas on how to do this. Most of our time is spent in XML::XPath and in the DBIx::Class and how the latter interacts with the database. We need to fix the namespaces in our XML to switch to XML::LibXML, but right now, we have two interesting ideas for the the database problems.

The first is creating a pool of test databases. We already have these. They are on a per user, per branch basis. So if I'm working on our 'segments' branch, I would have a pips3_test_poec01_segments database created just for me. What I want to do is, at a first pass, is have pips3_test_poec01_segments_01 and pips3_test_poec01_segments_02. While a test is running against one, the other is being rebuilt in the background. The tests then won't have to worry about that. While some might be fast enough that the rebuild isn't done, because they're database tests, they usually won't be.

The second idea is interesting. We have test fixtures where the code can look something like this:

my $ce = $class->change_event_builder($schema);

my $service = $schema->resultset('Service')->find( {
    api_public_name => 'bbc_one_london',
});

my $ondemand_service = $schema->resultset('Service')->find( {
    api_public_name => 'iplayer_streaming',
});

my $pip_rs = $schema->resultset('Pip');

# Brand:           Waking the Dead
my $wtd = $pip_rs->create_brand( {
    title => 'Waking the Dead',
    pid   => 'brwtd',
    crid  => 'crid://bbc.co.uk/b/10366',
});

$ce->add_change_event($wtd);

# Series:       Series 5
my $s5 = $pip_rs->create_series(
    {
        title => 'Series 5',
        pid   => 'seri5',
        crid  => 'crid://bbc.co.uk/b/10360',
    }
);

# lots more stuff adding episodes, versions, credits, and so on ...

A test can load a fixture with something like (fudging here):

$fixture->load($fixture_name);

And then the test can proceed on its merry way.

What if we cache the SQL for that? We could create an md5 hash for each fixture file and if it changes, we rerun the fixture and cache the SQL. Otherwise, we just run the SQL directly to add the fixture data. This raises the obvious question of "how do we capture this SQL?"

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • ...cache a binary snapshot of the MySQL data and bring up a local instance of MySQL configured to use the snapshot? It shouldn't be too hard to create a suitable MySQL config on the fly and MySQL should start up in just a couple of seconds. When you're finished with it just throw away the data.