Slash Boxes
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

barbie (2653)

  reversethis-{ku. ... m} {ta} {eibrab}

Leader of [] and a CPAN author []. Co-organised YAPC::Europe in 2006 and the 2009 QA Hackathon, responsible for the YAPC Conference Surveys [] and the QA Hackathon [] websites. Also the current caretaker for the CPAN Testers websites and data stores.

If you really want to find out more, buy me a Guinness ;)

Memoirs of a Roadie []
CPAN Testers Reports []
YAPC Conference Surveys []
QA Hackathon []

Journal of barbie (2653)

Tuesday November 01, 2005
09:46 AM

CPAN Testers Statistics Update

[ #27405 ]

My CPAN Testers Statistics site got a facelift and the latest monthly update.

Following suggestions to break the original page up, and a desire to have a better looking set of pages, I nabbed the CSS file from Thomas's CPANTS site and set about redesigning the content. Any further suggestions for improvements or mail address fixes will be gratefully received.

Enjoy :)

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
More | Login | Reply
Loading... please wait.
  • never seen those pages before. :-)

    I was in a discussion with Xantus a couple nights ago about (currently has the google-suggested verison of the search). He has a vision of bringing together all of the various subsites into on big cohesive unit; the rankings, the forums, the annotations, the test reports, etc, of which your stuff would be one of the units. :-)

    I think that's a tall tall order and my mind boggles at where to start. I know CPANTS has the data in a sqlite downloadable form
    • The data is in an SQLite database, which is now over 25MB. I was originally going to use the database that WWW::CPAN::Testers::Generator creates, but it doesn't contain all the information I needed. In addition I have to manually verify the results as some reports are bogus. I have been fine-tuning the script to ignore the bogus reports and for the latest run there weren't any entries I had to delete from the database, but I'm a little way off running this completely automated. I'll add a link to the code a