Slash Boxes
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

gnat (29)

  (email not shown publicly)

Journal of gnat (29)

Saturday August 17, 2002
12:09 AM

Gotta love that LWP

[ #7141 ]
I spent a few hours tonight knocking up code to milk the Chinese mp3 server that Slashdot pointed to. So while I sleep tonight, my machines will be fetching me hot mp3 action. Not that I'm particularly fond of Brandy's latest album, but it was fun to write the code ...


The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
More | Login | Reply
Loading... please wait.
  • by darobin (1316) on 2002.08.17 11:05 (#11890) Homepage Journal

    So, it's sure nice to know you had fun with LWP but are you going to keep that script to yourself or is there any chance you'll share it with the other nice kids so they don't have to rewrite it? :-)


    -- Robin Berjon []

  • Blimey, glad I grabbed that Oasis album when I did (grin). I just visited the site and was greeted by a blank page with the following line:

    Lmp3 will be closed and will never come back!

    • The site with the MP3s is still at:
    • Lmp3 will be closed and will never come back!
      Oh bloody hell. <> has been taken offline within 24 hours as well. It looks like the terrorists are winning... Yes I'm talking about the RIAA and the whole DMCA mob...

      See Lawrence Lessig's talk [], link currentluy at's home page. Maybe a 5 percent loss! While this study here [] seems to indicate that those people who regularily download and burn MP3's buy as many CD's as everybody else. Read more here [] and here [].
    • The good news is that their FTP site is still live. The bad news is that it doesn't give directory listings. The good news is that I snarfed some paths and filenames from the site while it was live.

      I'll give out the code once I figure out how to detect an aborted get(). It seems like every second to third get is bungholed, and I'm not sure why.


      • Re:Gone (Score:2, Informative)

        Perhaps you could use the $ua->request($req, $callback) variant to detect aborted downloads. I use something like this:

        my ($bytes_read, $total_size);
        my $res = $ua->request(
            HTTP::Request->new( GET => $url ),
              sub {
                $bytes_read  += length( $_[0] );
                $total_size ||= $_[1]->content_length;
                print ( shift() );

        if ( $bytes_read != $total_size )

        -- briac