It took me far longer than I thought it would to come up with this code that grabs a web page and stuffs all the page's hyperlinks into a text file.
#usage perl linkextractor.pl http://www.example.com/ > output.txt
my $url = shift;
my $mech = WWW::Mechanize->new();
print $status." OK-URL request succeeded."."\n";
my @links = $mech->links;
print STDOUT ($_->url, $/) foreach $mech->links;