Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

scot (6695)

scot
  (email not shown publicly)
http://redcloudresearch.com/

Perl hacker
"apprentice" sysadmin for ASP running Sparc-Solaris-Java-Oracle stack

Journal of scot (6695)

Wednesday June 13, 2007
04:47 PM

Automated downloading of NOAA radar images

Inspired by Develop your own weather maps and alerts with Perl and GD, I have put together a script which downloads the base radar image from a specific site at a specified interval.

use strict;
use LWP::Simple;
use POSIX;
my $refresh = 1800;
do {
   my $base="http://radar.weather.gov/ridge/RadarImg/N0R/MTX_N0R_0.gif";
my $timestring = strftime( "%Y%m%d_%H%M%S", gmtime() );
my $savefilename = "SLCRadar_".$timestring.".gif"; my $status = getstore($base,$savefilename);
print $status."\n" ;
sleep $refresh;
} while ($refresh);
<blockquote>

See my Standard Code Disclaimer

Tuesday May 29, 2007
05:45 PM

Automated extraction of NOAA weather satellite images

NOAA updates its weather satellite images on the GOES website every 30 minutes, with images in infrared, visible light, and water vapor wavelengths; for both the eastern and western halves of the US. The script below grabs those images and saves them to a local file with a timestamp. Could prove useful for doing your own weather forecasting or whatever....I have updated the script to use the "strftime" function from the POSIX module,and the filenames now are very precise (for example "eastconusir20070530_074348.jpg"). I also updated the script to timestamp the images with Greenwich Mean Time...thanks to graff and jasonk over at Perlmonks for their feedback...

use strict;
use LWP::Simple;
use POSIX;

my $image;
my $url;
my %images = (
"eastconusir" => "http://www.goes.noaa.gov/GIFS/ECIR.JPG",
"eastconusvis" => "http://www.goes.noaa.gov/GIFS/ECVS.JPG",
"eastconuswv" => "http://www.goes.noaa.gov/GIFS/ECWV.JPG",
"westconusir" => "http://www.goes.noaa.gov/GIFS/WCIR.JPG",
"westconusvis" => "http://www.goes.noaa.gov/GIFS/WCVS.JPG",
"westconusvwv" => "http://www.goes.noaa.gov/GIFS/WCWV.JPG",
);

foreach my $key (keys %images) {
print $key."\n";
my $epoch = ( head( $images{$key} ) )[ 2 ] || next;
my $timestring = strftime( "%Y%m%d_%H%M%S", gmtime( $epoch ) );
print $timestring."\n";
print $images{$key}."\n";
my $status = getstore($images{$key},$key.$timestring.".jpg");
print $status."\n" ;
};

Monday April 23, 2007
01:42 PM

Sun's Java Web Server superior to Apache?

Serverwatch has posted a review of Sun's Java Web Server which is very positive. My summary of the review is here.
Wednesday April 04, 2007
05:51 PM

Method for extracting urls for filetypes + auto-retrieval

This post reminded me of a problem which I have been trying to solve involving extracting URL's pointing to a specific filetype (say a gz archive) from a web page. It turns out that at CPAN there is a page which contains an alphabetical list of all modules, with a hyperlink to the tar.gz file of each module.

The following code (given appropriate substitution of the command line input; ie gz for pdf) will create a text file with all of the URL's for the tar.gz files:

use strict;
use LWP::Simple;
use HTML::SimpleLinkExtor;
#usage getfileoftype http://www.example.com pdf > urllist.txt
my $url = shift;
my $filetype = shift;
my $filetypelen = length($filetype);
my $offset = -$filetypelen;
#print $filetypelen."\n";
#print $offset."\n";
my $fileget = getstore($url,"tempfile.html");
my $extor = HTML::SimpleLinkExtor->new(); $extor->parse_file("tempfile.html");
my @a_hrefs = $extor->a;
for my $element (@a_hrefs) {
# print $element;
# print "\n";
my $suffix = substr($element,$offset,$filetypelen);
# print $suffix;
# print "\n"; if ($suffix =~ m/$filetype/){
print $element;
print "\n";
}
}

Once you have that, you can then use the following code to automatically download all of the modules if you so choose, or whatever subset of the modules you wish to extract from the text file created by the above code:

use strict;
use LWP::Simple;
use File::Basename;
open (DATA, "urllist.txt") || die "File open failure!";
while (my $downloadurl = <DATA>){
(my $name, my $path, my $suffix) = fileparse($downloadurl);
my $finurl = $downloadurl;
print $finurl."\n";
my $savefilename = $name.$suffix;
print $savefilename;
print "\n";
my $status = getstore($finurl,$savefilename); print $status."\n"
}

Both pieces of code work nicely on my WinXP box. Yes, I know that "tempfile.html" gets clobbered, but I was just glad to get this code working, and WinXP doesn't seem to care. In any case, one can now generate a local repository of modules. Suggestions for improvement in my code are welcome!

Tuesday April 03, 2007
02:11 PM

use.perl posts hit google quickly

[ #32890 ]
use.perl posts frequently make it into the top hits of a google search.
Tuesday February 13, 2007
11:44 AM

Remote computing with a Linux app server farm

IBM Developerworks writes up a slick setup at the University of California where they replaced Windows XP boxes with ITX Mini-boxes in a couple of computer labs and got positive feedback from the endusers plus significant cost savings.
Wednesday January 24, 2007
04:31 PM

Script to calculate digits of pi

Introducing my blog on Perl, Red Cloud Research, with this entry.
Thursday January 11, 2007
04:11 PM

Dev language poll

From here: Developers Embrace Java, Drop Visual Basic

By Gregg Keizer

Use of Visual Basic has dropped 35% since the spring, says a poll of more than 430 North American developers done by research company Evans Data Corp. Developers have abandoned Microsoft's Visual Basic in droves during the last six months, and they're using Java more than any other development language, according to a recently published survey.
Use of Visual Basic has dropped 35% since the spring, says a poll of more than 430 North American developers done by research company Evans Data. "Microsoft has dominated languages since the early 90s, but we are seeing much more parity now," said John Andrews, president of Evans Data, in a statement. "The use of scripting languages, as well as Java, appears to have limited VB's future market potential."
Developers aren't only leaving Visual Basic, they're also less likely to work in the Visual Basic.Net environment; VB.Net use is down 26%, the survey shows.
Java now holds the top spot, with 45% of the polled developers saying they used Java during some part of the last six months. C/C++, meanwhile, was used by 40% of the coders, and C# was used by 32%.
The survey also indicates that use of Ajax, short for Asynchronous JavaScript and XML, is up 10% since the spring 2006 poll, with 28% of the developers involved in Ajax-style Web interface development at some point during the last six months.
Tuesday January 09, 2007
01:48 PM

"A Java Front-End To CGI/Perl "

Found this JavaCGI Bridge Presentation which based on a cursory read looks like an alternative to AJAX. The gist is this:
 
 

JavaCGIBridge Solution

        CGI/Perl advantages

                * CGI/Perl can provide database connectivity more easily
                * Application related "business-rules" are centrally maintained and executed in CGI/Perl scripts
                * Browsers that don't support Java can still use the application
                * Leverage existing "legacy" CGI/Perl code
                * Many Internet Service Providers only allow CGI/Perl access (no daemons)

        Java advantages

                * Java applet becomes a truly thin client
                    - only requires GUI code and GUI logic
                    - JavaCGIBridge class adds ~5k overhead
                    - No need to learn the entire Java class library.
                    - You can get by with AWT (Forms) and some utility classes
                * Java applet can maintain state between all CGI script calls
                * Java applet can cache retrieved data
                    - eliminates the need to constantly get redundant data from the server

Check the date of the presentation: 1997 :) !!

Tuesday January 02, 2007
04:35 PM

Simple extraction of links from web page

It took me far longer than I thought it would to come up with this code that grabs a web page and stuffs all the page's hyperlinks into a text file.
 
Updated...


use strict;
use WWW::Mechanize;
#usage perl linkextractor.pl http://www.example.com/ > output.txt
        my $url = shift;
                my $mech = WWW::Mechanize->new();
        $mech->get($url);
        my $status=$mech->status();
        print $status." OK-URL request succeeded."."\n";
        my @links = $mech->links;
        print STDOUT ($_->url, $/) foreach $mech->links;