Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Perl: It's Not Just for Breakfast

posted by chip on 2000.07.05 14:01   Printer-friendly
eann writes "I've been given the opportunity to put on my Advocacy hat and write a short article about how useful Perl is for things other than web-related programs (CGI, mod_perl, etc.) and automating system administration tasks. I doubt it'll actually convince any of the PHBs around here, but if they keep seeing the word 'Perl' enough we might be able to make them believe it's a buzzword. My trouble is, other than some data conversions, I've never really used Perl for anything else. So, I'm asking for examples of Perl success stories that don't involve web or sysadmin stuff."
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • I use Perl to control Windows TCP/IP stack (some changes in INIs and Registry + file management). And this is only a small sample...
    --
    Alexander S. Tereschenko - tereschenko@bigfoot.com http://perl.org.ru

  • > I've been given the opportunity to put on my Advocacy hat and write a
    > short article about how useful Perl is for things other than web-related
    > programs (CGI, mod_perl, etc.) and automating system administration tasks.

    Well, don't forget data munging. :-)

    A lot of stuff that looks cute and simplistic with Perl is much harder
    to do with C/C++/Java. I've written data mungers for big data in both
    Perl and C, and I don't recommend C for this kind of work except in
    the most extreme circumstance
  • I've mostly used Perl to automate tedious tasks.

    I'm an astronomer and, in my last job, I had to process numerous large datacubes as part of of a survey. This required that the processing of the cubes be as similar as possible, and there needed to be a log of the processing in case we needed to trace back problems. The (UNIX-based) software was prompt-driven, and the data had to pass through several progams before reaching its final state. I used perl to drive the individual programs and tie several progra
  • I've been programming in Perl for about five years, and I've used it for quick hacks, writing distributed publishing tools, generating reports, controlling jobs on remote hosts, scraping information from web pages, and sending emails, among other things like complex CGI and FastCGI applications.

    I think Perl is sufficiently general-purpose to satisfy almost any application need. The real question is whether it's the best tool for your specific need.

  • A couple of years ago IEEE Software magazine carried an article on Perl. There was a sidebar with a list of commercial perl uses:

    Perl was invented in 1986, when Larry Wall, then at the Jet Propulsion Labs, needed to generate reports from their configuration management system. Perl continues to be used in CM settings; US West uses a software configuration management system written entirely in Perl.

    The Human Genome Project, whose goal is to sequence the entire human genome uses Perl programs to manage an

  • The previous post already mentioned PDL in passing. PDL (the Perl Data Language) is a full fledged array processing language module (a la IDL, matlab) that opens up the world of scientific data analysis to the perl hacker. Find more details at http://pdl.perl.org [perl.org].
  • Perl makes it easy to write graphical clients with Perl/Tk or gtk... It is also aweful handy for the server than sits behind it. A recent example from my own experience is a system that receives data from a large number of users (non-web based) and queues up jobs for various sorts of processing. It computes summary statistics and reports those to questioning Perl/Tk network monitoring applications.
  • We only use Perl for testing our software. We have built huge perl libraries for connecting to various devices (routers, unix machines, etc). Our tests can be developed quickly (since we look for "string" output!) and adapted to any new "devices" that come along.
  • I've adopted the policy that if I do something twice, then it's a candidate for scripting. Consequently I end up building a huge armoury of tools using Perl.

    These include simple scripts for performing basic tasks consisting of a few commands, plus some elementary error checking. At the other end of the scale, at my last position I built the complete development/test/release mechanism in Perl, tying together all the development tools and providing generation of web logs, auditing etc. Once the system was r

  • I work for an ISP and most of my work revolves around the billing system which itself is largely written in Informix 4GL - however all the bits where it needs to interact with other systems and where it needs to do fancy output are written in Perl, all reports (since I have been here) are written in Perl and of course all of the stuff where the system has to interact with the web.

    We also use Radiator [open.com.au] which is an authentication server written in Perl.

    /J\

  • We use a set of perl scripts to 'skim-read' students' 'English 101' essays and point up possible areas for teacher attention. They don't actually replace the teachers reading and commenting on the texts (honest!), but they give them some useful information such as sentence length, use of key grammatical structures and type-token reference (a measure of language variety).

    Perl's ability to slurp whole files through a sequence of regexes and spit out useful comments has proved very useful for harried teache

  • Our product is sort of an ATM router with various interfaces, including T1, E1, T3, and OC3. Our entire automated testing system is written in Perl. With this system, we can control the product via SNMP, as well as controlling various physical devices... test sets, data switches, power systems, even an oven!

    Of course, the interface to the system is a Perl CGI program, which is integrated with our Perl CGI test case tracking system and our Perl CGI requirements system.

  • Two major uses we have, are
    1. Maintaining cross-references in a pile of requirements documents, specification etc. We check cross-references to other documents and automatically create back-references. And since the whole lot is guaranteed to be accurate, designers tend to keep the documents up to date.
    2. Code generation from an xml repository. Perl is used to translate program templates into perl scripts which use the repository data to write programs of arbitrary complexity. This means that system layers s
  • The company I work for can't afford to pay for a commercial MS Exchange virus scan package (plus we don't want to lose MS's support for Exchange). So I wrote a Perl script that de-taches attachments and waits for a virus scanning program to scan them. If they have been cleaned, I reattach them. It was pretty easy to do using the CDO library.
  • This is interesting. Can you make the code available? Also, are students at that early educational level submitting essays digitally now?
  • Perl is used very extensively in throughout Wall Street; particularily on the trading desks. Its used for all kinds of process (grabbing and organizing market data (prices), product data (financial instruments: stocks, bonds), company data); and then doing something with that data (like coming back with a list of stock trades.) Not a lot of CGI stuff is used; generally its easier just to dump it to the screen, into a database, a mail message, or a file. A lot of the stuff is on Unix (Solaris).
  • The field of computer performance analysis is data rich. but the tools for analysis are rare. Perl's flexable data structures makes the task of data analysis more structured and easier than using C or it's must faster than awk.
    --
    --- perl -e 'print $i=pack(c5,(41*2),sqrt(7056),(unpack(c,H)-2),oct(115),10);'
  • The bank got a lot of data from their ATM/debit card provider (Visa), but it was all in pre-chewed report form. The reports changed semi-randomly, had variable numbers of lines per item, all sorts of mean, nasty, ugly things. After the senior programmer spent a couple of days trying to get RPG to account for every possible exception and so on, I tossed out a Perl script in about an hour that thrashed the nastiest of the reports into a nice flat data file.

    Now they use Perl to pre-process data for RPG pro

  • Going off at a tangent, should we have an _actual_ advocacy hat? what perl clothing is available?
  • I've used Perl for >7 years and never written a single web related program (no CGI, no mod_perl etc.)

    The largest program in production is about 3000 lines. It does automated dimension table management in a star schema data warehouse. (If you don't know what that means read Ralph Kimball's book, The Data Warehouse Toolkit, and look up slowly changing dimensions.)

    Another fairly big projects did text summarization of a collection of documents.

    Another fairly big program computed the "connected compon
    --
    There is nothing so practical as a good theory. Comments are by me, not my employer.
  • I use perl to provide a system of content management, which delivers MPEG/2 files one of 8 Plan 9 Based VideoServers based on a database schedule. It handles load ballancing, content management and disk-volume management and comprehensive error reporting (so we know well ahead of time when a channel's going to drop out). The entire set of 6 applications communicate via SNMP and mail for the more remote systems. In theory, it could all be written in a shell script or two, but I'd loose alot of the error
  • I worked doing support in the manufacturing division of a once great work station maker whose initials are sgi. One piece of the manufacturing puzzle we had to contend with was a beast known as an ASRS, an automated storage and retrieval system. Think of two tall and long bookshelves facing each other with a crane on a track between them. The crane would pick up systems from an inbound conveyor, put them in a place on the shelves, inform the database (through an application known as LAWS (Litton Automat

  • An Enterprise product that I'm involved with has has most of its middle-tier, a set of single-threaded application servers, coded in Perl. The servers are big. The Perl codebase is about 100KLOC. The folks who built the system agree that it couldn't have been done on time (before funding dried up) if they'd had to use C++ or Java, and performance is good enough. We use a half dozen or so CPAN packages, which saved a few months of developement.
  • I am a physicist working in the field of x-ray absorption spectroscopy. I use Perl for a wide variety of professional chores, ranging from throw-away data processing to geometrical calculations for programming goniometer motion, to large complex object-oriented crystallogrphy calculations.

    One of my professional projects is a data analysis tool which deals with crystallographic data. Once upon a time my little program was written in Fortran, but about two years ago I rewrote the whole thing in Perl. Now


  • >In that vein, Siemans did a project for the Land Resource System of
    >Scotland where they used Perl as their n-tier client/server
    >development environment. Client-side using Perl/TK, intermediate layers
    >in Perl, and I believe the servers were either legacy systems or
    >Perl-based.

    It's a system we (Siemens) implemented for Registers Of Scotland and is 100% Perl on both Client and Server side