Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

polettix (7564)

polettix
  (email not shown publicly)
http://www.polettix.it/

Journal of polettix (7564)

Wednesday August 26, 2009
10:54 AM

playing with rakudo

These days I've played a bit with rakudo (should I say Rakudo? I actually played with the program...) and I really enjoyed it.

I focused on implementing short examples in the perl6-examples, in particular related to Project Euler. It allows me to use the "everyday" Perl 6 waiting to exploit more advanced features, which is good. It's quite easy to have access to the Github repository, as a matter of fact it boils down to joining #perl6 in freenode and ask.

Performance is really poor so far, some scripts in Perl 5 that run in some 3 seconds take ages in Perl 6. Anyway, at the moment it's not really the point - not for me, at least - so the level of fun is high. I must admit that it's easy to get used to the syntax changes, like the one for "for.

I was a bit scared in being hit by a couple of lacks/bugs while doing quite basic stuff. One resulted in a bug report, another one was due to a known bug. Was I particularly lucky? I hope so.

Sunday August 31, 2008
12:49 PM

Accessing Amazon S3 from the command line

I started playing with Amazon Simple Storage Service and acme's Net::Amazon::S3, finding both quite useful and easy to use.

The obvious next step has been re-inventing the wheel. Well, sort-of, because the type of wheel I had in mind wasn't anywhere I was able to look for. In particular, I was looking for some "operating-system command" a-la git that was able to support subcommands for common operations.

So, I concentrated on two sets of commands: the "pure" ones, that focus on Amazon's way of seeing data, and the "filesystem" ones, that support thinking S3 stuff as files with a Unix-like path.

Among the first group you can find the basic S3 functions like add, remove etc. The second group has more "convenient" commands like ls, cp, rm and mv, that try to do the right thing in the spirit of the corresponding Unix commands (e.g. cp can both send or get data, depending on the order of the parameters). I find this useful because the metaphor is somewhat "hardcoded" in me after years of using those commands.

If you're interested or just want to find something to start a flamewar, you can find it at http://repo.or.cz/w/s3.git. Enjoy!

Update: thanks to grink's tip below, I found Term::ShellUI that allowed me to add an interactive shell to call all of the commands, with on-line help! I love CPAN.

Saturday August 09, 2008
08:54 PM

My take on system administration: deployable

Some time ago I had to take care of the configuration of a bunch of Linux servers (more or less fifty of them so far), most of which have identical configuration apart some bits. These bits are typically the hostname, the IP address ecc., but also more specific and application-related info that I can't talk about.

I started using a lot of ssh and quickly realised that I had to make stuff as automatic as possible. This is probably what led me to the wheel-reinventer land once again (this was also the reason why I developed Template::Perlish, by the way). My idea of an ideal workflow for this type of activities is simple:

* figure out what has to be done in some test environment

* collect all the needed bits that need to be deployed and possibly executed in each server

* put these bits in place and execute what's needed.

While there's little to automatise for the first bullet, the second and the third have definitively space for letting me do more interesting things.

The second bullet boiled down to this: most of the times, I either had to deploy some new configuration files that were pretty the same for all the servers (or groups of them), or I had to execute some script in the target machine, where these scripts needed also some supporting data files.

What I wanted, in order to ease the third bullet, was a way to produce a single, self-contained script that was able to carry all the needed bits and do all the magic once executed. In other words, I needed some executable archive, which had to be able to hold all the files, figure out where they had to be installed depending on their "role", and execute all the "deployment" scripts. This led to a script - deployable - that is some specialised archiver that takes some lists of files/directories and produces a single script that is capable of all the magic in the remote server.

The produced script works with the least requirements, due to the fact that I wanted to change the target environment as little as possible. Thus, the underlying archiver of choice has been tar, and the remotely distributed script is capable of carrying Archive::Tar inside if it's not installed in the target machine; alternatively, it can rely upon system tar, if present.

The last part, i.e. the actual deployment, can be handled by another script - deploy. It basically does two things with a "deploy-script" produced by deployable: sftp-s it into each of the targets, then ssh-s and executes it. By default it works into a temporary directory that is created, from time to time, inside a base camp in /tmp/our-deploy, but it's all configurable.

If you want to take a look, or just comment harshly on how this is reinventing multiple wheels, feel free to do! You can find a git repository with the needed tools at http://repo.or.cz/w/deployable.git (you can take a plain old tarball going to http://repo.or.cz/w/deployable.git?a=shortlog and clicking on the first "snapshot" link you find in the list). Enjoy!

Saturday July 05, 2008
10:12 AM

Variable holding a reference to itself

Today I did something that's totally insane:

   $image = \$image unless ref $image;

The intent was normalising the input to a sub, which can be either the image or a reference to the image's data. My goal was to always have a reference.

After some debugging, it became clear that the "hold a reference to yourself" is plain wrong. I was relying on some obscure reasoning inside my brain, about the fact that taking that reference would magically "detach" the variable container from $image, just to take a reference to the container and put it into a new container in $image.

Luckily, sometimes Perl just refuses to DWIM, and with good reasons.

Friday July 04, 2008
05:41 PM

Sometimes the solution is under your nose

I think it's hard not to admit that Image::Magick documentation simply sucks. IMHO it's a wonderful module, I simply hate the docs.

One thing that I've been eager to do for a long time has been getting image data in memory instead of writing them to a file. I tried in-memory filehandles, but with no luck. There's always been an interesting ImageToBlob method, but it was a bit short-ranged... at least I believed. Reading the docs, it seemed that it was able to spit data out only in the original format... whatever it means for some dynamically generated image.

Today I had the pleasure to discover that this isn't actually the case. You can use that method to get image data in memory and in any format you want... just pass the magick parameter with the indication of the format:

   my $png_data = $image->ImageToBlob(magick => 'png');

I wish I knew this a long time ago.

Thursday June 26, 2008
10:24 AM

When tests unveil bugs - more or less...

This has been a nightmare... more or less.

It's all dakkar's fault. He wrote an interesting article in perl.it (in Italian, sorry!) about using the Gtk2 module, how easy it was to build up an application, etc. and so I was heading to my shell in no time.

I use debian etch, which comes with a lot of packages for Perl modules, but I like to separate system stuff from what I use, so I stick to a custom version of perl. There I was, in front of my cpan shell, but the installation process failed miserably: Glib had issues!

Ok, on to the build directory, then, to see the offending test. There it is! t/64bit.t chokes! There seem to be a problem invoking Glib::ParamSpec::int64, but the problem immediately seems to be somewhere in the underlying library because there are some complains from the library itself in stderr. In particular, when calling the function, it seemed that this assertion failed:

   MIN_INT64 < 0 < MAX_INT64

I have to admit that it took me a while to understand that the problem was in the glib library. How could this possibly be? So I wasted some time navigating through XS code (very little time), doing some hacks with fprintf/stderr in the generated C code (some more time) and finally it seemed that I could restrict the error to the underlying call to g_param_spec_int64().

It was time for my first false route, so I happily took it. I concocted what I thought to be a minimal example to compile, just to see that it crashed immediately. In the beginning, I was looking for the actual target to blame, so I tried to change compiler (using gcc-3.3 and gcc-3.4) and a couple of voodoo rites found in the Internet, but no luck. I finally came into another example that was sufficiently minimal for my eyes to eventually catch that g_type_init() was needed *before* calling the other function. At least I was back in the correct route!

In the meantime, I also tried to download the latest version of the glib library, compiled it and installed somewhere useful. I then forced a compilation/link against that library, and the test went fine! Ok, then... the library seems the one to blame.

So I finally had something useful to at least prove that the function wasn't working, but I had to quickly change my mind, because the following test program worked like a charm:

shell$ cat prova.c
#include <stdio.h>
#include <glib.h>
#include <glib-object.h>
 
int main (int argc, char *argv[]) {
   GParamSpec *pspec;
 
   g_type_init();
 
   printf("starting, first goes well\n");
   pspec = g_param_spec_int64("int64", "Int", "Bah!",
         G_MININT64, G_MAXINT64, 0,
         G_PARAM_READWRITE);
 
   printf("putting min equal to max, and default outside\n");
   pspec = g_param_spec_int64("int64", "Int", "Bah!",
         G_MAXINT64, G_MAXINT64, 0, /* min set to G_MAXINT64 */
         G_PARAM_READWRITE);
 
   return 0;
}
shell$ gcc $(pkg-config --cflags --libs gobject-2.0) prova.c -o prova
shell$ ./prova
starting, first goes well
putting min equal to max, and default outside
 
(process:9229): GLib-GObject-CRITICAL **: g_param_spec_int64: assertion `default_value >= minimum && default_value <= maximum' failed

I was starting to get nervous! The function seemed to work fine... so it was time to go back in the XS/C code and try to figure out what was going on. After some munging, I discovered that parameter grabbing was not working as expected; the following call in GParamSpec.c:

gint64   minimum = SvGInt64 (ST(5));

was generating a *positive* value even when provided a *negative* one. On to SvGInt64 in GType.c, then:

gint64
SvGInt64 (SV *sv)
{
#ifdef USE_64_BIT_ALL
   return SvIV (sv);
#else
   return PORTABLE_STRTOLL (SvPV_nolen (sv), NULL, 10);
#endif
}

OMG, PORTABLE_STRTOLL, a macro! After a bit of discussion with the Makefile I grab the (huge) command line to compile the GType.c file, and adding the -E switch lets me access the code *after* macro expansion. The bottom line is that PORTABLE_STRTOLL simply resolves to a call to g_ascii_strtoll(). Ok, another step in the (hopefully) right direction.

First a bit of googling to try to find out if there had been a known issue. This resolved to be a waste of time, I had better go and compare the two implementations in the original version I had in my system and the latest version. In fact, it seems that someone already discovered that there was a bug in that function, and the solution was there in the new code.

What to do, then? The first thing that came to mind was sending a bug report to debian, because the library in etch is broken! I used git to produce the patch (and this lead to another thread of discussion about the diff format generated by git-format-patch, but this is another story!) and sent it along to the debian maintainers, hoping not to have done any error in the process. But the answer didn't come in a handful of hours, so I had to think something more. (Yes, I'm that impatient, maybe this is why I stick to Perl).

I had two alternatives: either force the install (who will be using that 64 bit stuff anyway?!?), or upgrade the library by myself. I found an interesting article about recompiling a debian package, so I immediately got to work. No, things didn't go smoothly, because the debuild program (or whatever sub-program it called) insisted in complaining that the modified tarball with my patch wasn't the same as the original (guess why?!?). After a bit of tweaking I managed to make it generate a new package, anyway.

Prior to installing it I made a checkpoint to the virtual machine I'm running Linux in, just to be on the safe side. Having such a facility is sooo great, you can go back at once if things go wrong. Then I installed the package and voilà! The Glib module compiled, at last!

Anyway, I now ask myself: how come that the debian package for Perl's Glib didn't have the same problem?!?

Monday April 07, 2008
07:23 PM

Falled into the trap: Template::Perlish

So I finally falled into the trap - I wrote a templating system: Template::Perlish. A very basic one, just giving you means to put some variables here and there, and use Perl for all control structures.

The thing I love about it is that I needed it. It vaguely resembles TT2 - at least in the way you put the variables. This let me take a bunch of TT2 templates and more or less keep them switching to the new templating system. The other thing I love is that all what's more complicated than a simple variable is just plain Perl - and I love Perl.

Why, then? I'm working on an automatic deployment system, to build up a server image that can be deployed and auto-configures all those tiny bits like network configurations and stuff like this, based on some master configuration file. At first I developed it in TT2, and executed it on my machine, but then each new server needed my intervention after server image deploy. This should let me avoid this.

So, I needed something that I could carry very easily anywhere Perl 5.8 was present.

One thing that makes me proud is that for the new 1.1 release there are a few features added and a few removed. Like accessors for the three member variables: actually not needed due to the extreme simplicity of the module. I'm still thinking if an "include" feature is needed, but I won't probably be adding it until it's evident that this is a lacking feature. It's like a gym where I can practice some refrain to feeping creaturism - ehr, creeping featurism.

Friday January 11, 2008
05:59 AM

sort SUBNAME LIST is sick

I was bite by this:

#!/usr/bin/env perl
use strict;
use warnings;
 
my @stuff = sort returns_list('whatever');
print "stuff: [@stuff]\n";
 
@stuff = returns_list('again');
@stuff = sort @stuff;
print "stuff: [@stuff]\n";
 
sub returns_list {
   return qw( howdy all of you );
}
 
__END__
 
poletti@PolettiX:tmp$ /opt/perl-5.8.8/bin/perl bug.pl
stuff: [whatever]
stuff: [all howdy of you]
poletti@PolettiX:tmp$ /opt/perl-5.10.0/bin/perl bug.pl
stuff: [whatever]
stuff: [all howdy of you]

which is regarded as (thanks dakkar):

$ perl -MO=Deparse sort_bug.pl
use warnings;
use strict 'refs';
my(@stuff) = (sort returns_list 'whatever');
print "stuff: [@stuff]\n";
@stuff = returns_list('again');
@stuff = sort  @stuff;
print "stuff: [@stuff]\n";
sub returns_list {
    use warnings;
    use strict 'refs';
    return 'howdy', 'all', 'of', 'you';
}

dakkar also reminded me that this was the only way to support what then became sort BLOCK LIST.

This is just sick.

Sunday January 06, 2008
08:48 PM

OO and "private" methods

I was writing a couple OO modules and I was struck by a thought about "private" methods. In that quite private context that's my mind, I'm basically thinking of "private" methods as those methods that you don't explicitly support for external usage, but that come handy in your own implementation of a class, mostly for refactoring stuff. So I'm not meaning it in the mainstream OO tongue (like enforcing its usability only within the class), but more on the "intended audience" -- which seems quite in line with Perl's approach.

What made me scared all at once was realising that some derived class could actually override that private method, without explicitly knowing about it! For example, if I have factored out some logic into some "private" method:

sub _get {
   # do stuff
}

using such a simple sub name, it could well be that someone using my class can implement its own "private" sub _get, and blow it all!

Thinking a bit about this, the solution is obvious. If I want methods that can't be overridden, just stop calling them "methods", and call them just "subs". Which means that instead of:

my $stuff = $self->_get(@whatever);

I have to use:

my $stuff = _get($self, @whatever);

turning OO magic off. Something to remember in the future.

Saturday January 05, 2008
11:06 AM

Roma.pm wishes best luck to dada

Yesterday night Roma.pm met to wish best luck to dada and wife, for their imminent Austrian adventure. For such a big happening, we also had the honour to host larsen and dakkar from Firenze.pm and Pisa.pm - they're always a pleasure to meet in person. While Austria is not that distant, I'll surely miss my lunches with dada.

Roma.pm has never been too much active, at least in organising technical meetings. OTOH I've really enjoyed many mailing list threads up to now. Luckly, the emigrants (can) continue to partecipate in the mailing list. The sad part being that plural.

In the last year, dada is the second person leaving Roma for another country, following malattia (which went to Japan). And others went away in the past, like oha which I never met in person. While it's great that they're going to such wonderful places to make their lives better, it's sad for me to think that they've not found what they deserve here, and that I can't enjoy their company. They're really smart people and taught me a lot.

But this is probably a general trend here in Italy, in particular centre-south Italy. Whatever the goverments (*any* government) say, opinions abroad are quite right IMHO.

Anyway, don't think we completely disappeared! We'll continue to have dinners, talk like ner*COUGH*geeks and enjoy life!