Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

jozef (8299)

jozef
  (email not shown publicly)
http://jozef.kutej.net/
Jabber: jozef@kutej.net

Journal of jozef (8299)

Friday December 04, 2009
02:49 PM

when virtualization is too expensive

Yes the thing that turns your single computer to multiple computers. Why should it be expensive? Simply because the virtual machines are not sharing one disk space and one memory. To make the the virtual system run smoothly, it needs some decent amount of dedicated memory and a disk volume with some reserve so it doesn't have to be resized too often.

First find the reasons for doing virtualization, why would anyone want to run multiple machines on a single hw? Most likely to clearly separate the programs and the whole operating systems. Give the strictly defined virtual hw resources, limit the access for security reasons. And also to add one level of abstraction which then allows systems to live in a cloud. But that is a different topic.

Let's search for the solutions how not to do virtualization and fulfil some (!) of the requirements of it. Mainly the clear files and whole system separation for Perl development.

If just Perl is in the play, then compiling and installing user own Perl is an option. Simply having the Perl binary and all the installed CPAN modules in the $HOME directory of the user.

If Apache is needed or some extra binary libraries, it is still possible to compile and install to the user home, but it is quite a lot of "hand work" and not every one has time and passion to do it. Much more simple way is to use chroot. What chroot does is that it sets root of the filesystem for the child processes to a folder. And as we are in UNIX, where (nearly) anything is a file, this means a different machine. Both systems, the parent and the chroot-ed, still share the same /proc, /dev, network devices etc., but the separation is enough to be able to install programs with standard distribution commands and run them. Fair enough to have chroot-ed machine as a development machine. Benefiting of shared memory and disks pace, easy file sharing (one filesystem) and not having to maintain virtualization sw.

Here is how to create a chrooted system on Debian and switch to it:

debootstrap lenny /usr/chroot/$MACHINENAME
echo ${MACHINENAME}_chroot > /usr/chroot/$MACHINENAME/etc/debian_chroot
chroot /usr/chroot/$MACHINENAME su -

Monday November 30, 2009
03:04 PM

CHI -- Unified cache interface

CHI looks interesting - one Perl module interfacing to any kind of cache. The "L1 cache" + "mirror cache" is a technique worth looking at and it adds more value to the module beyond just an unified interface.

Thursday November 26, 2009
12:00 PM

So I've thrown away my feedback

I wrote recently about feedback.cgi on Bratislava Perl Mongers page. It turned out to be useful only to deliver more spam. Today I've replaced it with a JavaScript bookmarklet to the new evil thing from Google - Sidewiki. So everyone has still a chance to annotate the pages and I have one less thing to maintain. Win-Win.

Btw this is how one can get the sidewiki RSS feed url for http://bratislava.pm.org/ page:

perl -MURI::Escape -le 'print "http://www.google.com/sidewiki/feeds/entries/webpage/", uri_escape($ARGV[0]), "/default?includeLessUseful=true"' http://bratislava.pm.org/

Tuesday November 24, 2009
03:37 AM

simple (stupid?) other XML way arround

There are 2625 XML modules atm on CPAN. (`zcat 02packages.details.txt.gz | grep -E '\bXML\b' | wc -l`) So why not create another one? There are number of modules that are trying to map XML structure to Perl data structures. This can never be perfect and sometimes needs custom hooks to adapt. What about the other easy (easier) way around? Serializing Perl data structures to XML? I've made an experiment. More about it here or in Pod of Data::asXML.

Monday November 16, 2009
05:50 AM

Anyone working on event-based MVC?

Does anyone knows of a Perl MVC implementation that is based on some event loop? Having non-blocking IO (filesystem/database) in web request handling would be nice to have. That will enable to have just one process per CPU. The IO is the only reason why "one process is NOT enough for every CPU"

Wednesday November 11, 2009
03:08 AM

finished the series on friday

Monday November 02, 2009
11:30 AM

5. dôveruj ale preveruj

"5. trust but verify" - some time ago I've promised to do blog series and a lot of water went down the river since then so it's really time to full fill the promise.

Why starting with part 5? As there was just one comment, from andy.sh that he likes the $title, on the schedule announcement, I assume it is the most interesting one. :-)

All (most?) CPAN authors are writing tests. Tests are cool, let us sleep well, let us discover "oh I forgot" thinks.

While code is once written and remains that way for a while, most pages these days are generated on the fly to put personalized information, banners or what ever fancy shiny stuff. This may make the testing a bit more difficult. Having the pages statically generated allows to easily test and most important validate pages still on dev system.

Let's have a look at vxml script. It will validate any xml file or a folder with .html, .xml, .rdf files using XML::LibXML for validation. In addition all internal links (a href, img src) in HTML files are extracted and tested if their target files exists. The output of testing is TAP. What else will be better if TAP can test anything? ;-).

While ba.pm.org can be generated statically some pages can not, especially the once requiring users to login. Lot of them have a high load and testing every request will kill (already loaded) machines. One technique is to take randomly only every Xth request and test that one as the part of response to the client. When put to PerlCleanupHandler than it has no effect on user experience.

Friday October 23, 2009
10:08 AM

"the Perl is dying" vs "you ain't seen nothing yet"

Benjamin Zander on music and passion

In case of Perl, you ain't seen nothing yet!

Monday October 19, 2009
01:10 AM

about a future of Perl

Last Thursday we have had a Bratislava.pm meeting with two topics.

1st one, thanks to Andrew Shitow, was about code generation or if you want compiling for compilers. He demonstrated this on a finding primes with Perl6 program. It was nice and you can still enjoy this talk on many places that Andrew has on his this year's roadmap. It is an example of reusability, the beauty of code generation and a showcase of a way which to try when doing optimizations. It's also a really good example how we, Perl hackers, want to always do thinks our own way ;-) The speed is nice, but not always and the Rakudo project should not get tied with premature optimization at this stage.

2nd talk, thanks to Emmanuel Rodriguez, was about a game made with CPAN authors - Pexeso. Most of Perl developers don't like to do graphics, JavaScript, web design and graphics effects. Seeing 3D OpenGL effects animating, rotating, zooming in and out done in couple of lines of Perl code looked impressive. And showing this to someone that is asking "why Perl?" might do the trick of convincing.

effect++ # is selling

Thursday October 15, 2009
01:09 AM

now back to work (again?)

It has a long time agon since I've posted here. Until YAPC::EU (for 3+ months) I'he tried to keep up with one post per week even it was not easy. The Iron Man competition (challenge?) has a good intention to make a lot of noise, but this can also slip to become an low-quality noise. On the other hand, not every post has to be crowded with information. Does it? After all it's just a blog. Or? Still it takes time and it is suppose to take time every week. Hopefully it's a good investment (payback?) to Perl community.

Another, or different kin, Iron Man issue is that it aims for English blog posts. Yes I know that there are also Russian, Japanese blogs, but those are minor and look strange, in the middle of the English flood. Still blogging (or doing noise) in local language can be much more useful for the purpose or bringing (enlightening) to Perl new people than writing in English where the information flow is plentiful. The same reasoning as for translating perlpod.

Jet another reason was that I was busy doing different things than blogging. Actually working :-) besides the day job, I did an experiment to bring FHS to Perl which is not ready to blog about. Or should we blog also about things that are a total drift to unknown? Than when it turns out to be "no so great" (read stupid), the internet will never forget... I've also managed to install a MT4 on my server (which was not without a, still unresolved, issue) and created not one but two blogs! One for things that are not sane and one for more serious stuff. The main reason, among others, for not using use.perl.org is that there is no way to post entries with images. Still I'll post my Perl blog entries here, but I'll post different not-just-Perl related there. After all this blog is registered to the Iron Man competition not those. :-)