I've been blogging here for several years. I've tried to keep my Perl stuff here and my non-Perl stuff else where. Recently I've found the interface very unstable, at first I thought it was just a manky Firefox set-up but I see lots of other people are reporting the same.
I don't think this site is particularly pretty but at the same time I don't dislike it or have some active hatred of it. I'm not paying anything for the site so I can't complain anyway about the visuals.
Quite a few people I do follow seem to be moving to pastures new, so I think for the moment this will be my last blog on this site. My other blogs are listed below.
The past few weeks have been busy at work. We were asked to interface one our pumps with SAP so we could download it's event log into the SAP Quality System at the end of the assembly and testing of the pump.
After a certain amount of fiddling about, I got the OBEX interface to work and parsed out of the XML log the data we wanted. It then turned out that we had two other interfaces in the same physical pump to deal with at the same time that we didn't initially know about - one device and three software systems. Both the new ones are HTTP rather than OBEX transfers, one uses the same XML log files as the OBEX interface the other uses a less structured free text solution.
It's all been done in a single semi-modular Perl CGI application on a Windows PC, and then the SAP system interrogates the PC via a HTTP call. It's a bit of a string and sticky tape solution but it does work. For good measure I also merged in an existing and different interface to a different medical pump so that the SAP side could be streamlined down to just one set of configuration values.
It's been really challenging and fun to do, and when we tried it for the first time on a test PC everything worked first time - even the deployment documentation is pretty good. Another good win for Perl.
Yesterday I spent a few hours banging my head against a brick wall. I'm using
PerlIO::gzip on ActiveState Perl 5.8.8 to decompress a large XML file. The XML file decompresses perfectly with Cygwin's gzip or xmllint, but
PerlIO::gzip sometimes mangles the file and it starts as XML then degenerates into soup and sometimes it decompresses the file perfectly.
I wasted quite a few hours thinking it was something else, but it's clearly a random problem with
PerlIO::gzip, sometimes it works and sometimes it doesn't. I can't make any sense of it, how can it work some of the time and the rest of the time it generates gibberish?
I'm going to give
IO::Uncompress::Gnuzip a go now to see if it's consistent and reliable. It's a shame as the
PerlIO::gzip interface was very handy.
This week I've been doing too many things at once and surprise, surprise, things didn't quite go the way one would hope. I've been working on SAP PI middleware maps - a bit like poking your eyes out with an overly long hot poker, starting to work on an OBEX interface between one of our pumps and SAP, and trying to get a working backup scenario for the R&D server.
It took us more than two days to clean up and get working our first PI inbound interface. It has not been helped by the fact that the BOMI warehouse software lies - it doesn't send fixed length files in reality... or that we needed one IDoc in SAP per line in the BOMI file - something that SAP's PI middleware doesn't do without "magic". In the end we got it all working but it's been a real palaver and we're not happy with the solution.
OBEX was more challenging and didn't go well until I got the PumpConnector tool from R&D which turns our pumps into an OBEX server that can be reached over a selected IP address and port. It's not ideal but I can now get a gzipped stream on the log file onto the client PC. The next challenge will be to decompress it and find the entries in the vast XML file that I want, and then send them back over the original CGI request to SAP.
Finally just before the holiday I got HP DataProtector installed and working on the R&D server. Previous backup solution had been more Heath Robinson...
For the past few days I've started to really use SAP's NetWeaver PI middleware system. The GUIs are Java applications, they are quite sluggish and get noticeably slower during the day, plus they don't look native to the OS and so look ugly and feel odd.
The server component is all written in Java, which thankfully I don't have to work with, but as with the GUI it's resource hungry and quite sluggish. At least they don't seem very buggy - I gather the the earlier versions were very wobbly!
On the whole I don't have to work with much Java code myself - just the results of it. However in the XML mapping core it's often not possible to use the point and drool interface to achieve what you need, so you have to resort to a Java functions to get the job done. I don't like Java - probably because it's not something I've used often - though I do keep trying to learn it, but I constantly feel that it's the wrong language for this job - Perl would be so much a better solution...
This week a consultant showed off a new super modern feature of Java that he's not really use to, looping over a list of strings without using a counter to access the items by index - he thought it so merit worthy he had to talk about it! I know Java and Perl are designed for to do different things, it's just a shame that people in SAP don't know it!
At the end of last year after a lot of effort we went live with a new quality initiative at work. Most of it was done in SAP but the custom interfaces to test equipment and instruments was done with Perl - it was used as the glue to hold it all together.
I sent an email to Proud To Use Perl. After an initially positive reply, in the new year I received a more depressed email so instead I posted a brief summary of the story on Perl Is Alive here: Perl Helps Medical Company. It's not great but it's a start.
While I understand the mantra, I'm constantly fighting a mantra of use "SAP Standard", or more recently use "SAP XI/PI and Java". If you have bought SAP and are using it's PI mapping tool then it's best to use it, but it's just not an appropriate tool to do everything and it's nice to have a success story that isn't some monstrous Java framework that takes an eight core AIX box just to run and a team of developers to write and maintain - even if it is "standard".
Companies are scared of bespoke solutions because they fear that they may be trapped with a unique solution that they cannot maintain. They run to Microsoft or in our case SAP/Java without any real understanding of the shoddy quality of the solution they are getting. In the end it's more important to have a "standard" solution no matter how inferior it is. Perl isn't considered standard, so it always comes out worst.
On Monday an Axon consultant was horrified that I'd consider writing a Perl daemon to receive instructions by SOAP, SAP RFC or HTTP in Perl in a few hours. Creating a process like that requires weeks of development in Java, you can't possibly say writing something like that is trivial... Okay I may have overplayed the advantage that CPAN gives to Perl, but some things - in my case, most things - are a lot easier in Perl than Java!
I came across this site today: Perl Is Alive, not from a Perl link but from a Debian site. It seems the web site shares the same Yawns engine as the Debian Administration web site that I frequent. I'm not sure what niche this new Perl web site will fill, but it's better that there is one more Perl Web site than one less.
Just before the end of last year we went live with a new quality initiative at our Hampshire based manufacturing site. It's mostly SAP standard and barcoding but my bit was some custom ABAP code and some funky Perl stuff to glue some of the hard bits together.
I'm proud of the solution is brings lots of quality and regulatory benefits to the factory which is good as we make medical devices. We also got a 15% increase in productivity for free which the bean counters will like. I mentioned this to the Proud To Use Perl and Dave said it was a good story.
After we went live the factory manager wanted a nice big plasma screen in the office showing live figures from the SAP system as they are entered into it by the factory staff. The easiest way to do that was to reuse a Perl/SAP web application framework I wrote for another project (zero code changes required) and create to new SAP functions to do the new SAP side of the work. All told it too two days to have up on the "big screen" for a site visit by top brass. Yet another win for Perl.
All I have to do is finish off my Proud to Use Perl statement - something I'm not so good at.
I don't do that much Perl these days. Though I've done more Perl development work in the last 3 months than the last 2 years probably. At the moment I'm doing some more fun Perl/SAP work at work and SAP is more mentally stimulating with Perl than without it!
Yesterday I was thinking, "Is now the time to start taking advantage of things in Perl 5.8?". For a long time I tried to avoid using anything from Perl 5.8 that couldn't be used on older Perls, I'm now thinking it's time to move on and start actually using stuff from 5.8 that can't be used in 5.6 - and dropping anything that is on it's way out in 5.10 and beyond.
I know that stuff was introduced into the 5.8 core but I didn't really bother with it as we still use 5.6 at work and initially a lot of people were still running 5.6. Time moves on and I think it's time to re-read my PerlDelta docs and starting doing new things. I know that there are lots of cool things in 5.10 but I think they are a bit new yet to use given the number of 5.8 and older systems.
Yesterday someone commented that I hadn't blogged for a while. I realised that if you don't have my combined "planet me" feed you may not know that I blog in three different places and you may believe I've dropped off the net because I've not written in one of the blogs for a while.
You could say it's sad that I even thought about this but when you are crammed in a train in conditions worse than a sardine you think weird thoughts...