scrottie's Journal http://use.perl.org/~scrottie/journal/ scrottie's use Perl Journal en-us use Perl; is Copyright 1998-2006, Chris Nandor. Stories, comments, journals, and other submissions posted on use Perl; are Copyright their respective owners. 2012-01-25T01:45:58+00:00 pudge pudge@perl.org Technology hourly 1 1970-01-01T00:00+00:00 scrottie's Journal http://use.perl.org/images/topics/useperl.gif http://use.perl.org/~scrottie/journal/ Amiga Ethernet http://use.perl.org/~scrottie/journal/40515?from=rss <p>Yesterday, I got my X-Surf 3cc Ethernet card I broke down and ordered for my Amiga 3000. There's some backstory about serial consoles, Sparcs, and the cluster, but it's not important. The 3000 was also packaged as a Unix machine, running a pretty standard port of SysV. It was the first Amiga standard with an MMU and SCSI. It'll also kick out 1280x resolution graphics at 2bpp. Commodore sold an Ethernet board for it along with Unix on tape.</p><p>The X-Surf is really an ISA card, probably NE2000, mounted in a little carrier. There are confusingly few pins attached and the logic on the carrier amounts to a few small 7400 series chips and one slightly larger chip that also couldn't possibly have enough logic on it to do what it does. And then just to convince you that your nuts, it adds an IDE port that alone has more lines than the one little adapter chip does. The Amiga really is a machine for psychopaths, by psychopaths. Everyone sits around all of the time trying to out psycho everyone else. Just take a look at the demo scene for the thing. Amiga virtually defined the demo scene.</p><p>I have/had Amiga OS 3.9 on the thing. 3.9 is post-Commodore death. Someone bought the rights and sold them and someone bought them and sold them and so on until a sue happy band of self righteous ruffians managed to convince the remaining user base buying the rights at garage sale prices entitled them to be king of the squalid kingdom so that they could go around lynching anyone else trying to do anything for the Amiga. Anyway, OS 3.9 is pretty recent as far as Amiga stuff goes, even though it's ten years old. Most people stopped at 3.1. 3.9 only came out on CD-ROM. The 3000 doesn't have a bay but it does have SCSI, so the CD-ROM, when needed, gets hung off the side with the case open. I could also set up an enclosure and plug it into the back. I could also probably buy one of those.</p><p>X-Surf's stuff did not want to install.</p><p>X-Surf actually had an installer, which is impressive. AmigaOS 3.x has a scripting language for installers and an interpreter for that. This installer gave you the choice of two TCP stacks. AmigaOS 3.9 comes with a TCP stack but you can still swap it out. It's a bit Windows 3.1-like in that regard. The options are GENESiS/AmiTCP and Miami. GENESiS, the AmiTCP configurerer and dialer that cames with AmiTCP, was shipped in a version requiring libraries not included in AmigaOS3.9 so it wouldn't run. AmiTCP would, and AmiTCP was on the HD, though buried a bit. Miami is shareware/crippleware. It required the same library, MagicUI, that I didn't have.</p><p>I spent hours sorting out what required what and what I did and didn't have and how these various packages worked and fit together. That's ignoring the device driver for the ethernet card which is straight forward. The Amiga has a directory for libraries (which end in<nobr> <wbr></nobr>.library; the Unix terseness is missing from AmigaOS even though a lot of the feel is there). AmigaOS3.9 also won't read iso9660 filesystem CDs. Perhaps some BoingBag update fixes that but the BoingBag updates themselves are large<nobr> <wbr></nobr>.lha archives. I'm avoiding plugging the serial line into a Unix machine and speaking kermit or zmodem or something to transfer stuff. I've been down that road. Eventually I burned AmigaSYS4, a version of AmigaOS3.9 with lots of add-ons and the various BoingBag updates on it, stick it in the Amiga, and was able to steal MUI off of it and get both TCP stacks running.</p><p>Amiga programmers love to do ports of Unix software and add GUIs. They've been doing this for ages. They've had gcc since the early ages of gcc, and I ran the Amylaar MUD driver on AmigaOS 1.3 to do development locally, also in the dark ages. Kicking around on aminet.net from the Amiga, I see PHP, MySQL, Apache, bittorrent, Python, bind9, samba, VNC, and all sorts of stuff. No one ports just the client. If they port the client, they port the server, too. In the case of AmiTCP, the suite of utilities you'd expect are there, such as host, finger, traceroute, and so on, but to configure TCP/IP, you run a little GUI program and it asks you questions. It took Linux ages to get to this point and Amiga was doing it long before. One of the extras on the Extras disc, even as far back as 1.3, was a version of emacs with drop down menus.</p><p>Completely unsurprisingly, the 16mhz 68030 processor running AWeb (which does some JavaScript) is vastly faster than firefox on my 1.2ghz Centrino Linux machine. Amiga programmers do not write slow software. It's entirely against their nature. Threading is fantastic. It'll do downloads, render several jpgs in the page, update the page layout as HTML comes across, and never lose snappy UI responsiveness. On firefox, I yank on the scrollbar only to have it ignore me and snap back, or else the scroll bar doesn't move at all, or the whole thing just goes away for a few heart sinking seconds, making me wonder if it just crashed.</p><p>My ambition is to get a desk in a shared office space going and stick this baby there with an updated video card that does high res, high bit depth graphics. If I'm willing to start replacing and upgrading chips on the motherboard, I can take the thing up to a gig of RAM, too, and NetBSD supports it if I ever decide I want to see how firefox runs on a 16mhz processor. What I'm really hoping for is someone to take the latest Coldfire chips from Motorola's spin off, Freescale, and do an 800mhz accelerator card for the Amiga 2000/3000/4000. That would RULE.</p><p>-scott</p> scrottie 2010-08-25T20:09:48+00:00 journal How I spent my day today (or, slowass.net pops a hole) http://use.perl.org/~scrottie/journal/40501?from=rss <p>1. Ran backups<br>2. Verified integrity of ssh on my local system versus last backup; changed local passwords<br>3. Verified integrity of my linode chpass with md5sum versus previous backup<br>4. Locked accounts; fixed changes to shell for system programs, removed additional accounts, changed passwords<br>5. Killed root processes and shells; accounted for all of the shells and processes in ps<br>6. Compared md5sums of everything in ps, login shells, rsync, inetd, su, vmlinuz, ps and various things between previous backup and current<br>7. compared nmap to netstat -lnp; accounted for netstat -lnp entries<br>8. Ran find to find setuid/setgid programs; verified no additional ones exist; ran md5sum against existing ones<br>9. Replace sshd, ssh and their config files and host keys; restarted sshd; relogged and changed passwords<br>10. Upgrade sshd<br>11. Killed<nobr> <wbr></nobr>.ssh directories<br>12. Temporarily took some services down until I can decide if I trust/replace them (squid, cron, sendmail)<br>13. diff -r'd between the two backups; read through the output to account for all changes to the system (new files and changed files) (several notable)<br>14. Ran find to find world writable files; ran find to find device files in the wilds of the filesystem</p> scrottie 2010-08-17T05:30:44+00:00 journal My reply to Removing database abstraction http://use.perl.org/~scrottie/journal/40482?from=rss <p>My longwinded response to http://blogs.perl.org/users/carey_tilden/2010/08/removing-database-abstraction.<nobr>h<wbr></nobr> tml :</p><p>Don't get trapped in the mindset of "I have to use X because if I don't, I'm doing it wrong". Quite often, if you don't use X, it's entirely too easy to do it wrong if you don't know what you're doing. You probably don't want to re-implement CGI parameter parsing, for example. But that's not the same thing as saying that you should always use CGI because it's a solved problem so never do anything else. Nothing is a solved problem. mod_perl is a valid contender to CGI and now Plack is a valid contender to mod_perl. FastCGI was and is a valid contender to mod_perl and servers like nginx do awesome thing. Yet, tirelessly, the fans of one explain that the competing ideas are somehow not valid.</p><p>Sorry, I'm trying to do proof by analogy here. It isn't valid except to the degree that it is. I'll get to databases in a minute.</p><p>Quick recap: there are lots of valid approaches; using an alternative is not the same as re-inventing the wheel.</p><p>Furthermore, the heaviest technology is seldom the long term winner. Witness the return to lighter HTTP pipelines. For ages, Apache boasted being a bit faster than IIS, in response to which I could only wonder why Apache was so slow.</p><p>Okay, back to databases. DBIx::Class to a relational database is a valid option. It's also very heavy. It alo doesn't really let you scale your web app out unless the database in question is DB2, Oracle, or one of a few of those that runs on a cluster with a lot of processors rather than just one computer. Otherwise you've just added a new bottleneck. DBIx::Class makes it harder to do real relational work -- subqueries, having, or anything hairy. At the very least, you have to create a class file with a lot of boilerplate, reference that file from other files that you made or generated, and stuff the same SQL into there. Abstracting the querying away in simple cases makes it easier to query the database without thinking about it. This leads you to query the database without thinking about it. That's a double edged sword. In some cases, that's fantastic.</p><p>Lego blocks make it easy to build things but you seldom buy home appliances built out of Legos. Even more so for Duplo blocks. Some times easy tools are in order; some times, low level engineering with an RPN HP calculator is absolutely in order.</p><p>Okay, I'll get back to databases in a minute here, but I want to talk about something outrageous for a moment -- not using a relational database at all.</p><p>I wrote and use Acme::State for simple command line and daemonized Web apps. It works well with Continuity and should work with the various Coro based Plack servers for the reason that the application stays entirely in one process. All it does is restore state of variables on startup and save them on exit or when explicitly requested. It kind of goes overboard on restoring state and does a good enough job that it breaks lots of modules if not confined to your own namespace, hence the Acme:: designation.</p><p>Similarly, people have used Data::Dumper or Storable directly (Acme::State uses Storable under the hood) to serialize datastructures on startup and exit. In AnyEvent frameworks, it's easy to set a timer that, on expiration, saves a snapshot. Marc Lehmann, the man who created the excellent Coro module, has patches to Storable to make it reenterant and incremental, so that the process (which might also be servicing network requests for some protocol) doesn't get starved for CPU while a large dump is made. His Perl/Coro based multiplayer RPG is based on this idea. With hundreds of users issuing commands a few times a second, this is the only realistic option. If you tried to create this level of performance with a database, you'd find that you had to have the entire working set in RAM not once but several times over in copies in parallel database slaves. That's silly.</p><p>You can be very high tech and not use a database. If you're not actually using the relational capabilities (normalized tables, joining tables, filtering and aggregating, etc), then a relational database is a slow and verbose to use (even with DBIx::Class) replacement for dbmopen() (perldoc -f dbmopen, but use the module instead). You're not gaining performance, elegance or scalability, in most cases. People use databases automatically and mindlessly now days to the point where they feel they have to, and by virtue of having to use a database, they have to ease the pain with an ORM.</p><p>Anytime someone says "always", be skeptical. You're probably talking to someone who doesn't understand or doesn't care about the tradeoffs involved.</p><p>Okay, back to databases. Right now, it's trendy to create domain specific languages. The Ruby guys are doing it and the Perl guys have been doing it for ages. Forth was created around the idea -- the Forth parser is written in Forth and is extensible. Perl 5.12 lets you hijack syntax parsing with Perl in a very Forth-ish style. Devel::Declare was used to create proper argument lists for functions inside of Perl. There's a MooseX for it and a standalone one, Method::Signatures. That idea got moved into core. XS::APItest::KeywordRPN is a demo. Besides that, regular expressions and POD are two other entire syntaxes that exist in<nobr> <wbr></nobr>.pl and<nobr> <wbr></nobr>.pm files. It's hypocritical to say that it's somehow "bad" to mix languages. It's true that you don't want your front end graphic designer editing your<nobr> <wbr></nobr>.pl files if he/she doesn't know Perl. If your designer does know Perl, and the code is small and doesn't need to be factored apart yet, what's the harm? It's possible to write extremely expressive code mixing SQL and Perl. Lots of people have written a lot of little wrappers. Here's one I sometimes use:</p><p>http://slowass.net/~scott/tmp/query.pm.txt</p><p>Finally, part of Ruby's appeal -- or any new language's appeal -- is lightness. It's easy to keep adding cruft, indirection, and abstraction and not realize that you're slowly boiling yourself to death in it until you go to a new language and get away from it for a while. The Ruby guys, like the Python guys before them, have done a good job of building up simple but versatile APIs that combine well with each other and keep the language in charge rather than any monolithic "framework". Well, except for Rails, but maybe that was a counter example that motivated better behavior. Look at Scrapi (and Web::Scraper in Perl) for an example.</p><p>Too much abstraction not only makes your code slow but it makes it hard to change development direction in the future when something cooler, faster, lighter and more flexible comes out. Just as the whole Perl universe spent ten years mired down in and entrenched in mod_perl, so is there a danger that DBIx::Class and Moose will outlive their welcome. POE, which was once a golden child, has already outlived its welcome as Coro and AnyEvent stuff has taken off. Now there are a lot of Perl programs broken up into tiny non-blocking chunks that are five times as long as they could be, and the effort to move them away from POE is just too great. The utility of a package should be weighed against the commitment you have to make to it. The commitment you have to make to it is simply how much you have to alter your programming style. With Moose as with POE, this degree is huge. DBIx::Class is more reasonable. Still, it's a cost, and things have costs.</p><p>Thank you for your article.</p><p>Regards,<br>-scott</p> scrottie 2010-08-06T21:53:04+00:00 journal DEFCON, day 1 http://use.perl.org/~scrottie/journal/40473?from=rss <p>Basically no mischief or craziness. Having DEFCON at a casino did to it exactly what I would have expected. No money pots to eat the cockroach, no naked fire jugglers, no getting thrown in the pool, no parties by the pool.</p><p>Bros outnumber the blackshirts now. They're talking loudly and proudly about how little they know and care.</p><p>Kilts are representing, too. The freaks are here. There's a Japanese gang dressed in kimonos. Some other Japanese guys walked by talking vigorously amonst themselves, laughing and pointing. Punks with combs, raisers and hairspray are in the contest area/lounge dispensing mohawks. They have their own official area. Strange hats abound. One kid has a fez. There are BDUs and lab coats. Lots of colored hair.</p><p>Aha! Finally spotted someone I knew -- Kevin, a friend of Ernie's, who also worked in the gaming industry, but on a different side of it.</p><p>People are sitting next to me reading long hex strings from the "background" of the talks description book.</p><p>They ran out of badges as usual. My flight got delayed two hours which seriously cut into my time here.</p><p>"My only crime is that of outsmarting you"... shirts have slogans.</p><p>There's a lot less interesting in WiFi and a lot more in smartphones. The common area is woefully inadequate.</p><p>UnixSurplus is here again and people are packing in to see old Unix hardware. Some people are going to be coming home with O2s.</p><p>More later, perhaps.</p><p>-scott</p> scrottie 2010-07-30T19:43:53+00:00 journal OpenSSI cannot find NIC with static node configuration http://use.perl.org/~scrottie/journal/40446?from=rss <p>Pasting an email reply since everyone seems to be stumbling over this one:</p><p>----</p><p>I've had this problem too.</p><p>This happens when the driver for the NIC on that machine isn't included in<nobr> <wbr></nobr>/etc/mkinitrd/modules. When the node boots, it can't find the NIC card<br>because it doesn't have a driver for it, and since it can't find the NIC<br>card, it doesn't know the MAC address, and without the MAC address, it<br>can't configure the node. Without the NIC card, it can't connect to the<br>initnode.</p><p>You can add the names of the NIC devices for all of the machines during<br>install, or you can add them before you boot the other machines into the<br>cluster and then run 'mkinitrd -o<nobr> <wbr></nobr>/boot/initrd.img-2.6.14-ssi-686-smp 2.6.14-ssi-686-smp'<br>(adjusted to match your kernel version, see 'uname -a') and run 'ssi-ksync'.</p><p>It's possible to add the correct device names and rebuild initrd but still<br>the node won't find the NIC or MAC address. For some reason, some drivers<br>just don't work correctly with how OpenSSI configures nodes. The e100 and<br>e1000 don't work correctly with OpenSSI for configuring nodes. I got a<br>bunch of 3com 3c509 cards and stuck those in the nodes and use only those<br>for the cluster interconnect, for now. Later I hope to get Infiband going.</p><p>So, make sure<nobr> <wbr></nobr>/etc/mkinitrd/modules has the correct driver in it,<br>rebuild initrd, ssi-sync, and if all else fails, use some old 3com<br>cards.</p><p>Good luck!<br>-scott</p> scrottie 2010-07-14T18:10:34+00:00 journal PerlMongers group organization lessons http://use.perl.org/~scrottie/journal/40418?from=rss <p>My pasted reply to a yapc mailing list message:</p><p>Missed this BoF too. I've written a lot about my experiences with<br>Phoenix.PM. As far as I can tell, the key ingredients are:</p><p>* organizer with the right personality<br>* audiance happy with social meetings or else a few people willing to present over and over year after year<br>* central location to minimize excuses from people who have families to go home to<br>* core people to attend so the thing doesn't bottom out while it's ramping up</p><p>We had a mix of people hoping the meetings would have tangible,<br>immediate improvements to their career (more immediate than teaching<br>them awesome Perl hackery, such as recruiters standing by),<br>maintenance programmers whose life with Perl just sucks and won't<br>be made by happy that doesn't just take a rotorooter to their<br>codebase, weirdos like Marlana from Fight Club who just go to every<br>meeting in town but never actually do anything or have any personal<br>interest in any of it, and hip kids looking for the cool thing.<br>Most of these groups saw we had nothing to offer (only awesome<br>technical presentations) and never came back.</p><p>-scott</p> scrottie 2010-06-25T19:58:45+00:00 journal Perl is Dead - the supply lines have been cut http://use.perl.org/~scrottie/journal/40417?from=rss <p>I wrote this once and it didn't post (that often happens here, in various scenarios) or else it got deleted. Assuming the former. Here's a super short version of the same thing. The first go was much better. Dammit.</p><p>* College CS departments are owned by Microsoft and Sun. They use C# and Java out of consideration for strings-attached grant money.</p><p>* Highschool and gradeschool kids learn PHP, jquery/JS, Squeak, Processing, Flash, or Python/PyGame. Perl has a foot in the web world (though not the appeal to the ultra-low-brow user base) but virtually no foot in the playing with graphics department. SDL is okay but it isn't easy enough or flashy enough to compete.</p><p>* Perl teams are small. Fewer people are hired to do a project compared to Java -- radically fewer. And companies hire almost exclusively senior level Perl programmers. It's hard to get a toe hold in the industry. You virtually have to publish a lot of great stuff on CPAN to get a job. Strong typing in Java I think makes it easier to integrate weaker programmers into a team and keep them from doing damage. They aren't kept out of your living room with a shutgun but with locked doors. Compartmentalizing the code, it's harder for someone new or intermediate to do damage; they just do or don't succeed. That's a big improvement.</p><p>* There is no perception that there's a lot of money to be made writing Perl. Java jobs with far lower expectations have paid me personally far better than Perl jobs with high expectations. Really good Perl programmers don't get actively recruited away. Countries with developing economies aren't tooling up on Perl and Perl companies aren't sponsoring H1B visas. Nor are Perl companies making long term investments in employees and letting them spend years only marginally productive with the idea that they'll be there for 10 or 20 years. Perl jobs just tend not to be that "milk". We all have just one battle story after another complete with scars. The lack of promise of money plus comfortable employment does not draw in adults in over Perl's job market.</p><p>* Relatedly, no one is out there just making cool stuff in Perl and saying, "Hey, look what Perl can do". Yahoo! Pipes is Perl and it's way awesome but they aren't playing up the Perl bit and there are precious few examples of this. People's perceptions would be that Perl is only used for big, cranky, serious old web apps. And they'd mostly be right.</p><p>* Perl was and perhaps still is the language of choice for system administration on Unix but other things are competitive in this field now. This is probably the largest avenue by which people discover and learn Perl -- Unix admins. Shops that do a lot of Unix administration probably still take non-programmers and tell them to learn Perl.</p><p>* Microsoft, Sun, IBM (SAP really wins here) are buying their customers lunch and being buddy-buddy with them. They're listening to them, agreeing with them, sympathizing with them, and drinking with them. In this regard, Perl doesn't even exist. Microsoft and Sun get a lot of money from companies but they don't just walk over with their hand out -- they woo them. This is a bonus point that's unrelated to the main point.</p><p>Want to kill something? Cut the supply lines. There's precious few new projects using Perl (neither large business nor garage style maybe-the-next-Twitter type). The avenues by which people have discovered Perl in the past have almost entirely been closed off. Nothing dies quickly. COBOL is still around. COBOL is even in demand -- the dearth of new people learning for the amount of legacy code out there itself creates demand. Except for system administration (I don't know, what else can you think of? Where do people come from?), our lines have been cut.</p><p>As I wrote in the last post, Perl is still assimilating and the community adapts very quickly after assimilating something. In about a year, half of the web infrastructure made for Perl switched to Plack. That's almost over night. Coro has been making waves. We ran with ORMs like tweaked out squirrels with scissors. It's disgusting, really. Collectively, we're having a lovefest with git.</p><p>Yeah, there's CPAN. Trying to figure out what happened to sunsite.unc.edu (go find out! You'll like the answer) I happened into CTAN -- the Comprehensive Tex Archive Network. No one gives a flying fuck how much stuff is in there because they have no plans to use Tex. CPAN isn't going to sell people on Perl. It did sell them on the idea of creating repo archives, though.</p><p>I hope you've enjoyed my little rant. My goal isn't to kick Perl while it's down nor is it to pointlessly piss people off. The subject is subjective and debatable and I don't mean to try to "win" the debate, only to add usefully to it. Past a certain point, saying "Perl isn't dead!" just is not constructive. We have a lot of work to do.</p><p>-scott</p> scrottie 2010-06-25T05:55:21+00:00 journal YAPC::NA2010 wrap-up http://use.perl.org/~scrottie/journal/40416?from=rss <p>I flew USAir. This was the airline that promised me a $200 flight voucher and gave me nothing. They announced on the way back that the stop in Phoenix was actually a plane change for the people going through to CA. The airlines started doing "stops" rather than layovers because people hated switching planes -- too often they wouldn't make their connection and they were twice as likely to have a canceled flight.</p><p>As people boarded the plane, USAir was confiscating carry-ons, telling people that the plane was full. One guy was almost in tears -- he had a case that was full of glass. He boarded without it. The plane was not full. Not nearly. Nor were the overhead luggage racks. I've heard airlines use this to hurry people on planes before. I wonder what the next tactic will be when this one wears out. This "plane is full" stuff came after they called all of the stand-by passengers up to the podium. The lie was bare faced.</p><p>I don't mean to hate on the airlines; businesses are machines. Things devoid of soul aren't worthy of hate. But I'm trying to piece together thoughts on what happens when humans don't or aren't able to stand up for themselves.</p><p>From what I'm hearing on IRC, other people's experiences were actually legitimately bad, not just annoying. It rained somewhere so flights got canceled left and right. People barely made it home in time to eat at The Cracker Barrel.</p><p>The travel days are interesting. Between the bus, taxi, plane, and airport, it's nearly an all day affair.</p><p>I'd live in Columbus. People were laid back and earnest. Bikes represented well. College town was bursting with interesting establishments. Awesome old buildings of brick or wooden things with elaborate roofs with gables were all over, far outnumbering the new structures. It felt like a place that people cared about.</p><p>The whole time, people complained about the heat. I think it was mid 70's but muggy. I felt extremely comfortable to me -- I could roll naked in the grass. Okay, the chiggers might not be so comfortable later. Shirtless or scantily clad students were jogging, playing basketball, or walking around.</p><p>The Perl community... oh boy. Recent years has brought a push to organize. The Perl Foundation has been doing more and different things and recruiting people into roles doing specialized things that programmers generally can't do. I was steeped in organization and volunteerism for a different cause. I went around YAPC wearing my TBAG (Tempe Bike Action Group) shirt for two days. Smelly shirts go well with eye bags. It was interesting to see how well I do not function with extended lack of sleep. I had the stupid, bad.</p><p>The Perl community is full of misfits, freaks, man-boys, server room dwelling shut-ins, gimps, maladapts, rejects from other cultures, and the curiously alternative, plus the suspiciously normal looking, and I love them. It took me a few years but I learned that I can talk to nearly anyone there and have my mind blown. It's exciting to see and hear about the things everyone is working on. Every now and then, we make someone doing something especially nifty feel like a rockstar. It would be like going into a Hollywood studying and seeing a movie being made, if I actually cared about Hollywood. Some of the talks are riotous.</p><p>My "hey, look at me, I'm weird!" instinct is dying down. People know all about it for one, and secondly, I should be paying more attention to the ways other people are weird. Also, I've pretty much failed in making people hate me which is always an easy way out of social situations. Perhaps I should have taken lessons from buu. It just isn't as rewarding as it was.</p><p>It looks like the cluster going down was due to a power outage. The machines that survived on battery (laptops with longer battery lives) shat themselves when the initnode went down and laptops with shorter battery lives died before that happened, leaving notices on the screens of the other laptops that they went down and left the cluster. The last minute errand of putting new batteries in the old UPS saved my ass. While I was giving the talk, one of the machines went out. This old APC600 has a serial port on it. I guess I should see about actually plugging it in to something so I have some idea of what's going on when I'm remote at least. Clearly the service level needs to be bumped up. I might have to break down and do the Cox Internet thing, as much as I hate those fuckers.</p><p>Perl is still assimilating just as it always has. Perl 5 and Perl 6 are doing that concurrently. When Perl assimilates something, the community usually aggressively embraces it, even to the point of silliness. I kind of wish mixing in some strong typing had been embraced but what are ya going to do. There's a lot Perl does not have. PyGame, for one. Because so much has been coming from outside of our own camp, it concerns me that we might be forgetting that we can start memes too. I want to resurrect the old clustering meme, and in a lot of ways, Perl and Perl programmers are perfect for it. Perl has infrastructure for working with Unix primitives and processes beyond what most languages offer. forks, the threads API implementation for forks, comes to mind. And a lot of us are cranky old sysadmin types.</p><p>Just sitting around coding socially is something I really don't get to do in Phoenix, working from home. The coffee shops don't offer much in that way either.</p><p>I showed YAPC a photo of myself from Phoenix, the day before I left for Ohio.</p><p>It was good hanging out with beppu again, a man I admire and draw inspiration from.</p><p>I have code I started for my presentation that I need to finish still.</p><p>I got to put faces to coworkers.</p><p>-scott</p> scrottie 2010-06-25T05:06:12+00:00 journal perl and AtariBASIC, part III http://use.perl.org/~scrottie/journal/40413?from=rss <p>In a Perl 6 talk, pmichaud felt necessary to justify to the audience why something was a bit cryptic: he was running out of space on the line. dconway, from the back, commented "If I had a dollar for every time someone wrote unmaintainable code with that excuse...", prompting me to point out on IRC that putting lots of stuff on one line is an optimization technique in AtariBASIC. The moment I said that I realized (or remembered) that Perl has exactly the same problem: it does a linear traversal through statements looking for the target label (or line, in AtariBASIC). The more statements it has to traverse to find the target, irregardless of size, the longer it takes.</p><p>Therefore, it follows that longer lines in Perl make for faster programs.</p><p>http://slowass.net/~scott/tmp/gotot.pl.txt</p><p>-scott</p> scrottie 2010-06-23T18:46:01+00:00 journal MUD cannot be written in Perl and here's why http://use.perl.org/~scrottie/journal/40404?from=rss <p>Awwaiid urls http://github.com/jasonmay/abermud# perl based mud with mooseyness (early stages)</p><p>You ack loudly.</p><p>You say in common: yuck</p><p>You say in common: every MUD knock-off in the last 20 years as sucked ass and completely missed the point.</p><p>You say in common: Aber, Tiny, Circle's take on Diku, UberMUD, Muck... people *knew* what they were doing for a while, but now everyone completely misses the point.</p><p>You say in common: it's all cargo-culting... taking the superficial stuff while missing the point. makes me sad.</p><p>You say in common: turns the end of MUD, people were far more interested in throwing tech at it randomly... TMI, EoTL II, etc</p><p>You say in common: that's what killed MUD... mindless weilding of tech</p><p>You say in common: I think I'm going to put the "TECHNOLOGY WANTS YOUR SOUL" sticker on this laptop.</p><p>Awwaiid says in common: haha</p><p>Awwaiid says in common: so bitter! Maybe jasonmay is having fun and you should join him... bring "the point" into his code<nobr> <wbr></nobr>:)</p><p>Awwaiid says in common: or maybe that's not possible, since this is apparently a mud-framework as opposed to a mud itself?</p><p>You say in common: the most successful MUD software of all time, LPMud 2.4.5, was distributed as a copy of a running a game. you could fire it up and run it.</p><p>You say in common: it had a couple of security flaws in the "mudlib" code (the interpreted C-like stuff that's easy to change).</p><p>You say in common: it lacked a lot of features... races and classes most significantly</p><p>You say in common: but it had monsters, areas, spells and lots of stuff.</p><p>You say in common: you could fire one up, "wiz" the people who madeit to level 20 and let them start adding to it.</p><p>You say in common: one thing that's well established is under no circumstances should a live game "wiz" people who have not played through and beat that particular game.</p><p>You say in common: that goes for the admins too. no one should ever start a MUD without wiz'ing on one MUD and having that MUD go down or them being forcibily ejected from it.</p><p>You say in common: for some profound human nature reason, doing otherwise is always the same... you have to profoundly care about a game to contribute to it and the only way to do that is to go through the long process of making friends, partying, helping novices, learning your way around it, etc, etc over months or years.</p><p>You say in common: I guess more fundamentally, MUDs need to be targetted towards *players*, not towards coders. players become coders.</p><p>You say in common: a Perl MUD is a good idea. the zombie game was a stab at that. but there a pile of lessons like that... more than I could remember... that people haven't learned. either they haven't had the experience or else they second guessed what they saw in the name of tech devotation.<nobr> <wbr></nobr>....</p><p>By the time someone makes it to level 20 and is generally (a willing sponsor aside) then permitted to code, seeing the code has to make the game even more magical. This is critical: seeing the code has to be a wondrous event. They should relive all of their adventures again from the point of view how things actually really worked rather than what their imagination thought was going on. Their imagination will have inflated the realities of the world. But they'll see opportunity to create more of this illusion. This is analogous to working through a math problem or riddle before seeing the answer and just being told the answer. Seeing the code should be a glorious "aha!" moment.</p><p>That MUDs are text is superficial to why they were successful and why so many people loved them and still do. What makes a MUD is being up to your eyeballs for months or years, making friends, fighting monsters together, exploring, the politics of the gods, the policies of the games, the dramas, the loss and gain, the interaction between people of different philosophies including personas of people's experimental alter egos. A profound love of the game, strong feelings (period) towards the admin, and a sense of serving people like your many friends drives you to create more adventures, places, creatures, oddities, and interactions. You're adding magic to the game. This is the only time and the only point that magic is added to the game -- when a player beats it and adds to it.</p><p>I lied: Diku was probably the most popular game. It too game ready to play. Rather than be coded on (Circle would later change this), creation was done with level editors. Creativity was mostly limited to economics and prose, and this was fine.</p><p>Diku was a clone of Aber, stealing and changing it slightly. Aber was a knock off of the original MUD which was a commercial enterprise; the creation of MUD was a blessed event. LP was a knock off of Aber. Lars was an Aber player and beat Aber.</p><p>No one is going to clone Aber, Diku or LP in Perl because they want to do a "better" framework. How good the framework is matters now. In fact, TinyMUD was huge and it ran a very simple BASIC. Diku didn't run user code at all. You had to hack on the C source code to add features. Same for Aber, except Diku let you extend the map at run time.</p><p>To create a MUD -- in the spirit of MUD -- in Perl, you'd have to make a game. That's a much harder problem than creating a "framework" for making games. For the reasons outlined above, no one is going to take your framework and run with it. Anyone who cares is going to just make a game and worry about the "framework" later. Even myself, I had a harm time having energy left over the game. It's profoundly hard to do the tech and the game at the same time. Stealing egregiously is the only way to do this. Previous efforts have used mechanical translation from one game to another, and un-tech-savvy friends cutting and pasting. People cared about the game that much. They knew. Even when a new game started, enough had to be added to it to make it interesting, even though it came playable. Most people didn't have the energy for that. There were countless failed attempts at starting games where all they had to do is take a running game -- often 2.4.5 -- and add classes, races, and/or a mix of other unusual, new, unique and interesting things to define that game. Taking an existing, ready to play game with a pile of features as long as your arm and dozens of areas and thousands of monsters and adding a few more features takes more energy and creativity than the vast majority of people have to spare. And that's why you can't create a game framework in Perl and expect it somehow turn into something. Your effort is insignificant and misguided. You make me sad. Go code on an existing MUD that already had hundreds of wizards pour their life and soul into it. Bolt Perl onto the side. But, if you actually played and beat the thing and lived it, bolting on Perl would be the last thing you cared about.</p><p>-scott</p> scrottie 2010-06-17T19:25:44+00:00 journal Moose, DBIx::Class, and the _new_ OO http://use.perl.org/~scrottie/journal/40380?from=rss <p>Simula introduced OO in the 60s. Smalltalk took it to its logical and pure extreme in the 80s. C++ brought it to systems programming and gave it the performance that only static optimized code can enjoy.</p><p>Really smart people wrote really smart code using really powerful, really futuristic features in C++... and created great big steaming piles of crap.</p><p>Efforts to create any reasonable operating system or database system using C++ failed. Telephone switches, previous infallible, failed spectacularly in cascade. Countless klunky, slow, buggy, bloated, unmaintainable Windows apps were written. Government agencies swore off of it in favor of other systems, even avoiding systems with OO at all, favoring APL or C. Windows programmers eschewed the complexity of C++ in favor of VisualBasic.</p><p>This created a bit of a paradox.</p><p>Was it the language responsible for all of these flawed designs and executions? Did adding objects, destructors, multiple inheritance, and so on, create a language that it just isn't possible to write clean code in?</p><p>For a long time, people thought so. "C++ is a messy language", they said, conflating the learning curve of the syntax of the language with learning to design objects and APIs that made sense.</p><p>Gosling and those behind Java seemed to think so. They threw away multiple inheritance, operator overloading, and a pile of other things. For a time, Java code was clean. So clean that they started teaching it in schools. Projects started demanding it and all across the world, people with no prior programming knowledge quickly ramped up on the language. They joined the workforce and wrote buggy, overly complex, unmaintainable code.</p><p>Simula's idea of OO was to provide a new abstraction to programmers with which to better model the interactions of the parts of a complex system. If a programmer could conceptualize the program in a way that drew parallels to real objects acting on each other, the objects and their interactions could all be better understood. In so much as that's true, there's nothing wrong with OO and no reason it should lead to unmaintainable code.</p><p>Much earlier, Donald Knuth wrote _Literate Programming_. Local variables and functions were the celebrated abstraction with which people were making huge messes. Knuth sat down and asked what was going wrong and speculated about how those problems might be avoided. It proved far easier to offer people a new abstraction that they haven't yet ruined than to get them to example the psychological, social, and technical forces driving them to write crap.</p><p>When the C++ guys realized that not only were they writing terrible code but that they were predictably and consistently writing terrible code, they too sat down and put some of their profound intelligence into asking why this was. This was the birth of the field of "Object Oriented Design and Analysis". There were a lot of things that people tried to do with objects that just didn't make sense and didn't work in the long run. There are a lot of early warning signs of failures to conceptualize things well and map them to objects.</p><p>The Java guys, determined not to repeat the mistakes of the C++ guys, adopted less analytical, more rule-of-thumb versions of this and called them "Design Patterns", "Code Smells", and so on. They fairly successfully marketed to their own ranks the ideas of studying design for design's sake rather than merely learning a language.</p><p>The Perl camp briefly and very superficially flirted with these ideas too but the glamour wore off and sitting down and just winging it and seeing where it goes is just so gosh darn much fun. History is boring. Of course, if you've read history, repeating it is boring.</p><p>Even this sort of backfired for the Java camp; understanding so well how to build certain types of abstractions, they went nuts building them all over the place and then managed to construct great big steaming piles of crap out on a much higher level -- they built them composed out of large scale constructs devised to avoid problems at lower levels.</p><p>While Java failed to convince everyone to actually studying the inherent follies and blunders novices make when designing software in hopes of avoiding them, it did introduce the world to a different idea: rather than using objects to model programs in terms of parts that interact, use objects to create massive APIs.</p><p>It's a big step backwards but it kind of wins in the "worse is better" department in that it's a lot easier for people to wrap their heads around than OODA. The language vendor created a whole lot of objects for you to use that represent different parts of the system they built, and if you base your application largely around using these pre-created objects, you're less likely to fuck things up. The win of objects became easily traversing large sets of documentation to figure out how things are done. If you get a Foo back from calling Bar.bar, you can instantly look up the docs for it and see if maybe you can weasel a Baz out of that to pass to Quux. This began the umpteenth documentation push which started with flowcharts, spent years on massive shelves of spiral bound printed pages from vendors, and, at some point culminated in programmers having bookshelves full of O'Reilly books before branching out into CPAN display module docs.</p><p>Everyone got tired of Java and gave up hope that any sort of cure for the worlds programming ills would emerge from that camp. Yo dawg, I heard you like MVCs so I put an MVC in your MVC! Even reading histories of how Java fucked things up and repeatedly missed the point is boring, and history is interesting. Java just smears boring all over stuff.</p><p>Meanwhile, the Python folks were writing much better software without nearly so large of sticks up their butts. No one knows how the Python people did it so I guess they just supposed that Python is magical and code written in it is clean. Likewise, no one understands why Ruby is so magically delicious, so like amateurs banging on a piano keyboard after the maestro gets up, they're trying their hand at it.</p><p>Back to present day and Perl. OO and OO abstractions mean something entirely different than what the Simula guys had in mind.</p><p>Now, when we sit down and create a system, we don't conceptualize the parts of the system and their interactions. We don't model the problem space using analogues to real world ideas and real world interactions. We don't search for the correct idiom.</p><p>Instead, we use APIs, and the more, the better. There is no User; instead, there are DBIx::Class ResultSet objects for a user or login table; there are admin screens that check things in that; there are Apache Response objects for talking to the user through; there are piles of Moose meta things that have nothing to do with hypothetical objects modeling a hypothetical universe but do a neat job of setting up neat things. Everything is abstracted -- except for the concepts the program is trying to actually model in its own logic. If there are objects definitively, unique, and authoritative representing things and nothing but that thing in a normalized SQL sense, then those objects are all thrown together in a big pile. A lot of the C++ OODA textbook's pages are concerned with finding idioms for how things interact to model those interactions. In Perl, we just pass stuff. And we're proud of our redneck selves.</p><p>In C++, and again in Java, and most certainly in Perl, we've shat on the grave of Simula. Smalltalk did not; Ruby had a blessed birth by virtue of drawing so heavily from Smalltalk. Python programmers tried hard to keep the faith even though OO is borderline useless -- nearly as bad as Perl's -- in that language.</p><p>If we were to sit down and try to represent our actual problem space as objects -- what the program is trying to do rather than the APIs it's trying to use -- we'd find that we're knee deep in shit.</p><p>This isn't one man's opinion. C++ programmers trying to claw their language's way out of its grave named parallel inheritance hierarchies as a thing to avoid; Java redubbed them as "code smells". If you have multiple objects each representing a user in some capacity, but not representing some limited part or attribute of the user, you have this disease. If you're using a bunch of abstraction layers to represent the User, for example, you have this disease.</p><p>Yet it has been escaped from. There are cures. You can have your objects representing things which your program deals with and have good abstractions to program in too. MVC frameworks aren't the cure but some people benefit from any restraint they can get.</p><p>And here it is, 2010. Everyone wants to learn the language syntax and fuck around with it, which is fine and great. But not everyone is here, reading this, being told that you can and will paint yourself into a corner in any language -- even assembly language -- and that it isn't the language's fault but especially: the language won't help you.</p><p>Perl programmers and Perl programs suck because Perl programmers think that rather than Perl fostering bad code, it'll help you dig your way out, with all of the magical things it can do. This is what C++ programmers thought. Perl programmers would be far better off if they actually thought that Perl fostered bad code and worked against this imagined doom.</p><p>So, let me say it: every programmer in every language, if he lets himself tackle large or ill defined enough tasks, will code himself into a corner. Not him nor perhaps anyone who follows him will be able to dig him out. The house of cards will implode. Trusting in abstractions of the language to save you will accelerate this process unless, just perhaps, you're privy to the lore. Books talk about how to design inheritance hierarchies that make sense. They talk about how to handle multiple inheritance and how to conceptualize things as objects. There's lots of benefit to modeling your problem space not as "objects" in the sense of the API you're using but in the sense of actors and props in a drama.</p><p>Like C++ programmers of yore, Perl programmers reliably, consistently build houses of cards.</p><p>As Ruby programmers start to build larger systems and have the time to grow things to that point, they'll discover that merely representing things as objects isn't enough, and that the interactions between objects are out of hand.</p><p>This isn't to say that I'm not susceptible to these same forces. I most certainly am. And I fear them.</p> scrottie 2010-06-04T07:19:57+00:00 journal Walking into a new codebase: tips http://use.perl.org/~scrottie/journal/40373?from=rss <p>Find out where the cache is. Rig tests or a test script to delete it before each run.</p><p>Make a file with the commands necessary to reinitialize the database to a known good state. Make that an optional step in running the tests.</p><p>Use the test suite as docs to find out which APIs are important and how they're used.</p><p>Use the debugger to figure out which cross section of the code runs for a given task. Step through with a mixture of 'n' (next statement) and 's' (single step including stepping into function/method calls). As you trace through execution, you'll learn which utility subroutines can safely be taken for granted and skipped over. Note which tables data gets stuffed into and pulled from.</p><p>Strategically pop over to another window and inspect database tables while in the middle of single stepping.</p><p>Write SQL queries to inspect database state. Even if you like DBIx::Class, it's handy to be able to simply write "select count(*) from foo group by state order by state desc" and other things.</p><p>If tests don't already inspect the tables for the correct left state, add tests that do. The utility queries will start life in a notebook or scratch file, get refined, then maybe wind up in a stub<nobr> <wbr></nobr>.pl, but don't stop there. Add them to the tests. Yes, tests should only test function, not implementation, but, in one sense, the API is probably just a database diddler with sideeffects, and its correct operation could be specified as not mucking up the database.</p><p>Get the code running on your local machine -- that should go without saying. Mock services, APIs, commands, and whatever is necessary to get the core code to run. Mock stuff until you get the code passing tests again and then start modifying the code. From one project, I have a mock implementation of a Vegas slot machine. My cow-orker and I referred to it affectionately as "ASCII Slots". It handshook, spoke XML over SSL, had a credit meter, tilted on protocol violations, and the whole nine yards. Furthermore, it could be told to abuse the client with a string of simulated network errors including every possible scenario for the connection having gone away after a packet was sent but before it was received, including for packet acknowledgments.</p><p>Before you start work, run the test harness piped into a file. After work, pipe it into a different file and diff it, or add the first one to git and let git show you what tests have started passing/failing when you do 'git diff'.</p><p>Comment the source code you're working on with questions, speculation, and so on. This will help you find stuff you were looking at by way of 'git diff'. You can always checkout HEAD on that file to get rid of it or else just delete the comment, but you may find that the comments you write to yourself as notes while you're exploring the code have lasting value.</p><p>Similarly to saving test harness output, save full program traces created with perl -d:Trace t/Whatever.t. Trace again later and diff it if you find that an innocent seeming change causes later tests to break. This can dig up the turning point where one datum value causes a while/if to take a different route.</p><p>If execution takes a route that it shouldn't have and meanders for a while before actually blowing up, add a sanity check earlier on.</p><p>-scott</p> scrottie 2010-06-02T07:14:31+00:00 journal Bane of the existance of IPSs... http://use.perl.org/~scrottie/journal/40356?from=rss <p>In Minnesota around '93, an ISP started offering unlimiting dialup. Through an arrangement with the state to make Internet more widely available than government offices and college campuses, they were to resell from the UMN.edu modem pool and the one T1 coming into the state to the general public. After my brother and I tag-teamed 24/7 for about two months, they changed their unlimited usage policy "due to the actions of one user".</p><p>That's after the 24 hour labs I was sleeping in, in one case ruining a keyboard with drool. Oops.</p><p>For years, I paid for dedicated SLIP connections, first in Minnesota then in Arizona, then ran a ppp emulator that takes a dial-up shell account and gives you full PPP with nat. I helped kill Global Crossing staying dialed in to the "shell account" 24x7. Downloads ran at night while I was sleeping.</p><p>When Ricochet hit Phoenix, I couldn't resist. $70 was not too much for total freedom of movement. Getting a real IP address and having the option of a dedicated IP was just awesome. Ricochet went bust, sadly. The modems were too expensive to make, and they had Alps Electric of Japan making them, and they had too many made, riding the dot com optimism, and uptape was too low. God bless Ricochet. I seriously doubt wireless will be half that awesome again.</p><p>Then there was Cox in the days before you could get a home NAT appliance. I had a FreeBSD machine doing NAT -- strictly illegal. One modem (yes, cable modem) per computer on the 'net was the policy. And no serving. I carefully recorded scans against my network for a month and a half and firewalled to refuse, not drop, any future probes from those that probed me, then started hosting crap. Cox never caught me. When I moved, their system completely hosed my account. They could ping me, even as I held the modem in my hand completely disconnected from cable, power, Ethernet, or any other connection. Clearly it wasn't me they were pinging but I could never establish that with them. This was the first time I tried to convince techs that the system was *un*plugged. Cox started off on the wrong foot and just stayed there.</p><p>After months of paying for service that didn't work, I had to cancel and went DSL for the first time. Not many people had DSL at that time. Quest was giving out Cisco 675s that you configured by typing commands at over a serial cable from a terminal program -- not xterm but minicom or HyperTerminal or whatever. Having good enough lines was a really big deal. Most people didn't. Most people still don't but they sell you the service anyway, refuse to fix the lines, and, for many users, DSL sucks in comparison to cable.</p><p>When wireless data again became available, I hopped back on the bandwagon, this time with T-Mobile. For $30/monthy, you could unlimited dial-up speed data using GPRS. They ran you through a "transparent" HTTP proxy that recompressed images and cached shit, or made it look like it hadn't changed when it was. This was extremely disruptive to a would-be web developer. It also ran you through two levels of NAT. It is and was ghetto. Outages were frequent. It was like Atlas holding the globe up, but instead of a big strong guy who cares, it was a fat stoner dude who could just barely reach the fridge.</p><p>The next major trend I just had to jump on was aiming high gain antennas down the road. Shitty MMMC connectors that have a service life of 0.5 insertions and increased WiFi noise as more and more APs go in and move more and more data seems to be making this unusable.</p><p>So, back to DSL. Except that it disconnects constantly and has less throughput than 56k dialup. It keeps training down to 64kbps when I try to use it and then probably drops and if not, has an error rate that's through the roof. I'm pretty sure that dial-up would handle the noise better. This was after it didn't work at all until I took it into the ISP (little local ISPs rule -- I sat in the FastQ office for 2.5 hours while they messed with the thing). It worked on DHCP but not static (with static IPs) and no one could figure out why but everyone was interested. I didn't press for details, but I guess it turned out to be "technical reasons". I had a great time chatting about tech and old school shit while this went on though. I feel like I need to go hang out at the ISP more, and bring pizza next time.</p><p>I'm seriously tempted to turn that cell data back on. I might also give CDMA data a whirl, care of Cricket, who wants $40/month, no contract, 5gb/soft limit.</p><p>I could easily imagine having the gateway machine with the dipole antenna soldered onto the mini-PCI card and a 56k modem plugged in, this Keyocera CDMA data card sharing appliance box, the and DSL modem all running concurrently and me continuing to fumble around for a good connection all day long.</p> scrottie 2010-05-19T21:59:27+00:00 journal Nerd communication http://use.perl.org/~scrottie/journal/40308?from=rss <p>I've written before about how being at the computer makes you look non-busy. Old people especially assume that if you're sitting there silently staring, you must be *desperate* for conversation. That you could be goofing off or concentrating hard on work makes this ambiguous to peers to.</p><p>I've also written about how it's impossible to communicate to clients what constitutes an emergency. Giving out my cell number to clients has never worked. I've been drunk dialed by chatty clients. Not being able to get to ESPN.com is an emergency related to the shopping cart somehow. If you yell at them, then they don't call you when orders get wedged or the site goes down. The amount of emphasis required to convince a client to only call you in case of an emergency exceeds the amount of emphasis needed to make them never call you. Same thing goes for the people sitting next to you. Add in that coding sessions can easily be 16 hours long and you're making completely unreasonable demands of people, socially speaking.</p><p>If you're going to write code in a non-trivial sense, you have to jealously guard your concentration.</p><p>That's nerd-non-nerd communication. Nerd conversation is something else entirely.</p><p>I had a hissy fit on Twitter recently when I realized -- or rather, when it was pointed out to me -- that an RT (request tracker) was automatically, silently created for my various modules and people had been filing reports in them for *years* without any notification being sent to me. I, Marc Lehmann, and apparently no one else thinks this is a huge problem. I stewed for a while pondering how anyone else could possibily think that this design is okay, until I realized that it fits this model: opt-in communication. The fact that bug reports get bit bucketed until the programmer goes looking for them is exactly what programmers want.</p><p>Let me tell you a story of Life in Programmerville. The highway department schedules road construction a month in advance using shared calendaring. If not everyone is able to make time for the road construction event, they'll postpone and attempt to reschedule for next month. Your large house with a second story (with an attic) has a basement. The doorbell has auto-away and won't ring unless you've been seen unidle in the past 10 minutes checking your doorball status. If someone presses the doorbell while you're (auto-)away, it'll log that someone pushed the doorbell and when you next go to check your doorbell status, you'll see that someone came and went and probably take a digital photo of them because that seems like a spiffy feature. Houses all have fiber so that there's no phone number that telemarketers can call that causes ssh sessions over DSL to timeout (and god knows you don't have a phone connected to the land-line anyway). The cell is set to silent before bed so no one wakes you up too early and then set to ring again before you go out for dinner in the evening so friends can tell you if they're going to be late. During the day, when the inclination strikes, you check Twitter, Facebook, LiveJournal, use.perl.org, perlbuzz, CPAN's new module feeds, SMS messages, email, RSS feeds, github, work email, various bug trackers including RT for your Perl modules, your httpd_access logs, dmesg, top, reddit, digg, and Slashdot. If one unexpected pop-up appears on your screen, you flip the hell out and don't stop modifying things until you've discovered five more holes and plugged those too.<nobr> <wbr></nobr>... which is a ranty way to say that programmers, in the name of controlling when they're distracted, strongly favor polling over an interrupt based model. And that's generalizing, of course. All generalizations are false. I know.</p><p>Programmers have a lot of sympathy for other programmers but sometimes this model makes it very difficult to reach another programmer. It might take hoping between different communications mediums for a while before you catch them where they're polling, and the polling cycle might be very slow indeed. Too old of events get discarded either through having been pushed past the point where they're seen or else recognized as probably expired and intentionally ignored. Sometimes you have to try to get a message to a programmer weeks or months later. Though they follow vast numbers of channels -- or more likely because they follow vast numbers of channels -- each message is treated as far less significant.</p><p>Broadcast-subscriber-poll mechanisms are especially popular -- Foursquare, Twitter, commit logs, IRC, etc.</p><p>I critically failed to generate chatter on git... specifically, the email gateway for commit messages. I accidentally opted out of a very important communication mechanism, for lack of being able to see other people's communications due to a snafu. Everyone probably took it as intentional concentration defense tactics that they didn't like but were extremely disinclined to mention to my face out of this sort of programmer sympathy concentration.</p><p>A recent commenter (hello!) spoke of Asperger's and perceived arrogance. I don't have good data to speak from, but I have to wonder if years and years of trying to jealously guard concentration and tending to prefer quiet over spurious interrupts would create this social veil. Practicing ignoring verbal communication can't be good for learning non-verbal communication. And isn't arrogance essentially being more interested in what's going on in your own head than what other people have to say? Isn't everyone interested in their own thoughts essentially arrogant?</p><p>-scott</p> scrottie 2010-04-09T23:14:42+00:00 journal "Do you think Nintendo would allow Flash-cross-compiled..." http://use.perl.org/~scrottie/journal/40307?from=rss <p>Apple fanboys never cease to amaze me.</p><p>This gem is circulating on Twitter: "Do you think Nintendo would allow Flash-cross-compiled games?"</p><p>Um... _yes_. Quite a lot of companies are in exactly that market -- game engines. Any game made in the last ten years is a mish mash of licensed code, runtimes, toolkits, translators, portability wedges, cross platform libraries, and so on.</p><p>Apple is not acting like Nintendo. It's acting like Nokia did with the N-Gage. Except Apple actually put 3D accelerated hardware in their device. But Nokia at least had a D-pad and buttons.</p><p>20 years ago, Nintendo tried to lack developers into exclusives. In those days, writing a game for one system was so radically different from writing it for another (a "6510" 6502 variant in the NES vs Z80 in the Sega Master System, with the games written in assembly with custom bank selectors, for example) that porting code was out of the question. SNES/Genesis continued roughly along these lines except C became a possibility. By the time the PlayStation 1 and N64 came around, development was mostly C (with some custom assembly for the DSP). Then you could actually write the bulk of the game for both systems at the same time. Attempts to abstract away differences in GPUs have continued by 3rd parties ever since.</p><p>There's a lot of great stuff being written for the iPhone, but there's also a lot of great stuff being written for the Wii and Xbox 360 online market places. To tell developers that they have to write entirely separate versions for each (requiring code to be originally written in Objective C and not translated) is to add significant amounts of work to anyone who would want to write games for multiple platforms. They've created an artificial us-vs-them scenario, apparently hoping the other guys lose out.</p><p>When you advocate without having your facts straight, you've gone from being a tech enthusiast to a marketing tool.</p><p>-scott</p> scrottie 2010-04-09T22:15:56+00:00 journal On the bright side... http://use.perl.org/~scrottie/journal/40272?from=rss <p>My six month contract is up. It's been a busy six months.</p><p>I tell people that consulting is feast or famine. I've enjoyed the feast. I've been eating good food from the farmer's market. Before that, I've killed a 25 pound sack of rice and a 50 pound one along with crazy amounts of dried black beans. I've put on weight. I was down to 155 pounds for a while.</p><p>The firewall/gateway machine was apparently suffering some severe hardware problem that caused it to crash more and more frequently. I got it booting memtest only to see that memtest was reporting the same unhandled page faults that Linux was reporting. Daaamn. That Toughbook was replaced with another. I have to find somewhere to recycle a 20 pound blob of magnesium now. That's just one misadventure of mine these guys had to endure that's now taken care of and the next guys won't.</p><p>Speaking of Toughbooks, poor old fluffy, the CF-R1, got replaced and retired, first with a CF-51 then with a CF-73. The 51 proved too bulky to haul around under my own power all over creation. It actually worked the backpack zipper loose once and leapt out hitting the sidewalk at a good speed (and survived, minus a chink in its armour). I could have pulled the 32gb Compact Flash card out of fluffy and stuck it directly into the 51 then 73 but fluffy was still on a 2.4 kernel and upgrading to 2.6 was one of the goals. 2.6 wasn't setting up AGP correctly on the thing so I had to keep it on 2.4 and glibc had long since passed 2.4 by and that was making it difficult to use things like alien and klik to install software on it. That and the 256 megs of RAM made upgrading critical. I decided to try to upgrade the existing OS and keep the software. That proved to be a huge time sync and a disaster. If I had it to do over again, I'd have just done a clean install. I used to be able to manage that but low level Linux deps have gotten far more complex. Worse, Slackware on these Toughbooks with the Intel WiFi hardware was -- and this blows my mind but it's perfectly replicatable -- looses my packets in Chicago. I don't think it's the Intel WiFi either though the firmware crashes constantly and constantly gets restarted. traceroute in Knoppix on the same machine shows almost no packet loss or lag; traceroute in Slackware on the same machine shows massive packet loss, terrible network performance, and high ping times. It may or may not impact wired connects. It may or may really have anything to do with the router in Chicago; the WiFi stack may have just been systematically losing replies and leaning heavily on retransmits of unack'd packets. This problem proved disastrous. Trips to Seattle and Minnesota as well as in-town coffee shops (often when the home network was down!) left me still stranded without network. An income left me the chance to upgrade hardware and that chance bit me in the ass. But, on the bright side, it's sorted out and next go, I won't be fighting with this one. I should be able to squeeze a couple years of this machine. And I have a spare. These puppies cost me $60 each on eBay. I've been bashing the Linux users that act like Microsoft users in reinstalling the latest version of the OS at the first sign of trouble, but I have to give it to these guys for being plucky and jumping into battle with just the tools de jour and doing a fantastic job of wielding them.</p><p>The thing with Slackware and Chicago reminds me of a certain FreeBSD NAT at a small company in Scottsdale that absolutely would not speak to a Windows webserver that one of their clients needed for work.</p><p>I got to spend some more time with jquery and I love it. I almost hate to say it, but HTML (heh, DHTML) is turning into a first class windowing toolkit. Compared to AWT or Swing or most things built on top of X, all of the redraw event and clipping stuff is hidden from you and still optimized, and HTML+CSS is far richer for describing a GUI than the XML that can be used to build GTK windows. HTML isn't a programming language but it's a fantastic example of an imperative language nevertheless. It declares things. Perl does things, one at a time. Creating apps that run in a webbrowser feels like a terrible abstraction inversion but I have to remember that these things change with time. Hell, Apples run "display PDF" and render HTML widgets in the toolbar. Anything is possible. I was playing with JavaScript in Netscape 1.2 (or was that 1.3?). Almost everything was arrays with numeric subscripts. Elements didn't have ids. You'd diddle document.form[5]. Things were buggy beyond description. It's come a looong way, baby.</p><p>I got to spend some serious quality time with DBIx::Class. I tried hard to be open minded but this has really cemented my feelings about ORMs. SQL is a powerful, expressive language. We're living in an age that finally values domain specific languages. Regex is one; it rocks. SQL is another. It rocks at imperative "programming" against relational datasets. Trying to replace SQL with Perl is dumb. That would be like trying to rewrite a good awk script in QBASIC. Or like writing a Perl program to, step by step, add visual elements to a screen (hey, that's what Java Swing does!). Sure, QBASIC is a more general purpose language, but it does not do what awk does, at least not cleanly and that's in the simple case. In the complex case, it's just downright painful. I know people don't like to mix Perl and SQL but for chrissakes, we're working with HTML, JavaScript and CSS already and probably lots of other things. There are some useful abstractions in DBIx::Class. My stabs at abstracting DBI dealt with those rather nicely I think. I should release some of that. I guess I was doubting that it's still relevant but I think it is. One thing DBIx::Class does do that's neat is deploy the DDL (data definition) for you. If you deploy Perl somewhere and point it at a database, it'll create the tables, constraints, and so on. Sweet! I described using DBIx::Class as reminding me of a Mr. Bean episode where he winds up trying to carry a sofa and other furniture home in a tiny car and has to rig up a system for steering, braking and accelerating using broom sticks that he actuates from on top of the sofa on top of the car. All of the indirection did not help; it didn't even just get in the way; it made the job almost impossible, comically so. Rather than just using identifiers that exist in the database, relations get nicknames in the Schema files. With this auto-generated stuff, you're refering to foreign tables by using the name that that table uses to refer to the current tables primary key. I think I have a hard time wrapping my head around anything I cannot fathom; so much of my hackery has been based on colliding my own design sense with others and anticipating good designs that I find it almost impossible to anticipate a bad design. But I'm better educated in this department to brush up my code and release it.</p><p>Then I got to spend some quality time with git. I did sit down and read about its internal datastructures. Bootstrapping knowledge is hard. A lot of stuff on the Web is misleading. This happens with young technologies -- people who really shouldn't pretend to be experts. Tutorials assume overly simplistic scenarios that get blown away when you're working with people that know what they're doing. You need to know a certain amount to be able to see signs that something is misleading or incorrect. I think git's reputation stems from the sort of alpha geek who early adopted git. These people get excited by cool technology but lack some human-human teaching instinct. They declare things to be "easy!" very readily and rattle off strings of commands that can easily go wrong if considerations aren't taken into account. They have no idea they're doing this. I'm generalizing several git users I've been exposed to for some time, here. I think Perl users tend to fit this same class. We're so good at what we do and we've been doing it for so long, we forget the pitfalls that novices step into. Every problem is "easy!". We're jumping to give the correct bit of code but failing to communicate what's needed to conceptualize what's happening or to otherwise generalize the knowledge. To those on the outside, this creates the impression that the technology in question is overly fickle and overly complex -- somewhat ironically, since the attempt was to portray it as easy. Anything unpredictable and hard to conceptualize is going to seem "hard". But I'm beginning to be able to conceptualize this thing. At least one individual in the git camp showed enough self awareness here to communicate that understanding git's datastructures is the key to understanding git. git does awesome things, no doubt, and with power comes a certain amount of necessary complexity. This complexity, be it in git or Perl, cannot be swept under the rug.</p><p>Then there's Zeus. I spent a week, on and off, just looking all over creation for the WSDL, to use as a cheat to figure out what the data being sent to the thing was supposed to look like. Turns out that even though the Zeus site insists that it has it, it doesn't but it can be found in the most unlikely place -- the help screen of the Zeus machine itself. Even though Zeus's admin and API are implemented in Perl, there are no Perl examples in the docs of using the datastructures. The various resources that must get built to set up a functioning loadbalancer through the API are numerous, haphazard, and badly non-normalized. The documentation lists almost every function as taking (Char[] names, Char[] values). Bloody thanks a lot. Names of which other resource? What's valid for the values? Sorting out the API took a lot of the same sort of reverse engineering I was doing right around 2000 trying to puzzle out the numerous bank credit card processing gateways before Authorize.net came along, published some *good* documentation, and ran everyone else out of business overnight (even though Authorize.net ran on Windows and had outages that would sometimes be long enough that the credit card companies reversed the charges -- something like 5 days). It's always good to get the chance to work with a product that's *expensive*. I can play with Linux all day long but you have to get a job at the right place to get to touch certain things. I should do an online article detailing what I've learned.</p><p>Oh yeah. And I got to spend a little time with Nagios. The logic for sucking Perl plugins in is just cranky. It violates an anti-pattern in caching failure and it doesn't signal failure in any useful way. I actually doubt that it knows the difference itself.</p><p>I have to thank these guys for investing in me to learn one thing after another. I wish I could repay that investment.</p><p>I haven't lost my touch in tickling bizarre bugs.</p><p>I practiced being nice, at the urging of friends. I'm still a deer in the headlights when confronted with hu-mans; I can never conceptualize what's going on in their heads, and I think my stress in facing them stresses them back out. It's like if you encounter an alien in the woods and you're all like "ARHGH!" and he's all like "ARHGGH!" and so you're all like "ARGHGH!". It's just no good. Even being nice, I have to learn how to distribute warm fuzzies. My normal model seems to be to antagonize people into answering questions. If things don't make sense, I tease people for being apparently silly. People *hate* being unintentionally, apparently silly, so this is a fantastic way to get them to answer things -- they stop what they're doing and vigorously explain away. Putting down that tool was hard. Learning other tools I expect will also be hard. Anyway, this was a wanted chance to experiment with that one.</p> scrottie 2010-03-26T06:36:40+00:00 journal A programmer just like me! http://use.perl.org/~scrottie/journal/40194?from=rss <p>There was a Red Dwarf episode (spoofish British low brow scifi) where Rimmer, the officious, slimey roommate managed to get another hologram/clone of himself going. For a while, they annoyed the hell out of Lister, the other roommate. But then the two Rimmers became each other's worst nightmare.</p><p>Everyone thinks, gee, wouldn't it be great if I had another programmer just like myself? You kick out tons of code each day. You don't really get stuck on stupid things. Everything is logical and orderly usually all the right technologies.</p><p>Everyone wants this mysterious ninja coder. I'm sick of hearing about him. He writes lots of code and gets things done fast. He doesn't get stuck on things or need a lot of help. He agrees with style and module fashion and agrees you know software architecture.</p><p>Today, I kept hearing "we need really good programmers". I also kept hearing "I'm concerned those guys have been doing things their own way too long and aren't adaptable enough to our way".</p><p>You think you know how to architect but you don't. If you had someone just like you, they'd still think your code sucks.</p><p>The people that wrote the code can quickly add features to it and work on it. That's always how it is. That means nothing. Look at awstats. It's a fucking nightmare. Yet, for ages, it grew rapidly, adding features left and right, doing more and more things, and was one of the de facto stats packages. It's a mess of global code, global data structures, and undocumented transforms.</p><p>But really I'm making excuses for myself. I suck. I've taken stabs at it already but I need to compile a list of things I've done to blow this gig. That's another post. Subconsciously not willing to put code in unless I can convince myself that it's correct coupled with this mass of undocumented side-effect riddled code is perhaps the wost offender. Two gigs ago didn't work out so well, last time I worked for a medium sized company, so I swore to do things differently. I think I preferred that approach -- I instrumented when I needed to instrument and when people didn't like it, I told them to piss off. Trying to do things other people's way was educational. I know a lot more about several technologies now. Trying to swap out my development philosophy was just destructive and pointless. I will never, ever get the hang of this.</p><p>-scott</p> scrottie 2010-02-19T07:23:19+00:00 journal "Mozilla too much like Windows?" addendum http://use.perl.org/~scrottie/journal/40185?from=rss <p>http://news.cnet.com/8301-13505_3-10453399-16.html?part=rss&amp;subj=news&amp;tag=2547-<nobr>1<wbr></nobr> _3-0-20</p><p>I thought, oh, goody, finally someone else has woken up to this and documented it in a main stream news outlet. But no. His analogy is limited to sluggishness and "community".</p><p>Look, people. Mozilla crashes. It lacks the stability that used to be the hallmark of Unix software. Unix software didn't blame the user -- "you have too many tabs open, no wonder it's crashing". Unix software has traditionally withstood abuse.</p><p>If a user on my system launches firefox while another user is running it, the first user gets a new window with all of their start tabs in it open again and the second user doesn't get anything. What the fucking hell?! That's another way that Firefox is like Windows -- it's so preoccupied by the single user experience that they throw security out the window.</p><p>There's this bug report of Mozilla ruining an engagement when it leaked data from one user profile to another: https://bugzilla.mozilla.org/show_bug.cgi?id=330884</p><p>That's not an isolated case. Any "privacy" has been half assed and the creators repeatedly forget about the foibles of before and repeat them in various incarnations.</p><p>If one user launching firefox affects another user running it, then I seriously doubt that there's any real security -- security that's *free* with Unix unless you work to defeat it -- to protect the first user's data from the second. Actively working to negate free security in the name of rushing a whiz-bang product to market is decided from the Windows camp and Firefox does that over and over again.</p><p>It nags you to be your "primary browser". On Unix. I'm probably not running IE on Unix and if I am, it's not successfully hijacking "primary browserhood" from Firefox. Instead, it's fighting against things like Konq. Unix software shouldn't and, aside from Firefox, generally doesn't fight with other software in petty little squabbles like this. This was entirely a Windows thing.</p><p>Security. Abysmal. Even the legendarily bad Unix software such as sendmail and NFS/rpc require extremely few patches and settled down into extremely infrequent vulnerabilities much more quickly. Firefox makes sendmail look secure. That's pathetic. Bind 9 is a fortress by comparison to Firefox. And no, frequent security fixes is not the same as security. Bugs are inevitable, but endless torrents of them with no slow down in sight was not the norm before and it's tragic that it is now.</p><p>It's very much the Unix way that there be multiple competing offerings to do any task. Unix is modular. People readily replace 'ls' and 'more' with versions they like better. Firefox effectively had complete market share. The next contender was Konqourer which in olden days was fast and light but quickly turned into another bloated, crashing, sluggish monster. Now there's Chrome.</p><p>I held out with links -g and w3m (with the graphics patch, inside screen) as long as I could. Both of those are stable, well behaved programs. galeon was also fun. Eventually, I found I had to use Firefox or Seamonkey. Granted, the Web has long caused problems for Unix. Anyone remember client side VBScript/ASP, or Microsoft's IIS forcing clients to do NT domain authentication to access Web resources?</p><p>User interface race conditions were squarely in the domain of Windows. You should never have a field selected and then find that the program is modifying it at the same time you are. Yet Firefox's URL bar will suggest completions and then take unrelated keystrokes as confirmation that you want them; moments after you click into the URL box and after you've started typing, it'll highlight a chunk of text causing you to start typing over stuff; it'll move the cursor moments after you carefully click into the field where you want the cursor, either to the end or beginning. I recently found that I couldn't disable searching Google. I was trying to paste some javascript: bookmarketlet into the URL bar and every time I hit enter or clicked the button, it searched Google for the JavaScript. I found an explanation of how to disable that behavior -- remove all of the<nobr> <wbr></nobr>.xml in whereever/firefox/searchplugins and restart. The behavior continued. I looked in searchplugins and Firefox had replaced the<nobr> <wbr></nobr>.xml files. Fuck! There's no allowance for people who hate Google. The choice was made for you and thrust upon you -- they want money from Google therefore you're going to be scarified at Google's alter and if you don't like that, fuck you.</p><p>Just like Windows software, Firefox treats its users like shit. It's controlling, patronizing, combative, and demanding.</p><p>-scott</p> scrottie 2010-02-16T17:41:10+00:00 journal Appauling unprofesionalism and sexism at Frozen Perl http://use.perl.org/~scrottie/journal/40171?from=rss <p>Someone -- male -- noticed and loudly proclaimed having noticed that a "sexist" word was used on the workshop's IRC channel. Of course, they didn't say what or who said it. I'm concerned it might have been me. I said that a certain talk "sucked donkey dong. and not in a good way". Perhaps I said other things.</p><p>Since the Ruby conf fiasco (link below), we've been a bit trigger happy to identify, point out, and fix anything possibly offensive to... well, I guess, anyone who could be offended.</p><p>http://theworkinggeek.com/2009/06/dirty-presentations-xkcd-and-the-perils-of-14<nobr>0<wbr></nobr> -cha.html</p><p>Commenting on that link (which commented on the Ruby presentation): It seems like they're saying that because it's stereotypical of adolescent geek sexuality, it's bad. Yes, there are bad (especially of what's underdeveloped) aspects in geek sexuality, but not everything in geek sexuality is automatically bad.</p><p>Grown men can't be juvenile; nether can geeks. The other day, I threw a small beanbag embroidered with the world "douchelord" at a woman (a well humored one -- she's married to a sometimes adolescent college roommate I had and she's the one who bought the beanbag) who was going off the plot intricacies of the _three_ Twilight books, which she read. I was being subjected to female adolescent sexuality and it was painful. Thank heavens (I believe in empty heavens free of deities and free of undead humans) that, as adults, we're able to communicate with the opposite gender. Or not.</p><p>I don't claim to offer wisdom here. Perhaps I should claim to offer the point of view of the average sexist idiot.</p><p>As it stands, (and I was told this -- I didn't count) two women attended the workshop which (ditto) had about 140 people at it. Should our concern really to be try to make it even more "professional" in hopes that the extremely few extremely tiny omissions are what's keeping the floodgate of women closed? And if it's so important that gender balance be struck in gatherings, why aren't we going to pottery class and book club?</p><p>Often I suggest on IRC and in person, through off color jokes, that I like sheep. I am essentially from Minnesota, after all. Sheep are not offended by this, I'm pretty sure -- the jokes, I mean. But humans often are, are often in the same way and to the same degree as any other remark I make, and as far as I know, there's no goal to try to make Frozen Perl more friendly to sheep.</p><p>I suppose my goal is, as has always been, is to offend people. It's my non-outgrown geek adolescent sexuality behind the wheel. I realize that it's possible to make inside jokes where the only people who would be offended don't get the joke, but that doesn't serve my goal. I suppose I want to tweak the normals. I hang out with freaks. I adore them. Lack of freaks makes me anxious. Frozen Perl makes me anxious. The phrase "white bread" comes to mind. The one mohawk went a long way to making me feel at home. Minnesota does pretty well on the freak scale -- better than Phoenix. Perhaps -- I can't say -- that this feeling out of place gives me some shared empathy with the primary demographic that we're so eager to make feel comfortable.</p><p>I guess when it boils down to it, on one hand, we're a barren wasteland of human sexuality trying to make people of different sexes and sexualities feel comfortable, and on the other hand, even though we've trained ourselves to react with horror and disapproval at anything resembling a gender issue, geeks are generally among the most non-judgmental and open-minded.</p><p>If a certain -- and that standard is set to a very low, non-demanding level -- if a certain level of color can't be established at a con, my bored brain might make it it's goal to get thrown out. I might fancy myself the title of the first person to get thrown out of a Perl con.</p><p>Hmm. "Uncomfortable and boring"... I've heard that phrase before.</p><p>-scott</p> scrottie 2010-02-09T17:04:06+00:00 journal Frozen Perl and Genetic Algorithms + Floating Point Tests http://use.perl.org/~scrottie/journal/40168?from=rss <p>My Frozen Perl Lightning Talk was on the topic of large legacy codebases, sort of. Specifically, it was on how to continue to grow them without having to look at them. A more serious talk might follow on this one -- one that involves useful code visualization and inspection. It's a topic that's been on my mind for a long time and one that my friends and I often discuss.</p><p>I monkey patched Test::Harness to accept floating point values for test results rather than merely "ok" and "not ok" and then used that as a fitness test for a genetic algorithm combined with a Markvo chains chatter box trained from Perl sourcecode. It kept alive and to bread permutations of the Markov-chains generated code that did better on the tests.</p><p>It was part political commentary -- "Perl programmers pioneered building massive unmaintainable codebases and it's Perl programmers who will take it to the next level!". Also with my monkey patching and the genetic algorithm's mucking about in the Markov Chains' package, I was showing off _hackish_ styled code. In what I consider good Perl spirit, interesting modules were wired together in interesting ways.</p><p>The first thing people heard during the day was that you shouldn't use anonymous subroutines (!!) or vi or emacs but instead use an IDE (!!). This was from a presentation done in Keynote with Comic Sans. The last thing they heard during the day was "experiment, have fun, explore".</p><p>I'm tired of the message at these sorts of things being "Perl is dangerous and has lots of sharp pointy bits that'll put your eye out... we don't want Perl to get any more of a bad rep so stay on the trail and keep your arms in the bus". Fuck that. There are precious few "at your own risk" languages. Assembly is one of them. C++ can be. Ruby hackers indulge playfulness in its place.</p><p>I keep telling people that if Perl turns its back on hackery, it'll lose that ecosystem *and* it still won't break into Java's and Python's. It'll wind up without a user base at all. Computer companies, languages, automobiles -- any product at all -- that's unclear on its identity winds up with no user base. Perl programmers need to know that Perl came out of sed, awk, BASIC-PLUS, and a pile of other languages, and that the sharp edges exist for a reason, and a lot of parts of Perl don't make sense in the context of yet-another-C-like-language.</p><p>Also, FYI... I've been writing a lot on http://scrottie.livejournal.com lately, including stuff that I should probably put or copy here. And there's me on Twitter too.</p><p>Oh yeah... the code is at http://slowass.net/~scott/tmp/Test-Float-0.1.tar.gz and in http://slowass.net/~scott/tmp/Test-Float-0.1/</p><p>-scott</p> scrottie 2010-02-08T20:55:11+00:00 journal Don't worry about it http://use.perl.org/~scrottie/journal/40119?from=rss <p>I struggle with corporate culture. Last time I worked for a medium sized company, I resolved to adopt a "don't worry about it" attitude. I was worrying about things that were other people's jobs; things that could not be fixed; but worst of all, I was worrying about things involved in my interpretation of my project, as compared to my actual assignment. Right now, I'm looking at a device that has to be programmatically configured. The Web UI maintains and pulls state from the database, not the device, creating the possibility that the two will get out of sync. That's one of many examples. I keep running through "what if" scenarios in my head. Working on code alone or in very small teams, I have to. Here, I can't do this. I have to do something, get feedback, and then go from there. I have to leave most of the problems unsolved. For this feature, I need to start with an on/off switch.</p> scrottie 2010-01-21T23:22:50+00:00 journal Less non-constructive thoughts on DBIx::Class http://use.perl.org/~scrottie/journal/40105?from=rss <p>Working with the Web, there's going to be some kind of a dispatch or system for getting requests to handlers and there is going to be the handlers. Then with a database there is logic to fetch from and store to the database. You've probably just stepped into a trap. In a useful object design, objects are named after the nouns the system was written to operate on, and after the verbs it performs on those objects. The program is a reflection of the problem it's meant to solve and enough of the problem's world. If you're writing an accounting system, you would, ideally, have objects for lineitems, accounts, etc, and if you're doing a visitor type thing, then for actions that can be done on other objects. But, chances are, that's not what you're doing. Your problem has shifted from accounting to that of dealing with the computer you're programming. Your objects are named after this problem space instead: request handlers; the database; and so on.</p><p>MVC won't save you from defaulting to dumping business logic into big classes because OO-ifying the business logic isn't as pressing as setting up the web side and database glue.</p><p>You could subclass your DBIx::Class resultset objects and trust your schema to model the actual problem. That could work nicely if your database is full of useful views and uses views to abstract away the changing, growing actual schema. Or you could make a dedicated effort at putting objects in front of record set objects for important things ("account", "customer", etc) and letting those serialize/deserialize themselves using DBIx::Class with a has-a sort of relationship to the recordset objects they correspond to. There are other things you could use instead of DBIx::Class if you only want persistence -- KiokuDB comes to mind.</p><p>I don't have any brilliant thoughts right now other than to say, don't do that.</p><p>-scott</p> scrottie 2010-01-15T15:01:41+00:00 journal Race condition Deja Vu http://use.perl.org/~scrottie/journal/40094?from=rss <p>Part of the last gig involved doing high availability and extreme high reliability (large amounts of money involved) between two systems without locking primitivies. Perhaps a future version of XML or SOAP will include locking primitives. Rather than speaking HTTP, it was just raw XML over SSL (with, as per regs, authentication repeated inside of the SSL connection... no single point of failure was the guiding design). Either the server could be rebooted or any of the clients or both at any moment and they'd figure out where they were. This wasn't properly planned for to start with and it proved to be a major bugbear that kept cropping up. The regs also required that once something was done, it would not be un-done. The nanosecond that a random number was produced, it had to be preserved. Even if someone had an ultra sensitive EMI reader and could pick the randoms out of RAM as they're generated (and the generator churned constantly so picking up the seeds with EMI would be of limited utility) and had the ability to crash the server at any moment if the randoms selected weren't to their liking, it still wouldn't matter because they would just re-appear after the server came back up. This means that the server could make an important decision such as what a random was going to be, send it to the client, then the client would crash before it actually got then, and when it came back up, it would have to figure out that the server was further ahead of it and it would have to replay things happen in the future. Sync without locking is a bugbear.</p><p>So, now I'm doing web stuff with XML/SOAP, memcached, DBIx::Class, etc with async agents that push...</p><p>-scott</p> scrottie 2010-01-12T17:07:57+00:00 journal Every Toughbook ever made http://use.perl.org/~scrottie/journal/39881?from=rss <p>I'm thinking outloud here but also telling a bit of a tale.</p><p>I could have taken a $1000 odd and bought a new laptop and really hoped I'd get on well with it, or else psyched myself into liking it by virtue of cognitive dissonance. Instead, I decided to buy used and not spend too much. So I bought a CF-51. It's has a 15" display. This is really bugging me. Among other things, it has worked the zipper loose on my backpack and leaped out of the bag while I was cruising down the canal on my bicycle. It survived minus a chunk missing. It probably wouldn't survive again hitting the same corner. This is a semi-rugged Toughbook. I ordered the other machine I was eying, a CF-T2. Compared to the R1 it's meant to replace, it's only slightly larger, its touchpad works (the button on the R1 is on the motherboard and it's worn out), and, critically, it takes twice as much RAM. I tried and failed to get ACPI to work. This is a dark art. There are no BIOS updates published by Panasonic for it and which exactly revision of the machine, BIOS, and Linux all interact here. I demand suspend of some sort. So I've been shopping again. I bought a T4, which is newer and takes even more RAM and most critically has a better armoured LCD. I cracked the screen on the R1 twice, though both under extreme circumstances (eg, getting run down by a Buick). But I'm also looking at the 73 which is a pound lighter than the 8 pound 51, has half the battery life, and has a 13" rather than 15" screen. It's better armoured than the T4 but has worse battery life at 3 hours and still weighs 7 pounds vs the 3 pound of the T series.</p><p>This must be awfully boring to read (seriously, why is anyone here?) but I'm obsessed with this.</p><p>I want light (my laptop goes lots of places), small enough screen to use on long Greyhound trips, good battery, durable (I wear laptops out but they also suffer harsh backpack conditions, and yes, the R1 was in a laptop case when it got smashed), suspend in Linux...</p><p>There's already a CF-27 fully rugged machine. So if I order that 73, I'll have a model from almost every line of Toughbook that Panasonic makes, excepting only the swivel screen version of the fully rugged machines and the hand-held industrial computers. I think I probably should write some comparative reviews...</p><p>-scott</p> scrottie 2009-11-11T18:50:06+00:00 journal My own worst enemy http://use.perl.org/~scrottie/journal/39880?from=rss <p>"by revdiablo (1502) Neutral on Tuesday November 10, @11:32PM (#71086)<nobr> <wbr></nobr>...<br>I guess rants are meant to be hyperbolic, but what the hell? You might want to ask yourself whether your wounds are self-inflicted. Frankly, it sounds to me like you're the cause of your own problems."</p><p>I'm elevating this comment into a journal entry and duplicating (with modifications) my reply.</p><p>Yes, I struggle with that question -- to what degree my problems are self inflicted. As I said (either here or previously), I've tried very hard to do exactly what other people do and just run Debian and let it update itself. I don't know why things that just work for other people don't work for me. Maybe I push them harder. Maybe my mindset is different. My brother and I used to hang out in an arcade and we know how to crash half a dozen of the games there. It's well known that programmers make terrible testers of their own software. It's just too difficult to be caught up in the minutia of the implementation of it all and to keep the larger picture in mind at the same time of how people are going to try to use the whole thing. Vast numbers of Windows users swore that NT 3.5.1 was stable, and that NT 4 was rock solid, and so on. Turns out that they are if you rationalize away all of the crashes as some how being your fault and accordingly constantly modify your behavior so as not to provoke the OS. I think there's some of that same mindset going on in the Linux camp.</p><p>As with the arcade, a lot of my problems stem from one thing: I try to do a _lot_ of stuff. I build hundreds of packages from source. I cluster machines using single image patches. When I learned to crash Primal Rage 2.x, I wasn't trying to learn how to crash it. I was merely trying to game the game so that only the challenger would ever have to put a new quarter in it. If you get challenged during the final sequence where you had to fight each enemy from the entire game one at a time with one (double sized) bar of health, the game would attempt to resume after the challenge. This worked the first time but the second time, after playing all the way through again, boom, red 68k register dump screen.</p><p>A lot of my motivation for posting these sorts of entries is because I really want to be contradicted. If I can be corrected, I can be educated, and sources of frustration eliminated. It also connects me to how other people think, even if I don't chose to adopt that way of thinking. Often I do chose to. Frankly, I spent a fuck of a lot of time hacking around by myself before I was exposed to other programmers and I willfully shut myself off from other people. It's easy for people to take for granted how socialized almost everyone is. "Common sense" is hard to define but it seems to include a lot of implicit knowledge picked up through observation about what works and what doesn't.</p><p>And part of it is that I'm stupid. I've accomplished a lot through doggedness but now I'm told, tired, lazy and burnt out, and very pissed about that fact.</p><p>Another part of my frustration is that, in doing things that other people don't, I notice when those things stop working. I used to be really good at de-hosing hosed Linux (and other Unix) systems. This was a time honored sysadmin skill. It's gone by the wayside. It's just too difficult now. I'd have fun before installing Slackware over top of Knoppix, for example, then sorting out all of the libraries,<nobr> <wbr></nobr>/etc files, and so on, to come up with a system with all of Knoppix' canned packages but with system updates and the streamlined design of Slackware.</p><p>Sometimes my yelling at people works. I used to go around complaining all the time about how fragile five and six stage bootloaders are and how much they suck compared to the old two stagers. Most poeple have no idea how convulted Linux bootloaders have gotten and how many problems that can cause and all of the ways this can go wrong -- but eventually I ran into someone who *did*. The conversation was extremely educational. I learned a lot about what different kinds of diagnostics means. How much of "LILO" ("L", "LI", etc) indicates how far it has gotten in early bootstrap before any wedge. To the lay user, it's just a big bag of features that works right if you baby it in that certain way that you know how that enables you to be a Linux user. To a developer working on it, it's a complex and cranky beast with unresolvable edge cases due to the features and complexity.</p><p>I guess ignorant, mindless fanboy-ism pisses me off about as much as my bashing on things pisses the fanboys off. And that's probably by design. Again, I very much welcome being schooled by people who know better than I.</p><p>I like to encourage people not to read my journal. I know I'm not being especially constructive here, but dammit, it's my fucking journal. If you don't find my remarks constructive and you don't have any to make, go away. I won't miss you. I can say that with confidence, from experience. If you were trying to be helpful, great, but I'm waay ahead of ya there.</p><p>As far as the kernel not building, you were close -- it wasn't me per se, but my home directory, and specifically, my<nobr> <wbr></nobr>.bashrc. In the way of experimenting, I eventually ssh'd back into the same machine as guest to ditch the environment and then it worked. My home directory was a constant between different OS installs. This was after I pruned out environment variables that seemed related -- LDFLAGS including<nobr> <wbr></nobr>/usr/local/lib, similar for CFLAGS -- with no love. Whichever variable or file in my home directory is responsible I'll never know.</p><p>So Slackware is off the hook. RedHat probably is too. Still, from reading through Google, a _lot_ of things can cause this. It's one symptom with a myriad of causes. That's frustrating. I think you'll find that if you get into the implementation guts of gcc, you'll be revolted. I guess that's key... never look at things beyond the surface. Never have to. Refuse to. Maintain the illusion. That's the only way to keep a handle on this stuff.</p><p>-scott</p> scrottie 2009-11-11T18:38:04+00:00 journal The day Linux stopped being self hosting; or, Linux sucks http://use.perl.org/~scrottie/journal/39849?from=rss <p>Background: I bought a new used CF-51 semi-rugged Toughbook to replace the ailing and failing CF-R1 [1]. I've wasted entirely too much time trying to get all of my crap moved over to a new OS install on the new machine.</p><p>The history of which OSes I've tried and what I've done to them has gotten quite long now, but most recently, I blew away CentOS and stuck Slackware back in as CentOS couldn't even compile its own kernel. I cursed CentOS as being stupid and decided I'd deal with the limited number of packages afforded by Slackware. So I go to build a kernel in Slackware because there's always something you need that's disabled... and lo, exact same problem. I had just been Googling for the problem with the word "centos" tacked on but I got curious and dropped any mention of any vendor and discovered that Gentoo and other systems had floods of bug reports of the same problem:</p><p>http://bugs.gentoo.org/show_bug.cgi?id=8132<nobr> <wbr></nobr>... every Linux vendor on Earth took a broken GCC and shipped a major release version that's not capable of building its own kernel.</p><p>Linux went non-self-hosting and most people never noticed.</p><p>There's no Slashdot headlines.</p><p>Perhaps we're still distracted with the flood of security announcements and still reeling from the profound ways that RedHat fucked up Perl, and Debian fucked up ssh, and so on. Perhaps our smashed expectations for Linux vendors to deliver a working desktop that we stopped caring that there's supposed to be a *Unix* *like* operating system under neigh. All we care about is that it has "Linux Inside". We don't care how much more convulted the init system is than HP/UX or how many more pointless CPU eating extensions it has than Solaris... only that it's Linux. It'll take ages for these idiot OS vendors to undo all of the good will and tarnish the reputation of Linux. As long as it's pretty, no one will look under the hood. They'll happily reinstall Linux over and over again to fix all of the stupid problems.</p><p>I'd love to make a serious effort to migrate to DragonFlyBSD since we have a possibly non-fucked-up BSD again now [2] but now days, "open source" software doesn't even build cleanly on Linux. Someone somewhere gets it to build just once at great effort and then it gets packaged into a<nobr> <wbr></nobr>.deb and never builds again. Try to build on something other than Linux and it's a huge project. Back when people ran shit like AIX2 it was easier to get a random package to compile for your random system. Making something compile on Ultrix was easier than getting something to compile on BSD. Infinite numbers of operating specific build crutches portability does not make.</p><p>I just want to say right now, all of you suck. This is the kind of crap that makes me just want to go work for Microsoft. Fuck you all.</p><p>Footnote 1: The cute, tiny CF-R1 is six or so years old now and the microswitch for the left trackpad button doesn't return any more. This same microswitch is soldered to the main board. Also it's maxed at 256 megs of RAM. It's also not as rugged as I'd like -- it's on it's third screen, though both were broken in events such as my being hit by a car. I really want a laptop I can use as a weapon against motorists... also Xorg doesn't seem to like the SiliconMotion video hardware on the CF-R1 and the 2.6 kernel had some problems.</p><p>Footnote 2: Previous rants were about how NetBSD, OpenBSD, and FreeBSD each, mostly out of fear and envy of Linux, screwed the pooch and made themselves obsolete by giving up the only thing that they had that Linux didn't: stability and sanity.</p><p>-scott</p> scrottie 2009-11-06T19:43:35+00:00 journal Growing the team from small to medium sized http://use.perl.org/~scrottie/journal/39820?from=rss <p>A few things have to change as a software team grows.</p><p>Your code organization system might be entirely logical to you, but that might be because you were there when it was written. Do the module names re-use the same few words over and over? "Manager", "Handler", "Master", "Server", etc are commonly overused, meaningless identifiers but any over-used words is a symptom of the same problem: meaningless names. New programmers will have to do archeology with grep to figure out how the thing is put together. Re-using the name of the company in multiple parts of the module path to somehow distinguish two parallel code hierarchies similarly creates confusion rather than organization. If a module lacks a meaningful name, it lacks a clear distinction for what goes in there as opposed to somewhere else. Also, the module names will start to sound like Monty Python's "Spam" skit. "Oh, did you mean to edit Data Manager Manager Manager Data Manager or Manager Manager Data Manager Manager Data Manager?".</p><p>Using a chat channel is generally a good idea. However, if new programmers are supposed to draw on the entire dev team for help rather than having any sort of per-project or general mentoring system, you have a lot of broadcast overhead. You also put each programmer in an awkward position of deciding at what point to finally step and help rather than hoping another has more time and knowledge of the matter.</p><p>There's a whole code standards thing. This combines with the organization problem. A new organization system will develop for each new programmer's limited understanding of the existing organization system. More broadly, are future programmers going to be even more bogged down? And what's going to be the priority then?</p><p>-scott</p> scrottie 2009-10-29T23:10:50+00:00 journal Sorts of telecommute... my routine then and now http://use.perl.org/~scrottie/journal/39774?from=rss <p>For NG, I was a consultant in spirit even though I was W2, not 1099. On a typical day, I'd crawl out of bed somewhere between 10 and noon. I tried not to sleep past noon as I wanted to get to any important email in a reasonably timely fashion. Also Yahoo! Small Business email just loves to silently drop email. I can't tell you how much confusion and frustration all of us endured using this piece of shit that the company was actually paying money for. Yahoo! is a spectre of a previous Internet century and needs to die. For a while, I had everyone using codenames for things we were working on. "P" was the poker game (quotes included). "B" was the baccarat game, and the various slots games had two letter abbreviations. The theory was that emails were scoring so high on the spam test that they didn't even make it into the spam folder. This hypothesis proved incorrect as Yahoo! Small Business Email continued silently vanishing emails mailed between users in the same corporate domain.</p><p>Anyway... some evenings I'd work late into the night. Other evenings, after doing some email and hunting a few bugs, I'd go drink or engage in some other social activity. Sometimes these would be in clusters; I'd do no actual work for a week, only baby sit getting a release packaged or debugged; I'd do nothing but bake goodies for an upcoming bike tour, bike tour, then recover. Other times, I wouldn't leave the house for a week, working 14 hour days.</p><p>When I got something done, I'd send an email saying as much. Very seldom would I be asked for a status report. Once I got grumpy because I was woken up to a page asking for a report in the morning after I'd been asleep mere hours and after I'd sent one before going to bed, hours before. Of course, it turned out that Yahoo! Small Business Email ate it.</p><p>I tried to push for chat just because that would allow you get instant confirmation that an important factiod had been delivered, but one coworker was on wonky hours as I was but different and the other had to run around to the physical locations of vendors, investors, and so on and really didn't have time to monitor chat.</p><p>The non-disclosure agreement was a verbal one. In Vegas, threats really aren't made. Rather than making pretenses that you'll be sued if you screw your employer over, instead employers actually ask around the grapevine about people. You don't get many chances. Of course, if you commit a felony, you'll either go to jail or else go on the lamb. All of this self importance of long contracts signed in triplicate and multiple agreements covering various aspects of work is just missing. Business is done on a handshake. The legal department can't keep people from being dumb asses.</p><p>Code is not over engineered. Almost everything you see in Vegas is either z80 assembly written for the bare metal or else Macromedia Flash. No one talks about design patterns, best practices, architecture, API design, or any of that. They do try very hard to write, as the famous quote says, "obviously no bugs rather than no obvious bugs". KISS is the guiding principle.</p><p>We'd all go out for beer after work, during office visits. Some times almost every night.</p><p>The code repo sat on a colocated Linux machine and a qualified sysadmin did what was needed for him to feel comfortable for us to be able to push and pull from it. There was a tacit understanding that we couldn't allow our laptops to get pwned or stolen with sensitive data on them.</p><p>Conference calls were done on an as-needed basis between those concerned when email wasn't cutting it.</p><p>I was at liberty to underbill rather than make a quota; this allowed me to research technologies or just read code without concern of logging unproductive hours. I never felt like I couldn't sit down with a book and read all about how to do something.</p><p>I wanted to write in praise of my experience working for Vegas, not to diss on Web companies, but I have to complete the contrast. I'll do it as quickly as possible. I've done a number of these now -- I'm not pointing my finger at any one company.</p><p>Regular hours; mornings; team dynamics; "good fit"; commitment to 40 hours; VPNs; scrum; speaker phone in to meetings;<nobr> <wbr></nobr>...</p><p>Sorry, I'm rehashing a common topic I've written about before. JaWS was similar to the Vegas gig; I've been in that basic situation twice now.</p><p>-scott</p> scrottie 2009-10-20T08:13:44+00:00 journal Optimism strategy http://use.perl.org/~scrottie/journal/39754?from=rss <p>Famous pro sports athletes sometimes try to transition from one sport to another. A basket ball player will take up football, or martial arts, or whatever. They don't assume that they'll be successful. Often they'll say it's for the challenge and to expand themselves. This kind of optimism is why they did well in the sport that they were in. They seldom manage in the sport they've joined. That doesn't mean that the attitude is a bad one though.</p><p>The world is full of kids who just learned a bit of programming and are making dynamic websites for people. Often this ends badly, as the sites get comprimised or fail under load. But it's a good strategy for learning to program and to make websites.</p><p>I'm not good at being an optimist; out of necessity I decided to try. I'm jumping in head first to a pile of technologies that I've been avoiding, and the idea of tackling another code base made my skin crawl. Honestly, the software market is better than the light industrial, retail, etc markets, at least in Phoenix. I know this. I tried. I learned that call centers really have vanished from the US. After the dot com blowout, I worked in a call center for a while. This time, I failed to find such a job.</p><p>And, as I chose to ignore, the employer is going to be unhappy. But next time I try, I'll suck a little less. I hit the bottle pretty hard sometimes but I'm thinking at this point I need to explore other drugs that take the edge off of plummeting serotonin levels associated with exploring a new, huge code base. Or I need to transition away from working on large code bases. Branching into any new technology (for me) is going to involve a lot of not knowing.</p><p>I have to remember that optimism isn't easy. It doesn't have the predictability that cynicism does. It involves a lot of "I don't know, but I'm doing it anyway". If optimism about ActionScript doesn't pan out, I need to use some towards HP/UX. I've heard stories of idiot sysadmins. An optimist strategy is to think that I couldn't screw up any worse than them.</p><p>-scott</p> scrottie 2009-10-14T16:01:52+00:00 journal Cynicism strategy http://use.perl.org/~scrottie/journal/39753?from=rss <p>I badly need a newer, faster computer as the requirements of modern software have outpaced the 1.7ghz P4 Celeron desktop and the 800mhz fanless laptop I have. I have a little cash at the moment. Yet I'm far more inclined to spend it all stockpiling dry good and canned goods from the dented can store. I'm extremely apprehensive about splurging for newer, faster hardware. Even a $230 gPC3 with a 2.0gha Sempron would be a huge upgrade from this P4 Celeron -- two terrible things together at last! Obnoxious number of pipeline stages meets stripped out ALUs and cache. What a dog.</p><p>In a very real sense, my cynicism here is a self fulfilling prophecy. What kind of programmer carries around an 800mhz fanless computer as their main machine? How many programmers have to take pace breaks because the drive light is pegged yet again while it thrashes around in its maxed out 256 megs of RAM?</p><p>In a sense, through my actions, I'm saying "I think I can't, I think I can't".</p><p>There are plenty of ways to divide programmers according to philosophy. I'm one who worries about the worst case -- the worst case run time of the algorithm, the race conditions, the boogie men lurking in the basement of the codebase, and so on. Other programmers are unarguably optimistic, who try daring things and celebrate the moment it works, perhaps rightly thinking that getting it to work well will be the easy part.</p><p>I'm drowning in leaky abstractions. I think the people who think VPNs are great are more easily able to ignore losing all of their sessions those several or hundreds of times a day the VPN disconnects. People who think that any technology is great are able to put up with the downside of that technology... whereas I'm stuck in a cost:benefit analysis that even if it doesn't rule against the technology, makes me acutely aware of its downside. From my point of view, people are overly eager to assume the negative costs of technology for the promise of benefits. Optimistic programmers put the rest of us up to our necks in abstractions we don't need and often don't benefit from at all. Yet you can't yell at someone for being an optimist.</p><p>-scott</p> scrottie 2009-10-14T15:33:34+00:00 journal