Yesterday, I got my X-Surf 3cc Ethernet card I broke down and ordered for my Amiga 3000. There's some backstory about serial consoles, Sparcs, and the cluster, but it's not important. The 3000 was also packaged as a Unix machine, running a pretty standard port of SysV. It was the first Amiga standard with an MMU and SCSI. It'll also kick out 1280x resolution graphics at 2bpp. Commodore sold an Ethernet board for it along with Unix on tape.
The X-Surf is really an ISA card, probably NE2000, mounted in a little carrier. There are confusingly few pins attached and the logic on the carrier amounts to a few small 7400 series chips and one slightly larger chip that also couldn't possibly have enough logic on it to do what it does. And then just to convince you that your nuts, it adds an IDE port that alone has more lines than the one little adapter chip does. The Amiga really is a machine for psychopaths, by psychopaths. Everyone sits around all of the time trying to out psycho everyone else. Just take a look at the demo scene for the thing. Amiga virtually defined the demo scene.
I have/had Amiga OS 3.9 on the thing. 3.9 is post-Commodore death. Someone bought the rights and sold them and someone bought them and sold them and so on until a sue happy band of self righteous ruffians managed to convince the remaining user base buying the rights at garage sale prices entitled them to be king of the squalid kingdom so that they could go around lynching anyone else trying to do anything for the Amiga. Anyway, OS 3.9 is pretty recent as far as Amiga stuff goes, even though it's ten years old. Most people stopped at 3.1. 3.9 only came out on CD-ROM. The 3000 doesn't have a bay but it does have SCSI, so the CD-ROM, when needed, gets hung off the side with the case open. I could also set up an enclosure and plug it into the back. I could also probably buy one of those.
X-Surf's stuff did not want to install.
X-Surf actually had an installer, which is impressive. AmigaOS 3.x has a scripting language for installers and an interpreter for that. This installer gave you the choice of two TCP stacks. AmigaOS 3.9 comes with a TCP stack but you can still swap it out. It's a bit Windows 3.1-like in that regard. The options are GENESiS/AmiTCP and Miami. GENESiS, the AmiTCP configurerer and dialer that cames with AmiTCP, was shipped in a version requiring libraries not included in AmigaOS3.9 so it wouldn't run. AmiTCP would, and AmiTCP was on the HD, though buried a bit. Miami is shareware/crippleware. It required the same library, MagicUI, that I didn't have.
I spent hours sorting out what required what and what I did and didn't have and how these various packages worked and fit together. That's ignoring the device driver for the ethernet card which is straight forward. The Amiga has a directory for libraries (which end in
Amiga programmers love to do ports of Unix software and add GUIs. They've been doing this for ages. They've had gcc since the early ages of gcc, and I ran the Amylaar MUD driver on AmigaOS 1.3 to do development locally, also in the dark ages. Kicking around on aminet.net from the Amiga, I see PHP, MySQL, Apache, bittorrent, Python, bind9, samba, VNC, and all sorts of stuff. No one ports just the client. If they port the client, they port the server, too. In the case of AmiTCP, the suite of utilities you'd expect are there, such as host, finger, traceroute, and so on, but to configure TCP/IP, you run a little GUI program and it asks you questions. It took Linux ages to get to this point and Amiga was doing it long before. One of the extras on the Extras disc, even as far back as 1.3, was a version of emacs with drop down menus.
My ambition is to get a desk in a shared office space going and stick this baby there with an updated video card that does high res, high bit depth graphics. If I'm willing to start replacing and upgrading chips on the motherboard, I can take the thing up to a gig of RAM, too, and NetBSD supports it if I ever decide I want to see how firefox runs on a 16mhz processor. What I'm really hoping for is someone to take the latest Coldfire chips from Motorola's spin off, Freescale, and do an 800mhz accelerator card for the Amiga 2000/3000/4000. That would RULE.
1. Ran backups
2. Verified integrity of ssh on my local system versus last backup; changed local passwords
3. Verified integrity of my linode chpass with md5sum versus previous backup
4. Locked accounts; fixed changes to shell for system programs, removed additional accounts, changed passwords
5. Killed root processes and shells; accounted for all of the shells and processes in ps
6. Compared md5sums of everything in ps, login shells, rsync, inetd, su, vmlinuz, ps and various things between previous backup and current
7. compared nmap to netstat -lnp; accounted for netstat -lnp entries
8. Ran find to find setuid/setgid programs; verified no additional ones exist; ran md5sum against existing ones
9. Replace sshd, ssh and their config files and host keys; restarted sshd; relogged and changed passwords
10. Upgrade sshd
12. Temporarily took some services down until I can decide if I trust/replace them (squid, cron, sendmail)
13. diff -r'd between the two backups; read through the output to account for all changes to the system (new files and changed files) (several notable)
14. Ran find to find world writable files; ran find to find device files in the wilds of the filesystem
My longwinded response to http://blogs.perl.org/users/carey_tilden/2010/08/removing-database-abstraction.
Don't get trapped in the mindset of "I have to use X because if I don't, I'm doing it wrong". Quite often, if you don't use X, it's entirely too easy to do it wrong if you don't know what you're doing. You probably don't want to re-implement CGI parameter parsing, for example. But that's not the same thing as saying that you should always use CGI because it's a solved problem so never do anything else. Nothing is a solved problem. mod_perl is a valid contender to CGI and now Plack is a valid contender to mod_perl. FastCGI was and is a valid contender to mod_perl and servers like nginx do awesome thing. Yet, tirelessly, the fans of one explain that the competing ideas are somehow not valid.
Sorry, I'm trying to do proof by analogy here. It isn't valid except to the degree that it is. I'll get to databases in a minute.
Quick recap: there are lots of valid approaches; using an alternative is not the same as re-inventing the wheel.
Furthermore, the heaviest technology is seldom the long term winner. Witness the return to lighter HTTP pipelines. For ages, Apache boasted being a bit faster than IIS, in response to which I could only wonder why Apache was so slow.
Okay, back to databases. DBIx::Class to a relational database is a valid option. It's also very heavy. It alo doesn't really let you scale your web app out unless the database in question is DB2, Oracle, or one of a few of those that runs on a cluster with a lot of processors rather than just one computer. Otherwise you've just added a new bottleneck. DBIx::Class makes it harder to do real relational work -- subqueries, having, or anything hairy. At the very least, you have to create a class file with a lot of boilerplate, reference that file from other files that you made or generated, and stuff the same SQL into there. Abstracting the querying away in simple cases makes it easier to query the database without thinking about it. This leads you to query the database without thinking about it. That's a double edged sword. In some cases, that's fantastic.
Lego blocks make it easy to build things but you seldom buy home appliances built out of Legos. Even more so for Duplo blocks. Some times easy tools are in order; some times, low level engineering with an RPN HP calculator is absolutely in order.
Okay, I'll get back to databases in a minute here, but I want to talk about something outrageous for a moment -- not using a relational database at all.
I wrote and use Acme::State for simple command line and daemonized Web apps. It works well with Continuity and should work with the various Coro based Plack servers for the reason that the application stays entirely in one process. All it does is restore state of variables on startup and save them on exit or when explicitly requested. It kind of goes overboard on restoring state and does a good enough job that it breaks lots of modules if not confined to your own namespace, hence the Acme:: designation.
Similarly, people have used Data::Dumper or Storable directly (Acme::State uses Storable under the hood) to serialize datastructures on startup and exit. In AnyEvent frameworks, it's easy to set a timer that, on expiration, saves a snapshot. Marc Lehmann, the man who created the excellent Coro module, has patches to Storable to make it reenterant and incremental, so that the process (which might also be servicing network requests for some protocol) doesn't get starved for CPU while a large dump is made. His Perl/Coro based multiplayer RPG is based on this idea. With hundreds of users issuing commands a few times a second, this is the only realistic option. If you tried to create this level of performance with a database, you'd find that you had to have the entire working set in RAM not once but several times over in copies in parallel database slaves. That's silly.
You can be very high tech and not use a database. If you're not actually using the relational capabilities (normalized tables, joining tables, filtering and aggregating, etc), then a relational database is a slow and verbose to use (even with DBIx::Class) replacement for dbmopen() (perldoc -f dbmopen, but use the module instead). You're not gaining performance, elegance or scalability, in most cases. People use databases automatically and mindlessly now days to the point where they feel they have to, and by virtue of having to use a database, they have to ease the pain with an ORM.
Anytime someone says "always", be skeptical. You're probably talking to someone who doesn't understand or doesn't care about the tradeoffs involved.
Okay, back to databases. Right now, it's trendy to create domain specific languages. The Ruby guys are doing it and the Perl guys have been doing it for ages. Forth was created around the idea -- the Forth parser is written in Forth and is extensible. Perl 5.12 lets you hijack syntax parsing with Perl in a very Forth-ish style. Devel::Declare was used to create proper argument lists for functions inside of Perl. There's a MooseX for it and a standalone one, Method::Signatures. That idea got moved into core. XS::APItest::KeywordRPN is a demo. Besides that, regular expressions and POD are two other entire syntaxes that exist in
Finally, part of Ruby's appeal -- or any new language's appeal -- is lightness. It's easy to keep adding cruft, indirection, and abstraction and not realize that you're slowly boiling yourself to death in it until you go to a new language and get away from it for a while. The Ruby guys, like the Python guys before them, have done a good job of building up simple but versatile APIs that combine well with each other and keep the language in charge rather than any monolithic "framework". Well, except for Rails, but maybe that was a counter example that motivated better behavior. Look at Scrapi (and Web::Scraper in Perl) for an example.
Too much abstraction not only makes your code slow but it makes it hard to change development direction in the future when something cooler, faster, lighter and more flexible comes out. Just as the whole Perl universe spent ten years mired down in and entrenched in mod_perl, so is there a danger that DBIx::Class and Moose will outlive their welcome. POE, which was once a golden child, has already outlived its welcome as Coro and AnyEvent stuff has taken off. Now there are a lot of Perl programs broken up into tiny non-blocking chunks that are five times as long as they could be, and the effort to move them away from POE is just too great. The utility of a package should be weighed against the commitment you have to make to it. The commitment you have to make to it is simply how much you have to alter your programming style. With Moose as with POE, this degree is huge. DBIx::Class is more reasonable. Still, it's a cost, and things have costs.
Thank you for your article.
Basically no mischief or craziness. Having DEFCON at a casino did to it exactly what I would have expected. No money pots to eat the cockroach, no naked fire jugglers, no getting thrown in the pool, no parties by the pool.
Bros outnumber the blackshirts now. They're talking loudly and proudly about how little they know and care.
Kilts are representing, too. The freaks are here. There's a Japanese gang dressed in kimonos. Some other Japanese guys walked by talking vigorously amonst themselves, laughing and pointing. Punks with combs, raisers and hairspray are in the contest area/lounge dispensing mohawks. They have their own official area. Strange hats abound. One kid has a fez. There are BDUs and lab coats. Lots of colored hair.
Aha! Finally spotted someone I knew -- Kevin, a friend of Ernie's, who also worked in the gaming industry, but on a different side of it.
People are sitting next to me reading long hex strings from the "background" of the talks description book.
They ran out of badges as usual. My flight got delayed two hours which seriously cut into my time here.
"My only crime is that of outsmarting you"... shirts have slogans.
There's a lot less interesting in WiFi and a lot more in smartphones. The common area is woefully inadequate.
UnixSurplus is here again and people are packing in to see old Unix hardware. Some people are going to be coming home with O2s.
More later, perhaps.
Pasting an email reply since everyone seems to be stumbling over this one:
I've had this problem too.
This happens when the driver for the NIC on that machine isn't included in
because it doesn't have a driver for it, and since it can't find the NIC
card, it doesn't know the MAC address, and without the MAC address, it
can't configure the node. Without the NIC card, it can't connect to the
You can add the names of the NIC devices for all of the machines during
install, or you can add them before you boot the other machines into the
cluster and then run 'mkinitrd -o
(adjusted to match your kernel version, see 'uname -a') and run 'ssi-ksync'.
It's possible to add the correct device names and rebuild initrd but still
the node won't find the NIC or MAC address. For some reason, some drivers
just don't work correctly with how OpenSSI configures nodes. The e100 and
e1000 don't work correctly with OpenSSI for configuring nodes. I got a
bunch of 3com 3c509 cards and stuck those in the nodes and use only those
for the cluster interconnect, for now. Later I hope to get Infiband going.
So, make sure
rebuild initrd, ssi-sync, and if all else fails, use some old 3com
My pasted reply to a yapc mailing list message:
Missed this BoF too. I've written a lot about my experiences with
Phoenix.PM. As far as I can tell, the key ingredients are:
* organizer with the right personality
* audiance happy with social meetings or else a few people willing to present over and over year after year
* central location to minimize excuses from people who have families to go home to
* core people to attend so the thing doesn't bottom out while it's ramping up
We had a mix of people hoping the meetings would have tangible,
immediate improvements to their career (more immediate than teaching
them awesome Perl hackery, such as recruiters standing by),
maintenance programmers whose life with Perl just sucks and won't
be made by happy that doesn't just take a rotorooter to their
codebase, weirdos like Marlana from Fight Club who just go to every
meeting in town but never actually do anything or have any personal
interest in any of it, and hip kids looking for the cool thing.
Most of these groups saw we had nothing to offer (only awesome
technical presentations) and never came back.
I wrote this once and it didn't post (that often happens here, in various scenarios) or else it got deleted. Assuming the former. Here's a super short version of the same thing. The first go was much better. Dammit.
* College CS departments are owned by Microsoft and Sun. They use C# and Java out of consideration for strings-attached grant money.
* Highschool and gradeschool kids learn PHP, jquery/JS, Squeak, Processing, Flash, or Python/PyGame. Perl has a foot in the web world (though not the appeal to the ultra-low-brow user base) but virtually no foot in the playing with graphics department. SDL is okay but it isn't easy enough or flashy enough to compete.
* Perl teams are small. Fewer people are hired to do a project compared to Java -- radically fewer. And companies hire almost exclusively senior level Perl programmers. It's hard to get a toe hold in the industry. You virtually have to publish a lot of great stuff on CPAN to get a job. Strong typing in Java I think makes it easier to integrate weaker programmers into a team and keep them from doing damage. They aren't kept out of your living room with a shutgun but with locked doors. Compartmentalizing the code, it's harder for someone new or intermediate to do damage; they just do or don't succeed. That's a big improvement.
* There is no perception that there's a lot of money to be made writing Perl. Java jobs with far lower expectations have paid me personally far better than Perl jobs with high expectations. Really good Perl programmers don't get actively recruited away. Countries with developing economies aren't tooling up on Perl and Perl companies aren't sponsoring H1B visas. Nor are Perl companies making long term investments in employees and letting them spend years only marginally productive with the idea that they'll be there for 10 or 20 years. Perl jobs just tend not to be that "milk". We all have just one battle story after another complete with scars. The lack of promise of money plus comfortable employment does not draw in adults in over Perl's job market.
* Relatedly, no one is out there just making cool stuff in Perl and saying, "Hey, look what Perl can do". Yahoo! Pipes is Perl and it's way awesome but they aren't playing up the Perl bit and there are precious few examples of this. People's perceptions would be that Perl is only used for big, cranky, serious old web apps. And they'd mostly be right.
* Perl was and perhaps still is the language of choice for system administration on Unix but other things are competitive in this field now. This is probably the largest avenue by which people discover and learn Perl -- Unix admins. Shops that do a lot of Unix administration probably still take non-programmers and tell them to learn Perl.
* Microsoft, Sun, IBM (SAP really wins here) are buying their customers lunch and being buddy-buddy with them. They're listening to them, agreeing with them, sympathizing with them, and drinking with them. In this regard, Perl doesn't even exist. Microsoft and Sun get a lot of money from companies but they don't just walk over with their hand out -- they woo them. This is a bonus point that's unrelated to the main point.
Want to kill something? Cut the supply lines. There's precious few new projects using Perl (neither large business nor garage style maybe-the-next-Twitter type). The avenues by which people have discovered Perl in the past have almost entirely been closed off. Nothing dies quickly. COBOL is still around. COBOL is even in demand -- the dearth of new people learning for the amount of legacy code out there itself creates demand. Except for system administration (I don't know, what else can you think of? Where do people come from?), our lines have been cut.
As I wrote in the last post, Perl is still assimilating and the community adapts very quickly after assimilating something. In about a year, half of the web infrastructure made for Perl switched to Plack. That's almost over night. Coro has been making waves. We ran with ORMs like tweaked out squirrels with scissors. It's disgusting, really. Collectively, we're having a lovefest with git.
Yeah, there's CPAN. Trying to figure out what happened to sunsite.unc.edu (go find out! You'll like the answer) I happened into CTAN -- the Comprehensive Tex Archive Network. No one gives a flying fuck how much stuff is in there because they have no plans to use Tex. CPAN isn't going to sell people on Perl. It did sell them on the idea of creating repo archives, though.
I hope you've enjoyed my little rant. My goal isn't to kick Perl while it's down nor is it to pointlessly piss people off. The subject is subjective and debatable and I don't mean to try to "win" the debate, only to add usefully to it. Past a certain point, saying "Perl isn't dead!" just is not constructive. We have a lot of work to do.
I flew USAir. This was the airline that promised me a $200 flight voucher and gave me nothing. They announced on the way back that the stop in Phoenix was actually a plane change for the people going through to CA. The airlines started doing "stops" rather than layovers because people hated switching planes -- too often they wouldn't make their connection and they were twice as likely to have a canceled flight.
As people boarded the plane, USAir was confiscating carry-ons, telling people that the plane was full. One guy was almost in tears -- he had a case that was full of glass. He boarded without it. The plane was not full. Not nearly. Nor were the overhead luggage racks. I've heard airlines use this to hurry people on planes before. I wonder what the next tactic will be when this one wears out. This "plane is full" stuff came after they called all of the stand-by passengers up to the podium. The lie was bare faced.
I don't mean to hate on the airlines; businesses are machines. Things devoid of soul aren't worthy of hate. But I'm trying to piece together thoughts on what happens when humans don't or aren't able to stand up for themselves.
From what I'm hearing on IRC, other people's experiences were actually legitimately bad, not just annoying. It rained somewhere so flights got canceled left and right. People barely made it home in time to eat at The Cracker Barrel.
The travel days are interesting. Between the bus, taxi, plane, and airport, it's nearly an all day affair.
I'd live in Columbus. People were laid back and earnest. Bikes represented well. College town was bursting with interesting establishments. Awesome old buildings of brick or wooden things with elaborate roofs with gables were all over, far outnumbering the new structures. It felt like a place that people cared about.
The whole time, people complained about the heat. I think it was mid 70's but muggy. I felt extremely comfortable to me -- I could roll naked in the grass. Okay, the chiggers might not be so comfortable later. Shirtless or scantily clad students were jogging, playing basketball, or walking around.
The Perl community... oh boy. Recent years has brought a push to organize. The Perl Foundation has been doing more and different things and recruiting people into roles doing specialized things that programmers generally can't do. I was steeped in organization and volunteerism for a different cause. I went around YAPC wearing my TBAG (Tempe Bike Action Group) shirt for two days. Smelly shirts go well with eye bags. It was interesting to see how well I do not function with extended lack of sleep. I had the stupid, bad.
The Perl community is full of misfits, freaks, man-boys, server room dwelling shut-ins, gimps, maladapts, rejects from other cultures, and the curiously alternative, plus the suspiciously normal looking, and I love them. It took me a few years but I learned that I can talk to nearly anyone there and have my mind blown. It's exciting to see and hear about the things everyone is working on. Every now and then, we make someone doing something especially nifty feel like a rockstar. It would be like going into a Hollywood studying and seeing a movie being made, if I actually cared about Hollywood. Some of the talks are riotous.
My "hey, look at me, I'm weird!" instinct is dying down. People know all about it for one, and secondly, I should be paying more attention to the ways other people are weird. Also, I've pretty much failed in making people hate me which is always an easy way out of social situations. Perhaps I should have taken lessons from buu. It just isn't as rewarding as it was.
It looks like the cluster going down was due to a power outage. The machines that survived on battery (laptops with longer battery lives) shat themselves when the initnode went down and laptops with shorter battery lives died before that happened, leaving notices on the screens of the other laptops that they went down and left the cluster. The last minute errand of putting new batteries in the old UPS saved my ass. While I was giving the talk, one of the machines went out. This old APC600 has a serial port on it. I guess I should see about actually plugging it in to something so I have some idea of what's going on when I'm remote at least. Clearly the service level needs to be bumped up. I might have to break down and do the Cox Internet thing, as much as I hate those fuckers.
Perl is still assimilating just as it always has. Perl 5 and Perl 6 are doing that concurrently. When Perl assimilates something, the community usually aggressively embraces it, even to the point of silliness. I kind of wish mixing in some strong typing had been embraced but what are ya going to do. There's a lot Perl does not have. PyGame, for one. Because so much has been coming from outside of our own camp, it concerns me that we might be forgetting that we can start memes too. I want to resurrect the old clustering meme, and in a lot of ways, Perl and Perl programmers are perfect for it. Perl has infrastructure for working with Unix primitives and processes beyond what most languages offer. forks, the threads API implementation for forks, comes to mind. And a lot of us are cranky old sysadmin types.
Just sitting around coding socially is something I really don't get to do in Phoenix, working from home. The coffee shops don't offer much in that way either.
I showed YAPC a photo of myself from Phoenix, the day before I left for Ohio.
It was good hanging out with beppu again, a man I admire and draw inspiration from.
I have code I started for my presentation that I need to finish still.
I got to put faces to coworkers.
In a Perl 6 talk, pmichaud felt necessary to justify to the audience why something was a bit cryptic: he was running out of space on the line. dconway, from the back, commented "If I had a dollar for every time someone wrote unmaintainable code with that excuse...", prompting me to point out on IRC that putting lots of stuff on one line is an optimization technique in AtariBASIC. The moment I said that I realized (or remembered) that Perl has exactly the same problem: it does a linear traversal through statements looking for the target label (or line, in AtariBASIC). The more statements it has to traverse to find the target, irregardless of size, the longer it takes.
Therefore, it follows that longer lines in Perl make for faster programs.
Awwaiid urls http://github.com/jasonmay/abermud# perl based mud with mooseyness (early stages)
You ack loudly.
You say in common: yuck
You say in common: every MUD knock-off in the last 20 years as sucked ass and completely missed the point.
You say in common: Aber, Tiny, Circle's take on Diku, UberMUD, Muck... people *knew* what they were doing for a while, but now everyone completely misses the point.
You say in common: it's all cargo-culting... taking the superficial stuff while missing the point. makes me sad.
You say in common: turns the end of MUD, people were far more interested in throwing tech at it randomly... TMI, EoTL II, etc
You say in common: that's what killed MUD... mindless weilding of tech
You say in common: I think I'm going to put the "TECHNOLOGY WANTS YOUR SOUL" sticker on this laptop.
Awwaiid says in common: haha
Awwaiid says in common: so bitter! Maybe jasonmay is having fun and you should join him... bring "the point" into his code
Awwaiid says in common: or maybe that's not possible, since this is apparently a mud-framework as opposed to a mud itself?
You say in common: the most successful MUD software of all time, LPMud 2.4.5, was distributed as a copy of a running a game. you could fire it up and run it.
You say in common: it had a couple of security flaws in the "mudlib" code (the interpreted C-like stuff that's easy to change).
You say in common: it lacked a lot of features... races and classes most significantly
You say in common: but it had monsters, areas, spells and lots of stuff.
You say in common: you could fire one up, "wiz" the people who madeit to level 20 and let them start adding to it.
You say in common: one thing that's well established is under no circumstances should a live game "wiz" people who have not played through and beat that particular game.
You say in common: that goes for the admins too. no one should ever start a MUD without wiz'ing on one MUD and having that MUD go down or them being forcibily ejected from it.
You say in common: for some profound human nature reason, doing otherwise is always the same... you have to profoundly care about a game to contribute to it and the only way to do that is to go through the long process of making friends, partying, helping novices, learning your way around it, etc, etc over months or years.
You say in common: I guess more fundamentally, MUDs need to be targetted towards *players*, not towards coders. players become coders.
You say in common: a Perl MUD is a good idea. the zombie game was a stab at that. but there a pile of lessons like that... more than I could remember... that people haven't learned. either they haven't had the experience or else they second guessed what they saw in the name of tech devotation.
By the time someone makes it to level 20 and is generally (a willing sponsor aside) then permitted to code, seeing the code has to make the game even more magical. This is critical: seeing the code has to be a wondrous event. They should relive all of their adventures again from the point of view how things actually really worked rather than what their imagination thought was going on. Their imagination will have inflated the realities of the world. But they'll see opportunity to create more of this illusion. This is analogous to working through a math problem or riddle before seeing the answer and just being told the answer. Seeing the code should be a glorious "aha!" moment.
That MUDs are text is superficial to why they were successful and why so many people loved them and still do. What makes a MUD is being up to your eyeballs for months or years, making friends, fighting monsters together, exploring, the politics of the gods, the policies of the games, the dramas, the loss and gain, the interaction between people of different philosophies including personas of people's experimental alter egos. A profound love of the game, strong feelings (period) towards the admin, and a sense of serving people like your many friends drives you to create more adventures, places, creatures, oddities, and interactions. You're adding magic to the game. This is the only time and the only point that magic is added to the game -- when a player beats it and adds to it.
I lied: Diku was probably the most popular game. It too game ready to play. Rather than be coded on (Circle would later change this), creation was done with level editors. Creativity was mostly limited to economics and prose, and this was fine.
Diku was a clone of Aber, stealing and changing it slightly. Aber was a knock off of the original MUD which was a commercial enterprise; the creation of MUD was a blessed event. LP was a knock off of Aber. Lars was an Aber player and beat Aber.
No one is going to clone Aber, Diku or LP in Perl because they want to do a "better" framework. How good the framework is matters now. In fact, TinyMUD was huge and it ran a very simple BASIC. Diku didn't run user code at all. You had to hack on the C source code to add features. Same for Aber, except Diku let you extend the map at run time.
To create a MUD -- in the spirit of MUD -- in Perl, you'd have to make a game. That's a much harder problem than creating a "framework" for making games. For the reasons outlined above, no one is going to take your framework and run with it. Anyone who cares is going to just make a game and worry about the "framework" later. Even myself, I had a harm time having energy left over the game. It's profoundly hard to do the tech and the game at the same time. Stealing egregiously is the only way to do this. Previous efforts have used mechanical translation from one game to another, and un-tech-savvy friends cutting and pasting. People cared about the game that much. They knew. Even when a new game started, enough had to be added to it to make it interesting, even though it came playable. Most people didn't have the energy for that. There were countless failed attempts at starting games where all they had to do is take a running game -- often 2.4.5 -- and add classes, races, and/or a mix of other unusual, new, unique and interesting things to define that game. Taking an existing, ready to play game with a pile of features as long as your arm and dozens of areas and thousands of monsters and adding a few more features takes more energy and creativity than the vast majority of people have to spare. And that's why you can't create a game framework in Perl and expect it somehow turn into something. Your effort is insignificant and misguided. You make me sad. Go code on an existing MUD that already had hundreds of wizards pour their life and soul into it. Bolt Perl onto the side. But, if you actually played and beat the thing and lived it, bolting on Perl would be the last thing you cared about.