This is notebook thinking aloud. Shared notebooks are fun!
Some background on Forth, not the article... Forth is still relevant. It runs on entirely smaller and more simple systems than anything other than assembly and simplified C variants. It's routinely used to write extremely reliable systems. It's aged well without bloating. It's extremely simple to bootstrap Forth onto a new system. It's written in itself, has a built-in recompiler, and has an extensible grammar. Forth programming style is to first create a "vocabulary" for solving whatever program is at hand and then solve the problem using it. This is far more Design Patterns-esque, in the original Christopher Alexander sense than the re-hashed OO version of it, imo.
The article talks about good programming practice and being proud of the software you write but at the same time seems to contradict itself by criticizing the practice of building class hierarchies and other abstractions and patting yourself on the back for it. Clearly programmers have different concepts of beautiful when it comes to internals.
The article also criticizes the practice of using ultra-high level tools to quickly build and ship buggy, bloated programs. I'm not sure about this one. Most of what's written in C, desktop apps, should be written in something much higher as the higher level language would more quickly produce code that doesn't core dump and have exploits and the result would execute faster as programmer time would be freed up to optimize the bits that actually needed and to use better algorithms. Programming an entire desktop app in C is a case of Premature Optimization. But I also believe too many of us are using so much abstraction that we're getting bitten in the ass far more than bailed out. The old hammer-factory-factory-factory essay still rings true. There's a good case for building your own abstractions beyond a certain point rather than someone else trying to anticipate all of the abstractions you'll need and handing them to you.
Perl has a lot of abstraction. It bites novices on the ass. It feels good to have masted as much of it as I have. But at the same time, I pine for the simpler ideal of a small, general core, such as offered by Lisp or Forth. I'd do some abstractions differently, peek below others more readily, and do without others. But I'm still very happy with the aesthetics of most of the Perl I write. I think abstractions and aesthetics can co-exist.
It makes good points about how while computing power is treated as abundant and cheap, for most of the world, it's unavailable. I've personally wondered why when we ship all of our "ewaste" to China and create a toxic wastedump there, why the bastards don't pull more of the things out and use them. A lot of the hardware that goes to waste disposal places is only a generation or two out of date and worth plenty on eBay. Even if they didn't want to use it, they should be selling it back to us on eBay. I still think the ideal *really* $100 OLPC would be a C=64. There's a one-chip version of the thing that's been sold in a joystick form-factor multi-game thing that's hackable into a full machine -- $20. Add a 320x240 LCD and a keyboard and you're up maybe another $30 or $40. And it could actually run on solar panels or rechargeable AA cells. Only kids got their start on a Pentium class machine. Us old timers had minis or 8 bits or kits or... *sigh*
"As every programmer probably knows, the complexity of the problems one has to deal with often increases exponentially with the size of the program being written . Even in a component based architecture, unexpected interactions between the underlying components and misdocumentation of functionality lead to many headaches." Right on!
It talks about the visions of various companies and the prevailing industry attitude of needing to abstract the computer further and further away from the user. Rather than giving the user a good interface to the computer, they're kept more and more at a distance, buried behind more and more abstractions. It had been a tradition for computers to come with a BASIC language that let you control their various peripherals attached to it, including the graphics adapter. The Amiga carried this tradition all the way from kit computers that read from paper tape. Microsoft and Apple abandoned it. Now rather than programming the computer, the central tenant of the concept of computing, they're pointing and clicking. Apple made the computer into an appliance and was wildly successful in doing so. Windows still shipped with GWBASIC for a long time (does it still? I don't know) but it doesn't interact with graphics or anything.
This is opposite of the vision of the famous SketchPad, or of Logo, and so on.
You know, I don't think modern computers are any easier to operate than the Commodore 64 or Atari 8 bit.
The article also starts with the premise that good programmers need to be writing the code. The industry's attitude is the opposite -- that programmers should be interchange, and any technology that requires accomplished or skilled programmers should not be used. Industry has won this battle -- look at PHP.
"The final coded solution should be as "brutally simple" as possible, and should try to do as much work in representation as possible so that less work needs to be done at compile and runtime"... a lot of this article reminds me of Perl, or at least some popular Perl community attitudes. Someone complained recently about being asked to do a task and being offered months and a budget. He came back minutes later with a Perl one-liner. He was then told to "turn it into a real program". It's self destructive to pretend like a program is simpler than it actually is and ignore important things (bounds cases, security, worst case run-times, etc). The PHP folks are often guilty here. It's also self destructive to trump up every problem into an enterprise effort with UML, reusable classes, factories, architecture review meetings, and so on.
Interesting to see a lot of same themes we're struggling with from that point of view...