Slash Boxes
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Ovid (2709)

  (email not shown publicly)
AOL IM: ovidperl (Add Buddy, Send Message)

Stuff with the Perl Foundation. A couple of patches in the Perl core. A few CPAN modules. That about sums it up.

Journal of Ovid (2709)

Thursday January 31, 2008
12:59 PM

Arc is Released - So What?

[ #35540 ]

Paul Graham has finally released Arc. After reading about it, I'm quite disappointed. I have a lot of respect for Paul Graham, so this surprised me. Here's a perfect example of why I'm disappointed:

Which is why, incidentally, Arc only supports Ascii. MzScheme, which the current version of Arc compiles to, has some more advanced plan for dealing with characters. But it would probably have taken me a couple days to figure out how to interact with it, and I don't want to spend even one day dealing with character sets. Character sets are a black hole. I realize that supporting only Ascii is uninternational to a point that's almost offensive, like calling Beijing Peking, or Roma Rome (hmm, wait a minute). But the kind of people who would be offended by that wouldn't like Arc anyway.

Not supporing Unicode? This is not merely a matter of being offended. It's a matter of "can I use this code or not?" The very next sentence reads "Arc embodies a similarly unPC attitude to HTML." ASCII-only isn't "unPC", it's stupid. It means, right off the bat, that many, if not most of the world's programmers can't use it for any serious work. By his own admission this project he's talked about for years would only have been delayed a couple of days. I'm flabbergasted.

So what is his "unPC attitude to HTML"?

The predefined libraries just do everything with tables. Why? Because Arc is tuned for exploratory programming, and the W3C-approved way of doing things represents the opposite spirit.

In other words, Arc is a toy and not to be used for real projects (to be fair, Graham himself describes Arc as for "exploratory programming", whatever that is). I assume we could write our own proper HTML libraries, but Graham seems to be encouraging developers to do the wrong thing

I can forgive the tables since one can presumably work around it, but failing to spend a couple of days resolving the ASCII-only issue is disappointing.

But what does Arc code look like? Maybe it's clean enough that we should forgive its sins.

(def firstn (n xs)
  (if (and (> n 0) xs)
      (cons (car xs) (firstn (- n 1) (cdr xs)))

Right. Now we have to try to explain to new Arc programmers that the origins of "car" and "cdr" are "Contents of Address of Register" and "Contents of Decrement of Register" respectively (which makes sense if you know the historical reasons). Why? Why not just call them "head" and "tail" and make it simple? This compiles down to Scheme and this could have been built-in up front. Instead, historical baggage which confuses new programmers has been left in. Here's what I would have liked to have seen:

(def first_n (n list)
  (if (and (> n 0) list)
      (concat (head list) (first_n (- n 1) (tail list)))

OK, switching "cons" to "concat" may not be that big of a deal, but with the above code, programmers who don't know the language have a much better chance of understanding it. Further, note that I've changed the variable name xs to list. Is see xs used all the time in languages like Prolog, Lisp and Haskell, but these point to mathematical underpinnings and to my mind, are mental speed-bumps. While this is certainly something a programmer can change, examples should be written to reduce the cognitive load so long as they don't sacrifice correctness.

I'm also not convinced about the lack of support for OO programming in Arc. He has a fascinating follow-up about confusion in the OO world, but his primary justification (first link in this paragraph) seems to be:

I personally have never needed object-oriented abstractions. Common Lisp has an enormously powerful object system and I've never used it once. I've done a lot of things (e.g. making hash tables full of closures) that would have required object-oriented techniques to do in wimpier languages, but I have never had to use CLOS.

If you read that carefully, that almost sounds like he's saying "OO programming isn't important because I don't use it." I have never done a lot of purely functional programming and after having read HOP (Higher Order Perl) and worked through the examples, I can sympathize with Paul Graham's position, but given what I've seen in Arc, I'm far less ready to agree with him.

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
More | Login | Reply
Loading... please wait.
  • "list" is already a function name, so I wouldn't use it as a variable. This wouldn't happen if the language had sigils...
    • That's why you have quoting... er... that's why function application only applies to the first element of a cons... er... that's why you have Lisp-2... er... sigils are ugly, darnit.

      • Incidentally, Lisp uses the @ sigil for list interpolation.  Example:

        (let ((foo '(1 and 2)))
          `(this is a new list with ,@foo in it))

        That evaluates to (this is a new list with 1 and 2 in it).  It amuses me anyway :)
    • This isn't a problem. Symbols get their values from different "slots" depending on context. When you write (list 'foo 'bar), list is called as a function. When you write (setq list 42) and then (+ list list), you get 84. It's very clear. The only problem is calling a function saved in the value slot of a symbol: (defun foo (a b) (- a b)) (let ((foo (lambda (a b) (+ a b)))) (foo 2 1) ; 1 (funcall foo 2 1)) ; 3 Oh well, you can't have everything.
      • Ugh, apologies for the worthless formatting.
      • This isn't a problem.

        ... not for the compiler anyway. Some of the rest of us like to optimize for the slower parts of the process, often known as wetware.

        • FWIW, Perl suffers from this problem, sort of:

          sub foo {}
          my $foo = sub {};
          my $foo = "hello";
          $foo->(); # death

          Anyway, lisp is what it is.  It's straightforward to write a macro such that:

          (with-sigils (&foo) (foo $foo))

          expands to

          (progn (funcall foo) (foo foo)))

          If you really care I'll try it out and blog the code :)
      • Disclaimer: the last Lisp I used was AutoLisp...many years ago. I don't remember if it behaved the way you describe or not. I later (after posting) figured this wouldn't be a problem in this instance anyway because "list" was (dynamically?) scoped to the function. But I still don't think I'd call a variable "list" in the language (w/o sigils) :-)
  • First, the ASCII issue. PHP, Ruby, Perl 5.6, etc., etc. don't support Unicode. Those saw plenty of use anyway. (PG mentions on his site that people wouldn't have complained about this if he hadn't brought it up. I agree; this is whining about the bikeshed color.)

    Next, car and cdr. Most LISP programmers I've talked to (including myself), prefer car and cdr to head and tail or first and rest. First, head/tail and first/rest don't make much sense when applied to an improper list (i.e. (cons 1 2)). car/c
    • Was that reply a joke? I hope it was, but in case it wasn't...

      PHP, Ruby, Perl 5.6, etc., etc. don't support Unicode. Those saw plenty of use anyway.

      Perl 5.6 does support Unicode... badly, but it does support it. And what was one of the major things we put into 5.8 (begun in 2000) despite it causing vast amounts of internals grief? Unicode. Also keep in mind that 5.6 development started in 1998 when you could still fool yourself that ASCII was all you needed. Even so, the diverse array of Perl developers recognized it was necessary even if it was very painful.

      It's not that Arc doesn't s

      • Even English speaking programmers in the UK for example have to talk with the rest of Europe.

        English speaking programmers in the UK who want to get paid have to deal with the Euro symbol, which isn't in ASCII. They can't fall back on the pound symbol either.

        • Actually, programmers in the UK are more likely to be paid in British pounds. Of course the symbol for that (£) isn't ASCII either. Even Americans can't render the symbol for cents (¢) without venturing outside ASCII.
    • Ruby, Perl and Python are all at least as old as Unicode itself. PHP is PHP. What excuse does Arc have to dismiss Unicode as “unimportant”?

      And I agree entirely with what Schwern said: if Paul Graham had written that Unicode doesn’t concern him now, but he’ll get around to it before he starts telling people to use Arc for serious work, I wouldn’t have said a peep about it.

      F.ex., who cares if he thinks using tables for layout is somehow more exploratory and agile. No one has

  • Don't ignore that Paul Graham doesn't want everyone to use his language (see "Beating the averages" []. He doesn't care about everyone, and he's making this for a handful of people who know the secret handshake. It's not supposed to be easy to understand because that doesn't filter out the people he thinks shouldn't be programmers.

    • Good point. I disagree with him on that, but that's possibly due to my naive fantasy of wanting to help average developers become good developers. This might not be possible for many (most?) of them. I really don't know.

      • Superstar programmers are born, not made. I don’t think there’s any way to change that. It takes a certain mental predisposition that does not appear to be teachable; either your mind works that way or it does not.

        Nevertheless, if you look at spreadsheets, you’ll see that by reducing the minimum required ability for abstraction and increasing the amount of computation state that's tangibly visible, people with little programming skill can be empowered to harness computing machines for th

        • Superstar programmers are born, not made. I don’t think there’s any way to change that.

          As someone who didn't do programming until their 20s, I must call bullshit. It's sort of like the people who think that if you haven't written your first masterpiece at age 9 you can never learn to play the piano.

          It takes a certain mental predisposition that does not appear to be teachable; either your mind works that way or it does not.

          I think you have reversed the cause and effect. We teach algorithms and data structures, but we don't teach how to think like a programmer. We barely examine it, partially because we're bad at people. Partially because we have the idea that it can't be taught, so why bother trying?

          This is why