Slash Boxes
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Putting the ``Backwards'' in ``Backwards Compatibility''?

posted by chip on 2000.05.30 14:22   Printer-friendly
Every time we upgrade Perl, it seems that something breaks. Sometimes that something is very small, but occasionally it's large like "@" in strings. Sometimes it's on purpose; usually it isn't. But what is the ideal? What can we -- and users -- reasonably expect from Perl?
This is a subject I'm obviously interested in because I'm working on Topaz, which will become Perl 6 if it works out. And reimplementing a whole language makes breakage almost inevitable. So since some things are going to break, it could be argued that we shouldn't worry about it. And maybe that's true for an upgrade from Perl 5 to Perl 6... In particular, Larry has said that for Perl 6, everything that's officially deprecated in Perl 5 is fair game for deletion.

But there's another perspective that's worth considering. I recently had a mail exchange with a programmer who loves Perl but decided not to use it for his product, because he couldn't rely on every Perl program he writes today continuing to work for the indefinite future on all upcoming versions of Perl.

I think there are three major questions raised by this story.

  • Is it appropriate to expect all Perl programs to work forever with all future versions of Perl?
  • If not, what does that mean for Perl advocacy? Should we really be encouraging people to use Perl for systems that are deployed far from maintenance programers?
  • Would it be worthwhile to resynchronize the documentation and the regression tests so that every documented behavior is tested?

I invite perspectives on these issues from everyone....

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
More | Login | Reply
Loading... please wait.
  • C? (Score:1, Interesting)

    I think it is worth noting that not all C programs work forever, either. Compilers and libraries -- even *gasp* standard ones! -- change. Very well-written C programs should continue to run on modern compilers, but sloppy ones won't. C programs that exploited hidden or undocumented features can break. Hey, this is beginning to sound familiar ...

    Perl might be in worse shape than C in this regard, but it is something to ponder.
  • ... and the compiled form is likely to work for a very, very long time.

    Maybe what we're seeing is that byte-compiled Perl is more important for reliability than anything else.

  • It's fairly important, IMO, to be able to clean out old code every now and then. This means rewriting, and that means things will break.
    However, so what? If someone has some code that won't work in Perl 6, they can keep running perl 5.x indefinitely. It's not like we're taking anything away from anyone. Odds are fairly high that if something breaks, they'd be well advised to rewrite it anyway. I'd rather have a language that was well maintained and up to date, with efficient code, than one that was b
    -- Kirby
  • But can we really tell them to keep Perl 5 and Perl 6 forever?

    What if a security problem arises with Perl 5 ten years from now. Will we care? Do we care if someone finds a (new) security problem today with Perl 4?

    I think it's inevitable that, at some point, bit rot will set in and Perl 5 itself will be deprecated. The key questions are (1) how long that can be delayed, (2) how long we think it should be delayed, and (3) how that should affect what we do and recommend for users.

  • Sure, but how many unchanged, compiled programs from 10 years ago still work? And how many of those do you still use?
  • Aren't there people still using SunOS 4 binaries under Solaris? Isn't it possible for those binaries to be ten years old? How about commercial apps for Xenix that still run under emulation today?

    A raw count of old binaries doesn't really address the point, IMO, because the whole universe of computing has expanded so quickly that even if all old binaries were still used, they'd be outnumbered by the new ones.

    More to the point: For those who have chosen as their goal the creation of code that works for

  • I just use the "count" as an illustration: for the overwhelming majority of code out there, such longevity is not an issue. Even sometimes when the compiled code is still to be used, the OS has been upgraded and the binaries break. Unless you are going to make sure the OS stays the same, you probably count on a program of significance still working in 10 years. And who is going to be using the same OS in 10 years?

    Yes, some people will. And I'd have to say that unless you can compile a Perl program, th
  • Working under MPE, we have a lot of programs compiled 10 years ago which are still happily working every day. I'm pretty sure we even have one or two which were compiled in the 70s and which still run today.

    For a shop like ours with lots of inhouse code, the idea that upgrading the language may suddenly cause widespread breakage is a pretty scary thing.

    At the very least, getting the test suite as complete as possible so that we can tell people what is going to break is important.

  • Wow, MPE. That brings back memories....

    If modern MPE is anything like the MPE IV (?) I used in the early 80s, it's extraordinarily stable--just the kind of system you'd expect to be upgraded about once per decade.

    I heard a story about an MPE system that got turned off in the middle of a really long COBOL compilation. When the system was repowered, the OS started doing lots of active disk stuff but didn't generate any output. Thinking that it was wedged, the operator write-protected the hard drive in

  • Can anyone (from p5p, maybe) look at Perl today, and tell us what the likely breaking points are? Functions will change their return values? Syntax changes? Garbage collection?

    Maybe a line should be drawn clearly in the documentation between what you can rely on and what behaviour is considered "undefined" or a "mere side-effect of a particular operation which you really can't rely on".

    Hmm. It is an interesting question.

    Maybe "Perl" should be defined separately (somehow) from "perl". Much like there a
  • Larry doesn't want to separate Perl and perl. There would be drift and differences, and that would be more harmful than the variety would be beneficial. Or so he surmises. I think he's right, too.

    We already have some classification of breakage.

    • Bugs. Oops. 'Nuff said.
    • New keywords can be introduced at any time, so calling non-imported subroutines without using "&" is asking for trouble.
    • Indirect object syntax for method calls is particularly fragile in the face of new keywords.
    • Some features are
  • >New keywords can be introduced at any time, so calling non-imported
    >subroutines without using "&" is asking for trouble.

    I thought it was asking for non-avoidance of prototypes and cleaner looking code. Sigh.
  • The chief problem is time -- I need to use Perl, not nursemaid it. I use Perl partially because it is a RAD language that lets me write a lot of features in a short time, and because it is so portable. I lost an entire weekend when I upgraded to Perl 5.6 and it broke my DBI and DBD modules! I had to dig out the old Perl, reinstall it, install my database modules again. In short, I don't have time to do full regression testing on every line of code I've ever written in Perl. I need to know Perl won't break,
  • Backwards compatibility is GOOD. I can still run (re-compile) most of my FORTRAN-IV programs, and yes, I consider F90 an abomination. However, some of my F-IV programs didn't work, but I was able fix them because the changes in the language were well documented. That's OK, provided simple, routine structures aren't changed (for example, I would assume that $_ in Perl 6, Topaz, etc. would still be modified by the s/.../.../ command. Bob
  • Like others have pointed out, it's not necessarily appropriate for all current Perl code to work in all future versions, although it's also obvious that breaking 'basic' features would cause lotsa angst.

    Besides, aren't deprecated features the icky stuff like allowing the user to set what the first element of an array is, etc. that really should go away anyhow?

    It's true that Perl is now being used in large-scale projects in many cases, sometimes replacing huge chunks of C/C++/Java, but one would suspe
  • Modern MPE is still extraordinarily stable :-). The HP3000 has become the HPe3000 and is generally built around a PA-RISC processor (the same hardware, pretty much, that HP-UX runs on) and will be available on IA-64.

    The story you tell sounds typical. I was recently talking to a linux fan about the stability of the box and went back and pulled the power loose from our 947. The box, of course, went immediately silent. Then I plugged the power back in. Once the drive spun up, and was active for a short
  • The problem is so much code on CPAN is poorly written. Not most of it, but a significant enough amount of it to mention. Much of the time, when perl "breaks" something on CPAN, it is the CPAN author's fault.

    This is not always the case, especially with a major update like perl 5.6. However, I think making sure things on CPAN don't break is not a worthy goal for perl porters; it is a worthy goal for CPAN authors and CPAN testers [].

    Oh, and the CGI module ships with perl; I think it is clear that it will a
  • You know, now that you say it that way ....

    I wonder if it might be possible to reverse the current rules someday. It's probably more robust over the long term to default to calling a subroutine with the same name as an operator.

  • Yes, the deprecated features are the ``yucky bits''.

    Your point about CPAN is well-taken. I think cpan-testers will help a lot.

    As for interfacing with C, I'm expecting to create a compatibility layer so many existing XSs can continue to work, albeit with some efficiency hit. Fortunately, writing the equivalent of an XS for Topaz will be a lot simpler than XSs today.

  • I hope not only to document language changes, but to detect any potential breakage at compile time.

    I know I don't mind language changes much, as long as there are no silent killers.

  • I think a lot of this depends upon the platform.

    At work I use Solaris, and we expect things to continue as they've been for quite a long while. At home I use Linux, and I don't get bothered when things stop working. It is a cultural issue. Some people prefer that everything remains stable, even at the expense of new features/functions. I usually prefer to update code and spend the time keeping my software up-to-date if I feel that I'm getting something for it (p6 must be better than p5). If I may (ov
  • Is it appropriate to expect all Perl programs to work forever with all future versions of Perl?

    Well yes and no. I mean that there was a lot of chat in comp.lang.perl.misc [perl.misc] after the release of 5.6.0 that would indicate that people never tested the new version of Perl with their existing programs before going live with the new release - this is quite simply bad practice. People blundering into the installation of a new version of anything without a proper implementation plan deserve everything they get

  • I don't know if that is the answer, either. Talk about confusion. But maybe it is the best answer.
  • Is it appropriate to expect all Perl programs to work forever with all future versions of Perl?

    All the ones that use documented features, yep. It's one of the things that marks a solid, well-engineered piece of sofware

    If not, what does that mean for Perl advocacy? Should we really be encouraging people to use Perl for systems that are deployed far from maintenance programers?

    The existence of maintenance programmers is reasonably irrelevant here. Perl's a language, and that's a low-level-enough thi