Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • Especially in the current environment, it's good to raise the point that open-source software is more auditable (and, some claim, more audited) than closed-source. You never know sort of backdoors and security holes might be in the proprietary software you install. The fact that many jurisdictions are now using proprietary software for voting machines (with no paper trail) and not even being allowed to look at the software is making me far more paranoid than I used to be.

    Of course, the other side has its
    • That's a very good point. I'm certainly all for transparency in government and government software. It's also mentioned in the bill. Other people will likely cover that in their testimony and supporting documentation.

      Is there a way to cover the auditability aspect of Open Source while discussing open standards and protocols? I'd like to stay within that narrow topic -- it helps to be laser-precise when talking to lawyers and legislators. :)

      • by ziggy (25) on 2003.04.02 15:51 (#18699) Journal
        Is there a way to cover the auditability aspect of Open Source while discussing open standards and protocols? I'd like to stay within that narrow topic -- it helps to be laser-precise when talking to lawyers and legislators. :)
        Perhaps. Is there any chance you can get Whitfield Diffie on your side? (On second thought, he's a pretty great mind and a deep thinker, but he rambles almost incomprehensibly at times.)

        Diffie's argument revolves around a fundemental tenet of security: a secret is only as good as your ability to change it. For example, suppose you have a combination lock on the front door of the office. You may consider it secure because it cannot be cracked(!), and furthermore it is a closed, proprietary combination lock -- no one knows what goes on inside.

        Now suppose you have 100 employees in your facility. Each and every one of them knows the combination. That's fine, because each and every one of them is trustworthy(!) and no one would squeal.

        Now suppose your impenetrable combination lock gets cracked. What do you do? You can't change the combination because your 100 employees won't be able to get into the office anymore. Because you can't change your cracked combination, someone else who is not trustworthy can enter your building any time he wants without your permission, simply because you cannot change the combination.

        Oh, and let's not forget that an untrustworthy cracker is unlikely to be afraid of breaking the law or even the Patriot act....

        This analogy is the basis of Diffie's refutation of security by obscurity. A closed source package has its source code obscured. You cannot change it easily. It will have security holes -- every piece of software of value does. However, you cannot fix them -- only your vendor can fix them, and only on his timescale. Furthermore, because closed source software relies on a secret that cannot be easily changed (the binaries), it is fundementally insecure.

        Open source, on the other hand, does not have this property. It is a secret that can be easily changed (much like a PGP key can be revoked and reissued). It is open to inspection, so it is more likely people will find security bugs. Furthermore, because it is easily changed, it is more likely to keep your secrets.

        (I hope that makes some sense. It was early in the morning and Diffie was wandering pretty far afield when I heard him speak...)

        • Changeability: that is a brilliant insight. Excellent refutation of biometrics for free, too. Unfortunately security-by-obscurity seems to be the first instinct of every layperson out there. Perhaps it's because their only experience of security is dealing with passwords. They know that they are supposed to be all tricksy with their passwords, so they have the sense that computer security == being tricksy. Perhaps one should meet that head on, explain that the ideal security system has no secrets at all