Over at O'Reillynet, Ovid blogs:
The first big problem comes in defining “standard practices”. Any Perl code which doesn’t run under taint mode is immediately suspect. Buffer overflows using untrusted data should not be tolerated. Home brewed encryption? Out. [...] But there are problems there. Any of the aforementioned “issues” could potentially be defended. Someone has to be the first person to try a new encryption method. Also, there are too many other areas where standard practices is a terribly ephemeral thing. It’s not a problem easily solved.
Sorry, Ovid, but you're using a strawman to tear down your main point. Which, from a rhetorical perspective, is rather odd.
The problem does boil down to defining standard practices. Anyone who violates those standard practices, either out of malice, negligence or ignorance, is guilty of malpractice. Period.
There is no loophole for homebrewed encryption. There is no loophole for being the first to use a brand new encryption algorithm. And this loophole cannot be used as proof by induction that any new endeavor needs an excemption from the strictures of good practice.
Why? Because cryptography is a branch of information theory, which is a branch of mathematics. If you set forth to build a new encryption system, you need to follow the rigorous mathematical practices for designing encryption, not the lackadasical hacking process of running rot13 over a stream of input an even number of times and declaring it "encrypted".
In fact, Phil Zimmerman of PGP fame did this a few times before he sat down with a cryptographer, who showed him exactly how weak his homebrewed encryption schemes were. So Phil gave up and just did a plain old public key cryptography system. Ignoring the body of work on strong crypto systems would have been malpractice out of ignorance.
Similarly, when NIST was poking around for something to replace DES, they didn't throw a half dozen homebrewed algorithms against the wall and hope for the best. The used the best practices for developing crypto algorithms (publishing papers, formulating attacks, detailed proofs, etc.), and determined that AES was good because it was provably strong enough to replace DES for the next few years.
So there are no loopholes when it comes to enforcing good practice. When Robert Jarvik developed the artificial heart, he didn't get a free pass from his medical responsibilities because no one had built an artificial heart before. Instead, he was still bound by his ethical obligations as a physician before experimenting on a human subject.
The same thing goes for other licensed professions, like engineers and lawyers.
The difference between licensed professions and software developers is that there is no agreed "standard body of practice" to draw from, nor is there "best standard practice" that practitioners must uphold, or be found guilty of malpractice. For example, what exactly belongs in the "standard body of practice"? Database design? Stored procedures? Race conditions? Taint mode? Class hierarchy design? Design pattern abuse? Secure data handling? Good crypto? Dropping permissions? Numerical analysis? Testing?
Is that list complete? Is it all taught in a 4-year degree program? Reliably?
Are software developers certified against that body of practice? Do we sit for the equivalent of a bar exam, medical boards, or engineering certifications?
Until we have a consensus view on all of these issues as an industry, any talk of software malpractice is premature at best, misleading and distracting at worst.