I'm working on my last code of the semester! Unfortunately that means I'm burning a day off tomorrow so I can stay up all night tonight. After my trip to Ukraine earlier this month I'd kind of gotten used to eight hours of sleep every day. I may make a habit of it this summer.
The good news is, I get to do this project in Perl. It's for Neural Networks class. If you've never done neural networks before you might have a preconception like I used to that everything is probably best implemented as a collection of instances of a Neuron class. Actually, everything is best implemented as a ton of matrix math. Our professor has been encouraging us to use matlab. I've been drinking FSF kool-aid for a couple of years now, and matlab isn't (yet) available for Debian MacPPC GNU/Linux (go figure), so I've spent most of this semester with Octave, a less-capable but very adequate alternative. However, I started the class by trying to use PDL, and with this project, I've returned to it.
So now I have a complete pure-Perl implementation of a multilayer neural network class complete with backpropagation training algorithm. I've even tested it with an example from the book and found it to work. I am certain that during this night, however, my mileage will vary. Specifically, I've verified all the math through one training iteration that my textbook goes over in great detail, and then ran through enough training iterations on the same single training set to verify that the thing really did converge.
I'll pause here to mention that this kind of thing gives me an ecstatic creative thrill.
For the rest of the project, I have to do something useful with it.
Of course, this code will be free software. I'll try to get it into CPAN some time in the next three months, but if you're reading this in two years and don't see it, post here asking for it. I never throw anything away, so I'm sure I'll still have it.
So, what do I call this monstrosity? Currently it's just Network.pm. There seem to be two different neural network implementations in CPAN (and several pointers to authors who apparently never finished or never released). One of these seems to tie in with another library. I didn't look at the other one yet, but if it doesn't use PDL it's probably way different from mine. (If it does, I probably just reinvented the wheel.) So
No, there's no test suite. I'm not an XP guy, at least not this year.
BTW, the class is general enough you can have any number of layers, any number of neurons in each layer, and any type of activation function for each layer, as long as you provide coderefs for each activation function and its derivative. All neurons in a single layer have to have the same activation function (but I think that's part of the definition, anyway).