Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

jdavidb (1361)

jdavidb
  (email not shown publicly)
http://voiceofjohn.blogspot.com/

J. David Blackstone has a Bachelor of Science in Computer Science and Engineering and nine years of experience at a wireless telecommunications company, where he learned Perl and never looked back. J. David has an advantage in that he works really hard, he has a passion for writing good software, and he knows many of the world's best Perl programmers.

Journal of jdavidb (1361)

Monday April 29, 2002
01:29 AM

Multilayer neural network implementation

[ #4520 ]

I'm working on my last code of the semester! Unfortunately that means I'm burning a day off tomorrow so I can stay up all night tonight. After my trip to Ukraine earlier this month I'd kind of gotten used to eight hours of sleep every day. I may make a habit of it this summer.

The good news is, I get to do this project in Perl. It's for Neural Networks class. If you've never done neural networks before you might have a preconception like I used to that everything is probably best implemented as a collection of instances of a Neuron class. Actually, everything is best implemented as a ton of matrix math. Our professor has been encouraging us to use matlab. I've been drinking FSF kool-aid for a couple of years now, and matlab isn't (yet) available for Debian MacPPC GNU/Linux (go figure), so I've spent most of this semester with Octave, a less-capable but very adequate alternative. However, I started the class by trying to use PDL, and with this project, I've returned to it.

So now I have a complete pure-Perl implementation of a multilayer neural network class complete with backpropagation training algorithm. I've even tested it with an example from the book and found it to work. I am certain that during this night, however, my mileage will vary. Specifically, I've verified all the math through one training iteration that my textbook goes over in great detail, and then ran through enough training iterations on the same single training set to verify that the thing really did converge.

I'll pause here to mention that this kind of thing gives me an ecstatic creative thrill.

For the rest of the project, I have to do something useful with it.

Of course, this code will be free software. I'll try to get it into CPAN some time in the next three months, but if you're reading this in two years and don't see it, post here asking for it. I never throw anything away, so I'm sure I'll still have it.

So, what do I call this monstrosity? Currently it's just Network.pm. There seem to be two different neural network implementations in CPAN (and several pointers to authors who apparently never finished or never released). One of these seems to tie in with another library. I didn't look at the other one yet, but if it doesn't use PDL it's probably way different from mine. (If it does, I probably just reinvented the wheel.) So ... AI::NeuralNetwork::PDL? AI::NeuralNetwork::Multilayered? Ah, I'll worry about it some day when I've had enough sleep.

No, there's no test suite. I'm not an XP guy, at least not this year.

BTW, the class is general enough you can have any number of layers, any number of neurons in each layer, and any type of activation function for each layer, as long as you provide coderefs for each activation function and its derivative. All neurons in a single layer have to have the same activation function (but I think that's part of the definition, anyway).

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • Had the same assignment (graduated in 2000) in my final year... our team members were for VB, Java and Perl (Perl was just me).. but at that time, the intricacies of PDL were a bit too much for me, so I abandoned the idea of Perl and we ended up doing it in VB

    As it turned out, Perl benched slightly slower than Java and about the same as VB for a 3 layer back propagation network.. We had loads of fun though... the assignment required that we build a character recognition system (any alpha numeric char is va