Thursday, November 1, 2012

No More Coursera Starter Kit

I just finished the latest Coursera assignment minutes before the deadline (a deadline that was extended due to Hurricane Sandy).  The assignment was not hard, but it certainly was time consuming.  Clearly I will be unable to put out a starter kit in any timely manner.  But, more importantly, I realized something when working through the assignment, this code that they have offered is a mess.  The act of porting this to stuff to Common Lisp would be a nightmare.  It is much better to rewrite the stuff.  The problem with this is that the actual assignments more or less require you to use the Octave code as it does things like sample from a random distribution.  It would be annoyingly difficult to guarantee that any code I wrote would sample the distribution the same way, especially since they use a built in MatLab/Octave routine.  I would have to research what PRNG Octave uses and likely implement it and make sure to seed it with whatever Octave is using.

The other thing I realized when working with the perceptron code is that writing starter code sucks for the programmer.  Writing code that has trivial pieces missing from it on purpose is not how I want to spend my time.  Creating teaching materials requires a crazy amount of prep-time and I don't care enough about education to spend all that time and, in the end, am left with nothing but a classroom exercise that cannot even be used for tinkering at the REPL.

This has lead me to the decision to stop my already feable attempts to maintain a Common Lisp port of the neural network starter code.  However, I still find that porting code is extremely useful in making sure you understand the concepts involved in the code.  I have always felt that once I could write the code for something, I tend to truly understand it.  So I will be taking the provided code from the course as a starting point and creating my own neural network related code, closely in parallel with the course, in Common Lisp.  I will not be leaving out the missing pieces as they do in the class.

I have decided that this won't really violate any rules of the course.  Looking at my perceptron code and the code that was released by the class organizers I doubt that any person would see one as derivative to the other if they didn't already know I was working off the other.  Further, I doubt that any person that couldn't answer the questions in the programming assignments using the MatLab code could look at my code and identify the pieces that needed to be ported back.  And it isn't as if executing the Lisp code will help them at all.  As I said, I doubt I could actually write code that would give the correct numerical answers for the homework even if I wanted to because of the MatLab/Octave heavy emphasis in the provided code.  I was able to just barely do it in the perceptron case and it probably took about half the development time to figure out how I would load the MatLab data and exactly match their form of the learning algorithm.

If you are following this work, I will probably stay on the same git repository for a while, but might switch to another more appropriately named one in due time.  Hopefully I will have a little toy neural network library to play with in a few weeks.

No comments:

Post a Comment