The LSTM Reber grammar example

Reber grammar is a simple string generator that will be used to showcase the LSTM. In this case it will be the more complex embedded version of Reber grammar with long term dependencies. The above is the string generator. In the examples above the make_reber_set is dispatched to generate random 3000 unique strings and then…

The pretrained net

Using the WTA function one can pretrain the layers of a net and then fine tune them together with the sigmoid layer added on top. This does not require any additions the library apart from the BlockReverse() type. First we create all the layers individually and then we create arrays of such layers grouped together,…

The WTA autoencoder

  A few months ago, I spent an enormous amount of time implementing the k-sparse autoencoder as practice for machine learning. In fact, for this new year, I wanted to make this the subject of the autoencoder tutorial, but the method in the paper suffers from some drawbacks. It was fun playing around, but to…