The Mnist Feedforward net example

A few posts ago I showed a feedforward pass for a neural net on the XOR example, but no reverse pass. This here is pretty much that neural net except with reverse pass using the spiral library. These are the contents of the load_mnist_fsx: The code just loads the Mnist dataset from the given file…

Main types, the tape and a few function examples

The following should be familiar from the basics of AD. There are some differences, the main being is that for some nodes we do not want to calculate the adjoints (because they might be input or target nodes) so we need a flag for that. The rest is boilerplate. At the time of writing, there…

Get and set slice modules

The following is just going to be a big code dump and there is no need to think about this too deeply. Even though it is 200 lines long, all the above does is lets us access matrix like a 2D array. With this extension it can be read and set using .[1..3,2..5] or something…

Basics of automatic differentiation

In the last post I gave an example of a true feedforward net if a small one. All neural nets are really just a class of functions R^n -> R. The XOR network rewritten in more mathy notation would be: y = sum((targets-sigmoid(W2 * tanh(W*input+bias) + bias2))^2) The challenge of optimizing this neural net is…

Map and reduce kernels and random functions.

Cuda runtime compilation In order to make any kind of machine learning algorithm work, the map operations are essential. Relu, sigmoid, tanh…for such functions and for more complex kinds like k-selection, it is necessary to apply map operations to the result of Ax+b. Before just last month I started working with ManagedCuda, most of my…

The dMatrix type and the multiply and add wrappers.

The dMatrix type Before anything can be done, the basic 2D matrix type has to be defined. It would be far too unsafe┬áto call Cuda library functions on raw Cuda matrices. Instead is very helpful to wrap them inside a class, or in this case a record which is similar to a class. The difference…

The first steps to running the library

64-bit Mode Assuming you have installed F#, the first step you should take would be to run F# Interactive in 64-bit mode. In VS go into Tools -> Options and just write F# in the search bar. Then go to F# Tools -> F# Interactive. Enable both debugging and the 64-bit mode. Debugging is for…