Get and set slice modules

The following is just going to be a big code dump and there is no need to think about this too deeply. Even though it is 200 lines long, all the above does is lets us access matrix like a 2D array. With this extension it can be read and set using .[1..3,2..5] or something…

Basics of automatic differentiation

In the last post I gave an example of a true feedforward net if a small one. All neural nets are really just a class of functions R^n -> R. The XOR network rewritten in more mathy notation would be: y = sum((targets-sigmoid(W2 * tanh(W*input+bias) + bias2))^2) The challenge of optimizing this neural net is…

Map and reduce kernels and random functions.

Cuda runtime compilation In order to make any kind of machine learning algorithm work, the map operations are essential. Relu, sigmoid, tanh…for such functions and for more complex kinds like k-selection, it is necessary to apply map operations to the result of Ax+b. Before just last month I started working with ManagedCuda, most of my…

The dMatrix type and the multiply and add wrappers.

The dMatrix type Before anything can be done, the basic 2D matrix type has to be defined. It would be far too unsafe┬áto call Cuda library functions on raw Cuda matrices. Instead is very helpful to wrap them inside a class, or in this case a record which is similar to a class. The difference…

The first steps to running the library

64-bit Mode Assuming you have installed F#, the first step you should take would be to run F# Interactive in 64-bit mode. In VS go into Tools -> Options and just write F# in the search bar. Then go to F# Tools -> F# Interactive. Enable both debugging and the 64-bit mode. Debugging is for…

Installing Dependecies

The following tutorials are going to be written in F# and Cuda. Before we can start here is what is going to have to be installed: F#. I cannot advise on how to get┬áthis on Linux, but on Windows the easiest option would be to download Microsoft Visual Studio 2015. I would recommend instead of…

Introduction

Over the last five months I had done nothing but program neural nets. The purpose of that was not that I had anything specific in mind when I started, but merely to get good at writing efficient numerical code. It had its ups and downs and now I am think I am rather decent at…