Click here to Skip to main content
Click here to Skip to main content

Convolutional Neural Network Workbench

By , 17 Mar 2014
Rate this:
Please Sign up or sign in to vote.


This article is about a framework in C# 4.0 that allows to create, train, and test convolutional neural networks against the MNIST and the CIFAR-10 dataset of 10 different natural objects. I initially based me on an article by Mike O'Neill on the The Code Project and gradually added new features that I've found interesting in research documents found on the internet. Dr. Yann LeCun's paper: Gradient-Based Learning Applied to Document Recognition is a great paper to get a better understanding of the principles of convolutional neural networks and the reason why they are so successful in the area of machine vision.

The Code

The main goal of this project was to build a more flexible and extendable managed version of Mike O'Neill's excellent C++ project. I've included and used the splendid WPF TaskDialog Wrapper from Sean A. Hanley, the Extended WPF Toolkit and for unzipping the CIFAR-10 dataset the open-source SharpDevelop SharpZipLib module. Visual Studio 2010 and Windows Vista SP2 are the minimum requirements (or just the OS if using the setup). I made maximal use of the parallel functionality offered in C# 4.0 by letting the user at all times choose how many logical cores are used in the parallel optimized code parts with a simple manipulation of a sliderbar next to the View combobox.

Using the Code

Here is the example code to construct a LeNet-5 network in my code (see the InitializeDefaultNeuralNetwork() function in MainViewWindows.xaml.cs):

NeuralNetworks cnn = new NeuralNetworks("LeNet-5", 10, 0.8D, LossFunctions.MeanSquareError, DataProviderSets.MNIST, TrainingStrategy.SGDLevenbergMarquardt, 0.02D);
cnn.AddLayer(new Layers(cnn, LayerTypes.Input, 1, 32, 32));
cnn.AddLayer(new Layers(cnn, LayerTypes.Convolutional, ActivationFunctions.Tanh, 6, 28, 28, 5, 5));
cnn.AddLayer(new Layers(cnn, LayerTypes.AveragePooling, ActivationFunctions.Tanh, 6, 14, 14, 2, 2));

bool[] maps = new bool[6 * 16] 
 true, false,false,false,true, true, true, false,false,true, true, true, true, false,true, true,
 true, true, false,false,false,true, true, true, false,false,true, true, true, true, false,true,
 true, true, true, false,false,false,true, true, true, false,false,true, false,true, true, true,
 false,true, true, true, false,false,true, true, true, true, false,false,true, false,true, true,
 false,false,true, true, true, false,false,true, true, true, true, false,true, true, false,true,
 false,false,false,true, true, true, false,false,true, true, true, true, false,true, true, true

cnn.AddLayer(new Layers(cnn, LayerTypes.Convolutional, ActivationFunctions.Tanh, 16, 10, 10, 5, 5, new Mappings(2, maps)));
cnn.AddLayer(new Layers(cnn, LayerTypes.AveragePooling, ActivationFunctions.Tanh, 16, 5, 5, 2, 2));
cnn.AddLayer(new Layers(cnn, LayerTypes.Convolutional, ActivationFunctions.Tanh, 120, 1, 1, 5, 5));
cnn.AddLayer(new Layers(cnn, LayerTypes.FullyConnected, ActivationFunctions.Tanh, 10), true);

Design View

In Design View you can see how your network is defined and get a good picture of the current distribution of weight values in all the layers concerned.

Training View

In Training View you train the network. The 'Play' button gives you the 'Select Training Parameters' dialog where you define all the training parameters. The 'Training Scheme Editor' button gives you the possibility to make your own training schemes to experiment with. At any time, the training can easily be paused or aborted. The 'Star' button will forget (reset) all the learned weight values.

Testing View

In Testing View you get a better picture of the testing (or eventually training samples) which are not recognized correctly .

Calculate View

In Calculate View we can test a single testing or training sample with the desired properties and get a graphical view of all the output values in every layer.

Final Words

I would love to see a DirectCompute 5.0 integration for offloading the highly parallel task of learning the neural network to a DirectX 11 compliant GPU if one is available. But I've never programmed with DirectX or any other shader based language before, so if there's anyone out there with some more experience in this area, any help is very welcome. I made an attempt to use a simple MVVM structure in this WPF application. In the Model folder, you can find the files for the neural network class and also a DataProvider class which deals with loading and providing the necessary MNIST and CIFAR-10 training and testing samples. There is also a NeuralNetworkDataSet class that is used by the project to load and save neural network definitions, weights, or both (full) from or to a file on disk. Then there is the View folder that contains the four different PageViews in the project and a global PageView which acts as a container for the different views (Design, Training, Testing, and Calculate). In the ViewModel folder, you will find a PageViewModelBase class where the corresponding four ViewModels are derived from. All other coded functionality is found in the MainViewWindows.xaml.cs class. Hope there's someone out there who can actually use this code and improve on it. Extend it with an unsupervised learning stage for example (encoder/decoder construction), or implement better loss-functions; implement stochastic conjugate gradient descent; extend to more test databases; make use of more advanced squashing functions, etc.


  • (03-13-2014)
    • StochasticPooling and L2Pooling layers now correctly implemented.
    • Native c++ implementation + managed c++/cli wrapper.
  • (12-10-2013)
    • Support for Stochastic Pooling layers.
    • Much faster binary load and save of a cnn.
    • ReLU activation function now working as expected.
  • (11-26-2013)
    • Bugfix: Download datasets now working as expected.
    • Bugfix: Softmax function corrected.
    • Bugfix: DropOut function corrected.
  • (11-15-2013)
    • Bugfix: Local layer and Convolutional layer now works properly.
    • Bugfix: Cross Entropy loss now works better (in combination with a SoftMax activation function in a final fully connected layer).
    • Added LeCun's standard LeNet-5 training scheme.
  • (11-09-2013)
    • Now the last min/max display preference is saved.
    • Added some extra predefined training parameters and schemes.
    • Bugfix: Average error not showing correctly after switching between neural networks with a different objective function.
    • Bugfix: Sample not always showing correctly in calculate view.
    • Bugfix: The end result in testing view is not displaying the correct values.
  • (11-06-2013)
    • Supports dropout.
    • Supports the Local layer type. (Same as the convolution layer but with non-shared kernel weights)
    • Supports padding in the Local and Convolution layers. (gives ability to create deep networks)
    • Supports overlapping receptive fields in the pooling layers.
    • Supports weightless pooling layers.
    • Supports all the commonly used activation functions.
    • Supports Cross Entropy objective function in combination with a SoftMax activation function in the output layer.
    • Ability to specify a density percentage to generate mappings between layers far more easily.
    • Improved DataProvider class with much reduced memory footprint and a common logical functionality shared across all datasets (easier to add datasets).
    • Much improved UI speed and general functionality.


    This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

    About the Author

    Filip D'haene
    Software Developer
    Belgium Belgium
    No Biography provided

    Comments and Discussions

    Questionhow could i enlarge some layers Pinmembercrazyhj9-Nov-12 0:36 
    AnswerRe: how could i enlarge some layers [modified] PinmemberFilip D'haene10-Nov-12 14:12 

    General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Rant Rant    Admin Admin   

    Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

    | Advertise | Privacy | Mobile
    Web04 | 2.8.140415.2 | Last Updated 17 Mar 2014
    Article Copyright 2010 by Filip D'haene
    Everything else Copyright © CodeProject, 1999-2014
    Terms of Use
    Layout: fixed | fluid