Click here to Skip to main content
Click here to Skip to main content
Add your own
alternative version

Neural Networks on C#

, 19 Nov 2006
The articles describes a C# library for neural network computations, and their application for several problem solving.
Prize winner in Competition "C# Oct 2006"
neuro_demo.zip
neuro_demo
Back Propagation
Approximation
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Approximation.exe
Data Samples
sample1.csv
sample2.csv
TimeSeries
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Data Samples
exponent.csv
growing sinusoid.csv
parabola.csv
sigmoid.csv
sinusoid.csv
TimeSeries.exe
XORProblem
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
XORProblem.exe
Simple
Delta Rule Learning
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Classifier.exe
Data Samples
and.csv
cube.csv
or.csv
sample1.csv
sample2.csv
One-Layer Perceptron Classifier
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Classifier.exe
Data Samples
sample1.csv
sample2.csv
Perceptron Classifier
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Classifier.exe
Data Samples
and.csv
cube.csv
or.csv
SOM
2DOrganizing
2DOrganizing.exe
AForge.dll
AForge.Neuro.dll
Color
AForge.dll
AForge.Neuro.dll
Color.exe
TSP
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
TSP.exe
neuro_src.zip
neuro_src
Docs
AForge.Core.chm
AForge.Neuro.chm
Release
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Samples
Neuro
Back Propagation
Approximation
App.ico
Data Samples
sample1.csv
sample2.csv
TimeSeries
App.ico
Data Samples
exponent.csv
growing sinusoid.csv
parabola.csv
sigmoid.csv
sinusoid.csv
XORProblem
App.ico
Simple
Delta Rule Learning
App.ico
Data Samples
and.csv
cube.csv
or.csv
sample1.csv
sample2.csv
One-Layer Perceptron Classifier
App.ico
Data Samples
sample1.csv
sample2.csv
Perceptron Classifier
App.ico
Data Samples
and.csv
cube.csv
or.csv
SOM
2DOrganizing
App.ico
Color
App.ico
TSP
App.ico
Sources
Controls
Core
Neuro
Activation Functions
Images
sigmoid.bmp
sigmoid_bipolar.bmp
threshold.bmp
Layers
Learning
Networks
Neurons
// AForge Neural Net Library
//
// Copyright � Andrew Kirillov, 2005-2006
// andrew.kirillov@gmail.com
//

namespace AForge.Neuro.Learning
{
	using System;

	/// <summary>
	/// Delta rule learning algorithm
	/// </summary>
	/// 
	/// <remarks>This learning algorithm is used to train one layer neural
	/// network of <see cref="AForge.Neuro.ActivationNeuron">Activation Neurons</see>
	/// with continuous activation function, see <see cref="SigmoidFunction"/>
	/// for example.</remarks>
	/// 
	public class DeltaRuleLearning : ISupervisedLearning
	{
		// network to teach
		private ActivationNetwork network;
		// learning rate
		private double learningRate = 0.1;

		/// <summary>
		/// Learning rate
		/// </summary>
		/// 
		/// <remarks>The value determines speed of learning  in the range of [0, 1].
		/// Default value equals to 0.1.</remarks>
		/// 
		public double LearningRate
		{
			get { return learningRate; }
			set
			{
				learningRate = Math.Max( 0.0, Math.Min( 1.0, value ) );
			}
		}

		/// <summary>
		/// Initializes a new instance of the <see cref="DeltaRuleLearning"/> class
		/// </summary>
		/// 
		/// <param name="network">Network to teach</param>
		/// 
		public DeltaRuleLearning( ActivationNetwork network )
		{
			// check layers count
			if ( network.LayersCount != 1 )
			{
				throw new ArgumentException( "Invalid nuaral network. It should have one layer only." );
			}

			this.network = network;
		}

		/// <summary>
		/// Runs learning iteration
		/// </summary>
		/// 
		/// <param name="input">input vector</param>
		/// <param name="output">desired output vector</param>
		/// 
		/// <returns>Returns squared error divided by 2</returns>
		/// 
		/// <remarks>Runs one learning iteration and updates neuron's
		/// weights.</remarks>
		///
		public double Run( double[] input, double[] output )
		{
			// compute output of network
			double[] networkOutput = network.Compute( input );

			// get the only layer of the network
			ActivationLayer layer = network[0];
			// get activation function of the layer
			IActivationFunction activationFunction = layer[0].ActivationFunction;

			// summary network absolute error
			double error = 0.0;

			// update weights of each neuron
			for ( int j = 0, k = layer.NeuronsCount; j < k; j++ )
			{
				// get neuron of the layer
				ActivationNeuron neuron = layer[j];
				// calculate neuron's error
				double e = output[j] - networkOutput[j];
				// get activation function's derivative
				double functionDerivative = activationFunction.Derivative2( networkOutput[j] );

				// update weights
				for ( int i = 0, n = neuron.InputsCount; i < n; i++ )
				{
					neuron[i] += learningRate * e * functionDerivative * input[i];
				}

				// update threshold value
				neuron.Threshold += learningRate * e * functionDerivative;

				// sum error
				error += ( e * e );
			}

			return error / 2;
		}

		/// <summary>
		/// Runs learning epoch
		/// </summary>
		/// 
		/// <param name="input">array of input vectors</param>
		/// <param name="output">array of output vectors</param>
		/// 
		/// <returns>Returns sum of squared errors divided by 2</returns>
		/// 
		/// <remarks>Runs series of learning iterations - one iteration
		/// for each input sample. Updates neuron's weights after each sample
		/// presented.</remarks>
		/// 
		public double RunEpoch( double[][] input, double[][] output )
		{
			double error = 0.0;

			// run learning procedure for all samples
			for ( int i = 0, n = input.Length; i < n; i++ )
			{
				error += Run( input[i], output[i] );
			}

			// return summary error
			return error;
		}
	}
}

By viewing downloads associated with this article you agree to the Terms of Service and the article's licence.

If a file you wish to view isn't highlighted, and is a text file (not binary), please let us know and we'll add colourisation support for it.

License

This article, along with any associated source code and files, is licensed under The GNU General Public License (GPLv3)

Share

About the Author

Andrew Kirillov
Software Developer (Senior) Cisco Systems
United Kingdom United Kingdom
Started software development at about 15 years old and it seems like now it lasts most part of my life. Fortunately did not spend too much time with Z80 and BK0010 and switched to 8086 and further. Similar with programming languages – luckily managed to get away from BASIC and Pascal to things like Assembler, C, C++ and then C#. Apart from daily programming for food, do it also for hobby, where mostly enjoy areas like Computer Vision, Robotics and AI. This led to some open source stuff like AForge.NET.
 
Going out of computers I am just a man loving his family, enjoying traveling, a bit of books, a bit of movies and a mixture of everything else. Always wanted to learn playing guitar, but it seems like 6 strings are much harder than few dozens of keyboard’s keys. Will keep progressing ...

| Advertise | Privacy | Mobile
Web02 | 2.8.140821.2 | Last Updated 19 Nov 2006
Article Copyright 2006 by Andrew Kirillov
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid