Click here to Skip to main content
Click here to Skip to main content
Add your own
alternative version
Go to top

Neural Networks on C#

, 19 Nov 2006
The articles describes a C# library for neural network computations, and their application for several problem solving.
neuro_demo.zip
neuro_demo
Back Propagation
Approximation
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Approximation.exe
Data Samples
sample1.csv
sample2.csv
TimeSeries
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Data Samples
exponent.csv
growing sinusoid.csv
parabola.csv
sigmoid.csv
sinusoid.csv
TimeSeries.exe
XORProblem
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
XORProblem.exe
Simple
Delta Rule Learning
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Classifier.exe
Data Samples
and.csv
cube.csv
or.csv
sample1.csv
sample2.csv
One-Layer Perceptron Classifier
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Classifier.exe
Data Samples
sample1.csv
sample2.csv
Perceptron Classifier
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Classifier.exe
Data Samples
and.csv
cube.csv
or.csv
SOM
2DOrganizing
2DOrganizing.exe
AForge.dll
AForge.Neuro.dll
Color
AForge.dll
AForge.Neuro.dll
Color.exe
TSP
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
TSP.exe
neuro_src.zip
neuro_src
Docs
AForge.Core.chm
AForge.Neuro.chm
Release
AForge.Controls.dll
AForge.dll
AForge.Neuro.dll
Samples
Neuro
Back Propagation
Approximation
App.ico
Data Samples
sample1.csv
sample2.csv
TimeSeries
App.ico
Data Samples
exponent.csv
growing sinusoid.csv
parabola.csv
sigmoid.csv
sinusoid.csv
XORProblem
App.ico
Simple
Delta Rule Learning
App.ico
Data Samples
and.csv
cube.csv
or.csv
sample1.csv
sample2.csv
One-Layer Perceptron Classifier
App.ico
Data Samples
sample1.csv
sample2.csv
Perceptron Classifier
App.ico
Data Samples
and.csv
cube.csv
or.csv
SOM
2DOrganizing
App.ico
Color
App.ico
TSP
App.ico
Sources
Controls
Core
Neuro
Activation Functions
Images
sigmoid.bmp
sigmoid_bipolar.bmp
threshold.bmp
Layers
Learning
Networks
Neurons
// AForge Neural Net Library
//
// Copyright � Andrew Kirillov, 2005-2006
// andrew.kirillov@gmail.com
//

namespace AForge.Neuro.Learning
{
	using System;

	/// <summary>
	/// Perceptron learning algorithm
	/// </summary>
	/// 
	/// <remarks>This learning algorithm is used to train one layer neural
	/// network of <see cref="AForge.Neuro.ActivationNeuron">Activation Neurons</see>
	/// with the <see cref="AForge.Neuro.ThresholdFunction">Threshold</see>
	/// activation function.</remarks>
	/// 
	public class PerceptronLearning : ISupervisedLearning
	{
		// network to teach
		private ActivationNetwork network;
		// learning rate
		private double learningRate = 0.1;

		/// <summary>
		/// Learning rate
		/// </summary>
		/// 
		/// <remarks>The value determines speed of learning in the range of [0, 1].
		/// Default value equals to 0.1.</remarks>
		/// 
		public double LearningRate
		{
			get { return learningRate; }
			set
			{
				learningRate = Math.Max( 0.0, Math.Min( 1.0, value ) );
			}
		}

		/// <summary>
		/// Initializes a new instance of the <see cref="PerceptronLearning"/> class
		/// </summary>
		/// 
		/// <param name="network">Network to teach</param>
		/// 
		public PerceptronLearning( ActivationNetwork network )
		{
			// check layers count
			if ( network.LayersCount != 1 )
			{
				throw new ArgumentException( "Invalid nuaral network. It should have one layer only." );
			}

			this.network = network;
		}

		/// <summary>
		/// Runs learning iteration
		/// </summary>
		/// 
		/// <param name="input">input vector</param>
		/// <param name="output">desired output vector</param>
		/// 
		/// <returns>Returns absolute error - difference between real output and
		/// desired output</returns>
		/// 
		/// <remarks>Runs one learning iteration and updates neuron's
		/// weights in case if neuron's output does not equal to the
		/// desired output.</remarks>
		/// 
		public double Run( double[] input, double[] output )
		{
			// compute output of network
			double[] networkOutput = network.Compute( input );

			// get the only layer of the network
			ActivationLayer layer = network[0];

			// summary network absolute error
			double error = 0.0;

			// check output of each neuron and update weights
			for ( int j = 0, k = layer.NeuronsCount; j < k; j++ )
			{
				double e = output[j] - networkOutput[j];

				if ( e != 0 )
				{
					ActivationNeuron perceptron = layer[j];

					// update weights
					for ( int i = 0, n = perceptron.InputsCount; i < n; i++ )
					{
						perceptron[i] += learningRate * e * input[i];
					}

					// update threshold value
					perceptron.Threshold += learningRate * e;

					// make error to be absolute
					error += Math.Abs( e );
				}
			}

			return error;
		}

		/// <summary>
		/// Runs learning epoch
		/// </summary>
		/// 
		/// <param name="input">array of input vectors</param>
		/// <param name="output">array of output vectors</param>
		/// 
		/// <returns>Returns sum of absolute errors</returns>
		/// 
		/// <remarks>Runs series of learning iterations - one iteration
		/// for each input sample. Updates neuron's weights each time,
		/// when neuron's output does not equal to the desired output.</remarks>
		/// 
		public double RunEpoch( double[][] input, double[][] output )
		{
			double error = 0.0;

			// run learning procedure for all samples
			for ( int i = 0, n = input.Length; i < n; i++ )
			{
				error += Run( input[i], output[i] );
			}

			// return summary error
			return error;
		}
	}
}

By viewing downloads associated with this article you agree to the Terms of Service and the article's licence.

If a file you wish to view isn't highlighted, and is a text file (not binary), please let us know and we'll add colourisation support for it.

License

This article, along with any associated source code and files, is licensed under The GNU General Public License (GPLv3)

Share

About the Author

Andrew Kirillov
Software Developer (Senior) Cisco Systems
United Kingdom United Kingdom
Started software development at about 15 years old and it seems like now it lasts most part of my life. Fortunately did not spend too much time with Z80 and BK0010 and switched to 8086 and further. Similar with programming languages – luckily managed to get away from BASIC and Pascal to things like Assembler, C, C++ and then C#. Apart from daily programming for food, do it also for hobby, where mostly enjoy areas like Computer Vision, Robotics and AI. This led to some open source stuff like AForge.NET.
 
Going out of computers I am just a man loving his family, enjoying traveling, a bit of books, a bit of movies and a mixture of everything else. Always wanted to learn playing guitar, but it seems like 6 strings are much harder than few dozens of keyboard’s keys. Will keep progressing ...

| Advertise | Privacy | Mobile
Web04 | 2.8.140926.1 | Last Updated 19 Nov 2006
Article Copyright 2006 by Andrew Kirillov
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid