Click here to Skip to main content
Click here to Skip to main content

Neural Dot Net Pt 3 The Adaline Network

, 23 Oct 2003
Rate this:
Please Sign up or sign in to vote.
A neural network library in C#.

Introduction

Finally we are going to get something working. We've done the theory, you've been formally introduced to the parents, now lets get down to the juicy bits. Welcome to the world of the Adaline neural network and running code demonstrations and seeing how things really work when the code is put to the test. As mentioned earlier, the Adaline network is a rather simple classification network in that, it takes a set of data with predefined answers and by learning the inputted data, it can get the correct answers whenever the data is run.

The Adaline network is an example of a feed forward network which is named so as all the data flows in one direction, from the input nodes to the output nodes. It should be noted that although the network contains a Bias node, this is not used within the example code. The network is based on the two Adaline examples in Joey Rogers book, Object Orientated Networks in C++, although the AdalineNeuron class is entirely mine. It has the same effective functionality as the network classes provided in those demonstrations.

The Adaline Network

There are three extra classes that make up the Adaline network used in the example program and these are the AdalineNeuron which inherits from BasicNeuron, the AdalineNode which inherits from the BasicNode class and the AdalinePattern class which inherits from the Pattern class. These three classes are located in the Adaline.cs file.

Sample image

Sample image

The AdalineNeuron Class

The AdalineNeuron class contains the class members for creating the network. It is at this point, when overriding the BasicNeuron class (see Neural .NET part 2 The Basic Classes for a description of this class), that you can specify the requirements that are personal to your network. The idea behind the Neuron classes is that they encapsulate a part of the network in order to make life a little easier later on, when dealing with networks that have more than one neuron.

What we are creating with the Adaline Network example is an example of a single neuron neural network. Hopefully, once looking at the class, you will see why a single neuron neural network is not written as a single class or function.

The AdalineNeuron class only adds one more member to the neuron class because the basic setup of two input values, a bias node and the standard three links are all that is required. We shall look at examples in a later article where the nodes take inputs from more than two input nodes, and examples using more linkages. But for now, we'll stick to the basics. The Node added to the AdalineNeuron class is an object of the AdalineNode class and it is this that will be doing all the work for us during the running of the program.

The useful work done by the AdalineNeuron class is that it builds the network for us from the constructor, by calling BuildLinks. This just made more sense to me than manually building the network every time that you wanted to run it and means that anytime an Adaline network is used, then once it's been made to work, all future networks will build themselves. It does this by using the BasicNode's Connect function which is described in Neural .NET part 2 The Basic Classes, but which in essence creates the links between the input nodes and the Adaline node using the link classes.

The AdalinePattern Class

The AdalinePattern class inherits from the pattern class in order to present the training and the running data to the network.

The AdalineNode Class

The AdalineNode class inherits from the basic node and implements the run, learn and transfer functions for the network.

The Adaline One Function

Now we come to how the Adaline network does its required task. This is performed by the DoAdalineOne function in Form1.cs and the important code is listed below:

FileInfo info = new FileInfo( "Neural Network Tester.xml" ); 
if( info.Exists == true ) 
{ 
    info.Delete();
} 

log = new Logger( "Neural Network Tester.xml", 
                    "NeuralNetworkTester", true ); 
ArrayList patterns = LoadAdalineTrainingFile(); 
/// create the Adaline network 

AdalineNeuron neuron = new AdalineNeuron( log, new BasicNode( log ), 
             new BasicNode( log ), new BiasNode( log ), 
             new AdalineNode( log, 0.45 ) ); 

/// train the Adaline network 
int nIteration = 0; 
int nGood = 0; 
while( nGood < nNumberOfItemsInAdalineTrainingFile )
{
     nGood = 0; 
     for( int i=0; i<nNumberOfItemsInAdalineTrainingFile; i++ ) 
     { 
            neuron.InputNodeOne.SetValue( neuron.Node.Values.NodeValue, 
                        ( double )( ( Pattern )patterns[ i ] ).InSet[ 0 ] ); 
            neuron.InputNodeTwo.SetValue( neuron.Node.Values.NodeValue, 
                        ( double )( ( Pattern )patterns[ i ] ).InSet[ 1 ] ); 

            neuron.Node.Run( neuron.Node.Values.NodeValue ); 

            /// if the output value generated by run is not the 
            /// same as the output value 
            /// in the training file then it is an error 
            if( ( ( Pattern )patterns[ i ] ).OutputValue( 0 ) 
                   != neuron.Node.GetValue( neuron.Node.Values.NodeValue ) ) 
            { 
                /// run the learn function 
                log.Log( DebugLevelSet.Errors, 
                   "Learn called at  number " + i.ToString() 
                   + " Pattern value = " 
                   + ( ( Pattern )patterns[ i ] ).OutputValue( 0 ).ToString()
                   + " Neuron value = " 
                   + neuron.Node.GetValue( neuron.Node.Values.NodeValue ), 
                  "Form1" ); 
                netWorkText.AppendText( "Learn called at  number " 
                   + i.ToString() + " Pattern value = " 
                   + ( ( Pattern )patterns[ i ] ).OutputValue( 0 ).ToString()
                   + " Neuron value = " 
                   + neuron.Node.GetValue( neuron.Node.Values.NodeValue ) 
                   + "\n" ); 

                neuron.Node.Learn(); 
                break; 
            } 
            else 
                nGood++; 
    } 


    log.Log( DebugLevelSet.Progress, "Iteration number " 
        + nIteration.ToString() + " produced " 
        + nGood.ToString() + " Good values out of 250 ", "Form1" ); 

    netWorkText.AppendText( "Iteration number " + nIteration.ToString() 
        + " produced " + nGood.ToString() 
        + " Good values out of 250 \n" ); 

    nIteration++; 
} 

FileStream xmlstream = new FileStream( "adalinenetworkone.xml",
       FileMode.Create, FileAccess.Write, FileShare.ReadWrite, 8, true ); 
XmlWriter xmlWriter = new 
  XmlTextWriter( xmlstream, System.Text.Encoding.UTF8 );
xmlWriter.WriteStartDocument(); 

neuron.Save( xmlWriter ); 
xmlWriter.WriteEndDocument(); 
xmlWriter.Close(); 

/// now load the file FileStream
readStream = new FileStream( "adalinenetworkone.xml", 
          FileMode.Open, FileAccess.Read, FileShare.ReadWrite, 8, true ); 

XmlReader xmlReader = new XmlTextReader( readStream ); 

/// create the adaline network 
AdalineNeuron neuron2 = new AdalineNeuron( log, new BasicNode( log ), 
         new BasicNode( log ), new BiasNode( log ), 
         new AdalineNode( log, 0.45 ) ); 
neuron2.Load( xmlReader );

xmlReader.Close(); 

/// rerun the adaline should get 250 out of 250
for( int i=0; i<nNumberOfItemsInAdalineTrainingFile; i++ )
{
    neuron2.InputNodeOne.SetValue( neuron.Node.Values.NodeValue,
                  ( double )( ( Pattern )patterns[ i ] ).InSet[ 0 ] );
    neuron2.InputNodeTwo.SetValue( neuron.Node.Values.NodeValue,
                  ( double )( ( Pattern )patterns[ i ] ).InSet[ 1 ] );
    neuron2.Node.Run( neuron.Node.Values.NodeValue );
    netWorkText.AppendText( "Pattern " + i.ToString() 
        + " Input = ( " 
        + ( ( Pattern )patterns[ i ] ).InSet[ 0 ].ToString() 
        + "," 
        + ( ( Pattern )patterns[ i ] ).InSet[ 1 ].ToString() 
        + " ) Adaline = " 
        + neuron2.Node.GetValue( neuron2.Node.Values.NodeValue ) 
        + " Actual = "
        + ( ( Pattern )patterns[ i ] ).OutputValue(  
                neuron2.Node.Values.NodeValue ) 
        + "\n" );
}

log.Close();
thread.Suspend();

The Adaline Network Program is kicked off from the Train menu by clicking on the "Adaline 1" option. This kicks off a thread that runs the code above, the reason for the thread is so that the user has some control over the interface while the program is running. As although the Adaline program is fairly quick to train, it is not unreasonable to suspect that this wont always be the case.

The code starts off by checking to see if the Neural Network Tester.xml file exists. If it does, it deletes it and recreates it and then creates a new Logger that will write to the file. The Logger class can be found in the SharpUtils DLL and writes to either an XML file, the registry or both. Once the log is created, that log object will be passed through to all the base classes, with its Logging parameters set by the DebugLevel variable that is created in the constructor of the main form as,

debugLevel = new DebugLevel( DebugLevelSet.All );

The debug level set has a number of options available to it that are stored in an enumeration in the DebugLevel.cs file in the SharpUtils project. These levels are:

public enum DebugLevelSet{ All, WarningsAndErrors,
          Errors, Warning, Note, Important, HighLight, Progress };

The first three levels apply only to the registry whereas the rest are to give greater flexibility for controlling the amount of information that the code outputs while it is running. It should be noted that I have set a lot of the levels to Progress and the debug level to All. This will generate the maximum amount of information, which can become a problem in its own right, if the network is refusing to learn as it was when I originally tried running it. Then I left the program running in the day while I went out and it generated a 2 gig log file. For general running, the debug level should be set to Warning, Warnings and Errors or Errors, with the other options only used for debugging.

When using the DebugLevels you are only required to set the level once. You will notice in the other files that the level is set to DebugLevelSet.currentLevel so that when you want to change the debug level, you don't have to start plowing through every single file that uses it to change it.

Once the logger is created, the code calls the LoadAdalineTrainingFile function which is standard file loading code that takes the variables in the file and stores them in a pattern array before moving on to create the AdalineNeuron.

AdalineNeuron neuron = new AdalineNeuron( log, new BasicNode( log ), 
           new BasicNode( log ), new BiasNode( log ), 
           new AdalineNode( log, 0.45 ) );

The AdalineNeuron class inherits from the BasicNeuron class and in its constructor, it takes a number of parameters, these being first of all the log, then the first Input Node which is of the BasicNode type as its job is to feed information to the network and not to develop it in any way. The third parameter is another BasicNode which is the second input node (see picture above) that also does nothing more than pass data to the network. The fourth parameter is the bias node which by default has a value of one, and works as another input parameter. The final parameter passed to the constructor is the AdalineNode that itself takes two parameters, the first being the log object and the second being the learning rate for the node.

The AdalineNeuron class builds the network that is pictured at the start of this section by calling the BuildLinks function that establishes the links between the separate nodes.

this.InputNodeOne.CreateLink( ( BasicNode )this.Node, 
               ( BasicLink )this.Links[ 0 ] );
this.InputNodeTwo.CreateLink( ( BasicNode )this.Node, 
               ( BasicLink )this.Links[ 1 ] );
this.BiasNode.CreateLink( ( BasicNode )this.Node, 
               ( BasicLink )this.Links[ 2 ] );

The BuildLinks function takes each node that is specific to the Adaline neuron and joins them by calling the BasicNode function CreateLinks, which is called on the BasicNode that you want to create the link from and takes the node that you are creating the link to, which in this case is the AdalineNode cast back into a BasicNode and the Link that will control the connection and contain the weight value for the link.

arrayOutputLinks.Add( blLink );
bnToNode.InputLinks.Add( blLink );
blLink.InputNode = this;
blLink.OutputNode = bnToNode;

First the Link is added to the nodes array of output links, and then it is added to the input links array for the node that the link is going to. Once this is done, the information for the link itself is then updated so that the link knows which is the output node and which is the input node.

Training

Training for the network is the process by which the network learns the task appointed to it. It should be noted though that the network learns how to do a complete task and not just how to get all the answers right in the example file. We are not talking Pavlov conditioning here, where you get the same response to the same stimulus all the time, we are talking about teaching the program to have the ability to get the answers right every time, regardless of the data set presented to it. I'll explain later how to run a trained network so that it can prove that it has learnt the task by generating a new set of data and running it against the saved network.

Sample image

The code for the training loop is:

while( nGood < nNumberOfItemsInAdalineTrainingFile )
{
    nGood = 0;
    for( int i=0; i<nNumberOfItemsInAdalineTrainingFile; i++ )
    {
        neuron.InputNodeOne.SetValue( neuron.Node.Values.NodeValue, 
               ( double )( ( Pattern )patterns[ i ] ).InSet[ 0 ] );
        neuron.InputNodeTwo.SetValue( neuron.Node.Values.NodeValue, 
               ( double )( ( Pattern )patterns[ i ] ).InSet[ 1 ] );
        neuron.Node.Run( neuron.Node.Values.NodeValue );

        /// if the output value generated by run is not the 
        /// same as the output value
        /// in the training file then it is an error
        if( ( ( Pattern )patterns[ i ] ).OutputValue( 0 ) 
                != neuron.Node.GetValue( neuron.Node.Values.NodeValue ) )
        {
            /// run the learn function
            log.Log( DebugLevelSet.Errors, 
                "Learn called at number " + i.ToString() + " Pattern value = "
                +  ( ( Pattern )patterns[ i ] ).OutputValue( 0 ).ToString() 
                + " Neuron value = " 
                + neuron.Node.GetValue( neuron.Node.Values.NodeValue ), 
                "Form1" );
            netWorkText.AppendText( "Learn called at number " + i.ToString() 
                + " Pattern value = " 
                +  ( ( Pattern )patterns[ i ] ).OutputValue( 0 ).ToString() 
                + " Neuron value = "
                + neuron.Node.GetValue( neuron.Node.Values.NodeValue ) 
                + "\n" );
            neuron.Node.Learn();
            break;
        }
        else
            nGood++;
    }

    log.Log( DebugLevelSet.Progress, "Iteration number " 
        + nIteration.ToString() 
        + " produced " + nGood.ToString() 
        + " Good values out of 250 ", "Form1" );
    netWorkText.AppendText( "Iteration number " + nIteration.ToString() 
        + " produced " + nGood.ToString() + " Good values out of 250 \n" );
    nIteration++;
}

Sample image

Basically we run the code round in a loop for a specified number of times represented by the nNumberOfItemsInTheAdalineTraining integer which is declared as a static variable of the class and is set to 250. First of all, the code sets the input values in the Input Nodes to the values that have been loaded into the patterns array when we loaded the training file. Then we call run on the AdalineNode itself, which is the only node in this network to have the run function defined. I went through the details of the run and the learn functions in part one, when looking at how neural networks learn, so I wont repeat it all here. Suffice to say that once the run function has executed, then the output value of the AdalineNode is checked to see if it is equal to the answer defined in the training file and if it isn't, then the learn function is called and the weight for the node is updated, as is also described in the Neural Dot Net 1 Introduction article.

Saving And Loading

Next we come to the saving and loading part of the Adaline One demonstration. I took the decision right at the start that all the saving and loading in this program was going to use XML rather than some made up file definitions. I did this because finally XML seems to be being accepted as a standard way of doing things which only makes me wonder why it's taken so long. All the logging for this program is done in XML and the neural network is saved in as XML too.

The strategy taken is that each class implements its own saved functionality which means that:

  1. I can avoid having to write huge extremely complicated saving functions for every network type that I write and
  2. Because of the inheritance hierarchy, most of the save functionality is already implemented by the time that I move to a new network.

This is what the saved Adaline network looks like:

<?xml version="1.0" encoding="utf-8"?>
<AdalineNeuron>
    <BasicNeuron>
        <BasicNode>
            <Identifier>0</Identifier>
            <NodeValue>0.960490693785479</NodeValue>
            <NodeError>0</NodeError>
        </BasicNode>
        <BasicNode>
            <Identifier>1</Identifier>
            <NodeValue>0.541840747251101</NodeValue>
            <NodeError>0</NodeError>
        </BasicNode>
        <BiasNode>
            <BasicNode>
                <Identifier>2</Identifier>
                <NodeValue>1</NodeValue>
                <NodeError>0</NodeError>
            </BasicNode>
        </BiasNode>
    </BasicNeuron>
    <AdalineNode>
        <BasicNode>
            <Identifier>3</Identifier>
            <NodeValue>-1</NodeValue>
            <NodeValue>0.45</NodeValue>
            <NodeError>-2</NodeError>
        </BasicNode>
    </AdalineNode>
    <AdalineLink>
        <BasicLink>
            <Identifier>4</Identifier>
            <LinkValue>-5.14391275287787</LinkValue>
            <InputNodeID>0</InputNodeID>
            <OutputNodeID>3</OutputNodeID>
        </BasicLink>
    </AdalineLink>
    <AdalineLink>
        <BasicLink>
            <Identifier>5</Identifier>
            <LinkValue>4.19182542813562</LinkValue>
            <InputNodeID>1</InputNodeID>
            <OutputNodeID>3</OutputNodeID>
        </BasicLink>
    </AdalineLink>
    <AdalineLink>
        <BasicLink>
            <Identifier>6</Identifier>
            <LinkValue>-0.450444623246065</LinkValue>
            <InputNodeID>2</InputNodeID>
            <OutputNodeID>3</OutputNodeID>
        </BasicLink>
    </AdalineLink>
</AdalineNeuron>

I've printed this out in its entirety as it not only shows the saved values for a fully trained network but the XML format is also a convenient way of displaying the class hierarchy in an easy to understand fashion. The Save and Load functions are included in each of the network classes and as a result of the hierarchy, they only implement the saving and loading of their own variables and then call the base class Save. The standard format for a Save function is write an element string identifying the current class, save any local class variables and then call base before writing the end of the element string.

The layout of the XML file follows the pattern it does, as even though the nodes contain a record of their links, it was not only more confusing to try and load the links along with the nodes, but seeing as the links are themselves stored as separate objects, it was an unwanted duplication of the code.

Once the file is saved and loaded again, it is then run again using the same data file and it should get full marks every time. There is a much surer test though, built into the test application provided with the downloads for this program.

Testing

The testing portions of the code are located under the run menu for the Neural Net Tester program. The test for this program is the "Load And Run Adaline 1" menu option. This will load the file that resembles the one above. I say resembles, as the linkage values wont be exactly the same for any two times running.

The menu option will rerun the file generation for the Adaline network, which is generated based on a time value seed so that the numbers are fairly sure to be different each time. The list will display a message saying that the file is being generated and then will probably proceed in a blur of activity as the program runs the newly generated file through the run function. Note that no learning functions are called this time through. The file is processed entirely with the data taken from the loaded network.

The display will show at the end, a list of all the input data and the conclusion the Adaline network reached about that data. Next to this will be the answer that was generated by the test data in the pattern. So far in my testing, the function has performed with one hundred percent accuracy.

The quick guide is:

  • Menu :- Generate/Generate Adaline One Training File :- Generates the file that is used for the Adaline Load and Run menu option
  • Menu :- Run/Load And Run Adaline 1:- Loads the Adaline saved network from the disk and then runs it against the Adaline file.
  • Menu :- Train/Train Adaline 1 :- Trains the network from scratch using the current adword.trn Adaline word training file and then saves it to disk.
  • Menu :- Options Adaline 1 Options :- Brings up a dialog that allows you to set certain parameters for the running of the Adaline network.

Options

Sample image

There are three options that can be set through the options dialog box for the Adaline one network. The first is the number of items in the file that is used for testing and training. This can be raised or lowered (as long as you make sure the file contains enough data items).

The second is the Learning Rate for the network that is used whenever Learn is called. And the third, "Use Bias" option, adds the bias value to the results of the run function in the transfer function.

Understanding The Output

Training

Pattern ID = 248 Input Value 0 = 0.580523407357057 
   Input Value 1 = 0.856571342263637 Output Value 0 = 1
Pattern ID = 249 Input Value 0 = 0.144502778139199 
   Input Value 1 = 0.420550713045779 Output Value 0 = 1 
Pattern ID = 250 Input Value 0 = 0.708482148921342 
   Input Value 1 = 0.984530083827921 Output Value 0 = 1

The above is part of a training run for the Adaline One network which shows the first stage of the output for the program. The Pattern ID value is the identifier number of the pattern and has no bearing on the program's final calculations. The Input Values are the values that are being loaded into the patterns array at the time, because this section of the output is when the program is loading the patterns array in preparation for training, and is called from the LoadAdalineTrainingFile function. This means that the Input Value at 0 and the Input Value at 1 and the Output Value at 0 should be exactly as they are in the training file the Adaline One network uses (lin2var.trn).

Learn called at number 0 Pattern value = 1 Neuron value = -1
Iteration number 8 produced 0 Good values out of 250
Learn called at number 2 Pattern value = -1 Neuron value = 1
Iteration number 9 produced 2 Good values out of 250
Learn called at number 0 Pattern value = 1 Neuron value = -1
Iteration number 10 produced 0 Good values out of 250

The next section of output that appears comes from within the main training loop and indicates where a Learn function was required to be called because the output returned from the network differed from that provided in the training file. The values output by the program indicate how far through the training array the code had gotten, before Learn was called. In the case of the first line above, the network immediately returned a wrong answer on the very first item to be fed into it. The next line shows how many times the network has cycled through the network, remember this is not always a complete run as the Adaline network aborts the current iteration when it detects an incorrect answer and moves on to the next iteration.

Pattern 223 Input = ( 0.955437682548276,0.938066999864796 ) 
   Adaline = -1 Actual = -1 
Pattern 224 Input = ( 0.519417053330418,0.795464988236998 ) 
   Adaline = 1 Actual = 1
Pattern 225 Input = ( 0.0833964241125604,0.664746477578183 ) 
   Adaline = 1 Actual = 1
Pattern 226 Input = ( 0.647375794894703,0.228725848360325 ) 
   Adaline = -1 Actual = -1

The final part of the training for the Adaline One network is to run the pattern array that was used to train the network through a saved version of the trained network that is loaded into a completely new Adaline network and then see what results it gives. As it was trained on this data, we expect perfection. The output shows the values entered into the network in brackets, and the required output with the final output being the value returned by the Adaline network, which if input value on the left is higher than the input value on the right should be -1, and 1 if the input value on the right is greater than the input value on the left.

Running

Adaline File Generated:

Pattern ID = 251 Input Value 0 = 0.421976388628584 
   Input Value 1 = 0.00332644209420608 Output Value 0 = -1
Pattern ID = 252 Input Value 0 = 0.985955759410726 
   Input Value 1 = 0.567305812876348 Output Value 0 = -1
Pattern ID = 253 Input Value 0 = 0.549935130192868 
   Input Value 1 = 0.131285183658491 Output Value 0 = -1

When running the Adaline One network it starts by generating a completely new file of values to test, to ensure that the values that the network is running against are different from those that it was trained by. The network starts by loading the pattern array which is identical to the output during training, which is shown above.

Pattern 0 Input = ( 0.421976388628584,0.00332644209420608 ) 
   Adaline = -1 Actual = -1
Pattern 1 Input = ( 0.985955759410726,0.567305812876348 ) 
   Adaline = -1 Actual = -1
Pattern 2 Input = ( 0.549935130192868,0.131285183658491 ) 
   Adaline = -1 Actual = -1
Pattern 3 Input = ( 0.712635237124113,0.695264554440633 ) 
   Adaline = -1 Actual = -1
Pattern 4 Input = ( 0.276614607906255,0.259243925222775 ) 
   Adaline = -1 Actual = -1 

Once the network has loaded the patterns array, it then runs the patterns against the loaded network and outputs the details as in the training code above.

Finally

As I've stated before the Adaline network, well Neuron is a simple neural network and as such has its limitations the most obvious being the fact that the answers are input along with the questions. Its main purpose is simple classifications where the desired output is known and there is a high volume of data to be processed. Most of the early neural networks we will come across are geared towards simple solutions applied to large volumes of data, with the classification of patterns within the data being the task of the network, though this classification will get more complicated even within the next few projects that we look at.

The next system we will look at is the Back propagation Network that will expand on the things that we have learnt here. This will be in the article Neural Dot Net 7 The Back propagation Network as Neural Dot Net 4 deals with the testing program provided with the download.

References

  • Tom Archer ( 2001 ) Inside C#, Microsoft Press
  • Jeffery Richter ( 2002 ) Applied Microsoft .NET Framework Programming, Microsoft Press
  • Charles Peltzold ( 2002 ) Programming Microsoft Windows With C#, Microsoft Press
  • Robinson et al ( 2001 ) Professional C#, Wrox
  • William R. Staneck ( 1997 ) Web Publishing Unleashed Professional Reference Edition, Sams.net
  • Robert Callan, The Essence Of Neural Networks ( 1999 ) Prentice Hall
  • Timothy Masters, Practical Neural Network Recipes In C++ ( 1993 ) Morgan Kaufmann ( Academic Press )
  • Melanie Mitchell, An Introduction To Genetic Algorithms ( 1999 ) MIT Press
  • Joey Rogers, Object-Orientated Neural Networks in C++ ( 1997 ) Academic Press
  • Simon Haykin Neural Networks A Comprehensive Foundation ( 1999 ) Prentice Hall
  • Bernd Oestereich ( 2002 ) Developing Software With UML Object-Orientated Analysis And Design In Practice Addison Wesley
  • R Beale & T Jackson ( 1990 ) Neural Computing An Introduction, Institute Of Physics Publishing

Thanks

  • Special thanks go to anyone involved in TortoiseCVS for version control
  • All UML diagrams were generated using Metamill version 2.2

History

  • 24 June 2003 :- Initial release
  • 24 October 2003 :- Review and edit for CP conformance

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here

About the Author

pseudonym67

United Kingdom United Kingdom
No Biography provided

Comments and Discussions

 
GeneralFormatting PinmemberPumpkin Carver24-Jun-03 8:04 
GeneralRe: Formatting PinsussAnonymous26-Jun-03 2:20 
GeneralRe: Formatting PinmemberMustafa Demirhan6-Jul-03 23:12 
GeneralRe: Formatting Pinmemberturmän4-May-06 10:26 
Bitch bitch bitch. How about commenting on the article

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

| Advertise | Privacy | Mobile
Web01 | 2.8.140721.1 | Last Updated 24 Oct 2003
Article Copyright 2003 by pseudonym67
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid