Click here to Skip to main content
15,860,972 members
Articles / Programming Languages / C#
Article

Neural Dot Net Pt 7 The BackPropagation Network

Rate me:
Please Sign up or sign in to vote.
3.89/5 (16 votes)
3 Nov 200313 min read 107K   46   13
A neural network library in C#

Introduction

The BackPropagation network gets its name from the way that the learning is done due to the fact that with the BackPropagation network, the learning is started at the Learn function in the output nodes and proceeds backwards through the nodes updating the weights on the links as it goes. This example is based on chapter 5 of Joey Rogers' Object Orientated Networks in C++ book and has been expanded with the provision of the facility to generate test data and then run the data through the trained network.

The BackPropagation Network

There are five new classes to introduce with the BackPropagation program, most of which inherit directly from classes that have already been seen previously. Structure wise, the network is slightly more complicated than that seen in previous examples because we are now starting to look at networks that contain layers of nodes. These layers now have what are called "Hidden" nodes in that they are part of the actual network and are not directly accessed by code from the running program but lie between the input and the output nodes.

Image 1

Image 2

The Back Propagation Network Class

The BackPropagationNetwork class is used for building and controlling the network and inherits from the BasicNetwork class. This class expands on the basic network class in that it must be capable of providing layers for the network and must know where the layers are. I say this because the nodes are still stored in a single array list, the layers are a conceptual construct in that they only exist where the code says they are. There is no attempt to build a code hierarchy that models the diagram of the BackPropagation like that one above.

It should also be noted that although this example only has one output node, the code contains the facility to deal with more than one output node. This is noticeable in the code that gets and sets the output errors and values.

C#
public virtual double OutputError( int nID )
{
    if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
    {
        log.Log( DebugLevelSet.Progress, "Getting the Output Error at " 
             + nID.ToString() 
             + " from the Backpropagation network", ClassName );
    }

    if( this.Nodes.Count < ( nID + nFirstOutputNode ) )
    {
        if( debugLevel.TestDebugLevel( DebugLevelSet.WarningsAndErrors ) 
                      == true )
        {
            log.Log( DebugLevelSet.WarningsAndErrors, 
                 "Warning the array count is less than the index you are
                 using to access it, returning 0.0 ", ClassName );
        }

        return 0.0;
    }

    return this.GetNodeAt( nID + nFirstOutputNode ).
                                GetError( Values.NodeError );
}

public virtual void SetOutputError( int nID, double dNewValue )
{
    if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
    {
        log.Log( DebugLevelSet.Progress, 
               "Setting the output error for output node " 
                + nID.ToString() + " from " 
                + this.GetNodeAt( nID + nFirstOutputNode ).
                           GetError( Values.NodeError ).ToString() 
                + " to " 
                + this.GetNodeAt( nID + nFirstOutputNode ).
                           GetError( Values.NodeError ), ClassName );
    }

    if( this.Nodes.Count < ( nID + nFirstOutputNode ) )
    {
        if( debugLevel.TestDebugLevel( DebugLevelSet.WarningsAndErrors )
                                       == true )
        {
            log.Log( DebugLevelSet.WarningsAndErrors, 
                   "Warning the array count is less that the index 
                    you are using to access it, 
                    quitting set node error in BackPropagation network",
                    ClassName );
        }

        return;
    }

    this.GetNodeAt( nID + nFirstOutputNode ).
                        SetError( Values.NodeError, dNewValue );
}

public virtual void SetOutputError( Pattern pattern )
{
    if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
    {
        log.Log( DebugLevelSet.Progress, 
             "Set output error with pattern called for BackPropagation 
              network", ClassName );
    }

    for( int i=0; i<( ( int )this.Layers[ nNumLayers-1 ] ); i++ )
    {
        this.GetNodeAt( i + nFirstOutputNode ).SetError( Values.NodeError, 
                                                pattern.OutputValue( i ) );
    }
}

public virtual double GetOutputValue( int nID )
{
    if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
    {
        log.Log( DebugLevelSet.Progress,
              "Get output value called for BackPropagation network",
              ClassName );
    }

    if( this.Nodes.Count < ( nID + nFirstOutputNode ) )
    {
        if( debugLevel.TestDebugLevel( DebugLevelSet.WarningsAndErrors ) 
                                             == true )
        {
            log.Log( DebugLevelSet.WarningsAndErrors, 
                 "Warning the array count is less than the index 
                  you are using to access it, 
                  returning 0.0", ClassName );
        }

        return 0.0;
    }

    return this.GetNodeAt( nID + nFirstOutputNode ).
                                      GetValue( Values.NodeValue );
}

As you can see from the code that whenever the code accesses the output values for the network, it calls GetNodeAt which takes the ID passed to the value which will be zero for the first node and adds that to the nFirstOutputNode value which is the network class' way of keeping track of the number in the ArrayList that the first output node starts at.

The Back Propagation Network is built by the CreateNetwork function.

C#
protected override void CreateNetwork()
{
    if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
    {
        log.Log( DebugLevelSet.Progress, 
             "Create Network called for the BackPropagation network ",
             ClassName );
    }

    /// work out the number of nodes and links
    for( int i=0; i<arrayLayers.Count; i++ )
    {
        nNumberOfNodes += ( int )arrayLayers[ i ];
    }

    /// number of links equals the a link to each node in the preceding layer
    for( int i=1; i<arrayLayers.Count; i++ )
    {
        nNumberOfLinks += ( ( int )arrayLayers[ i-1 ] 
                             * ( int )arrayLayers[ i ] );
    }

    /// fill out the node arrays
    nFirstMiddleNode = 0;
    for( int i=0; i<nNumLayers; i++ )
    {
        /// input layer
        if( i==0 )
        {
            for( int n=0; n<( int )arrayLayers[ i ]; n++ )
            {
                this.AddNode( new BasicNode( log ) );
            }
        }
        /// output layer
        else if( i+1 == nNumLayers ) /// numlayers not 0 based
        {
            nFirstOutputNode = this.Nodes.Count;

            for( int n=0; n<( int )arrayLayers[ i ]; n++ )
            {
                this.AddNode( 
                    new BackPropagationOutputNode( log, 
                         this.LearningRate, this.dMomentumTerm ) );
            }
        }
        /// middle layer (s)
        else
        {
            nFirstMiddleNode = this.Nodes.Count;

            for( int n=0; n<( int )arrayLayers[ i ]; n++ )
            {
                this.AddNode( 
                     new BackPropagationMiddleNode( log, 
                           this.LearningRate, this.dMomentumTerm ) );
            }
        }
    }

    /// create the links            
    for( int i=0; i<nNumberOfLinks; i++ )
    {
        this.AddLink( new BackPropagationLink( log ) );
    }

    /// now do all the connections
    int nLayerOne = 0;
    int nLayerTwo = nFirstMiddleNode;
    int nLinkNumber = 0;

    for( int i=0; i<nNumLayers-1; i++ )
    {
        /// outer layer ( starts with input layer )
        for( int n=0; n<( int )arrayLayers[ i ]; n++ )
        {
            /// next inner layer to link to the outer layer
            for( int k=0; k<( ( int )arrayLayers[ i + 1 ] ); k++ )
            {
                ( ( BasicNode )this.Nodes[ nLayerOne + n ] ).CreateLink( 
                    ( BasicNode )this.Nodes[ nLayerTwo + k ], 
                    ( BasicLink )this.Links[ nLinkNumber ] );
                nLinkNumber++;
            }
        }

        nLayerOne = nLayerTwo;
        nLayerTwo += ( int )Layers[ i + 1 ];
    }
}

The CreateNetwork function starts by calculating the numbers of nodes and links that are required for the creation of the network. There are a couple of ways that you can add layers to the BackPropagationNetwork class. The first and the one used by this code is to build a three layer network using the provided constructor which then takes the number of nodes for each layer. The second is to use the constructor that just takes the number of layers, you can then add the layers by:

C#
AddLayer = numberOfNodes;

passing the number of nodes that are to be in that specific layer.

The CreateNetwork function then creates the nodes depending on where they are in the layer structure, i.e. anything in the first layer is an input node, anything in the last layer is an output node, and anything in between is a middle node. With the creation of the nodes, the code then creates the correct number of links to join the nodes together before cycling through the nodes and creating the links between the separate layers.

The Back Propagation Output Node Class

The BackPropagationOutputNode class inherits from the Adaline node class so that it can use the Run function provided by that class. The most major change in the BackPropagationOutputNode class is the change to the Transfer function. This is:

C#
protected override double TransferFunction( double dValue )
{
    return 1.0/( 1+Math.Exp( dValue ) ); /// sigma 
}

which now uses a sigmoid function to guarantee that the output returned by the function is within the range of 0 and 1.

The BackPropagationOutputNode Class also has another new function called the ComputeError function:

C#
public virtual double ComputeError()
{
    return ( ( double )this.NodeValues[ Values.NodeValue ] ) 
            * ( 1.0-( double )this.NodeValues[ Values.NodeValue ] ) 
            
            * ( ( ( double )this.NodeErrors[ Values.NodeError ] ) 
            - ( ( double )this.NodeValues[ Values.NodeValue ] ) );
}

which computes the error for the output node as the current node value multiplied by the current node value minus one, multiplied by the current error value for the node minus the current node value. This function is called from the BackPropagationOutputNode Learn function which sets the node error value with the returned result.

The BackPropagationOutputNode Learn function looks like:

C#
public override void Learn()
{
    if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
    {
        log.Log( DebugLevelSet.Progress, 
                 "Learn called for BackPropagation node ", ClassName );
    }

    double dDelta = 0.0;
    this.NodeErrors[ Values.NodeError ] = ComputeError();

    for( int i=0;i<this.InputLinks.Count; i++ )
    {
        dDelta = ( ( double )this.NodeValues[ Values.LearningRate ] ) 

                * ( ( double )this.NodeErrors[ Values.NodeError ] ) 
                * ( ( BasicLink )this.InputLinks[ i ] ).InputValue( Values.NodeValue );
        ( ( BackPropagationLink )this.InputLinks[ i ] ).UpdateWeight( dDelta );
    }
}

which starts off by getting the error value for the current node and then calculates the new weight value for the node as the learning rate multiplied by the node error value multiplied by the input value.

The Back Propagation Middle Node Class

The BackPropagationMiddleNode class inherits from the BackPropagationOutputNode class and overrides the ComputeError function.

C#
public override double ComputeError()
{
    double dTotal = 0.0;
    for( int i=0; i<this.OutputLinks.Count; i++ )
    {
        dTotal += ( ( BackPropagationLink )
           this.OutputLinks[ i ] ).WeightedOutputError( Values.NodeError );
    }

    return ( double )this.NodeValues[ Values.NodeValue ] 
          * ( 1.0-( ( double )this.NodeValues[ Values.NodeValue ] ) ) 
          * dTotal;
}

The difference between the versions of the ComputeError function is that the BackPropagationMiddleNode calculates the error for the middle nodes differently in that it calculates the total values for the weighted errors and then returns the current node value multiplied by the current node value minus one multiplied by the total value calculated as the total weighted error value.

The Back Propagation Link Class

The BackPropagationLink class is an extension of the BasicLink class and is provided to enable the use of the delta and the momentum values that are used by the Back Propagation Network. Its main difference to the BasicLink class comes with UpdateWeight function.

C#
public override void UpdateWeight( double dNewValue )
{
    /// get the current momentum
    double dMomentum = this.OutputNode.GetValue( Values.Momentum );
    /// update the weight with the current change and a percentage 
    /// of the last change
    this.arrayLinkValues[ Values.Weight ] = 
          ( double )this.arrayLinkValues[ Values.Weight ] 
          + dNewValue + ( dMomentum * 
          ( double )this.arrayLinkValues[ Values.Delta ] );
    /// store the new value as passed 
    this.arrayLinkValues[ Values.Delta ] = dNewValue;
}

This gets the momentum from the node that is set when the Back Propagation network is created. It then calculates the new weight by adding the current weight to the new value passed in and then adding a proportion of the previous value for the weight. In this example, the momentum is being set 0.9 so the value will be 0.9 times whatever value is stored in the delta value, which as you can see, is stored in the delta in the next line of code, hence giving a percentage of the previous update value.

Training

Training for the Back Propagation Network is slightly more complicated than it was for the Adaline Network that we have seen earlier. The reason for this is because the Back Propagation Network does the weight adjustments to the network all in one go. That is, it starts at the output node and propagates the training backwards through the middle nodes.

Image 3

The training loop for the Back Propagation Network looks like:

C#
while( nGood < 4 )
{
    nGood = 0;
    dTotalError = 0.0;

    for( int i=0; i<4; i++ )
    {
        /// set the input values
        bpNetwork.SetValue( ( ( Pattern )patterns[ i ] ) );

        /// run the network
        bpNetwork.Run();

        /// set the desired output
        bpNetwork.SetOutputError( ( ( Pattern )patterns[ i ] ) );

        /// run learn anyway
        bpNetwork.Learn();

        dTest = ( Math.Abs( bpNetwork.GetOutputValue( 0 ) ) 
                 - ( ( Pattern )patterns[ i ] ).OutputValue( 0 ) );
        netWorkText.AppendText( "Network out put value = " 
                 + ( double )bpNetwork.GetOutputValue( 0 ) 
                 + " Pattern out put value = " 
                 +  ( ( Pattern )patterns[ i ] ).OutputValue( 0 ) + "\n" );
        netWorkText.AppendText( "Absolute output value = " 
                 + dTest.ToString() + "\n" );

        if( dTest  < dTolerance )
        {
            if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
            {
                log.Log( DebugLevelSet.Progress, 
                         "Test falls within tolerance levels", ClassName );
            }

            netWorkText.AppendText( "Test = " 
                         + dTest.ToString() + " Tolerance = " 
                         + dTolerance.ToString() 
                         + " Test falls within tolerance levels\n" );

            nGood++;
        }

        dTotalError += Math.Abs( bpNetwork.OutputError( 0 ) );

        if( debugLevel.TestDebugLevel( DebugLevelSet.Progress ) == true )
        {
            log.Log( DebugLevelSet.Progress, 
                   "Back Propagation training run " + i.ToString() 
                   + " completed, total error = " 
                   + dTotalError.ToString(), ClassName );
        }

        netWorkText.AppendText( "Pattern " 
              + ( ( Pattern )patterns[ i ] ).InputValue( 0 ).ToString() 
              + "," + ( ( Pattern )patterns[ i ] ).InputValue( 1 ).ToString() 
              + " run through the network " + " output error at 0 = " 
              + bpNetwork.OutputError( 0 ).ToString() 
              + " total error = " + dTotalError.ToString() + "\n" );
    }
}

The training loop only tries to get four good results as we are trying to solve the XOR problem here that means that, of the four inputs 0 & 0, 0 & 1, 1 & 0, and 1 & 1, only the pairs of numbers that contain a single value of 1 should give a positive value of 1 in the output.

The Back Propagation Network figures out the answer to this problem by running through each epoch of the four patterns or value pairs indicated above. It then calls the network Run function for the pair. The essential part of the Back Propagation Network Run function is:

C#
for( int i=nFirstMiddleNode; i<this.Nodes.Count; i++ )
{
    ( ( AdalineNode )this.Nodes[ i ] ).Run( Values.NodeValue );
}

The Run function starts at the first middle node and then calls the Adaline node Run function on all of the nodes up to Nodes.Count which includes calling the Run function on the output node.

The output error for the output nodes in the network is then set to the desired output that is stored in the pattern value, before the Learn function is called for every single run through the loop. The Learn function:

Image 4

C#
for( int i=this.Nodes.Count-1; i<=nFirstMiddleNode; i-- )
{
    ( ( AdalineNode )this.Nodes[ i ] ).Learn();
}

cycles backwards through the output nodes to the first middle node in the network, calling the Learn function on each node. Each Learn is called differently for the Back Propagation Output Node and the Back Propagation Middle node as they have different ways of computing the error values for the nodes as shown in the class descriptions above.

Saving And Loading

The Back Propagation network uses the same XML loading and saving techniques used through out the library. Here is an example of a saved XML file.

XML
<?xml version="1.0" encoding="utf-8" ?>
<BackPropagationNetwork>
    <NumberOfLayers>3</NumberOfLayers>
    <FirstMiddleNode>2</FirstMiddleNode>
    <FirstOutputNode>4</FirstOutputNode>
    <Momentum>0.9</Momentum>
    <Layers>
        <Layer0>2</Layer0>
        <Layer1>2</Layer1>
        <Layer2>1</Layer2>
    </Layers>
    <BasicNode>
        <Identifier>0</Identifier>
        <NodeValue>1</NodeValue>
        <NodeError>0</NodeError>
        <Bias>
            <BiasValue>1</BiasValue>
        </Bias>
    </BasicNode>
    <BasicNode>
        <Identifier>1</Identifier>
        <NodeValue>1</NodeValue>
        <NodeError>0</NodeError>
        <Bias>
            <BiasValue>1</BiasValue>
        </Bias>
    </BasicNode>
    <BackPropagationMiddleNode>
        <BackPropagationOutputNode>
            <AdalineNode>
                <BasicNode>
                    <Identifier>2</Identifier>
                    <NodeValue>0.306374302171129</NodeValue>
                    <NodeValue>0.45</NodeValue>
                    <NodeValue0.9</NodeValue>
                    <NodeError>-0.0212641438895606</NodeError>
                    <Bias>
                        <BiasValue>1</BiasValue>
                    </Bias>
                </BasicNode>
            </AdalineNode>
        </BackPropagationOutputNode>
    </BackPropagationMiddleNode>
    <BackPropagationMiddleNode>
        <BackPropagationOutputNode>
            <AdalineNode>
                <BasicNode>
                    <Identifier>3</Identifier>
                    <NodeValue>0.0793336908508232</NodeValue>
                    <NodeValue>0.45</NodeValue>
                    <NodeValue>0.9</NodeValue>
                    <NodeError>-0.00849829845549166</NodeError>
                    <Bias>
                        <BiasValue>1</BiasValue>
                    </Bias>
                </BasicNode>
            </AdalineNode>
        </BackPropagationOutputNode>
    </BackPropagationMiddleNode>
    <BackPropagationOutputNode>
        <AdalineNode>
            <BasicNode>
                <Identifier>4</Identifier>
                <NodeValue>0.39830515925165</NodeValue>
                <NodeValue>0.45</NodeValue>
                <NodeValue>0.9</NodeValue>
                <NodeError>-0.0954570813319005</NodeError>
                <Bias>
                    <BiasValue>1</BiasValue>
                </Bias>
            </BasicNode>
        </AdalineNode>
    </BackPropagationOutputNode>
    <BackPropagationLink>
        <BasicLink>
            <Identifier>5</Identifier>
            <LinkValue>0.156366589563507</LinkValue>
            <LinkValue>-0.00956886475030225</LinkValue>
            <InputNodeID>0</InputNodeID>
            <OutputNodeID>2</OutputNodeID>
        </BasicLink>
    </BackPropagationLink>
    <BackPropagationLink>
        <BasicLink>
            <Identifier>6</Identifier>
            <LinkValue>1.28844804384242</LinkValue>
            <LinkValue>-0.00382423430497125</LinkValue>
            <InputNodeID>0</InputNodeID>
            <OutputNodeID>3</OutputNodeID>
        </BasicLink>
    </BackPropagationLink>
    <BackPropagationLink>
        <BasicLink>
            <Identifier>7</Identifier>
            <LinkValue>0.657049900059346</LinkValue>
            <LinkValue>-0.00956886475030225</LinkValue>
            <InputNodeID>1</InputNodeID>
            <OutputNodeID>2</OutputNodeID>
        </BasicLink>
    </BackPropagationLink>
    <BackPropagationLink>
        <BasicLink>
            <Identifier>8</Identifier>
            <LinkValue>1.16778235004883</LinkValue>
            <LinkValue>-0.00382423430497125</LinkValue>
            <InputNodeID>1</InputNodeID>
            <OutputNodeID>3</OutputNodeID>
        </BasicLink>
    </BackPropagationLink>
    <BackPropagationLink>
        <BasicLink>
            <Identifier>9</Identifier>
            <LinkValue>1.04824365230734</LinkValue>
            <LinkValue>-0.0131605185061592</LinkValue>
            <InputNodeID>2</InputNodeID>
            <OutputNodeID>4</OutputNodeID>
        </BasicLink>
    </BackPropagationLink>
    <BackPropagationLink>
        <BasicLink>
            <Identifier>10</Identifier>
            <LinkValue>1.21888832568501</LinkValue>
            <LinkValue>-0.00340783316095809</LinkValue>
            <InputNodeID>3</InputNodeID>
            <OutputNodeID>4</OutputNodeID>
        </BasicLink>
    </BackPropagationLink>
</BackPropagationNetwork>

As you can see above, the layers section stores each layer in the XML file with the number of nodes that are to be in that layer of the network. The Back Propagation network also stores the array positions of the first middle node and the array position of the first output node, as well as the momentum for the network which is also stored individually by each node at array position two of the node value array. This is as well as the learning rate which is stored at position one in the node value array. The Back Propagation links also are slightly different in that they now store the delta value in position one of the link value array.

Testing

The Testing portions of the code are located under the run menu for the Neural Net Tester program. The test for this program is the "Load And Run back Propagation 1" menu option. This will load the file that resembles the one above. I say resembles as the linkage values won't be exactly the same any two times running.

The menu option will load and run the backpropagtionworkingfile.wrk and generate the log file Neural Network Tester Load And Run BackPropagation One Network.xml which can be viewed using the LogViewer that is part of the neural net tester program.

The display will show at the end a list of all the input data and the conclusion the back propagation network reached about that data. Next to this will be the answer that was generated by the test data in the pattern. So far in my testing, the function has performed with one hundred percent accuracy.

The quick guide is:

  • Menu :- Generate/Generate BackPropagation working File :- Generates the file that is used for the Adaline Load and run menu option.
  • Menu :- Run/Load And Run BackPropagation 1:- Loads the saved BackPropagation network from the disk and then runs it against the working file.
  • Menu :- Train/Train BackPropagation 1 :- Trains the network from scratch using the hard coded XOR data and then saves it to disk.
  • Menu :- Options BackPropagation 1 Options :- Brings up a dialog that allows you to set certain parameters for the running of the network.

Options

Image 5

The above is the options dialog for the Back Propagation One network and contains the five options you can set. The first being the Number of Tests which is the number of items that are read from and generated into the testing file for the network, which in the case of this network is the BackPropagationOneWorkingFile.wrk. The second is the tolerance level that is acceptable to the program. This should always be a value that the code is able to distinguish, i.e., if this was set to 0.6, then the acceptable values would overlap making any answers returned from the network meaningless. The third and fourth are the momentum and the learning rate which are both used in the calculations that determine the weight values for each link to the nodes and the final option is a simple check box to specify if you want to use the in-built bias which always has a value of one in the calculations.

Understanding The Output

Training

Network out put value = 0.397890715954771 Pattern out put value = 0
Absolute output value = 0.397890715954771
Test = 0.397890715954771 Tolerance = 0.4 Test falls within tolerance levels
Pattern 0,0 run through the network output error 
  at 0 = -0.0953241486740618 total error = 0.0953241486740618
Network out put value = 0.430815783338131 Pattern out put value = 1
Absolute output value = -0.569184216661869
Test = -0.569184216661869 Tolerance = 0.4 Test falls within tolerance levels
Pattern 0,1 run through the network output error 
  at 0 = 0.13957167905037 total error = 0.234895827724432

The above shows two runs through the training loop for the Back Propagation Network. Unlike the Adaline network, the Back Propagation network calls Learn automatically each time through the loop so there is no output to say that Learn has been called. The first line of the output is the value that the network has arrived at and the pattern output value that we want the network to arrive at. The second line shows the absolute value of the network's output which is the absolute value of the network's output value minus the pattern's output value. This absolute value is then tested against the tolerance level which in this case has been set to 0.4. If the absolute value is less than the tolerance value then it is determined to be a successful test. The final line shows the error values returned by the network and the pattern that was run through the network to begin with.

Saving the BackPropagation Network
Loading the BackPropagation Network
Pattern Values = 0, 0 Pattern Output Value = 0 
   Net work Output Value = 0.284630680477795
Pattern Values = 0, 1 Pattern Output Value = 1 
   Net work Output Value = 0.651005374420711
Pattern Values = 1, 0 Pattern Output Value = 1 
   Net work Output Value = 0.66443836682106
Pattern Values = 1, 1 Pattern Output Value = 0 
   Net work Output Value = 0.395627748169065

Once the network has successfully trained against the test data, in this case, the four acceptable values for the XOR test, then the network is saved and reloaded into a new Back Propagation network before the values are again entered into the network to see how it performs. The final four lines above show the output of a run for the XOR test with each indicating which pattern was entered for the test and the expected output value along with the final value which is the value that the network has arrived at. As you can see, all the values fall within the tolerance levels, with all the values where a 0 is expected to be returned being less than 0.4, and all the values where a 1 is expected being greater than 0.6.

Running

Generating Backpropagation working File... Please Wait
Backpropagation Working File Generated
Setting pattern 0 to, Input One = 1 Input Two = 1 Output Value = 0
Setting pattern 1 to, Input One = 0 Input Two = 0 Output Value = 0
Setting pattern 2 to, Input One = 1 Input Two = 0 Output Value = 1

As with the Adaline network samples, the Back Propagation network generates a file of data to run against the loaded network when load and run is selected. This file is filled with pairs of values that are either 0 or 1. And the third value is the value that the network should arrive at.

Pattern Values = 0, 1 Pattern Output Value = 1 
    Net work Output Value = 0.651005374420711 
    Output value is within tolerance level of 0.4 of 1
Pattern Values = 1, 1 Pattern Output Value = 0 
    Net work Output Value = 0.395627748169065 
    Output value is within tolerance level of 0.4 of 0

Once the pattern array has been loaded, the data is run and the results are printed to the screen. The results show the original values entered and the required output, followed by the value that the network arrived at and finally indicates if the value is within the tolerance level of the correct answer.

Fun And Games

As with the other network examples, the main aim is to produce a network that can be repetitively trained and saved to disk for demonstration purposes. For this reason, the main parameter that was played with during testing is the tolerance parameter. For the purpose of the example, it is set to a value of 0.4 which is giving it quite a large margin which you may not wish to, were you using the network in a production environment. There is the option to change the tolerance parameter within the options dialog, but it should be remembered that the smaller the tolerance gets, the longer the network training will take. Still, this shouldn't be too much of a problem as once trained, the network can simply be reloaded as fully trained.

References

  • Tom Archer (2001) Inside C#, Microsoft Press
  • Jeffery Richter (2002) Applied Microsoft .NET Framework Programming, Microsoft Press
  • Charles Peltzold (2002) Programming Microsoft Windows With C#, Microsoft Press
  • Robinson et al (2001) Professional C#, Wrox
  • William R. Staneck (1997) Web Publishing Unleashed Professional Reference Edition, Sams.net
  • Robert Callan, The Essence Of Neural Networks (1999) Prentice Hall
  • Timothy Masters, Practical Neural Network Recipes In C++ (1993) Morgan Kaufmann (Academic Press)
  • Melanie Mitchell, An Introduction To Genetic Algorithms (1999) MIT Press
  • Joey Rogers, Object-Orientated Neural Networks in C++ (1997) Academic Press
  • Simon Haykin, Neural Networks A Comprehensive Foundation (1999) Prentice Hall
  • Bernd Oestereich (2002) Developing Software With UML Object-Orientated Analysis And Design In Practice, Addison Wesley
  • R Beale & T Jackson (1990) Neural Computing An Introduction, Institute Of Physics Publishing

Thanks

Special thanks go to anyone involved in TortoiseCVS for version control.

All UML diagrams were generated using Metamill version 2.2.

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here


Written By
United Kingdom United Kingdom
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

 
Questionhi... Pin
wilusili11-Apr-07 23:14
wilusili11-Apr-07 23:14 
Generalbackpropagation Pin
wilusili11-Apr-07 23:13
wilusili11-Apr-07 23:13 
Generaltesting code Pin
hanihaddad11-Mar-06 2:26
hanihaddad11-Mar-06 2:26 
QuestionPossible bug? Pin
Anonymous4-Apr-05 16:08
Anonymous4-Apr-05 16:08 
GeneralHey need help from u!!!!!!!!!! SOON Pin
Anonymous20-Mar-05 20:42
Anonymous20-Mar-05 20:42 
QuestionSpeed ? Pin
freshthinking4-Jul-03 8:57
freshthinking4-Jul-03 8:57 
AnswerRe: Speed ? Pin
pseudonym674-Jul-03 22:32
pseudonym674-Jul-03 22:32 
GeneralRe: Speed ? Pin
freshthinking4-Jul-03 23:01
freshthinking4-Jul-03 23:01 
GeneralRe: Speed ? Pin
pseudonym675-Jul-03 3:33
pseudonym675-Jul-03 3:33 
freshthinking wrote:
I, in no way meant to be rude about your code.

I didn't think you were I was just trying to get the point across that there were a number of considerations decided at the start of the project and that speed simply wasn't one of them.

pseudonym67
GeneralRe: Speed ? Pin
TeleStar5-Nov-03 7:11
TeleStar5-Nov-03 7:11 
GeneralNeural Network and Pattern Recognition Pin
alexpop2-Jul-03 22:11
alexpop2-Jul-03 22:11 
GeneralRe: Neural Network and Pattern Recognition Pin
pseudonym673-Jul-03 0:06
pseudonym673-Jul-03 0:06 
GeneralRe: Neural Network and Pattern Recognition Pin
Adam Byrne3-Jul-03 6:33
Adam Byrne3-Jul-03 6:33 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.