Click here to Skip to main content
Rate this: bad
good
Please Sign up or sign in to vote.
See more: C++
I can't get my feedforward net to work. it's a single layer one with a single neuron...
are any of these equations wrong?
output = sum of all (inputs x corresponding weight)
error = desired output - neuron output
weight change => new weight = old weight + learningrate*error*input for that synapse
 
if not maybe you can see whats wrong when i go through two patterns of 3 inputs each time:
 
Data | Weight | Output | Desired | Error | New Weight
--------------------------------------…
334.6 | 0.00125126 | 257.953 | 342.2 | 84.247 | 281.892
340.1 | 0.563585 |
340.7 | 0.193304 |
--------------------------------------…
340 | 281.892 | 293,693 ..... making weights rediculous and the program error
342 | 287.088 | hitting #INF fast because it cant handle it....
347 | 287.223 |
 
Many thanks!
Posted 22-Nov-12 9:13am
Rate this: bad
good
Please Sign up or sign in to vote.

Solution 2

The problem is that your activation function (i. e. the formula for calculating the output) is a linear function. Because of that, no matter what weights you set, the entire neural network of yours is simply a linear function and can never approximate more than two target values at once.
 
Basically the best you can achieve is linear regression, but then there is no point for using NNs for linear regression since you can directly solve that a lot faster. (and anyway your correction formula is wrong). The main point of NNs is that they can solve nonlinear problems with the right type of activation function:
 
What you need is a nonlinear activation function, and a correction formula specifically suited to that function. Backpropagation uses a simple sigmoid function that allows for a simple error correction formula, but isn't so hard to use different functions with similar results.
 
There are various articles on CP about Backpropagation and other types of nueral networks, I suggest you hit search and see what helps you best.
  Permalink  
Rate this: bad
good
Please Sign up or sign in to vote.

Solution 1

You didn't mention the neural network algorithm. This actually looks like backpropagation network (since you calculate error).
 
In any case (either feed forward or backpropagation):
 
output = f( sum of all (inputs x corresponding weight) )
 
where, f() is the activation function.
 
In binary neural nets (e.g. Hopfield net), f() is a step function
In continuous nets, f() is a sigmoid function such as tanh(x)
  Permalink  

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

  Print Answers RSS
0 OriginalGriff 404
1 Sergey Alexandrovich Kryukov 387
2 Shemeemsha RA 148
3 Abdul Samad KP 125
4 CPallini 100
0 OriginalGriff 6,189
1 Sergey Alexandrovich Kryukov 5,666
2 CPallini 4,810
3 George Jonsson 3,429
4 Gihan Liyanage 2,522


Advertise | Privacy | Mobile
Web03 | 2.8.140916.1 | Last Updated 23 Nov 2012
Copyright © CodeProject, 1999-2014
All Rights Reserved. Terms of Service
Layout: fixed | fluid

CodeProject, 503-250 Ferrand Drive Toronto Ontario, M3C 3G8 Canada +1 416-849-8900 x 100