Click here to Skip to main content
15,918,404 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:

I'm learning about neural networks. How do I scale data in a neural network multilayer backpropagation? I've found this formula for the input and test values:
I = Imin + (Imax-Imin)*(X-Dmin)/(Dmax-Dmin)

Input values are real numbers like in this multiplication table
1 1 1  (1 x 1 =1)
1 2 2  (1 x 2 =4)
2 3 6  (2 x 3 =6)
5 5 25 (5 x 5=25)

I'd like to know how do I unscale output data to get the real output answers?

Thank you

thanks to this answer I've started my scaling phase, though I'm search for the good normalization formulas.
One can find so many papers on nn but very few ones on scaling un-scaling your data.

I've started this coding from the article
Back-propagation Neural Net[^]
where someone has implemented the multiplication table solved by the backpropagation algorithm.

I keep on my search,
I'm using:
(double)LO+(HI-LO)*((X - minX)/(maxX - minX));

to scale, where:
maxX=25  (last result on my multiplication table)
X=input to scale

In need to find out how to un-scale and if I'm using the right scaling method

Well, thanks again
Updated 3-Feb-10 10:41am

1 solution

Scale down the answer when training for the input set so they are within the range of your sigmoid/squashing function. Then scale up by the same factor. This acts as a constant weight on the output neuron output.
BPNs are poorly suited to this task due to the results sets being limited by the weight, but it is a good way to watche the network learn. I also trained to a sine function and logical operation too.
Share this answer

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900