15,918,404 members
See more:
Hi,

I'm learning about neural networks. How do I scale data in a neural network multilayer backpropagation? I've found this formula for the input and test values:
`I = Imin + (Imax-Imin)*(X-Dmin)/(Dmax-Dmin)`

Input values are real numbers like in this multiplication table
```1 1 1  (1 x 1 =1)
1 2 2  (1 x 2 =4)
.
.
.
2 3 6  (2 x 3 =6)
.
.
5 5 25 (5 x 5=25)```

I'd like to know how do I unscale output data to get the real output answers?

Thank you

thanks to this answer I've started my scaling phase, though I'm search for the good normalization formulas.
One can find so many papers on nn but very few ones on scaling un-scaling your data.

I've started this coding from the article
Back-propagation Neural Net[^]
where someone has implemented the multiplication table solved by the backpropagation algorithm.

I keep on my search,
I'm using:
```(double)LO+(HI-LO)*((X - minX)/(maxX - minX));

to scale, where:
LO=-1
HI=1
minX=1
maxX=25  (last result on my multiplication table)
X=input to scale```

In need to find out how to un-scale and if I'm using the right scaling method

Well, thanks again
Posted
Updated 3-Feb-10 10:41am
v7

## Solution 1

Scale down the answer when training for the input set so they are within the range of your sigmoid/squashing function. Then scale up by the same factor. This acts as a constant weight on the output neuron output.
BPNs are poorly suited to this task due to the results sets being limited by the weight, but it is a good way to watche the network learn. I also trained to a sine function and logical operation too.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Top Experts
Last 24hrsThis month
 OriginalGriff 90 halovalo 30 CPallini 20 Richard Deeming 10 Richard MacCutchan 10
 Pete O'Hanlon 700 OriginalGriff 571 Dave Kreskowiak 405 Richard MacCutchan 320 merano99 260

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900