|
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<HTML
><HEAD
><TITLE
>Training an ANN</TITLE
><link href="../style.css" rel="stylesheet" type="text/css"><META
NAME="GENERATOR"
CONTENT="Modular DocBook HTML Stylesheet Version 1.7"><LINK
REL="HOME"
TITLE="Fast Artificial Neural Network Library"
HREF="index.html"><LINK
REL="UP"
TITLE="Neural Network Theory"
HREF="c225.html"><LINK
REL="PREVIOUS"
TITLE="Artificial Neural Networks"
HREF="x241.html"><LINK
REL="NEXT"
TITLE="API Reference"
HREF="c253.html"></HEAD
><BODY
CLASS="section"
BGCOLOR="#FFFFFF"
TEXT="#000000"
LINK="#0000FF"
VLINK="#840084"
ALINK="#0000FF"
><DIV
CLASS="NAVHEADER"
><TABLE
SUMMARY="Header navigation table"
WIDTH="100%"
BORDER="0"
CELLPADDING="0"
CELLSPACING="0"
><TR
><TH
COLSPAN="3"
ALIGN="center"
>Fast Artificial Neural Network Library</TH
></TR
><TR
><TD
WIDTH="10%"
ALIGN="left"
VALIGN="bottom"
><A
HREF="x241.html"
ACCESSKEY="P"
>Prev</A
></TD
><TD
WIDTH="80%"
ALIGN="center"
VALIGN="bottom"
>Chapter 4. Neural Network Theory</TD
><TD
WIDTH="10%"
ALIGN="right"
VALIGN="bottom"
><A
HREF="c253.html"
ACCESSKEY="N"
>Next</A
></TD
></TR
></TABLE
><HR
ALIGN="LEFT"
WIDTH="100%"></DIV
><DIV
CLASS="section"
><H1
CLASS="section"
><A
NAME="theory.training"
>4.3. Training an ANN</A
></H1
><P
> When training an ANN with a set of input and output data, we wish to adjust the weights in the ANN, to make
the ANN give the same outputs as seen in the training data. On the other hand, we do not want to make the ANN
too specific, making it give precise results for the training data, but incorrect results for all other data.
When this happens, we say that the ANN has been over-fitted.
</P
><P
> The training process can be seen as an optimization problem, where we wish to minimize the mean square
error of the entire set of training data. This problem can be solved in many different ways, ranging from
standard optimization heuristics like simulated annealing, through more special optimization techniques like
genetic algorithms to specialized gradient descent algorithms like backpropagation.
</P
><P
> The most used algorithm is the backpropagation algorithm, but this algorithm has some limitations
concerning, the extent of adjustment to the weights in each iteration. This problem has been solved in more
advanced algorithms like RPROP [<A
HREF="b3048.html#bib.riedmiller_1993"
><I
>Riedmiller and Braun, 1993</I
></A
>]
and quickprop [<A
HREF="b3048.html#bib.fahlman_1988"
><I
>Fahlman, 1988</I
></A
>].
</P
></DIV
><DIV
CLASS="NAVFOOTER"
><HR
ALIGN="LEFT"
WIDTH="100%"><TABLE
SUMMARY="Footer navigation table"
WIDTH="100%"
BORDER="0"
CELLPADDING="0"
CELLSPACING="0"
><TR
><TD
WIDTH="33%"
ALIGN="left"
VALIGN="top"
><A
HREF="x241.html"
ACCESSKEY="P"
>Prev</A
></TD
><TD
WIDTH="34%"
ALIGN="center"
VALIGN="top"
><A
HREF="index.html"
ACCESSKEY="H"
>Home</A
></TD
><TD
WIDTH="33%"
ALIGN="right"
VALIGN="top"
><A
HREF="c253.html"
ACCESSKEY="N"
>Next</A
></TD
></TR
><TR
><TD
WIDTH="33%"
ALIGN="left"
VALIGN="top"
>Artificial Neural Networks</TD
><TD
WIDTH="34%"
ALIGN="center"
VALIGN="top"
><A
HREF="c225.html"
ACCESSKEY="U"
>Up</A
></TD
><TD
WIDTH="33%"
ALIGN="right"
VALIGN="top"
>API Reference</TD
></TR
></TABLE
></DIV
></BODY
></HTML
>
|
By viewing downloads associated with this article you agree to the Terms of Service and the article's licence.
If a file you wish to view isn't highlighted, and is a text file (not binary), please
let us know and we'll add colourisation support for it.