Click here to Skip to main content
Click here to Skip to main content
Add your own
alternative version
Go to top

Artificial Neural Networks made easy with the FANN library

, 28 Aug 2013
Neural networks are typically associated with specialised applications, developed only by select groups of experts. This misconception has had a highly negative effect on its popularity. Hopefully, the FANN library will help fill this gap.
fann-1_2_0.zip
fann-1.2.0
debian
changelog
compat
control
copyright
docs
libfann1-dev.dirs
libfann1-dev.examples
libfann1-dev.files
libfann1-dev.install
libfann1.dirs
libfann1.files
libfann1.install
rules
doc
fann_doc_complete_1.0.pdf
Makefile
html
src
include
Makefile.in
Makefile.am
Makefile.in
COPYING
Makefile.am
win32_dll
examples
makefile
README
Makefile.in
configure
AUTHORS
COPYING
ChangeLog
INSTALL
Makefile.am
NEWS
TODO
aclocal.m4
config.guess
config.sub
configure.in
depcomp
fann.pc.in
fann.spec.in
install-sh
ltmain.sh
missing
mkinstalldirs
benchmarks
datasets
building.test
building.train
diabetes.test
diabetes.train
gene.test
gene.train
mushroom.test
mushroom.train
robot.test
robot.train
soybean.test
soybean.train
thyroid.test
thyroid.train
two-spiral.train
pumadyn-32fm.test
pumadyn-32fm.train
two-spiral.test
parity8.train
parity8.test
parity13.test
parity13.train
Makefile
README
benchmark.sh
benchmarks.pdf
gnuplot
performance.cc
quality.cc
.cvsignore
examples
Makefile
xor.data
python
README
examples
libfann.i
makefile.gnu
makefile.msvc
libfann.pyc
MSVC++
libfann.dsp
all.dsw
simple_test.dsp
simple_train.dsp
steepness_train.dsp
xor_test.dsp
xor_train.dsp
config.in
fann_win32_dll-1_2_0.zip
changelog
compat
control
copyright
docs
libfann1-dev.dirs
libfann1-dev.examples
libfann1-dev.files
libfann1-dev.install
libfann1.dirs
libfann1.files
libfann1.install
rules
fann_doc_complete_1.0.pdf
Makefile
Makefile.in
Makefile.am
Makefile.in
COPYING
Makefile.am
makefile
README
Makefile.in
configure
AUTHORS
COPYING
ChangeLog
INSTALL
Makefile.am
NEWS
TODO
aclocal.m4
config.guess
config.sub
configure.in
depcomp
fann.pc.in
fann.spec.in
install-sh
ltmain.sh
missing
mkinstalldirs
building.test
building.train
diabetes.test
diabetes.train
gene.test
gene.train
mushroom.test
mushroom.train
robot.test
robot.train
soybean.test
soybean.train
thyroid.test
thyroid.train
two-spiral.train
pumadyn-32fm.test
pumadyn-32fm.train
two-spiral.test
parity8.train
parity8.test
parity13.test
parity13.train
Makefile
README
benchmark.sh
benchmarks.pdf
gnuplot
performance.cc
quality.cc
.cvsignore
Makefile
xor.data
README
libfann.i
makefile.gnu
makefile.msvc
libfann.pyc
libfann.dsp
all.dsw
simple_test.dsp
simple_train.dsp
steepness_train.dsp
xor_test.dsp
xor_train.dsp
config.in
bin
fanndoubled.dll
fanndoubled.lib
fanndoubleMTd.dll
fanndoubleMTd.lib
fannfixedd.dll
fannfixedd.lib
fannfixedMTd.dll
fannfixedMTd.lib
fannfloatd.dll
fannfloatd.lib
fannfloatMTd.dll
fannfloatMTd.lib
fanndouble.dll
fanndouble.lib
fanndoubleMT.dll
fanndoubleMT.lib
fannfixed.dll
fannfixed.lib
fannfixedMT.dll
fannfixedMT.lib
fannfloat.dll
fannfloat.lib
fannfloatMT.dll
fannfloatMT.lib
vs_net2003.zip
VS.NET2003
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<HTML
><HEAD
><TITLE
>Training an ANN</TITLE
><link href="../style.css" rel="stylesheet" type="text/css"><META
NAME="GENERATOR"
CONTENT="Modular DocBook HTML Stylesheet Version 1.7"><LINK
REL="HOME"
TITLE="Fast Artificial Neural Network Library"
HREF="index.html"><LINK
REL="UP"
TITLE="Neural Network Theory"
HREF="c225.html"><LINK
REL="PREVIOUS"
TITLE="Artificial Neural Networks"
HREF="x241.html"><LINK
REL="NEXT"
TITLE="API Reference"
HREF="c253.html"></HEAD
><BODY
CLASS="section"
BGCOLOR="#FFFFFF"
TEXT="#000000"
LINK="#0000FF"
VLINK="#840084"
ALINK="#0000FF"
><DIV
CLASS="NAVHEADER"
><TABLE
SUMMARY="Header navigation table"
WIDTH="100%"
BORDER="0"
CELLPADDING="0"
CELLSPACING="0"
><TR
><TH
COLSPAN="3"
ALIGN="center"
>Fast Artificial Neural Network Library</TH
></TR
><TR
><TD
WIDTH="10%"
ALIGN="left"
VALIGN="bottom"
><A
HREF="x241.html"
ACCESSKEY="P"
>Prev</A
></TD
><TD
WIDTH="80%"
ALIGN="center"
VALIGN="bottom"
>Chapter 4. Neural Network Theory</TD
><TD
WIDTH="10%"
ALIGN="right"
VALIGN="bottom"
><A
HREF="c253.html"
ACCESSKEY="N"
>Next</A
></TD
></TR
></TABLE
><HR
ALIGN="LEFT"
WIDTH="100%"></DIV
><DIV
CLASS="section"
><H1
CLASS="section"
><A
NAME="theory.training"
>4.3. Training an ANN</A
></H1
><P
>&#13;        When training an ANN with a set of input and output data, we wish to adjust the weights in the ANN, to make
	the ANN give the same outputs as seen in the training data. On the other hand, we do not want to make the ANN
	too specific, making it give precise results for the training data, but incorrect results for all other data.
	When this happens, we say that the ANN has been over-fitted.
      </P
><P
>&#13;        The training process can be seen as an optimization problem, where we wish to minimize the mean square
	error of the entire set of training data. This problem can be solved in many different ways, ranging from
	standard optimization heuristics like simulated annealing, through more special optimization techniques like
	genetic algorithms to specialized gradient descent algorithms like backpropagation.
      </P
><P
>&#13;        The most used algorithm is the backpropagation algorithm, but this algorithm has some limitations
	concerning, the extent of adjustment to the weights in each iteration. This problem has been solved in more
	advanced algorithms like RPROP [<A
HREF="b3048.html#bib.riedmiller_1993"
><I
>Riedmiller and Braun, 1993</I
></A
>]
	and quickprop [<A
HREF="b3048.html#bib.fahlman_1988"
><I
>Fahlman, 1988</I
></A
>].
      </P
></DIV
><DIV
CLASS="NAVFOOTER"
><HR
ALIGN="LEFT"
WIDTH="100%"><TABLE
SUMMARY="Footer navigation table"
WIDTH="100%"
BORDER="0"
CELLPADDING="0"
CELLSPACING="0"
><TR
><TD
WIDTH="33%"
ALIGN="left"
VALIGN="top"
><A
HREF="x241.html"
ACCESSKEY="P"
>Prev</A
></TD
><TD
WIDTH="34%"
ALIGN="center"
VALIGN="top"
><A
HREF="index.html"
ACCESSKEY="H"
>Home</A
></TD
><TD
WIDTH="33%"
ALIGN="right"
VALIGN="top"
><A
HREF="c253.html"
ACCESSKEY="N"
>Next</A
></TD
></TR
><TR
><TD
WIDTH="33%"
ALIGN="left"
VALIGN="top"
>Artificial Neural Networks</TD
><TD
WIDTH="34%"
ALIGN="center"
VALIGN="top"
><A
HREF="c225.html"
ACCESSKEY="U"
>Up</A
></TD
><TD
WIDTH="33%"
ALIGN="right"
VALIGN="top"
>API Reference</TD
></TR
></TABLE
></DIV
></BODY
></HTML
>

By viewing downloads associated with this article you agree to the Terms of Service and the article's licence.

If a file you wish to view isn't highlighted, and is a text file (not binary), please let us know and we'll add colourisation support for it.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Software Developer's Journal
Publisher
Poland Poland
Software Developer's Journal (formerly Software 2.0) is a magazine for professional programmers and developers publishing news from the software world and practical articles presenting very interesting ready programming solutions. To read more

| Advertise | Privacy | Mobile
Web01 | 2.8.140921.1 | Last Updated 28 Aug 2013
Article Copyright 2006 by Software Developer's Journal
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid