Click here to Skip to main content
Click here to Skip to main content
Add your own
alternative version

Artificial Neural Networks made easy with the FANN library

, 28 Aug 2013 CPOL
Neural networks are typically associated with specialised applications, developed only by select groups of experts. This misconception has had a highly negative effect on its popularity. Hopefully, the FANN library will help fill this gap.
fann-1_2_0.zip
fann-1.2.0
debian
changelog
compat
control
copyright
docs
libfann1-dev.dirs
libfann1-dev.examples
libfann1-dev.files
libfann1-dev.install
libfann1.dirs
libfann1.files
libfann1.install
rules
doc
fann_doc_complete_1.0.pdf
Makefile
html
src
include
Makefile.in
Makefile.am
Makefile.in
COPYING
Makefile.am
win32_dll
examples
makefile
README
Makefile.in
configure
AUTHORS
COPYING
ChangeLog
INSTALL
Makefile.am
NEWS
TODO
aclocal.m4
config.guess
config.sub
configure.in
depcomp
fann.pc.in
fann.spec.in
install-sh
ltmain.sh
missing
mkinstalldirs
benchmarks
datasets
building.test
building.train
diabetes.test
diabetes.train
gene.test
gene.train
mushroom.test
mushroom.train
robot.test
robot.train
soybean.test
soybean.train
thyroid.test
thyroid.train
two-spiral.train
pumadyn-32fm.test
pumadyn-32fm.train
two-spiral.test
parity8.train
parity8.test
parity13.test
parity13.train
Makefile
README
benchmark.sh
benchmarks.pdf
gnuplot
performance.cc
quality.cc
.cvsignore
examples
Makefile
xor.data
python
README
examples
libfann.i
makefile.gnu
makefile.msvc
libfann.pyc
MSVC++
libfann.dsp
all.dsw
simple_test.dsp
simple_train.dsp
steepness_train.dsp
xor_test.dsp
xor_train.dsp
config.in
fann_win32_dll-1_2_0.zip
changelog
compat
control
copyright
docs
libfann1-dev.dirs
libfann1-dev.examples
libfann1-dev.files
libfann1-dev.install
libfann1.dirs
libfann1.files
libfann1.install
rules
fann_doc_complete_1.0.pdf
Makefile
Makefile.in
Makefile.am
Makefile.in
COPYING
Makefile.am
makefile
README
Makefile.in
configure
AUTHORS
COPYING
ChangeLog
INSTALL
Makefile.am
NEWS
TODO
aclocal.m4
config.guess
config.sub
configure.in
depcomp
fann.pc.in
fann.spec.in
install-sh
ltmain.sh
missing
mkinstalldirs
building.test
building.train
diabetes.test
diabetes.train
gene.test
gene.train
mushroom.test
mushroom.train
robot.test
robot.train
soybean.test
soybean.train
thyroid.test
thyroid.train
two-spiral.train
pumadyn-32fm.test
pumadyn-32fm.train
two-spiral.test
parity8.train
parity8.test
parity13.test
parity13.train
Makefile
README
benchmark.sh
benchmarks.pdf
gnuplot
performance.cc
quality.cc
.cvsignore
Makefile
xor.data
README
libfann.i
makefile.gnu
makefile.msvc
libfann.pyc
libfann.dsp
all.dsw
simple_test.dsp
simple_train.dsp
steepness_train.dsp
xor_test.dsp
xor_train.dsp
config.in
bin
fanndoubled.dll
fanndoubled.lib
fanndoubleMTd.dll
fanndoubleMTd.lib
fannfixedd.dll
fannfixedd.lib
fannfixedMTd.dll
fannfixedMTd.lib
fannfloatd.dll
fannfloatd.lib
fannfloatMTd.dll
fannfloatMTd.lib
fanndouble.dll
fanndouble.lib
fanndoubleMT.dll
fanndoubleMT.lib
fannfixed.dll
fannfixed.lib
fannfixedMT.dll
fannfixedMT.lib
fannfloat.dll
fannfloat.lib
fannfloatMT.dll
fannfloatMT.lib
vs_net2003.zip
VS.NET2003
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<HTML
><HEAD
><TITLE
>Advanced Usage</TITLE
><link href="../style.css" rel="stylesheet" type="text/css"><META
NAME="GENERATOR"
CONTENT="Modular DocBook HTML Stylesheet Version 1.7"><LINK
REL="HOME"
TITLE="Fast Artificial Neural Network Library"
HREF="index.html"><LINK
REL="PREVIOUS"
TITLE="Getting Help"
HREF="x100.html"><LINK
REL="NEXT"
TITLE="Network Design"
HREF="x141.html"></HEAD
><BODY
CLASS="chapter"
BGCOLOR="#FFFFFF"
TEXT="#000000"
LINK="#0000FF"
VLINK="#840084"
ALINK="#0000FF"
><DIV
CLASS="NAVHEADER"
><TABLE
SUMMARY="Header navigation table"
WIDTH="100%"
BORDER="0"
CELLPADDING="0"
CELLSPACING="0"
><TR
><TH
COLSPAN="3"
ALIGN="center"
>Fast Artificial Neural Network Library</TH
></TR
><TR
><TD
WIDTH="10%"
ALIGN="left"
VALIGN="bottom"
><A
HREF="x100.html"
ACCESSKEY="P"
>Prev</A
></TD
><TD
WIDTH="80%"
ALIGN="center"
VALIGN="bottom"
></TD
><TD
WIDTH="10%"
ALIGN="right"
VALIGN="bottom"
><A
HREF="x141.html"
ACCESSKEY="N"
>Next</A
></TD
></TR
></TABLE
><HR
ALIGN="LEFT"
WIDTH="100%"></DIV
><DIV
CLASS="chapter"
><H1
><A
NAME="adv"
></A
>Chapter 2. Advanced Usage</H1
><P
>&#13;      This section describes some of the low-level functions and how they can be used to obtain more control of the fann library. For a full list of functions,
      lease see the <A
HREF="c253.html"
>API Reference</A
>, which has an explanation of all the fann library functions. Also feel free to take a look at
      the source code.
    </P
><P
>&#13;      This section describes different procedures, which can help to get more power out of the fann library:
      <A
HREF="c104.html#adv.adj"
><I
>Adjusting Parameters</I
></A
>, <A
HREF="x141.html"
><I
>Network Design</I
></A
>,
      <A
HREF="x148.html"
><I
>Understanding the Error Value</I
></A
>, and <A
HREF="x161.html"
><I
>Training and Testing</I
></A
>.
    </P
><DIV
CLASS="section"
><H1
CLASS="section"
><A
NAME="adv.adj"
>2.1. Adjusting Parameters</A
></H1
><P
>&#13;        Several different parameters exists in an ANN, these parameters are given defaults in the fann library, but they can be adjusted at runtime. There is no
	sense in adjusting most of these parameters after the training, since it would invalidate the training, but it does make sense to adjust some of the
	parameters during training, as will be described in <A
HREF="x161.html"
><I
>Training and Testing</I
></A
>. Generally speaking,
	these are parameters that should be adjusted before training.
      </P
><P
>&#13;	The learning rate is one of the most important parameters, but unfortunately it is also a parameter which is hard to find a reasonable default for. I
	(SN) have several times ended up using 0.7, but it is a good idea to test several different learning rates when training a network. It is also worth
	noting that the activation function has a profound effect on the optimal learning rate [<A
HREF="b3048.html#bib.thimm_1997"
><I
>Thimm and Fiesler, 1997</I
></A
>].
	The learning rate can be set when creating the network, but it can also be set by the
	<A
HREF="r1007.html"
><CODE
CLASS="function"
>fann_set_learning_rate</CODE
></A
> function.
      </P
><P
>&#13;	The initial weights are random values between -0.1 and 0.1, if other weights are preferred, the weights can be altered by the
	<A
HREF="r396.html"
><CODE
CLASS="function"
>fann_randomize_weights</CODE
></A
> or 
	<A
HREF="r421.html"
><CODE
CLASS="function"
>fann_init_weights</CODE
></A
> function.
      </P
><P
>&#13;        In [<A
HREF="b3048.html#bib.fiesler_1997"
><I
>Thimm and Fiesler, High-Order and Multilayer Perceptron Initialization, 1997</I
></A
>], Thimm and Fiesler state that, "An <SPAN
CLASS="emphasis"
><I
CLASS="emphasis"
>(sic)</I
></SPAN
> fixed weight
	variance of 0.2, which corresponds to a weight range of [-0.77, 0.77], gave the best mean performance for all the applications tested in this study. This
	performance is similar or better as compared to those of the other weight initialization methods."
      </P
><P
>&#13;	The standard activation function is the sigmoid activation function, but it is also possible to use the threshold activation function. A list of the
	currently available activation functions is available in the <A
HREF="r2030.html"
><I
>Activation Functions</I
></A
>
	section. The activation functions are chosen using the
	<A
HREF="r1040.html"
><CODE
CLASS="function"
>fann_set_activation_function_hidden</CODE
></A
> and
	<A
HREF="r1076.html"
><CODE
CLASS="function"
>fann_set_activation_function_output</CODE
></A
> functions.
      </P
><P
>&#13;	These two functions set the activation function for the hidden layers and for the output layer. Likewise the steepness parameter used in the sigmoid
	function can be adjusted with the
	<A
HREF="r1112.html"
><CODE
CLASS="function"
>fann_set_activation_steepness_hidden</CODE
></A
> and
	<A
HREF="r1149.html"
><CODE
CLASS="function"
>fann_set_activation_steepness_output</CODE
></A
> functions.
      </P
><P
>&#13;        FANN distinguishes between the hidden layers and the output layer, to allow more flexibility. This is especially a good idea for users wanting discrete
	output from the network, since they can set the activation function for the output to threshold. Please note, that it is not possible to train a network
	when using the threshold activation function, due to the fact, that it is not differentiable.
      </P
></DIV
></DIV
><DIV
CLASS="NAVFOOTER"
><HR
ALIGN="LEFT"
WIDTH="100%"><TABLE
SUMMARY="Footer navigation table"
WIDTH="100%"
BORDER="0"
CELLPADDING="0"
CELLSPACING="0"
><TR
><TD
WIDTH="33%"
ALIGN="left"
VALIGN="top"
><A
HREF="x100.html"
ACCESSKEY="P"
>Prev</A
></TD
><TD
WIDTH="34%"
ALIGN="center"
VALIGN="top"
><A
HREF="index.html"
ACCESSKEY="H"
>Home</A
></TD
><TD
WIDTH="33%"
ALIGN="right"
VALIGN="top"
><A
HREF="x141.html"
ACCESSKEY="N"
>Next</A
></TD
></TR
><TR
><TD
WIDTH="33%"
ALIGN="left"
VALIGN="top"
>Getting Help</TD
><TD
WIDTH="34%"
ALIGN="center"
VALIGN="top"
>&nbsp;</TD
><TD
WIDTH="33%"
ALIGN="right"
VALIGN="top"
>Network Design</TD
></TR
></TABLE
></DIV
></BODY
></HTML
>

By viewing downloads associated with this article you agree to the Terms of Service and the article's licence.

If a file you wish to view isn't highlighted, and is a text file (not binary), please let us know and we'll add colourisation support for it.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Software Developer's Journal
Publisher
Poland Poland
Software Developer's Journal (formerly Software 2.0) is a magazine for professional programmers and developers publishing news from the software world and practical articles presenting very interesting ready programming solutions. To read more

| Advertise | Privacy | Mobile
Web01 | 2.8.141022.2 | Last Updated 28 Aug 2013
Article Copyright 2006 by Software Developer's Journal
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid