Neural Networks
- Article
- Title - "Biologically Inspired Models to Train
Neural Networks", Neural Computing & Applications, Springer-Verlag
London Ltd, ISSN: 0941-0643, Volume 11, Numbers 3-4, June 2003, pp.
191-202, DOI: 10.1007/s00521-003-0350-7.
- Direct quote of Editor's (Dr. Larry Medsker, American University)
comment - this paper addresses several important problems in applying
neural nets.
- Uniformity of fit: The usual backpropagation algorithm
minimizes mean squared error on a training set, but says nothing
about how well the learned function matches individual training
points. The result may be a function that performs outside tolerance
over part of its required domain. Lyons' error constraints algorithm
adds error bounds at each training point to the learning algorithm.
This insures that the solution uniformly approximates the training
set. In practice this means that by choosing a sufficiently dense
training set over the domain of interest, one could guarantee that a
neural net performed within tolerance.
- Robustness of solution: The uniformity of fit achieved by
the error constraints algorithm, together with the bounds on the
individual weights included in the error minimization algorithm,
result in solutions with good point value agreement and relatively
small derivatives. These properties enhance the behavior of the
network on the boundary of the training set. This is important,
because once in production, unless care is taken, data outside a
region well covered by the training set could be submitted to the
trained network.
- Determining the right number of hidden layer nodes: The
error minimization algorithm presented in Lyons' paper provides a
practical way to address this problem, for which guidance has in the
past been confined to rules of thumb.
- Existing software: Finally, Lyons' algorithms are important
because they are easy to incorporate into existing software, and can
easily be put into practice.
- Practice: This paper makes a real contribution to the
practice of using neural nets.
- Training Methods
- Error Minimization Model
- Based on the traditional model of adjusting the weights of a neural
network to minimize the sum of the squares of the deviations of the target and computed
output.
- Also includes constraints on the size of the weights.
- The model is used in an iterative fashion with successively larger limits
on the weights to determine a solution.
- The process of letting the weights grow larger mimics the process in
biological systems where the strength of the connection between neurons
develops over
time.
- Error Constraints Model
- Acceptable deviations of the target and computed output are explicitly
specified as constraints and the objective function is to minimize the sum of the squared
weights.
- The rationale for this model is also biologically inspired. Since the
strength of the connection (weight) between two neurons can be thought of as proportional
to the activation voltage (Kosko), the square of the weight can be thought of as
proportional to the energy. Thus, this model mimics what nature probably does - use the
least amount of energy to encode a fixed amount of information.
- The model uses the same iterative process of successively larger limits
on the weights.
- Example - Diminishing Returns - see below.
- Workbooks
- Right click any of the following links and
use the Save Target As option to download the Excel workbook
implementations of the corresponding models. If asked for authentication information, just click on
cancel.
- These workbooks contain the full Solver model, however, some
worksheets containing the run histories have been deleted to reduce
download times. Please note that the Solver must be installed in
your copy of Excel in order to execute these models.
- Links
- Error Minimization Model for XOR (88KB)
- Error Constraints Model for XOR (95KB)
- Error Minimization Model for
Diminishing Returns (133KB)
- Error Constraints Model for Diminishing
Returns (206KB)
- Applications
- Forecasting
- Production Control
- Publications - see Selected Articles
(This page was last edited on
April 26, 2011.)
|