Swipe to navigate through the chapters of this book
In this chapter we introduce a general model of (artificial) neural networks that captures (more or less) all special forms, which we consider in the following chapters. We start by defining the structure of an (artificial) neural network and then describe generally the operation and finally the training of an (artificial) neural network.
Please log in to get access to this content
To get access to this content you need the following product:
A loop is an edge/connection from a vertex to this vertex itself, that is, an edge \(e = (v,v)\) with a vertex \(v \in V\).
A topological ordering is a numbering of the vertices of a directed graph, such that all edges are directed from a vertex with a lower number to a vertex with a higher number. A topological ordering exists only for acyclic graphs, that is, for feed forward networks. For feed forward networks an update following a topological ordering ensures that all inputs of a neuron are already available (have already been computed), before it (re-)computes its own activation and output.
The second formula is based on the maximum likelihood estimator for the variance of a normal distribution. In statistics often the unbiased estimator is preferred, which differs from the one used above only by using \(|L|-1\) instead of | L|. For the normalization this difference is negligible.
- General Neural Networks
- Springer London
- Sequence number
- Chapter number
- Chapter 4