Analysis of Simple Neural Networks

Chedsada Chinrungrueng

EECS Department
University of California, Berkeley
Technical Report No. UCB/CSD-88-482
December 1988

http://www2.eecs.berkeley.edu/Pubs/TechRpts/1988/CSD-88-482.pdf

In a layered feet-forward network the error surface with respect to a desired goal function completely describes the potential of that network to carry out a particular task. Such an error surface can be obtained by plotting the maximum error or the sum of the squares of the errors, where the individual errors are the deviation of the actual output of the network from the desired goal function. For a network with W individual (synaptic) weights, these error surfaces are W-dimensional surfaces embedded in a W + 1-dimensional space and defined over the W-dimensional domain determined by the ranges of the W weights.

We have studied in detail the error surfaces of a few small networks that we believe exhibit some of the typical characteristics also found in large networks. For the chosen networks we have determined the inherent symmetries of their associated error surfaces, the shape of the region around the origin, and the numbers and types of local minima. For each type of local minimum, we examined its multiplicity, the fractional coverage of the total domain space by its collection zone, and the approximate shape of the valleys leading into it. This study of small three-layered networks performing the Exclusive-OR function of two or three inputs provides some insight into the general structures of the error surfaces and thus the capabilities of more complicated networks.


BibTeX citation:

@techreport{Chinrungrueng:CSD-88-482,
    Author = {Chinrungrueng, Chedsada},
    Title = {Analysis of Simple Neural Networks},
    Institution = {EECS Department, University of California, Berkeley},
    Year = {1988},
    Month = {Dec},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/1988/6063.html},
    Number = {UCB/CSD-88-482},
    Abstract = {In a layered feet-forward network the error surface with respect to a desired goal function completely describes the potential of that network to carry out a particular task. Such an error surface can be obtained by plotting the maximum error or the sum of the squares of the errors, where the individual errors are the deviation of the actual output of the network from the desired goal function. For a network with <i>W</i> individual (synaptic) weights, these error surfaces are <i>W</i>-dimensional surfaces embedded in a <i>W</i> + 1-dimensional space and defined over the <i>W</i>-dimensional domain determined by the ranges of the <i>W</i> weights.   <p>  We have studied in detail the error surfaces of a few small networks that we believe exhibit some of the typical characteristics also found in large networks. For the chosen networks we have determined the inherent symmetries of their associated error surfaces, the shape of the region around the origin, and the numbers and types of local minima. For each type of local minimum, we examined its multiplicity, the fractional coverage of the total domain space by its collection zone, and the approximate shape of the valleys leading into it. This study of small three-layered networks performing the Exclusive-OR function of two or three inputs provides some insight into the general structures of the error surfaces and thus the capabilities of more complicated networks.}
}

EndNote citation:

%0 Report
%A Chinrungrueng, Chedsada
%T Analysis of Simple Neural Networks
%I EECS Department, University of California, Berkeley
%D 1988
%@ UCB/CSD-88-482
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/1988/6063.html
%F Chinrungrueng:CSD-88-482