Return to search

Extensions of a Theory of Networks for Approximation and Learning: Dimensionality Reduction and Clustering

The theory developed in Poggio and Girosi (1989) shows the equivalence between regularization and a class of three-layer networks that we call regularization networks or Hyper Basis Functions. These networks are also closely related to the classical Radial Basis Functions used for interpolation tasks and to several pattern recognition and neural network algorithms. In this note, we extend the theory by defining a general form of these networks with two sets of modifiable parameters in addition to the coefficients $c_\\ alpha$: moving centers and adjustable norm- weight.

Identiferoai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/6014
Date01 April 1990
CreatorsPoggio, Tomaso, Girosi, Federico
Source SetsM.I.T. Theses and Dissertation
Languageen_US
Detected LanguageEnglish
Format18 p., 2271885 bytes, 901116 bytes, application/postscript, application/pdf
RelationAIM-1167

Page generated in 0.01 seconds