The authors discuss the architecture and training properties of a multilayer feedforward neural network class that uses quadratic junctions in a neural architecture that uses effectively the backpropagation learning algorithm given by P.J. Werbos (1989). Both the architecture of the quadratic junctions and the backpropagation were adopted so as to endow the networks with appealing training properties (under supervision) and acceptable generalizations. Complexity and learning aspects of this class are examined and compared with traditional networks that use linear junctions.
Proceedings of the IEEE international conferences on systems, man, and cybernetics, v.3,p.p1557 - 1562