A performance comparison of trained multilayer perceptrons and trained classification trees
Article Abstract:
Multilayer Perceptrons (MLPs) and trained classification trees can both perform nonlinear classification, but no theoretical analysis proves the superiority of either method. MLPs and classification and regression trees (CARTs) are empirically tested with three problems to determine their effectiveness. The first problem requires each method to forecast electric power system loads in the Seattle/Tacoma area. MLP produces a 1.39 percent error, while CART's error is 2.86 percent. The second test requires both methods to determine efficient solutions for power system security. The difference between MLP's 0.78 percent error rate and CART's 1.46 percent error rate is statistically significant. The third test requires speaker-independent vowel classification based on a single spectral slice. MLP has a 47.4 percent classification rate, while CART has only a 38.2 percent rate without using linear combinations.
Publication Name: Proceedings of the IEEE
Subject: Electronics
ISSN: 0018-9219
Year: 1990
User Contributions:
Comment about this article or add new information about this topic:
On the decision regions of multilayer Perceptrons
Article Abstract:
The capabilities of two-layer Perceptrons are investigated. Multi-layer perceptrons are computation devices composed of nodes. The nodes receive inputs, which they modify with a set of weights. The modified values are summed with a constant, and an output is determined by a nonlinear function of the sum. Two-layer Perceptrons have the capabilities of higher-order Perceptrons for a one-dimensional input space, but the decision regions yielded by the two-layer structures are more limited than those yielded by three-layer Perceptrons. The significance of these limitations is unknown.
Publication Name: Proceedings of the IEEE
Subject: Electronics
ISSN: 0018-9219
Year: 1990
User Contributions:
Comment about this article or add new information about this topic:
Nearest neighbor pattern classification Perceptrons
Article Abstract:
A three-layer Perceptron using the nearest neighbor pattern classification rule can be designed specifically to avoid using training algorithms because the network training can be incorporated into the design. Such a Perceptron has advantages over traditional neural networks for pattern classification, because backpropagation in a single-layer Perceptron does not classify all linearly separable families and the perceptron learning procedure can only be used in single-layer networks.
Publication Name: Proceedings of the IEEE
Subject: Electronics
ISSN: 0018-9219
Year: 1990
User Contributions:
Comment about this article or add new information about this topic:
- Abstracts: A comparison of homogeneous hierarchical interconnection structures. The Special Issue on Open Systems Interconnection (OSI)- New International Standards Architecture and Protocols
- Abstracts: Applications of VLSI circuits to medical imaging. Strained layer heterostructures, and their applications to MODFET's, HBT's, and lasers
- Abstracts: Localized wave representations of acoustic and electromagnetic radiation
- Abstracts: 30 years of adaptive neural networks: Perceptron, Madeline, and backpropagation. CMAC: an associative neural network alternative to backpropagation
- Abstracts: CMAC: an associative neural network alternative to backpropagation. Backpropagation through time: what it does and how to do it