Abstracts - faqs.org

Abstracts

Electronics

Search abstracts:
Abstracts » Electronics

A performance comparison of trained multilayer perceptrons and trained classification trees

Article Abstract:

Multilayer Perceptrons (MLPs) and trained classification trees can both perform nonlinear classification, but no theoretical analysis proves the superiority of either method. MLPs and classification and regression trees (CARTs) are empirically tested with three problems to determine their effectiveness. The first problem requires each method to forecast electric power system loads in the Seattle/Tacoma area. MLP produces a 1.39 percent error, while CART's error is 2.86 percent. The second test requires both methods to determine efficient solutions for power system security. The difference between MLP's 0.78 percent error rate and CART's 1.46 percent error rate is statistically significant. The third test requires speaker-independent vowel classification based on a single spectral slice. MLP has a 47.4 percent classification rate, while CART has only a 38.2 percent rate without using linear combinations.

Author: Atlas, Les, Cole, Ronald, Muthusamy, Yeshwant, Lippman, Alan, Connor, Jerome, Park, Dong, El-Sharkawi, Mohamed, Marks, Robert J., II
Publisher: Institute of Electrical and Electronics Engineers, Inc.
Publication Name: Proceedings of the IEEE
Subject: Electronics
ISSN: 0018-9219
Year: 1990
Applications, Comparative Study, Classification Systems

User Contributions:

Comment about this article or add new information about this topic:

CAPTCHA


On the decision regions of multilayer Perceptrons

Article Abstract:

The capabilities of two-layer Perceptrons are investigated. Multi-layer perceptrons are computation devices composed of nodes. The nodes receive inputs, which they modify with a set of weights. The modified values are summed with a constant, and an output is determined by a nonlinear function of the sum. Two-layer Perceptrons have the capabilities of higher-order Perceptrons for a one-dimensional input space, but the decision regions yielded by the two-layer structures are more limited than those yielded by three-layer Perceptrons. The significance of these limitations is unknown.

Author: Gibson, Gavin J., Cowan, Colin F.N.
Publisher: Institute of Electrical and Electronics Engineers, Inc.
Publication Name: Proceedings of the IEEE
Subject: Electronics
ISSN: 0018-9219
Year: 1990
Decision theory, Mathematics of Computing, Algorithm Analysis, Mathematical Proofs

User Contributions:

Comment about this article or add new information about this topic:

CAPTCHA


Nearest neighbor pattern classification Perceptrons

Article Abstract:

A three-layer Perceptron using the nearest neighbor pattern classification rule can be designed specifically to avoid using training algorithms because the network training can be incorporated into the design. Such a Perceptron has advantages over traditional neural networks for pattern classification, because backpropagation in a single-layer Perceptron does not classify all linearly separable families and the perceptron learning procedure can only be used in single-layer networks.

Author: Murphy, Owen J.
Publisher: Institute of Electrical and Electronics Engineers, Inc.
Publication Name: Proceedings of the IEEE
Subject: Electronics
ISSN: 0018-9219
Year: 1990
Pattern recognition (Computers), Pattern Recognition, Computer Learning, Methods

User Contributions:

Comment about this article or add new information about this topic:

CAPTCHA


Subjects list: Neural networks, technical, Neural Network, Perceptrons
Similar abstracts:
  • Abstracts: A comparison of homogeneous hierarchical interconnection structures. The Special Issue on Open Systems Interconnection (OSI)- New International Standards Architecture and Protocols
  • Abstracts: Applications of VLSI circuits to medical imaging. Strained layer heterostructures, and their applications to MODFET's, HBT's, and lasers
  • Abstracts: Localized wave representations of acoustic and electromagnetic radiation
  • Abstracts: 30 years of adaptive neural networks: Perceptron, Madeline, and backpropagation. CMAC: an associative neural network alternative to backpropagation
  • Abstracts: CMAC: an associative neural network alternative to backpropagation. Backpropagation through time: what it does and how to do it
This website is not affiliated with document authors or copyright owners. This page is provided for informational purposes only. Unintentional errors are possible.
Some parts © 2025 Advameg, Inc.