Show simple item record

dc.contributor.authorChiang, Hsiao-Dongen_US
dc.contributor.authorReddy, Chandanen_US
dc.description.abstractSupervised learning using artificial neural networks has numerous applications in various domains of science and engineering. Efficient training mechanisms in a neural network play a vital role in deciding the network architecture and the accuracy of the classifier. Most popular training algorithms tend to be greedy and hence get stuck at the nearest local minimum of the error surface. To overcome this problem, some global methods (like multiple restarts, genetic algorithms, simulated annealing etc.) for efficient training make use of stochastic approaches in combination with local methods to obtain an effective set of training parameters. Due to the stochastic nature and lack of effective fine tuning capability, these algorithms often fail to obtain an optimal set of training parameters. In this paper, a new method to improve the subspace parameter search capability of training algorithms is proposed. This new method takes advantage of TRUST-TECH (TRansformation Under STability-reTaining Equilibrium CHaracterization) to compute neighborhood local minimum of the error surface. The proposed approach obtains multiple local optimal solutions surrounding the current local optimal solution in a systematic manner. Empirical results on different machine learning datasets indicate that the proposed algorithm outperforms current algorithms available in the literature.en_US
dc.format.extent303860 bytes
dc.publisherCornell Universityen_US
dc.subjectcomputer scienceen_US
dc.subjecttechnical reporten_US
dc.titleTRUST-TECH based Neural Network Trainingen_US
dc.typetechnical reporten_US

Files in this item


This item appears in the following Collection(s)

Show simple item record