Improving Speech Recognition Learning through Lazy Training

Citation data:


Publication Year:
Usage 18
Downloads 18
Repository URL:;
Martinez, Tony R.; Rimer, Michael E.; Wilson, D. Randall
lazy training; overfit; generalization; neural networks; Computer Sciences
article description
Multi-layer backpropagation, like most learning algorithms that can create complex decision surfaces, is prone to overfitting. We present a novel approach, called lazy training, for reducing the overfit in multiple-layer networks. Lazy training consistently reduces generalization error of optimized neural networks by more than half on a large OCR dataset and on several real world problems from the UCI machine learning database repository. Here, lazy training is shown to be effective in a multi-layered adaptive learning system, reducing the error of an optimized backpropagation network in a speech recognition system by 50.0% on the TIDIGITS corpus.