PlumX Metrics
Embed PlumX Metrics

Improving generalization in deep neural network using knowledge transformation based on fisher criterion

Journal of Supercomputing, ISSN: 1573-0484, Vol: 79, Issue: 18, Page: 20899-20922
2023
  • 3
    Citations
  • 0
    Usage
  • 3
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Article Description

Most deep neural networks (DNNs) are trained in an over-parametrized regime. In this case, the numbers of their parameters are more than available training data which reduces the generalization capability and performance on new and unseen samples. Generalization of DNNs has been improved by applying various methods such as regularization techniques, data enhancement, network capacity restriction, injection randomness, etc. In this paper, we proposed an effective generalization method, named multivariate statistical knowledge transformation, which learns feature distribution to separate samples based on the variance of deep hypothesis space in all dimensions. Moreover, the proposed method uses latent knowledge of the target to boost the confidence of its prediction. Compared to state-of-the-art methods, the transformation of multivariate statistical knowledge yields competitive results. Experimental results show that the proposed method achieved impressive generalization performance on CIFAR-10, CIFAR-100, and Tiny ImageNet with accuracy of 91.96%, 97.52%, and 99.21% respectively. Furthermore, this method enables faster convergence during the initial epochs.

Bibliographic Details

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know