PlumX Metrics
Embed PlumX Metrics

Bayesian Optimization for Auto-tuning Convolution Neural Network on GPU

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN: 1611-3349, Vol: 14492 LNCS, Page: 478-489
2024
  • 0
    Citations
  • 0
    Usage
  • 0
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Conference Paper Description

GPU as a hardware processor plays an important role in the training of deep neural networks. However, when using GPUs for computation on convolutional neural network models, different combinations of GPU kernel configuration parameters have different performance. Therefore, this paper proposes BAGF, a bayesian auto-tuning framework for GPU kernels, which parameterizes the factors affecting the performance of GPU programs and uses bayesian optimization methods to search for the best parameters in the search space consisting of the parameters. Compared with other optimization algorithms, BAGF obtains excellent configuration parameters with fewer iterations. This paper analyzes the performance of BAGF on four benchmarks and compares with other common optimization algorithms. In addition, the performance improvement of each parameter configuration is analyzed. Finally, the BAGF was tested with the convolution layer of Alexnet, and the results of the Roofline model were analyzed. Compared with the original parameter configuration, the speed of BAGF was increased by 50.09%.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know