keyboard_arrow_up
Fast Convolution based on Winograd Minimum Filtering: Introduction and Development

Authors

Gan Tong and Libo Huang, National University of Defense Technology, China

Abstract

Convolutional Neural Network (CNN) has been widely used in various fields and played an important role. Convolution operators are the fundamental component of convolutional neural networks, and it is also the most time-consuming part of network training and inference. In recent years, researchers have proposed several fast convolution algorithms including FFT and Winograd. Among them, Winograd convolution significantly reduces the multiplication operations in convolution, and it also takes up less memory space than FFT convolution. Therefore, Winograd convolution has quickly become the first choice for fast convolution implementation within a few years. At present, there is no systematic summary of the convolution algorithm. This article aims to fill this gap and provide detailed references for follow-up researchers. This article summarizes the development of Winograd convolution from the three aspects of algorithm expansion, algorithm optimization, implementation, and application, and finally makes a simple outlook on the possible future directions.

Keywords

Winograd Minimum Filtering, Winograd Convolution, Fast Convolution, Convolution Optimization.

Full Text  Volume 11, Number 17