访问量:   最后更新时间:--

陈夕子

硕士生导师
教师姓名:陈夕子
教师英文名称:Xizi Chen
教师拼音名称:chenxizi
电子邮箱:
所在单位:信息学院
职务:专任教师
学历:博士
办公地点:华中农业大学逸夫楼C座310
学位:博士学位
职称:副研究员
在职信息:在职
主要任职:专任教师
毕业院校:香港科技大学
所属院系:信息学院
学科:计算机系统结构    计算机应用技术    
其他联系方式

通讯/办公地址:

论文成果
Accelerating Large Kernel Convolutions with Nested Winograd Transformation
发布时间:2023-12-11    点击次数:

DOI码:10.1109/VLSI-SoC57769.2023.10321932

发表刊物:2023 IFIP/IEEE 31st International Conference on Very Large Scale Integration (VLSI-SoC)

摘要:Recent literature has shown that convolutional neural networks (CNNs) with large kernels outperform vision transformers (ViTs) and CNNs with stacked small kernels in many computer vision tasks, such as object detection and image restoration. The Winograd transformation helps reduce the number of repetitive multiplications in convolution and is widely supported by many commercial AI processors. Researchers have proposed accelerating large kernel convolutions by linearly decomposing them into many small kernel convolutions and then sequentially accelerating each small kernel convolution with the Winograd algorithm. This work proposes a nested Winograd algorithm that iteratively decomposes a large kernel convolution into small kernel convolutions and proves it to be more effective than the linear decomposition Winograd transformation algorithm. Experiments show that compared to the linear decomposition Winograd algorithm, the proposed algorithm reduces the total number of multiplications by 1.4 to 10.5 times for computing 4×4 to 31×31 convolutions.

是否译文:

发表时间:2023-11-22

收录刊物:EI

发布期刊链接:https://ieeexplore.ieee.org/document/10321932