Click:   The Last Update Time:--

Xizi Chen

Supervisor of Master's Candidates
Name (Simplified Chinese):Xizi Chen
Name (English):Xizi Chen
Name (Pinyin):chenxizi
Administrative Position:专任教师
Academic Titles:专任教师
Status:Employed
Education Level:博士
Degree:Doctoral degree
Business Address:华中农业大学第一综合楼B座413
E-Mail:
Alma Mater:The Hong Kong University of Science and Technology
Teacher College:College of Informatics
School/Department:Huazhong Agricultural University
Discipline:Computer Architecture    Computer Applications Technology    
Other Contact Information:

PostalAddress:

Email:

Paper Publications
SparseNN: An Energy-Efficient Neural Network Accelerator Exploiting Input and Output Sparsity
Release time:2021-09-08    Hits:

DOI number:10.23919/DATE.2018.8342010

Affiliation of Author(s):Hong Kong University of Science and Technology (HKUST)

Journal:Design, Automation and Test in Europe Conference and Exhibition (DATE, CCF-C)

Key Words:Neural networks, training, sparsity, computer architecture, scheduling, prediction algorithms

Abstract:Contemporary Deep Neural Network (DNN) contains millions of synaptic connections with tens to hundreds of layers. The large computational complexity poses a challenge to the hardware design. In this work, we leverage the intrinsic activation sparsity of DNN to substantially reduce the execution cycles and the energy consumption. An end-to-end training algorithm is proposed to develop a lightweight (less than 5% overhead) run-time predictor for the output activation sparsity on the fly. Furthermore, an energy-efficient hardware architecture, SparseNN, is proposed to exploit both the input and output sparsity. SparseNN is a scalable architecture with distributed memories and processing elements connected through a dedicated on-chip network. Compared with the state-of-the-art accelerators which only exploit the input sparsity, SparseNN can achieve a 10%-70% improvement in throughput and a power reduction of around 50%.

Note:CCF-B

Co-author:Jingbo Jiang,Xizi Chen,Chi-Ying Tsui

First Author:Jingyang Zhu

Translation or Not:no

Date of Publication:2018-01-01