Laius: an energy-efficient FPGA CNN accelerator with the support of a fixed-point training framework
by Zikai Nie; Zhisheng Li; Lei Wang; Shasha Guo; Yu Deng; Rangyu Deng; Qiang Dou
International Journal of Computational Science and Engineering (IJCSE), Vol. 21, No. 3, 2020

Abstract: With the development of convolutional neural networks (CNNs), their high computational complexity and energy consumption become significant problems. Many CNN inference accelerators are proposed to reduce the consumption. Most of them are based on 32-bit float-point matrix multiplication, where the data precision is over-provisioned. This paper presents Laius, an 8-bit fixed-point LeNet inference engine implemented on FPGA. To achieve low-precision computation and storage, we introduce our fixed-point training framework called FixCaffe. To economise FPGA resources, we proposed a methodology to find the optimal bit-length for weight and bias in LeNet. We use optimisations of pipelining, tiling, and theoretical analysis to improve the performance. Experiment results show that Laius achieves 44.9 Gops throughputs. Moreover, with only 1% accuracy loss, 8-bit Laius largely reduces 31.43% in delay, 87.01% in LUT consumption, 66.50% in BRAM consumption, 65.11% in DSP consumption and 47.95% in power compared to the 32-bit version with the same structure.

Online publication date: Fri, 27-Mar-2020

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com