王超

个人信息Personal Information

研究员(自然科学)   博士生导师   硕士生导师  

性别:男

在职信息:在职

所在单位:光学与电子信息学院

学历:研究生(博士)毕业

学位:工学博士学位

毕业院校:南洋理工大学

学科:微电子学与固体电子学
电路与系统

论文成果

当前位置: 中文主页 >> 科学研究 >> 论文成果

A Novel Transpose 2T-DRAM Based Computing-in-Memory Architecture for On-Chip DNN Training and Inference

点击次数:

第一作者:Y. Zhao

通讯作者:and C. Wang*

合写作者:Z. Shen, J. Xu, Kevin. T. C. Chai, Y. Wu

发表刊物:2023 IEEE International Symposium on Artificial Intelligence Circuits and Systems (AICAS 2023)

收录刊物:EI

学科门类:工学

一级学科:电子科学与技术

文献类型:C

DOI码:10.1109/AICAS57966.2023.10168641

发表时间:2023-06-13

摘要:Recently, DRAM-based Computing-in-Memory (CIM) has emerged as one of the potential CIM solutions due to its unique advantages of high bit-cell density, large memory capacity and CMOS compatibility. This paper proposes a 2T-DRAM based CIM architecture, which can perform both CIM inference and training for deep neural networks (DNNs) efficiently. The proposed CIM architecture employs 2T-DRAM based transpose circuitry to implement transpose weight memory array and uses digital logic in the array peripheral to implement digital DNN computation in memory. A novel mapping method is proposed to map the convolutional and full-connection computation of the forward propagation and back propagation process into the transpose 2T-DRAM CIM array to achieve digital weight multiplexing and parallel computing. Simulation results show that the proposed transpose 2T-DRAM based CIM architecture can achieve 11.26 GOPS by a 16K DRAM array to accelerate 4CONV+3FC in a 40-nm CMOS process and has an 82.15% accuracy on CIFAR-10 dataset, which are much higher than the state-of-the-art DRAM-based CIM accelerators without CIM learning capability. Preliminary evaluation of retention time in DRAM CIM also shows that a refresh-less training-inference process of lightweight networks can be realized by a suitable scale of CIM array through the proposed mapping strategy with negligible refresh-induced performance loss or power increase.

发布期刊链接:https://ieeexplore.ieee.org/document/10168641