The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Struktur memori capaian rawak rintangan (ReRAM) yang cekap dibangunkan untuk mempercepatkan rangkaian neural convolutional (CNN) yang dikuasakan oleh pengiraan dalam memori. Litar sel ReRAM baru direka dengan kebolehcapaian dua arah (2-D). Keseluruhan sistem ingatan disusun sebagai tatasusunan 2-D, di mana sel memori tertentu boleh diakses secara identik oleh kedua-dua lokasi lajur dan baris. Untuk pengiraan dalam ingatan CNN, hanya sel yang berkaitan dalam sub-tatasusunan yang sama diakses oleh operasi baca keluar 2-D, yang hampir tidak dilaksanakan oleh sel ReRAM konvensional. Dengan cara ini, akses berlebihan (lajur atau baris) struktur ReRAM konvensional dihalang untuk menghapuskan pergerakan data yang tidak diperlukan apabila CNN diproses dalam ingatan. Daripada hasil simulasi, kecekapan tenaga dan lebar jalur bagi struktur memori yang dicadangkan adalah masing-masing 1.4x dan 5x seni bina ReRAM terkini.
Yan CHEN
Hunan University,Nara Institute of Science and Technology
Jing ZHANG
Hunan University
Yuebing XU
Hunan University
Yingjie ZHANG
Hunan University
Renyuan ZHANG
Nara Institute of Science and Technology
Yasuhiko NAKASHIMA
Nara Institute of Science and Technology
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Salinan
Yan CHEN, Jing ZHANG, Yuebing XU, Yingjie ZHANG, Renyuan ZHANG, Yasuhiko NAKASHIMA, "A ReRAM-Based Row-Column-Oriented Memory Architecture for Convolutional Neural Networks" in IEICE TRANSACTIONS on Electronics,
vol. E102-C, no. 7, pp. 580-584, July 2019, doi: 10.1587/transele.2018CTS0001.
Abstract: An efficient resistive random access memory (ReRAM) structure is developed for accelerating convolutional neural network (CNN) powered by the in-memory computation. A novel ReRAM cell circuit is designed with two-directional (2-D) accessibility. The entire memory system is organized as a 2-D array, in which specific memory cells can be identically accessed by both of column- and row-locality. For the in-memory computations of CNNs, only relevant cells in an identical sub-array are accessed by 2-D read-out operations, which is hardly implemented by conventional ReRAM cells. In this manner, the redundant access (column or row) of the conventional ReRAM structures is prevented to eliminated the unnecessary data movement when CNNs are processed in-memory. From the simulation results, the energy and bandwidth efficiency of the proposed memory structure are 1.4x and 5x of a state-of-the-art ReRAM architecture, respectively.
URL: https://global.ieice.org/en_transactions/electronics/10.1587/transele.2018CTS0001/_p
Salinan
@ARTICLE{e102-c_7_580,
author={Yan CHEN, Jing ZHANG, Yuebing XU, Yingjie ZHANG, Renyuan ZHANG, Yasuhiko NAKASHIMA, },
journal={IEICE TRANSACTIONS on Electronics},
title={A ReRAM-Based Row-Column-Oriented Memory Architecture for Convolutional Neural Networks},
year={2019},
volume={E102-C},
number={7},
pages={580-584},
abstract={An efficient resistive random access memory (ReRAM) structure is developed for accelerating convolutional neural network (CNN) powered by the in-memory computation. A novel ReRAM cell circuit is designed with two-directional (2-D) accessibility. The entire memory system is organized as a 2-D array, in which specific memory cells can be identically accessed by both of column- and row-locality. For the in-memory computations of CNNs, only relevant cells in an identical sub-array are accessed by 2-D read-out operations, which is hardly implemented by conventional ReRAM cells. In this manner, the redundant access (column or row) of the conventional ReRAM structures is prevented to eliminated the unnecessary data movement when CNNs are processed in-memory. From the simulation results, the energy and bandwidth efficiency of the proposed memory structure are 1.4x and 5x of a state-of-the-art ReRAM architecture, respectively.},
keywords={},
doi={10.1587/transele.2018CTS0001},
ISSN={1745-1353},
month={July},}
Salinan
TY - JOUR
TI - A ReRAM-Based Row-Column-Oriented Memory Architecture for Convolutional Neural Networks
T2 - IEICE TRANSACTIONS on Electronics
SP - 580
EP - 584
AU - Yan CHEN
AU - Jing ZHANG
AU - Yuebing XU
AU - Yingjie ZHANG
AU - Renyuan ZHANG
AU - Yasuhiko NAKASHIMA
PY - 2019
DO - 10.1587/transele.2018CTS0001
JO - IEICE TRANSACTIONS on Electronics
SN - 1745-1353
VL - E102-C
IS - 7
JA - IEICE TRANSACTIONS on Electronics
Y1 - July 2019
AB - An efficient resistive random access memory (ReRAM) structure is developed for accelerating convolutional neural network (CNN) powered by the in-memory computation. A novel ReRAM cell circuit is designed with two-directional (2-D) accessibility. The entire memory system is organized as a 2-D array, in which specific memory cells can be identically accessed by both of column- and row-locality. For the in-memory computations of CNNs, only relevant cells in an identical sub-array are accessed by 2-D read-out operations, which is hardly implemented by conventional ReRAM cells. In this manner, the redundant access (column or row) of the conventional ReRAM structures is prevented to eliminated the unnecessary data movement when CNNs are processed in-memory. From the simulation results, the energy and bandwidth efficiency of the proposed memory structure are 1.4x and 5x of a state-of-the-art ReRAM architecture, respectively.
ER -