The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Rangkaian saraf dalam praktikal mempunyai beberapa parameter berat, dan format titik tetap dinamik telah digunakan untuk mewakilinya dengan cekap. Perwakilan titik tetap dinamik berkongsi faktor penskalaan antara kumpulan nombor, dan pemberat dalam lapisan telah dibentuk menjadi kumpulan sedemikian. Dalam makalah ini, kami mula-mula meneroka ruang reka bentuk untuk sistem pengkomputeran neuromorfik titik tetap dinamik dan menunjukkan bahawa adalah sangat diperlukan untuk mempunyai saiz kumpulan kecil dalam seni bina neuromorfik, kerana adalah sesuai untuk mengumpulkan berat yang dikaitkan dengan neuron ke dalam kumpulan. Kami kemudian membentangkan perwakilan titik tetap dinamik yang direka untuk sistem pengkomputeran neuromorfik. Keputusan eksperimen kami menunjukkan bahawa perwakilan yang dicadangkan mengurangkan lebar bit berat yang diperlukan sebanyak kira-kira 4 bit berbanding dengan format titik tetap konvensional.
Yongshin KANG
Incheon National University
Jaeyong CHUNG
Incheon National University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Salinan
Yongshin KANG, Jaeyong CHUNG, "Dynamic Fixed-Point Design of Neuromorphic Computing Systems" in IEICE TRANSACTIONS on Electronics,
vol. E101-C, no. 10, pp. 840-844, October 2018, doi: 10.1587/transele.E101.C.840.
Abstract: Practical deep neural networks have a number of weight parameters, and the dynamic fixed-point formats have been used to represent them efficiently. The dynamic fixed-point representations share an scaling factor among a group of numbers, and the weights in a layer have been formed into such a group. In this paper, we first explore a design space for dynamic fixed-point neuromorphic computing systems and show that it is indispensable to have a small group size in neuromorphic architectures, because it is appropriate to group the weights associated with a neuron into a group. We then presents a dynamic fixed-point representation designed for neuromorphic computing systems. Our experimental results show that the proposed representation reduces the required weight bitwidth by about 4 bits compared to the conventional fixed-point format.
URL: https://global.ieice.org/en_transactions/electronics/10.1587/transele.E101.C.840/_p
Salinan
@ARTICLE{e101-c_10_840,
author={Yongshin KANG, Jaeyong CHUNG, },
journal={IEICE TRANSACTIONS on Electronics},
title={Dynamic Fixed-Point Design of Neuromorphic Computing Systems},
year={2018},
volume={E101-C},
number={10},
pages={840-844},
abstract={Practical deep neural networks have a number of weight parameters, and the dynamic fixed-point formats have been used to represent them efficiently. The dynamic fixed-point representations share an scaling factor among a group of numbers, and the weights in a layer have been formed into such a group. In this paper, we first explore a design space for dynamic fixed-point neuromorphic computing systems and show that it is indispensable to have a small group size in neuromorphic architectures, because it is appropriate to group the weights associated with a neuron into a group. We then presents a dynamic fixed-point representation designed for neuromorphic computing systems. Our experimental results show that the proposed representation reduces the required weight bitwidth by about 4 bits compared to the conventional fixed-point format.},
keywords={},
doi={10.1587/transele.E101.C.840},
ISSN={1745-1353},
month={October},}
Salinan
TY - JOUR
TI - Dynamic Fixed-Point Design of Neuromorphic Computing Systems
T2 - IEICE TRANSACTIONS on Electronics
SP - 840
EP - 844
AU - Yongshin KANG
AU - Jaeyong CHUNG
PY - 2018
DO - 10.1587/transele.E101.C.840
JO - IEICE TRANSACTIONS on Electronics
SN - 1745-1353
VL - E101-C
IS - 10
JA - IEICE TRANSACTIONS on Electronics
Y1 - October 2018
AB - Practical deep neural networks have a number of weight parameters, and the dynamic fixed-point formats have been used to represent them efficiently. The dynamic fixed-point representations share an scaling factor among a group of numbers, and the weights in a layer have been formed into such a group. In this paper, we first explore a design space for dynamic fixed-point neuromorphic computing systems and show that it is indispensable to have a small group size in neuromorphic architectures, because it is appropriate to group the weights associated with a neuron into a group. We then presents a dynamic fixed-point representation designed for neuromorphic computing systems. Our experimental results show that the proposed representation reduces the required weight bitwidth by about 4 bits compared to the conventional fixed-point format.
ER -