TY - GEN
T1 - Precompensation, BIST and Analogue Berger Codes for Self-Healing of Neuromorphic RRAM
AU - Huang, Tsung Chu
AU - Schroff, Jeffae
PY - 2018/12/6
Y1 - 2018/12/6
N2 - Neuromorphic RRAM has become the most promising candidate for AI applications. But it suffers three issues including degradation, defects and errors. To overcome the three issues we proposed a precompensation technique for compensating resistive degradation. A linear-system-based BIST architecture is developed with proposed diagonal sliding march test can effectively and efficiently screen out the uncompensated degradation and permanent defects. Analog Berger codes is proposed for detecting transient errors for variation learning and self-checking for asymmetric errors. From evaluations, the precompensation takes only 5/B time for batch operations of B cycles. Proposed BIST approach and method can reduced 2LN march tests to 6N for L-level RRAMs. The self-healing ability is verified by analog-Berger-code error detection. From experiments using a typical neural network for MNIST handwritten digit dataset the network can be healed with only 2% of accuracy and about 35% of training steps.
AB - Neuromorphic RRAM has become the most promising candidate for AI applications. But it suffers three issues including degradation, defects and errors. To overcome the three issues we proposed a precompensation technique for compensating resistive degradation. A linear-system-based BIST architecture is developed with proposed diagonal sliding march test can effectively and efficiently screen out the uncompensated degradation and permanent defects. Analog Berger codes is proposed for detecting transient errors for variation learning and self-checking for asymmetric errors. From evaluations, the precompensation takes only 5/B time for batch operations of B cycles. Proposed BIST approach and method can reduced 2LN march tests to 6N for L-level RRAMs. The self-healing ability is verified by analog-Berger-code error detection. From experiments using a typical neural network for MNIST handwritten digit dataset the network can be healed with only 2% of accuracy and about 35% of training steps.
UR - http://www.scopus.com/inward/record.url?scp=85060012892&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85060012892&partnerID=8YFLogxK
U2 - 10.1109/ATS.2018.00041
DO - 10.1109/ATS.2018.00041
M3 - Conference contribution
AN - SCOPUS:85060012892
T3 - Proceedings of the Asian Test Symposium
SP - 173
EP - 178
BT - Proceedings - 2018 IEEE 27th Asian Test Symposium, ATS 2018
PB - IEEE Computer Society
T2 - 27th IEEE Asian Test Symposium, ATS 2018
Y2 - 15 October 2018 through 18 October 2018
ER -