Artificial optic-neural synapse for colored and color-mixed pattern recognition

Nature Communications - Tập 9 Số 1
Seunghwan Seo1, Seo-Hyeon Jo1, Sungho Kim1, Jaewoo Shim1, Seyong Oh1, Jeong Hoon Kim1, Keun Heo1, Jae Woong Choi2, Changhwan Choi3, Saeroonter Oh4, Duygu Kuzum5, H.‐S. Philip Wong6, Jin‐Hong Park6
1Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon, 16419, Korea
2SKKU Advanced Institute of Nanotechnology (SAINT), Sungkyunkwan University, Suwon, 16417, Korea
3Division of Materials Science and Engineering, Hanyang University, Seoul 04763, Korea
4Division of Electrical Engineering, Hanyang University, Ansan, 15588, Korea
5Department of Electrical and Computer Engineering, University of California San Diego, San Diego, CA 92093, USA
6Department of Electrical Engineering, Stanford University, Stanford, CA 94305 USA

Tóm tắt

AbstractThe priority of synaptic device researches has been given to prove the device potential for the emulation of synaptic dynamics and not to functionalize further synaptic devices for more complex learning. Here, we demonstrate an optic-neural synaptic device by implementing synaptic and optical-sensing functions together on h-BN/WSe2 heterostructure. This device mimics the colored and color-mixed pattern recognition capabilities of the human vision system when arranged in an optic-neural network. Our synaptic device demonstrates a close to linear weight update trajectory while providing a large number of stable conduction states with less than 1% variation per state. The device operates with low voltage spikes of 0.3 V and consumes only 66 fJ per spike. This consequently facilitates the demonstration of accurate and energy efficient colored and color-mixed pattern recognition. The work will be an important step toward neural networks that comprise neural sensing and training functions for more complex pattern recognition.

Từ khóa


Tài liệu tham khảo

Mead, C. Neuromorphic electronic systems. Proc. IEEE 78, 1629–1636 (1990).

Yu, S. et al. An electronic synapse device based on metal oxide resistive switching memory for neuromorphic computation. IEEE Trans. 8, 2729–2737 (2011).

Prezioso, M. et al. Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature 521, 61–64 (2015).

Wong, H. S. P. et al. Phase change memory. Proc. IEEE 98, 2201–2227 (2010).

Burgt, Yvd et al. A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing. Nat. Mater. 4856, 1–6 (2017).

Kim, S., Yoon, J., Kim, H.-D. & Choi, S.-J. Carbon nanotube synaptic transistor network for pattern recognition. ACS Ami. 7, 25479–25486 (2015).

Wang, Z. et al. Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing. Nat. Mater. 16, 101–108 (2017).

Choi, S. et al. SiGe epitaxial memory for neuromorphic computing with reproducible high performance based on engineered dislocations. Nat. Mater. 17, 335–340 (2018).

Sangwan, V. K. et al. Multi-terminal memtransistors from polycrystalline monolayer molybdenum disulfide. Nature 554, 500–504 (2018).

Yoon, S. M., Tokumitsu, E. & Ishiwara, H. An electrically modifiable synapse array composed of metal-ferroelectric-semiconductor (MFS). IEEE Electron Device Lett. 20, 229–231 (1999).

Silver, D. et al. Mastering the game of Go with deep neural networks and tree search. Nature 529, 484–489 (2016).

Wang, Z. et al. Fully memristive neural networks for pattern classification with unsupervised learning. Nat. Electron. 1, 137–145 (2018).

Sheridan, P. M. et al. Sparse coding with memristor networks. Nat. Nanotech. 12, 784–789 (2017).

Bi, G.-q & Poo, M.-m Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J. Neurosci. 18, 10464 (1998).

Whitlock, J. R., Heynen, A. J., Shouler, M. G. & Bear, M. F. Learning induces long-term potentiation in the hippocampus. Science 313, 1093–1097 (2006).

Takeo, O. et al. Short-term plasticity and long-term potentiation mimicked in single inorganic synapses. Nat. Mater. 3054, 591–595 (2011).

Burr, G. W. et al. Experimental demonstration and tolerancing of a large-scale neural network (165,000 synapses) using phase-change memory as the synaptic weight element. IEEE Trans. 62, 3498–3507 (2015).

Yang, J. J., Strukov, D. B. & Stewart, D. R. Memristive devices for computing. Nat. Nanotech. 8, 13–24 (2013).

Chen, P. Y., Peng, X. & Yu, S. NeuroSim+: an integrated device-to-algorithm framework for benchmarking synaptic devices and array architectures. In 2017 IEEE International Electron Devices Meeting (IEDM) 135–138 (IEEE, San Francisco, 2017).

Yu, S. Neuro-inspired computing with emerging nonvolatile memory. Proc. IEEE 106, 260–285 (2018).

Laughltn, S. B., de Ruyter van Steveninck, R. R. & Anderson, J. C. The metabolic cost of neural information. Nat. Neurosci. 1, 36–41 (1998).

Shim, J. et al. Electronic and optoelectronic devices based on two-dimensional materials: from fabrication to application. Adv. Electron. Mat. 3, 1600364 (2017).

Geim, A. K. & Grigorieva, I. V. Van der Waals heterostructures. Nature 499, 419–425 (2013).

Lemme, M. C., Li, L.-J., Palacios, T. & Schwierz, F. Two-dimensional materials for electronic applications. Mrs. Bull. 39, 711–718 (2014).

Jo, S.-H. et al. A high-performance WSe2/h-BN photodetector using a triphenylphosphine (PPh3)-based n-doping technique. Adv. Mat. 28, 4824–4831 (2016).

Mochida, R. et al. A 4M synapses integrated analog ReRAM based 66.5 TOPS/W neural network processor with Cell Current Writing and Flexible Network Architecture. In 2018 Symposium on VLSI Technology (IEEE, Honolulu, 2018).

Ambrogio. et al. Equivalent-accuracy accelerated neural-entwork training using analogue memory. Nature 558, 60–67 (2018).

Querlioz, D., Bichler, O., Dollfus, P. & Gamrat, C. Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans. 12, 288–295 (2013).

Anold, A. J. et al. Mimicking neurotransmitter release in chemical synapses via hysteresis engineering in MoS2 transistors. ACS Nano 11, 3310–3118 (2017).

Rumelhart, D. E., Hinton, G. E., & Willians, R. J. in Parallel Distributed Processing: Explorations in Macrostructure of Cognition. Vol. 1 (eds Rumelhart, D. E. & McClelland, J. L.) 318–362 (MIT Press, Cambridge,1986).

Lim, S. et al. Adaptive learning rule for hardware-based deep neural networks using electronic synapse devices. Preprint available at http://arXiv.org/abs/1707.06381 (2017).

Han, S., Mao, H. & Dally, W. J. Deep compression: compressing deep neural networks with pruning, trained quantization and Huffman coding. In International Conference on Learning Representations 2016 (ICLR, 2016).

Moons, B. & Verhelst, M. A 0.3-2.6 TOPS/W Precision-Scalable Processor for Real-Time Large-Scale ConvNets. In 2016 Symposium on VLSI Circuits (IEEE, Honolulu, ICLR, San Juan, 2016).

Dan, Y. & Poo, M.-M. Spike timing-dependent plasticity: from synapse to perception. Phy. Rev. 86, 1033–1048 (2006).

Hertz, J., Krogh, A. & Palmer, R. G. Introduction to the Theory of Neural Computation (Perseus, Cambridge, 1991).

Lecun, Y., Cortes, C. & Burges, C. J. C. Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998).