You are here

Neuromorphic Computing with Memristive Circuits

Monday, November 11, 2013 -
4:00pm to 4:50pm
KEC 1001

Speaker Information

Dmitri Strukov
Assistant Professor
Department of Electrical and Computer Engineering
University of California, Santa Barbara


<p>I will discuss recent experimental results on pattern classification and recognition tasks implemented with <span data-scayt_word="memristive" data-scaytid="1">memristive</span> [1] neural networks. The <strong>Pt/<span data-scayt_word="TiO" data-scaytid="2">TiO</span><sub>2-x</sub>/Pt</strong> <span data-scayt_word="memristive" data-scaytid="13">memristive</span> devices, which are utilized in both demonstrations, are fabricated with <span data-scayt_word="nanoscale" data-scaytid="15">nanoscale</span> e-beam-defined protrusion which localizes the active area during the forming process to ~(20 nm)<sup>3 </sup>volume and as a result helps in improving device yield. In particular, I will first discuss demonstration of pattern classification task for 3×3 binary images by a single-layer <span data-scayt_word="perceptron" data-scaytid="19">perceptron</span> network implemented with 10 x 2 <span data-scayt_word="memristive" data-scaytid="16">memristive</span> crossbar circuits in which synaptic weights are realized with <span data-scayt_word="memristive" data-scaytid="17">memristive</span> devices [2]. The <span data-scayt_word="perceptron" data-scaytid="20">perceptron</span> circuit is trained by ex-situ and in-situ methods to perform binary classification for a set of patterns from an original work of B. <span data-scayt_word="Widrow" data-scaytid="24">Widrow</span> on <span data-scayt_word="“memistor”" data-scaytid="25">“memistor”</span> classifiers. Both approaches work successfully despite significant variations in switching behavior of <span data-scayt_word="memristive" data-scaytid="18">memristive</span> devices as well as half-select and leakage problems in crossbar circuits. <span data-scayt_word="Ithen" data-scaytid="26">Ithen</span> present experimental demonstration of pattern recognition task, in particularly showing 4-bit analog-to-digital conversion (ADC) operation implemented with <span data-scayt_word="Hopfield" data-scaytid="27">Hopfield</span> recurrent neural network [3]. A 4-bit ADC is implemented with four inverting amplifiers (neurons), each of which is made with three Si IC operation amplifiers, and a 4´6 <span data-scayt_word="memristor" data-scaytid="29">memristor</span> crossbar which defines the connectivity among neurons (and bias). In this work the <span data-scayt_word="memristors" data-scaytid="30">memristors</span> are tuned precisely to the values described in the original <span data-scayt_word="Hopfield" data-scaytid="28">Hopfield</span> work using the developed algorithm [4]. Although the considered circuits are simple and hardly practical by itself, the established work presents a proof-of-concept demonstration for <strong>highly anticipated </strong><span data-scayt_word="memristor-based" data-scaytid="31">memristor-based</span> artificial neural networks and <strong>paves the way for extremely dense, high-performance information processing systems.</strong></p><p><strong>References</strong></p><p>[1]&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; J.J. Yang, D.B. <span data-scayt_word="Strukov" data-scaytid="52">Strukov</span> and D.R. Stewart, Nature Nanotechnology 8 (2013) 13-24.</p><p>[2] &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; F.Alibart, E. <span data-scayt_word="Zamanidoost" data-scaytid="56">Zamanidoost</span>, D.B. <span data-scayt_word="Strukov" data-scaytid="54">Strukov</span>, Nature Communications, (2013) <span data-scayt_word="25th" data-scaytid="59">25<sup>th</sup></span> June.</p><p>[3] &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; L. <span data-scayt_word="Gao" data-scaytid="60">Gao</span>, F. <span data-scayt_word="Merrikh-Bayat" data-scaytid="61">Merrikh-Bayat</span>, F. <span data-scayt_word="Alibart" data-scaytid="62">Alibart</span>, X. <span data-scayt_word="Guo" data-scaytid="63">Guo</span>, B.D. Hoskins, K.-T. Cheng, and D.B. <span data-scayt_word="Strukov" data-scaytid="55">Strukov</span>, in: <span data-scayt_word="Proc" data-scaytid="64">Proc</span>. <span data-scayt_word="NanoArch" data-scaytid="65">NanoArch</span> (2013).</p><p>[4]&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; F. <span data-scayt_word="Alibart" data-scaytid="66">Alibart</span>, L. <span data-scayt_word="Gao" data-scaytid="67">Gao</span>, B. Hoskins, D. B. <span data-scayt_word="Strukov" data-scaytid="68">Strukov</span>, Nanotechnology 23 (2012) 075201.</p>

Speaker Bio

Dmitri Strukov is an assistant professor in the Department of Electrical and Computer Engineering at the University of California, Santa Barbara (UCSB). Dr. Strukov received MS in applied physics and mathematics from the Moscow Institute of Physics and Technology in 1999 and a PhD in electrical engineering from Stony Brook University in New York in 2006. In general, he is broadly interested in a physical implementation of computation, including device physics, circuit design, and high-level architecture, with emphasis on emerging device technologies. In particular, his main focus now is on various aspects of reconfigurable hybrid nanoelectronic systems, utilizing novel resistive switching ("memristive") device, for applications in digital memories, programmable logic, and neuromorphic networks. Prior to joining UCSB he worked as a postdoctoral associate at Hewlett Packard Laboratories from 2007 to 2009.