Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Circuit design with nano electronic devices for biomimetic neuromorphic systems
(USC Thesis Other)
Circuit design with nano electronic devices for biomimetic neuromorphic systems
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
CIRCUIT DESIGN WITH NANO ELECTRONIC DEVICES FOR
BIOMIMETIC NEUROMORPHIC SYSTEMS
by
Kun Yue
A Dissertation Presented to the
FACULTY OF THE USC GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulllment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
(ELECTRICAL ENGINEERING)
December 2021
Copyright 2021 Kun Yue
Acknowledgments
I would like to thank my advisor, Dr. Alice Parker, for her invaluable guidance
and mentorship throughout my journey pursuing a Ph.D at USC. Her continuous
support and encouragement have always boosted my morale when facing diculties
and frustrations during the course of completing this dissertation. Her vision for
this research has sparked the curiosity in me to explore the academic challenges in
the neuromorphic engineering eld and has inspired my motivation to prove our
research is valuable.
I would like to thank my dissertation and qualifying committee members, Dr.
Pierluigi Nuzzo, Dr. Peter Beerel, Dr. Han Wang, and Dr. Archiiro Nakano, whose
advice and expertise have made this dissertation excelled. In addition, I would like
to thank my colleagues Rebecca Lee, Tiansong Cui, Fangzhou Wang, Haipeng Zha
and faculty mentors in the Ming Hsieh Department of Electrical Engineering, with
whom I have had productive and inspiring discussion about academic work and
research. I also would like to thank my colleague and best friend Yizhou liu from
ii
University of California Riverside, who has made our teamwork an extraordinary
experience.
I would also like to acknowledge the funding support from USC Research En-
hancement Fellowship and the Defense Advanced Research Projects Agency under
Grant W911NF-18-2-0264.
Last but not least, I would like to express my gratitude to my loving and
supportive family, without whom this dissertation is not possible.
iii
Table of Contents
Acknowledgments ii
List of Figures vi
Abstract xi
Chapter 1: Introduction 1
Chapter 2: Background and Related Work 7
2.1 Nano Electronic Devices Background . . . . . . . . . . . . . . . . . 7
2.2 Neuroscience Background of Sub-cortical Circuitry . . . . . . . . . 10
2.3 Neural Network Background . . . . . . . . . . . . . . . . . . . . . 12
2.4 State of the Art in Neuromorphic Systems . . . . . . . . . . . . . . 12
2.5 The BioRC Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Chapter 3: Synapse 17
3.1 Multi-state Synapse . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.1.1 Multi-state Synapse with MAM . . . . . . . . . . . . . . . . 19
3.2 Synapse with STDP Learning . . . . . . . . . . . . . . . . . . . . . 23
3.2.1 Spike Timing Dependent Plasticity (STDP) Algorithm and
Circuit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.2.2 On-Chip Learning Synapse . . . . . . . . . . . . . . . . . . . 27
3.3 Synapse with Noise . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.3.1 Random Pulse Generator . . . . . . . . . . . . . . . . . . . . 29
3.3.2 Discrete Noise Signals . . . . . . . . . . . . . . . . . . . . . 30
3.3.3 Continuous Noise Signals . . . . . . . . . . . . . . . . . . . . 32
3.3.4 Noisy Synapse in Neuron Cell . . . . . . . . . . . . . . . . . 35
3.4 Synapse with Frequency Adaptation . . . . . . . . . . . . . . . . . . 38
3.4.1 Voltage-Dependent Variable-Frequency Axon Hillock . . . . 40
3.4.2 Frequency-selective synapse circuit . . . . . . . . . . . . . . 43
3.4.3 Example STDP-like Frequency Adjustment Circuit . . . . . 46
3.4.4 Frequency-Adaptive Synapse . . . . . . . . . . . . . . . . . . 51
3.4.5 Frequency-Adaptive STDP Synapse with Strength Variation 53
iv
Chapter 4: Astrocyte Circuitry Neuromorphic Implementation 57
4.1 Sub-cortical Circuitry for Neuromodulation . . . . . . . . . . . . . . 57
4.1.1 Astrocyte Circuitry in Brain Stimulation for Parkinson's Dis-
ease . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
4.1.2 Astrocytic Circuit with MoS
2
Device . . . . . . . . . . . . . 58
Chapter 5: Neuronal Network Neuromorphic Implementation 67
5.1 Cortical Circuitry for Pattern Recognition . . . . . . . . . . . . . . 67
5.2 Memory Circuitry for Sequential Information . . . . . . . . . . . . . 71
Chapter 6: Blank-slate Network for Biomimetic Robot 80
6.1 Synapse with Dopamine Reward, Noise, and STDP . . . . . . . . . 81
6.2 The Blank Slate Network and Bimomimetic Robot . . . . . . . . . 85
Chapter 7: Future Research 90
Chapter A: Simulation: AstrocyteDeepBrainstimulationinParkin-
son's Disease in the Brain Blood Barrier 92
A.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
A.2 Stimulation Method . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
A.3 Results of Astrocyte Stimulation for Parkinson's Disease Treatment 94
A.3.1 Voltage Gated Calcium Channels . . . . . . . . . . . . . . . 95
A.3.2 Calcium-induced Calcium Oscillation . . . . . . . . . . . . . 96
A.3.3 Glutamate and Receptors . . . . . . . . . . . . . . . . . . . 99
A.3.4 Neurological Modulation for Parkinson's Disease . . . . . . . 101
A.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Bibliography 108
v
List of Figures
Figure 2.1 (a) A biomimetic excitatory synapse can be modulated by
dierent neural mechanisms such as neurotransmitter con-
centration, reuptake, and ion pump. (b) A biomimetic hy-
perpolarizing inhibitory synapse can be modulated by dif-
ferent neural mechanisms such as neurotransmitter concen-
tration, reuptake, and ion pump. [1] . . . . . . . . . . . . . 15
Figure 2.2 The Axon Hillock Circuit: The single-stage amplier is
used as a voltage interface between VSOMA and the in-
verter threshold. The inverter is used to generate a full
swing from GND to VDD to the spiking circuit. First,
the Na+-activation segment (modeled by X1, X2) turns on
Na
+
channels (X8) rapidly. After some delay (X4) Na
+
-
inactivation (X3) then turns o Na
+
channels. This por-
trays the gradual inactivation of Na
+
channels when the
action potential reaches a depolarizing state. After some
more delay (X4), K
+
-activation (X10) turns on K
+
chan-
nels (X7). K
+
-inactivation (X9) turns o K
+
channels after
a longer delay (X5, X6 in series). The delay is controlled by
adjusting the strengths of transistors. We use the resistive
and capacitive properties of the transistors to achieve the
desired time constants. X11 is used to pull the AP back to
its resting potential. [1] . . . . . . . . . . . . . . . . . . . . 16
vi
Figure 3.1 (a) BioRC synapse circuit modied for this dissertation.
AP is Action Potential, NT is neurotransmitter quantity,
Re is reuptake control, KR is K
+
channel receptor quan-
tity control, and EPSP is excitatory postsynaptic poten-
tial. (b) Simulation results of the synapse circuit with 45
nm CMOS. (c) Multi-state synapse circuit exploiting device
capacitance. (d) Simulation result of capacitive multi-state
synapse circuit with hybrid of 45 nm CMOS and MAM. (e)
Resistive MAM multi-state synapse circuit. (f) Simulation
result of resistive multi-state synapse circuit with hybrid of
45 nm CMOS and MAM. . . . . . . . . . . . . . . . . . . . 18
Figure 3.2 Schematic view of the magnetic domain wall analog mem-
ristor. Yellow arrows indicate the magnetization direction.
An electrical current
ow in the x-direction could induce
domain wall motion in the magnetic free layer. The tunnel
magnetoresistance (TMR) of this device is read out using
a vertical current (z-direction). Inset shows the calculated
resistance of the device after injecting both positive and
negative current with amplitude 5 10
11
Am
2
and 1 ns
duration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Figure 3.3 (a) STDP learning circuit. (b) Simulation results of the
STDP learning circuit and the MAM response. . . . . . . . 27
Figure 3.4 Illustration of basic STDP learning element implementation
(CMOS 45nm) including pre/post-synaptic neurons simpli-
ed to the axon-hillock, synapse circuit with MAM, STDP
learning circuit, current mirror for isolation, and capaci-
tor for current integration. The resistance of the MAM is
tuned according to the output spikes of the pre- /postsy-
naptic axon hillock following the STDP rule described above. 28
Figure 3.5 A cross-section view of the single photon random pulse gen-
erator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Figure 3.6 Circuit of the discrete noise generator . . . . . . . . . . . . 31
Figure 3.7 Discrete noise signals: Random pulses (upper trace) and
simulated photon voltage (lower trace) . . . . . . . . . . . . 32
Figure 3.8 Corner situation of random pulses with positive and nega-
tive values . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Figure 3.9 Current mirror circuit . . . . . . . . . . . . . . . . . . . . . 33
Figure 3.10 Circuit for continuous noise . . . . . . . . . . . . . . . . . . 34
Figure 3.11 Continuous noise signal and summed positive and negative
noise signals . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Figure 3.12 Structure of the experimental noisy neuron [2] . . . . . . . 35
Figure 3.13 BioRC excitatory synapse circuit [3] . . . . . . . . . . . . . 36
vii
Figure 3.14 Simulation result with variable neurotransmitter availabil-
ity in the excitatory synapse . . . . . . . . . . . . . . . . . 36
Figure 3.15 Variable threshold axon hillock circuit . . . . . . . . . . . . 37
Figure 3.16 Simulation result of variable threshold in a neuron circuit . 37
Figure 3.17 Spiking Axon Hillock Circuit with Voltage-Dependent Spik-
ing Frequency. The triangles are buers including two in-
verters serially connected. . . . . . . . . . . . . . . . . . . . 40
Figure 3.18 Output spike with Dierent PSP LevelS, When PSP lev-
els are 0.5V, 0.6V and 0.7V, output spike frequencies are
20MHz, 130MHz and 280MHz respectively. . . . . . . . . . 42
Figure 3.19 Output spike with Dierent PSP LevelS, When PSP lev-
elS are 0.5V, 0.6V and 0.7V, output spike frequencies are
20MHz, 130MHz and 280MHz correspondingly. . . . . . . . 43
Figure 3.20 Frequency-Selective Synapse Circuit . . . . . . . . . . . . . 44
Figure 3.21 PSP amplitude response respective to input spike frequency 45
Figure 3.22 Frequency Selective Synapse Verication Circuit . . . . . . 46
Figure 3.23 Frequency-Selective Synapse Verication Circuit Simula-
tion Result . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Figure 3.24 Frequency-Selectivity learning circuit . . . . . . . . . . . . 48
Figure 3.25 Frequency-selective learning simulation result . . . . . . . . 50
Figure 3.26 Frequency adaptive synapse test circuit . . . . . . . . . . . 52
Figure 3.27 Frequency adaptive simulation result. The pre spikes re in
300MHz for rst 8 s and then in 130MHz, the post spikes
re in 300MHz, the initial value of the memristor resistance
is 2.8 M
and its minimum value is 800k
. . . . . . . . . . 53
Figure 3.28 Frequency adaptive synapse with strength variation test cir-
cuit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Figure 3.29 Frequency adaptive synapse with strength variation test cir-
cuit. The pre spikes re in 300MHz for rst 8 s and then
in 130MHz, the post spikes re in 300MHz, the initial value
of the MEM1 and MEM2 resistance are 2.8 M
and 800k
,
the minimum value of memristor resistance is 800k
. . . . 56
Figure 4.1 I-V Curve of MoS
2
FET . . . . . . . . . . . . . . . . . . . . 61
Figure 4.2 I
d
s vs. V
d
s with Back-Gate Voltage and Fixed Top-Gate
Voltage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Figure 4.3 BioRC MoS
2
excitatory synapse circuit [4] . . . . . . . . . 62
Figure 4.4 The Action Potential and EPSP of the MoS
2
Synapse under
Normal Operation . . . . . . . . . . . . . . . . . . . . . . . 62
Figure 4.5 Synapse EPSP Simulation Result under Back-Gate Control 63
Figure 4.6 Delay and Maximum Value under Process Variation . . . . 63
Figure 4.7 Astrocyte Microdomain Circuit [5] . . . . . . . . . . . . . . 64
viii
Figure 4.8 Sketch of Astrocyte Neural Network . . . . . . . . . . . . . 65
Figure 4.9 Simulation Result of Astrocytic Ca
2
+ Signal with Synaptic
Cleft Signal . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Figure 4.10 Simulation Result of Astrocytic Modulation to Synapse 6 . 66
Figure 5.1 (a) Fully connected neural network implemented in this
neuromorphic system. (b) Patterns example input to the
neural network. (c) Simulation result of one pattern recog-
nition exercise. . . . . . . . . . . . . . . . . . . . . . . . . 68
Figure 5.2 STDP-Dopaminergic learning circuit . . . . . . . . . . . . . 69
Figure 5.3 (a) The three patterns fed into the feed-forward neuronal
network. (b) The 20 output neurons response during the
on-the-
y learning process of the three successive patterns . 70
Figure 5.4 Recurrent neuronal network for sequential memory. . . . . 72
Figure 5.5 Recurrent neuronal network during learning process. . . . . 73
Figure 5.6 Recurrent neuronal network during recall process. . . . . . 74
Figure 5.7 Simulation result of the recurrent neuronal network for se-
quential memory. . . . . . . . . . . . . . . . . . . . . . . . 75
Figure 5.8 Extended recurrent neuronal network for sequential mem-
ory and recognition. . . . . . . . . . . . . . . . . . . . . . . 76
Figure 5.9 Extended Recurrent Neuronal Network during the Learning
Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Figure 5.10 Extended recurrent neuronal network during recall process. 79
Figure 6.1 Dopamine-modulated STDP Synapse Circuit. The gate
names with arrows are inputs, while others are biasing con-
trols. No extra capacitance is added, charge storage relies
on the capacitances of transistors. . . . . . . . . . . . . . . 82
Figure 6.2 Dopamine-Modulated STDP synapse Waveforms . . . . . . 84
Figure 6.3 Two-Chain Cortical Neuronal Network Simulated with SPICE 86
Figure 6.4 Waveform showing two neurons signaling forward and back-
ward movement, NA1 and NA2. Simulated stimulus and
response of subcortical system when desire and noise were
introduced caused NA1 to spike, dopamine was released
and strengthened synapses in the A chain. Fear and noise
caused neuron NB1 to spike, and simulated movement caused
dopamine to be released, strengthening synapses in the B
chain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
ix
Figure 6.5 Waveform showing two neurons signaling forward and back-
ward movement, NA1 and NA2. Simulated stimulus and
response of subcortical system when desire and noise were
introduced caused NA1 to spike, dopamine was released
and strengthened synapses in the A chain. Fear and noise
caused neuron NB1 to spike, and simulated movement caused
dopamine to be released, strengthening synapses in the B
chain. This screenshot if from the video described earlier. . 89
Figure A.1 (a) SEM Image of Magneto-electric Nano Particles. (b)
Polarization by External Magnetic Field. (c) AFM Image
of Magneto-electric Nano Particles. . . . . . . . . . . . . . 94
Figure A.2 The Flow Diagram of the Astrocytes Stimulation for Parkin-
son's Disease Treatment . . . . . . . . . . . . . . . . . . . . 95
Figure A.3 Charge Moved into Astrocyte with Membrane potentials
and Accumulated Times . . . . . . . . . . . . . . . . . . . 97
Figure A.4 (a) The Diagram of CICR in Astrocytes. (b) The Ca
2+
Oscillation in Astrocyte Cytosol. (c) The Ca
2+
Oscillation
in ER. (d) The IP3 Oscillation in ER. . . . . . . . . . . . . 99
Figure A.5 AMPAR Conductance Simulation with Time and Neuro-
transmitter Concentration . . . . . . . . . . . . . . . . . . 100
Figure A.6 (a) Conductance of NMDAR. (b) Charge
ow through NM-
DAR. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Figure A.7 Simulation Result of Normal Control . . . . . . . . . . . . 102
Figure A.8 Simulation Result of Parkinson's Disease . . . . . . . . . . 102
Figure A.9 Connections within the network are according to known
biology of the astrocytes, basal ganglia, and Thalamus with
DBS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Figure A.10 Simulation Result of Parkinson's Disease with DBS . . . . 104
Figure A.11 Connections within the network are according to known
biology of the astrocytes, basal ganglia, and Thalamus with
MNPs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Figure A.12 Simulation Result of Parkinson's Disease with Astrocyte
Stimulation of MNPs . . . . . . . . . . . . . . . . . . . . . 106
x
Abstract
This dissertation presents research on neuromorphic circuits and nanotechnologies
that support learning. This work started from multi-state synapse circuit design
and explored implantations of all CMOS and nanoelectronics circuit hybrids. Mag-
netic analog memristor (MAM) is the nanoelectronic device used in this synapse
circuit to realize multiple weight states. Based on the multi-state synapse cir-
cuits, features including short-long term memory and noise are added to enable
reward-based learning using our synapse circuits. Furthermore, a frequency adap-
tation feature is explored to increase the nonlinearity for more complex learning
algorithm.
Based on the improved synapse circuits and previous work done by the BioRC
group, neuronal circuitries including cortical, subcortical, and memory networks
are demonstrated in this work. The cortical circuit is a 2-layer feedforward net-
work that can learn with spike-time dependent plasticity (STDP) in an online
and on-chip manner to do image pattern recognition. An astrocyte network that
modulates neuronal behavior is also presented as subcortical circuitry and another
xi
nanoelectronic device, MoS2, is used to optimize this implementation. We also
studied the astrocyte network numerically for Parkinson's disease brain stimula-
tion treatment. The neuronal memory circuitry, especially sequential memory, is
implemented as a neuron chain with STDP synapses.
The last piece of this work is a blank-slate network for biomimetic robot con-
trol. This network enables a robot to learn movement from emotions such as desire
and fear. The emotions map to possible neurons evoking in cortical manner; each
motion neuron is followed by a neuron chain communicating with robot's activa-
tors, sensors and reward neurons. With noisy random emotion inputs, the robot
will learn according movement and form sequential memories. This learned result
won't be forgotten when learning new actions, which is satisfying lifelong learning.
In conclusion, a multi-state synapse circuit with both linear and nonlinear func-
tions is proposed to increase the parallelism of this neuromorphic system. A mem-
ristor device in the synapse circuit provides analog values to enable the multi-states
and frequency nonlinearity. The subcortical parts are implemented as astrocytic
and dopaminergic regulation circuits. These parts enable the transformation from
short-term memory to long-term memory. With a sequential memory implemented
by the multi-states synapse and neuron chain structure, a solution of the lifelong
learning problem is demonstrated.
xii
Chapter 1
Introduction
Capturing the behavior of biological neurons in the brain with neuromorphic cir-
cuits is dicult because of the scale of the problem (and resulting power issues), the
complexity of the behavior of individual neurons, the density of interconnections
between neurons, and the plasticity of the biological structures. Nanotechnology
oers some signicant advantages of scale, power, circuit complexity reduction,
interconnect density and plasticity. In specic, the thesis focuses on two nanotech-
nologies, MoS
2
and analog memristors, that reduce circuit complexity and support
plasticity, while contributing to interconnection density reduction, and potentially
scale and power reduction.
As interdisciplinary research, engineering biomimetic neuromorphic systems
with advanced nanoelectronics technologies presents challenges from dierent
elds. The rst challenge is understanding the essential reasons for the neural
behaviors or phenomenon. For example, memory needs persistence, analog range
and on-the-
y variability for lifelong learning. To satisfy these requirements, a
domain-wal-based analog memristor is adopted as a neuromorphic memory de-
vice in this thesis. Based on this device, the circuit designs of synapse, learning
element and network are modied to be optimized. The second challenge is com-
prehensively understanding both the state of the art technology and alternative
1
technologies including features, advantages and disadvantages. Some advanced
nanotechnologies are extremely attractive in certain aspects, but may have un-
expected limitations for specic applications. For example, the two dimensional
material graphene is famous for its incredible high electron mobility and ultra-low
resistance [6], which looks perfect for interconnection application of circuit layouts.
However, its intrinsic low-current density cannot support enough current
ow for
CMOS with reasonable layout geometry, that lead to an impossible application.
The third challenge is designing the neuromorphic circuits according to the knowl-
edge of neuroscience and nanotechnology. Besides standard requirements of circuit
design including functionality and optimization, there could be some tricky issues
from nanotechnology or neuroscience. For example, to realize certain functions
eciently, multiple technologies could need to be hybridized, such as CMOS and
carbon nanotubes [3]. However, the compatibility of dierent technologies in term
of voltage or current is always a challenge. Hence, we may need the circuitry
to compliment the compatibility of dierent technologies without compromising
designs.
The hypothesis of this dissertation is that a more bio-plausible neuromorphic
system including complex plasticity, localized memory, and ecient network sys-
tem could improve the modern articial intelligence technology in terms of comput-
ing asynchrony, hardware volume, and learning without forgetting. To demonstrate
2
that, I will implement more biomimetic characteristics enabled by advanced nan-
otechnology in my pre-developed spiking-based neuromorphic system and show the
improvement in learning and recognition tasks. I will also demonstrate how learn-
ing without forgetting can occur using the BioRC neural complexity that relies on
biological neural complexity to capture more complex behavior. This complexity
is based on synaptic memory persistence, neurohormones (especially dopamine),
variable (noisy neural) behavior, and complex dendritic computations.
To engineer a bio-plausible neuromorphic system, practical problems of cir-
cuitry such as transistor count, interconnection density and persistent memory
need to be taken into consideration rst. I propose to construct neurons/synapses
with persistent memory while reducing interconnection density and circuit com-
plexity. Furthermore, I am going to show how to combine neuron behavior with
astrocytic intervention while reducing circuit complexity.
Besides the above fundamental engineering issues, noise can play benecial roles
in the nervous system. For example, noise causes random neural ring and learning
new results when synapses are strengthened. A proposed noisy neuron has been
demonstrated in the following chapters. I am going to apply the noisy neurons to
develop an on-the-
y dynamically learning neuronal network. The noisy will help
the learning network to avoid trapped results, so that leads to a lifelong learning
system. I will show how a neuronal network can learn to recognize a character by
3
ring in a certain pattern, then learn to recognize a dierent character by ring
with a dierent pattern, without forgetting the rst character recognition.
The following work to illustrate the engineering specication of the proposed
research is described. I investigate the quantitative improvement of introducing
the MoS
2
device in neuromorphic circuits and explore more neural functions that
can be implemented with an MoS
2
device. Moreover, the persistence capability of
MAM will be compared with CMOS using mathematical equations and simulation
of leakage in SPICE. Furthermore, the large neural network developed using our
circuits and advanced device technologies will be analyzed in detail including power
eciency, parallelism, and asynchronous features.
The results of this thesis research lead to a spiking neuromorphic system frame-
work including nanoelectronic device SPICE models, biomimetic neural elements
circuits, and a neuronal network that partially integrates fundamental results of
BioRC circuits into system-level implementation and serves as a platform for fu-
ture development towards to an articial brain. At present, this frame work is
based on CMOS 45nm technology and includes 3 nanoelectronics SPICE models,
3 basic nanotechnology device circuits with their subtypes, 2 kinds of learning
methods and network examples. Moreover, applications like image pattern recog-
nition and sequential memory have been achieved. The details will be explained
in later chapters.
4
This spiking neuromorphic system framework is born of the BioRC project,
hence it inherits CMOS analog neuromorphic circuits such as basic synapse and
axon hillock circuits. To simulate the advanced nanoelectronic devices incorpo-
rated with the CMOS based circuits, Verilog-A is used to model the nanoelectronic
devices. The compact Verilog-A device models are calibrated with experimental
data or numerical simulation data to achieve sucient validation. The 3 nano-
electronics SPICE models mentioned above include single photon avalanche diode
(SPAD) model, Molybdenum Disulde (MoS
2
) FET model, and domain wall based
magnetic analog memristor (MAM) model. The SPAD model is based on a diode
model, and the photons are simulated as voltage pulses. The MoS
2
FET model is
developed using physical equations with experimental data calibration. The MAM
model is a behavioral model with look-up tables of numerical simulation data.
The nanoelectronic devices are implement as circuits to realize certain neural
functions, such as random pulse generator, astrocyte micro-domain, and a spike-
time-dependent-plasticity learning element. To allow hierarchical usage of these
circuits in plug and play fashion, complex circuits are divided into small sub-circuits
with ports and all the circuits are collected into a library for reusing.
A net-list generator written in Python is developed to assemble neuromorphic
circuits into networks of neurons. These neuronal/astrocytic networks with feed-
back and neuromodulation are distinguished from traditional neural networks in
term of temporal and spatial summation, neuromodulation, asynchrony, dendritic
5
computations, dendritic spiking and plasticity, astrocytes and network heterogene-
ity. At present, the neuronal network can realize functions of image pattern recog-
nition and sequential memory.
Chapter 2 introduces all the background of the adopted bio-inspired concepts
and summarizes the implementation of neuromorphic circuits comparing with re-
lated work. Chapters 3-5 are the details of designs from neural cell circuits, small
networks to real world application. Chapter 7 is the statement of my future re-
search plan based on the nished work. And the appendix is my early research of
magneto-electrical nanoparticles for Parkinson's Disease treatment through astro-
cyte stimulation.
6
Chapter 2
Background and Related Work
In this chapter, I will introduce the biology implemented by our circuits and the
advanced device technologies adopted to enable the circuit designs or improve the
previous designs. Then I will compare this proposed research with completed work
in our group and other groups to present the prospective improvement or trade-o.
2.1 Nano Electronic Devices Background
Complex neuromorphic circuits that exhibit advanced learning and memory are
believed to depend on the dense interconnectivity, scale and interactions of bio-
logical neural circuits with other brain cells, including astrocytes. Analog circuits
are a leading candidate for implementation to compress the hardware required per
neuron by exploiting the ability to \compute" using charge, current and voltage.
Even so, constructing these circuits at scale (10,000 synapses per neuron, fan out
of 10,000 to adjacent neurons, and billions of neurons in the articial brain) will
probably require innovations in nanoelectronics, including layers of nanomaterial in
addition to 3D CMOS connectivity. A particular nanotechnology, the MoS
2
tran-
sistor, possesses a dual-gate structure that supports tight integration of astrocytic
circuits with neural circuits by exploiting the back gates of the MoS
2
transistors
7
to incorporate astrocytic control of the neural circuits. This innovation removes
the need for an explicit adder that incorporates astrocyte intervention into the
postsynaptic potential of the target neurons, possibly saving a dozen transistors
for each synapse implemented with this MoS
2
device.
Transition metal dichalcogenide (TMD) transistors, including the MoS
2
FET,
have high electron mobility, independent gate control, and strain
exibility that
aects mobility. Prototypes of monolayer TMD transistors and circuits have been
successfully demonstrated [7]. Similar to monolayer graphene, Monolayer TMD is a
2-D honeycomb lattice and is a robust thin-lm structure. Instead of carbon atoms,
it consists of a transition metal (denoted as M) and chalcogen atoms (denoted
as X), and thus its chemical formula is expressed as MX
2
[8]. It has electrical
and physical properties like graphene and it has an intrinsic band gap that limits
the short-circuit current better than graphene FETs. In this work, MoS
2
, the
most well-studied TMD material, is chosen to implement synapses and astrocyte
circuits. Circuits implemented with MoS
2
FETs will have low short-circuit current
and supply voltage. MoS
2
can be deformed with strain, making it a semiconductor
and interesting for dual-gated electronics applications. A simulation was performed
to study the strain eect on the dual-gated MoS
2
FET [9]. Furthermore, Desai et
al. [10] presented a 1 nm gate length transistor fabricated with MoS
2
and carbon
8
nanotubes, which raises the possibility of making neuromorphic circuits closer to
the human brain in size, number and density of synapses and neurons.
1
Spintronic devices are solid state devices exploiting the intrinsic spin, electrical
charge and magnetic moment of materials for computing purposes [12]. Spin-
tronic devices have been proposed as promising hardware candidates for neuro-
morphic computing due to their prominent properties such as non-volatility and
low power consumption. Neuromorphic computing with spin-based devices such as
magnetic tunnel junctions has been theoretically proposed and recently experimen-
tally demonstrated [13{15]. With a set of recent discovered physical phenomena
such as the spin transfer torque and spin Hall eect (SHE), spintronic devices have
evolved signicantly with more
exible device architectures and higher power e-
ciencies [16{19]. Furthermore, spintronic devices are also compatible with CMOS,
which is highly demanded for technological applications. In this proposed work, a
spintronic device MAM is incorporated into the BioRC synapse circuit to represent
the strength states, and the STDP circuit implemented with CMOS is designed to
adjust the resistance states of the MAM dynamically.
Hu, Miao, et al. [20] reported a multi-state memristor implemented by parallel
connected stochastic memristive switches with 10 resistance states, average 0.1s
state switch time and 4.5 mW power. Boybat [21] reported IBM's most advanced
1
Working carbon nanotube synapses were demonstrated early by the BioRC group in con-
junction with Zhou's Nanolab [11] and many circuit models of carbon nanotube neurons were
simulated with carbon nanotube device models beginning with [4].
9
result, which uses phase-change memory to achieve a multi-memristive synapse
with temporal signal correlation detection. In this proposed research, the ana-
log memristor has maximum 10 states with deterministic switching property, 5ns
switching time, and 0.125 mW power.
2.2 Neuroscience Background of Sub-cortical
Circuitry
Besides cortex, subcortical parts of mammalian brains essential for survival can
be found in all vertebrates, including the basal ganglia that are responsible for
reinforcement learning and the cerebellum, which provides the brain with forward
models of motor commands [22]. Long-range connection rather than local con-
nection is the major feature of these brain parts. These subcortical parts are in
charge of organizing thousands of cortical networks and managing the global
ow
of information in the cortex. For example, the dopamine neurons in the brainstem
provide reward prediction and neurotransmitters, and are key computational cells
in the temporal dierence reinforcement learning process. The dopamine neurons
in the midbrain are observed throughout the cortex and basal ganglia; these neu-
rons modulate synaptic plasticity and provide motivation for obtaining long-term
rewards [23]. Several other neuromodulatory systems also control global brain
states to guide behavior, representing negative rewards, surprise, condence, and
10
temporal discounting [24]. Therefore, a biomimetic neuromorphic system should
include not only the cortex for special tasks, but also system-level communication
including the brain subcortical regions.
The astrocyte plays an important role for information processing in the hu-
man brain [25], including neuronal communication. An astrocyte monitors synap-
tic activity in an autonomous manner through individual regions, known as mi-
crodomains. Astrocytic microdomains are activated according to the release of
neurotransmitters from synapses in close proximity. Activation of microdomains
induces calcium waves that propagate into dierent compartments of the astro-
cyte. It is the dynamics of calcium waves that evoke the release of transmitters,
called gliotransmitters, from the microdomain. This endows the astrocyte with
the ability to modulate synaptic information. Their capability to synchronize neu-
ronal activity by inducing slow inwards currents (SICs) on adjacent dendrites with
a high degree of temporal correlation is currently a subject of interest due to its
possible role in cognitive processing [26].
The BioRC project introduced astrocytes into neuromorphic circuits beginning
in 2010 [27]. Irizarry and Parker [5] presented an astrocyte microdomain circuit
implemented using CMOS 0.18 um technology including a 10-transistor analog
voltage adder. In this dissertation, the adder combining the eects of neurotrans-
mitters and gliotransmitters is replaced by 1 dual gated MoS
2
FET and the circuit
11
implemented achieves similar circuit behavior with 90% fewer transistors in the
circuit combining astrocyte and neural signaling.
2.3 Neural Network Background
Neural network models inspired by cerebral cortex are high-dimensional dynam-
ical systems that learn how to map input spaces into output spaces. They have
achieved great success for solving specic problems, while taking advantage of
their parallelism scaling power. The largest network today is about 175 billion
synapses presented by OpenAI, which requires 3640 peta
ops/s-day for training.
In comparison with human brain, the number of synapses in the cerebral cortex
is about 100,000 billion, 500 times greater than the network. Hence, as a deploy-
ing paradigm, a primary goal of biomimetic neuromorphic systems is increasing
parallelism as much as possible.
2.4 State of the Art in Neuromorphic Systems
Neurogrid was created at Stanford University as an programmable supercomputer
designed to explore various hypotheses about the inner workings of a mammalian
cortex [28]. The Neurogrid board includes 16 chips (Neurocores) each with a 256
256 matrix of real-time spiking neurons (eight-transistors-per-soma integrate and
12
re circuits) consuming 5 W of power. A Field-Programmable Gate Array (FPGA)
and a bank of SRAM are used for interneural communication [29].
The recongurable online learning spiking neuromorphic processor (ROLLS)
is capable of emulating biophysics of real spiking neurons and dynamic synapses
[30].It was designed by the Institute of Neuroinformatics, ETH and University of
Z urich. The chip comprises 12.2 M transistors with 128 k analog synapses and
256 neural circuits in (51:4mm
2
) area with 4mW of power. It supports online
hardware learning of pattern recognition in a machine vision task through short-
and long-term plasticity.
SPIking Neural Network Architecture (SpiNNaker) is a massively parallel neu-
romorphic supercomputer at Manchester University. The supercomputer will con-
sist of 18 cores; each core contains 57,600 ARM9 processors. The complete design
requires about 100 kW. It also uses address event representation (AER) to transmit
spikes between individual cores through a digital bus. This supercomputer is part
of the Blue Brain Project as a hardware platform for various neural algorithms for
both cortical and subcortical emulation [31].
In mid-2014 IBM announced the neuromorphic chip, TrueNorth, t with 1 M
neurons and 256 M synapses (5.4 billion 28-nm transistors, in 4.3 cm
2
) [32]. The
utility of this o-line trained processor was shown for real-time object detection in
video, while consuming60 mW of power
13
Loihi, by Intel, is a 60mm
2
chip fabricated in 14-nm process that integrates hier-
archical connectivity, dendritic compartments, synaptic delays, and programmable
synaptic learning rules. The chip instantiates a total of 2.07 billion transistors and
33 MB of SRAM (16 MB 1 bit synaptic memory) over its 128 neuromorphic cores
(1024 neurons/core) and three x86 cores in a single die.Its embedded mesh protocol
supports scaling to 4096 on-chip cores and, through hierarchical addressing, up to
16,384 chips [33].
Sengupta reported ultra-low power neuron (1.6 fJ per neuron per time step)
and stochastically binary synapse designs with novel post-CMOS technologies, and
showed simulation result of MNIST dataset recognition in STDP based SNN with
over 99% accuracy rate [34]. Several works reported neuromorphic STDP designs
incorporating metal oxide memristor devices [35{38]. The neuromorphic STDP
implementation with CMOS is explored thoroughly by [39{41]. Joshi etc. [42] im-
plemented neuromorphic STDP learning with carbon nanotube transistors. In this
dissertation, the neuromorphic STDP learning is implemented with hybridization
of 45-nm CMOS and MAM.
2.5 The BioRC Project
The work is presented here is part of the BioRC Project at USC. The BioRC
group designs neuromorphic hardware and systems in silico using analog electronic
14
Figure 2.1: (a) A biomimetic excitatory synapse can be modulated by dierent neural
mechanisms such as neurotransmitter concentration, reuptake, and ion pump. (b) A
biomimetic hyperpolarizing inhibitory synapse can be modulated by dierent neural
mechanisms such as neurotransmitter concentration, reuptake, and ion pump. [1]
circuits. Previous work from our group include circuit designs of excitatory and
inhibitory synapses shown in Fig. 2.1 [4], dendritic arbor [43], and axon hillock [44]
shown in Fig. 2.2. These circuit components are designed in a modular fashion
and can be connected together to form neuron circuit models. Mahvash explored
memristor device applications in neuromorphic circuits and variable behavior in
synapses and ring thresholds [45]. Joshi, Irizarry and Lee developed glial mi-
crodomain and astrocyte circuitry using the neural cell circuits [27,46,47]. Hsu re-
searched dendritic computations and developed acneuromorphc network embedded
with border ownership features as visual cortex [1]. Barzegarjalali developed neu-
romorphic circuits mimicking schizophrenic symptoms and Obsessive-Compulsive
Disorder among many applications [2,48].
15
Figure 2.2: The Axon Hillock Circuit: The single-stage amplier is used as a voltage
interface between VSOMA and the inverter threshold. The inverter is used to generate a
full swing from GND to VDD to the spiking circuit. First, the Na+-activation segment
(modeled by X1, X2) turns on Na
+
channels (X8) rapidly. After some delay (X4) Na
+
-
inactivation (X3) then turns o Na
+
channels. This portrays the gradual inactivation
of Na
+
channels when the action potential reaches a depolarizing state. After some
more delay (X4), K
+
-activation (X10) turns on K
+
channels (X7). K
+
-inactivation (X9)
turns o K
+
channels after a longer delay (X5, X6 in series). The delay is controlled by
adjusting the strengths of transistors. We use the resistive and capacitive properties of
the transistors to achieve the desired time constants. X11 is used to pull the AP back
to its resting potential. [1]
16
Chapter 3
Synapse
The synapse researched in this dissertation has been modied to enhance learning
and memory.
3.1 Multi-state Synapse
Fig. 3.1 (a, b) shows a Biomimetic Real-time Cortex (BioRC) analog CMOS
excitatory synapse circuit in 45 nm technology designed for this dissertation re-
search [49] and its transient simulation results. The input action potential (AP)
consists of spikes generated by a CMOS axon-hillock circuit [50] with maximum
amplitude 0.65 V. The excitatory postsynaptic potential (EPSP) magnitude of this
particular synapse circuit is approximately 14% of the action potential with about
5 times duration of the action potential. This particular simplied BioRC synapse
design realizes short-term memory through the duration of the EPSP, and can sup-
port long-term memory by adjusting the input of the neurotransmitter knob NT
to control neurotransmitter concentration in the synapse. Other BioRC synapses
also allow control of ion channel receptor concentration, providing another mem-
ory mechanism [42]. However, the BioRC synapse as implemented in CMOS does
not provide persistent memory unless the NT and receptor controls are generated
17
continuously, since charge leakage occurs. To mimic a multi-state human brain
synapse biologically, the resistance properties of the MAM are exploited in this
neuromorphic system.
Figure 3.1: (a) BioRC synapse circuit modied for this dissertation. AP is Action
Potential, NT is neurotransmitter quantity, Re is reuptake control, KR is K
+
channel
receptor quantity control, and EPSP is excitatory postsynaptic potential. (b) Simulation
results of the synapse circuit with 45 nm CMOS. (c) Multi-state synapse circuit exploiting
device capacitance. (d) Simulation result of capacitive multi-state synapse circuit with
hybrid of 45 nm CMOS and MAM. (e) Resistive MAM multi-state synapse circuit. (f)
Simulation result of resistive multi-state synapse circuit with hybrid of 45 nm CMOS
and MAM.
Fig. 3.1 (c, d) shows the multi-state synapse circuit exploiting device capac-
itance design and simulation results. The gate voltage of neurotransmitter(NT)
changes respect to signals NT Increase and NT Decrease and the output of the
18
synapse circuit EPSP will change according to the voltage of NT. This design theo-
retically can achieve unlimited numbers of states and ultra low power consumption
due to its charge-based operation.
3.1.1 Multi-state Synapse with MAM
The multi-state synapse exploiting device capacitance design is highly limited by
device capacitance leakage and fabrication yield. To achieve a robust multi-state
synapse circuit design, a nanoelectronic device, magnetic domain wall analog mem-
ristor (MAM) is introduced to implement resistive multi-state synapse design. The
structure of a MAM is illustrated in Fig.3.2. It consists of a heavy metal (HM)
layer, a magnetic free layer with perpendicular magnetic anisotropy (PMA), a tun-
nel barrier and a magnetic xed layer. The magnetic free layer hosts a magnetic
domain wall (DW) that separates the spin up (blue) and spin down (red) regions.
When an electrical current is
owing in the negative x-direction through the heavy
metal, a y-polarized spin current is injected into the free layer via the spin Hall
eect (Fig. 3.2) and drives a DW motion against the direction of the current. Due
to the pinning eects (both intrinsic and extrinsic pinning), there exists a criti-
cal current to initiate the DW motion. Only current with amplitude above the
critical value can trigger a DW motion, and the critical current here is 0.1 mA
( 1 10
11
Am
2
). The tunnel magnetoresistance (TMR) of this device can be
19
read out by a vertical (z-direction) electrical current. Note that the reading cur-
rent used here is well below the critical current of the DW motion so it does not
aect the position of the DW. The TMR is determined by the relative magnetiza-
tion direction between the two magnetic layers and thus depends on the explicit
position of the DW in the free layer. Since the DW motion is almost continuous de-
pending on the sample shape, defects and other variations, the TMR also changes
continuously, imitating the continuously-varying strength of an analog synapse.
Combining the in-plane and out-of-plane current, this MAM device is able to
achieve the functionalities of an analog memristor. A xed in-plane current pulse
with amplitude above the critical value is employed for the DW motion. Each
incoming current pulse changes the DW position and the corresponding resistance
state, and the resistance state can be read out via the out-of-plane current. Here we
consider a magnetic free layer with geometry 1000 nm 108 nm 1 nm. A N eel
type magnetic domain wall is stabilized in the free layer. A series of micromagnetic
simulations were performed in order to capture the microscopic dynamics of MAM
(for details of the micromagnetic simulation, see Methods in [51]). Furthermore,
perpendicular magnetic anisotropy (PMA) variations were also considered in the
simulations, which is usual in experiments and could aect the device performance.
50 MAM devices with dierent PMA variations were simulated and implemented
in our circuit simulations. The calculated resistance (averaged over 50 devices)
as a function of the input current pulse is shown in Fig. 3.2 inset, which is tted
20
to the reported experimental values [52]. Depending on the current direction, the
device resistance either increases or decreases quasi-linearly, which is similar to the
functionality of the Set/Reset signal for an analog memristor.
x
y
z
I
read
time (ns)
Resistance (Ω)
I
write
HM
free layer
fixed layer
tunnel barrier
spin polarization
0 5 10
30
40
50
60
0 5 10
30
40
50
60
current on
current on
+x current -x current
Figure 3.2: Schematic view of the magnetic domain wall analog memristor. Yellow
arrows indicate the magnetization direction. An electrical current
ow in thex-direction
could induce domain wall motion in the magnetic free layer. The tunnel magnetoresis-
tance (TMR) of this device is read out using a vertical current (z-direction). Inset shows
the calculated resistance of the device after injecting both positive and negative current
with amplitude 5 10
11
Am
2
and 1 ns duration.
To demonstrate the potential of this novel spintronic analog memristor in neu-
romorphic circuits and systems, an integrated SPICE model is used for circuit
simulation. We extracted the micromagnetic simulation results as a lookup table
and implemented it in Verilog-A to create a SPICE model.
The MAM is assumed to be a 4-terminal device, where 2 terminals are used for
the resistor and the other 2 terminals are used for controlling resistance Set/Reset.
21
The resistance is decreased by the Set signal until reaching a minimum value and
increased by the Reset signal until a maximum value is reached. The result is shown
in Fig. 3.1 (f). The synaptic plasticity of the following neuromorphic circuits is
based on this device. The MAM is connected with the EPSP node in this synapse
circuit to achieve multi-level current output.
Fig. 3.1 (e, f) shows the circuit design and simulation results. The Set/Reset
signal varies the resistance of the MAM, and the output voltage of the synapse
circuit is input to the resistance terminal of the MAM resulting in excitatory
postsynaptic current (EPSC ) output. A load capacitor is connected at the MAM
output to measure the EPSP variation in voltage. The input AP is generated by
an axon-hillock circuit, and the Set/Reset signal is set as stimuli. The resistance
of MAM changes deterministically according to each Set/Reset signal, which is a
xed current pulse with0:5 mA amplitude and 1 ns duration generated by a
pulse generator. The MAM also contains pinning areas for DW near its ends, so
that when the resistance of MAM reaches its maximum (minimum) value, it will
not respond to the Reset (Set) signal any more.
22
3.2 Synapse with STDP Learning
3.2.1 Spike Timing Dependent Plasticity (STDP) Algo-
rithm and Circuit
Spike Timing Dependent Plasticity (STDP), also known as Hebbian learning is
induced by tight temporal correlations between the spikes of pre- and postsynaptic
neurons. As with other forms of neural plasticity, it is considered as the learning
and information storage in the brain, as well as the development and renement of
neuronal circuits during brain development. [53] With STDP, repeated presynaptic
spike arrival a few milliseconds before postsynaptic action potentials leads in many
synapse types to Long-Term Potentiation (LTP) of the synapses, whereas repeated
spike arrival after postsynaptic spikes leads to Long-Term Depression (LTD) of the
same synapse. [54] The change of the synapse plotted as a function of the relative
timing of pre- and postsynaptic action potentials is called the STDP function
or learning window and varies between synapse types. [55] The rapid change of
the STDP function with the relative timing of spikes suggests the possibility of
temporal coding schemes on a millisecond time scale. [56]
In the STDP algorithm, the weight dynamics depend not only on the current
weight, but also depend on the relationship between presynaptic and postsynaptic
action potentials [57].
23
This means that, besides the synaptic weight, each synapse keeps track of
the recent presynaptic spike history. In terms of our STDP model, every time a
presynaptic spike arrives at the synapse, the presynaptic spike will cause charge
accumulation
pre
in diusion capacitance of a transistor, and then the charge will
decay gradually. Every time the postsynaptic neuron spikes, charge
post
is also
accumulated and then decays. When a postsynaptic spike arrives at the synapse
due to back propagation from the axon hillock, the weight change ! is calculated
based on the presynaptic charges. A simplied STDP mathematical model is given
below:
! =
8
>
>
<
>
>
:
(1)
@pre
@t
; if
post
>C
0; if
post
C
(3.1)
where is the amount of weight change at each time step of a synapse,
pre
is the accumulated charge from the pre signal,
post
is the accumulated charge
from the post signal and C is a constant greater than
pre
. The assumption of
this equation is that both
pre
and
post
are decaying over time after rising high
instantly. If the post signal arrives before the pre signal decays to 0, the weight of
this synapse will be increased by. If only the post signal arrives, which means the
output neuron res before with the ring of presynaptic neurons within an innite
time, the weight of this synapse will be decreased by .
The novel biomimetic circuit implementation of this STDP equation is shown
in Fig. 3.3(a); only 8 transistors and 2 pulse generators are used in the circuit.
24
This circuit operates on charges, thus avoiding continuous current to save power.
Each synapse in this design has its own STDP circuit and the possible power
consumption of memory operation can be reduced. is analogous to the resistance
change of MAM for each pulse.
pre
and
post
are analogous to electrical charges,
and charge signals are decaying over time through biasing transistors connected
to the ground. If and only if the pre signal arrives rst, the connected post gated
transistor will be charged.
Then if the post signal arrives successively, the Set pulse will be triggered
and the Reset signal is inhibited by discharging. The resistance of MAM will be
decreased by one Set pulse to increase the strength of this synapse. If only the
post signal arrives, the Set signal will not be triggered, because the pre-charging
of the pre transistor gate is absent, and the post signal will trigger the Reset pulse
without discharging inhibition. The resistance of MAM will be increased by one
Reset pulse to decrease the strength of this synapse. All the charging nodes in
this circuit (3.3 a) are discharged by a constant bias transistor to implement the
dierential timing factors dQ=dt. In the mathematical model, the amplitude of !
is a continuous value depending on the product of and the dierential equation.
However, the amplitude of ! in the circuit implementation is a discrete value of
the resistance change of MAM for each pulse. A positive-edge input will trigger
this circuit to generate one current pulse output with xed amplitude and duration,
25
in this case 0.5 mA and 1 ns. The simulation results shown in Fig. 3.3 (b) present
three scenarios:
Scenario 1:
If both pre and post spikes arrive in sequence and the timing interval is short
enough, the Set pulse is triggered and the resistance of MAM is decreased.
If the resistance of the MAM reaches its minimum value, the MAM will no
longer respond to the Set pulse.
Scenario 2:
If both pre and post spikes arrive in sequence with the timing interval slightly
longer than the rst scenario, neither signal Set nor Reset will be triggered.
This scenario is shown in Fig. 3.3 (b) around 70 ns. The reason for this
phenomenon is the decayed
pre
cannot serve as a source to enable the Set
pulse trigger, but it is strong enough to enable the discharge of the
post
and
then inhibit the post signal to trigger the Reset signal.
Scenario 3:
If only the post signal arrives or the time interval between signal pre and post
is long enough, the Reset signal is triggered and the resistance is increased.
If the resistance of the MAM reaches its maximum value, the MAM will no
longer respond to the Reset pulse any more.
26
Figure 3.3: (a) STDP learning circuit. (b) Simulation results of the STDP learning
circuit and the MAM response.
3.2.2 On-Chip Learning Synapse
By adding the STDP circuit to the previous synapse, the STDP learning ability
is enabled for on-chip learning implementation. Fig. 3.4 illustrates the connection
details of the two circuits.The pre spike from the presynaptic neuron is fed to
both synapse and STDP circuits and the synaptic output PSP is connected to
postsynaptic neuron. The post spike from postsynaptic neuron backpropagates
to the STDP circuit, then it generates set and reset signals to tune the synaptic
weight.
27
Figure3.4: Illustration of basic STDP learning element implementation (CMOS 45nm)
including pre/post-synaptic neurons simplied to the axon-hillock, synapse circuit with
MAM, STDP learning circuit, current mirror for isolation, and capacitor for current
integration. The resistance of the MAM is tuned according to the output spikes of the
pre- /postsynaptic axon hillock following the STDP rule described above.
3.3 Synapse with Noise
A strong argument for noisy neuromorphic neurons is presented in the Mahvash
dissertation /cite[]. In this section, we present the use of a CMOS-compatible
true randomness generator using an optoelectronic device as a noise source for
neuromorphic components in CMOS circuits. Photons generated by an LED, as
a provably random quantum process, provide a way to realize a random pulse
generator (RPG). Related work on noise generators has been widely reported but
use in neuromorphic circuits has not been reported to date [58]. In neuromorphic
circuits, independent randomness is important for noise to be a signicant factor
in neural information processing. Thus, in extensive neural networks containing
28
noisy neurons, a large number of RPGs integrated in circuits can be expected, and,
as a result, the RPGs must be power and area ecient.
The signals generated by an LED and single-photon avalanche diode (SPAD)
are uniform distributed discrete pulses with Gaussian distributed amplitude. The
photon intervals should be a Poisson distribution, but the constant quenching time
of the SPAD is much longer than the Poisson intervals [59]. To use the noise in this
manner, we connect the RPG model followed by an amplier to the neuromorphic
circuit in our SPICE simulation. For continuous noise, we use a current mirror
circuit with certain leakage to convert the discrete signals to continuous signals,
and then connect it in our circuits for SPICE simulation.
3.3.1 Random Pulse Generator
The RPG device consists of three parts: LED, single photon avalanche diode
device (SPAD), and waveguide. This structure has been experimentally veried
by others [60]. Based on this structure, we assume a red LED and single photon
avalanche diode device to implement an on-chip random pulse generator. The
cross-section of the device is shown in Figure 3.5. using the platform SOLES
reported by others [61]. The LED is compatible with CMOS technology.
The novel SPAD structure was emulated in SPICE using a CMOS 180 nm
process [62]. The element circuits including the current mirror and the amplier
are also implemented in 45 nm CMOS in [51], the detailed circuits parameters
29
Figure 3.5: A cross-section view of the single photon random pulse generator
including each transistors width are tuned for each application requirements in the
rest of this thesis. A planar p{n junction is biased above breakdown in the core
of SPAD, thus operating in Geiger mode. In this regime of operation, electron-
hole pairs generated by photons can stimulate an avalanche breakdown by impact
ionization. Because conventional avalanche photo diodes operate just below the
breakdown voltage, the optical gain of the SPAD is high enough to enable single-
photon sensing. Each photon absorbed in the active region can cause a large
current to
ow through the SPAD. In 180nm CMOS circuits, both active and
passive avalanche quenching schemes can be integrated, which enables this design
to have a higher detection rate.
3.3.2 Discrete Noise Signals
LE diodes are direct band gap devices that produce incoherent light by spontaneous
emission, essentially a random process. If operated at suciently low power, a LED
emits photons that are virtually independent of each other, this photon emission is
a Poisson process, and the wavelength of photons is a Gaussian distribution. The
exponentially-distributed time intervals of the Poisson process for photon emission
30
and the photon loss in the waveguide are neglected, because the quenching time
of SPAD is much longer.
In this RPG model, the output is uniform distributed discrete pulses with
Gaussian distributed amplitude in uV. To use this signal in our neuromorphic
circuits, we have to amplify it to a mV-scale pulse. The complete circuit of the
discrete noise generator is shown in Figure 3.6.
Figure 3.6: Circuit of the discrete noise generator
The photons are simulated as voltage responses in SPAD. The strength of
photons depends on the wavelength, which is Gaussian. The photon detection
eciency of SPAD is also a Gaussian process related to wavelength. Thus we
choose 615 nm peak wavelength and = 30 for both LED and SPAD as the
simulation parameters. The simulation results of simulated photon voltage and
noise signals are shown in Figure 3.7.
The above noise signals only include positive pulses. In neural systems, the
noise can be both positive and negative. To accomplish this with our RPG, we use
31
Figure 3.7: Discrete noise signals: Random pulses (upper trace) and simulated photon
voltage (lower trace)
two RPGs and connect them to one amplier. The output can be calculated with
the following equation:
V
out
=A(V
1
V
2
) +V
1
whereV
1
andV
2
are the inputs to the amplier and A is the gain of the amplier.
The corner situation is the two inputs having the exact same phase, so that the
pulses could theoretically cancel each other. However, the amplitude of the two
signals is usually dierent, and this allows the circuit to generate robust random
positive and negative pulses (Figure 3.8).
3.3.3 Continuous Noise Signals
To generate continuous noise signals, we connect the RPG to a current mirror
(Figure 3.9). The charge accumulates at the output, and gradually is discharged
32
Figure 3.8: Corner situation of random pulses with positive and negative values
Figure 3.9: Current mirror circuit
by the leakage signal. Thus we create a continuous noise signal (Figure 3.11a),
but in this manner, the noise cannot be negative. The solution is the same as the
method we used for discrete noise signals. Two continuous signals are connected
to an amplier (Figure 3.10), one of the signals is inverted and the output is the
summation of inputs. The result of the continuous noise generation is shown in
Figure 3.11b.
33
Figure 3.10: Circuit for continuous noise
Figure 3.11: Continuous noise signal and summed positive and negative noise signals
The probability of the true randomness from single photon random pulse gen-
erator is determined by the quenching time of SPAD and single photon genera-
tion. By varying the quenching time constant and photon generation time con-
stant,dierent probabilities can be achieved. Furthermore, combining more RPGs
together also can change the probability signicantly. The probability of the RPG
is Poisson distributed, while the noise that comes from LED and SPAD is also
Poisson distributed. The thermal noise from the amplier could aect the prob-
ability distribution seriously, but with more advanced amplier noise reduction
technology and SPAD technology, the signal to noise ration(SNR) can be con-
trolled eectively. More future work has been scheduled to optimize the SNR. On
34
the other hand, the thermal noise could be treated as another randomness source
to be used in the neuromorphic system in future work.
3.3.4 Noisy Synapse in Neuron Cell
The neuron circuit consists of three types of submodules: the excitatory and in-
hibitory synapses, the dendritic arbor and the axon hillock (Figure 3.12). We
include variability in the excitatory and inhibitory synapse circuits to model neuro-
transmitter release variability. And we mimic environmental variability by adding
a noise signal as an input to the dendritic arbor, the summation of synaptic outputs
and noise form the postsynaptic potential(PSP) with variability. We also include
variability in the axon hillock circuit to model a variable threshold.
Figure 3.12: Structure of the experimental noisy neuron [2]
Figure 3.13 shows a BioRC CMOS excitatory synapse circuit [3]. The voltage
at the gate labeled Neurotransmitter controls the neurotransmitter release. In the
circuit with no variability, the voltage applied to the gate labeled Neurotransmitter
could be a xed biasing voltage. In this paper, the gate voltage is connected to
35
the noise signal to make the neurotransmitter release variable. This variable input
makes the peak of the EPSP variable and varies the synapse strength stochastically.
Similarly, we include variability in a BioRC inhibitory synapse. The result of the
excitatory synapse simulations is shown in Figure 3.14. When periodic action
potentials with the same amplitude are applied to the synapse, the EPSP response
varies with the shape of the noise signal, as demonstrated earlier [63].
Figure 3.13: BioRC excitatory synapse circuit [3]
Figure 3.14: Simulation result with variable neurotransmitter availability in the exci-
tatory synapse
36
The noise is connected to the dendritic arbor directly to mimic the environment
variability (Figure 3.12) [2]. The BioRC voltage adder [64] is used as a Dendritic
Arbor compartment modeled with CMOS technology.
Figure 3.15: Variable threshold axon hillock circuit
Figure 3.16: Simulation result of variable threshold in a neuron circuit
37
In order to include variability of the neuron threshold, we use the threshold
stage circuit developed by Mahvash and Parker [63]. The noise signal is applied to
the threshold stage of the axon hillock module to vary the value of the threshold.
In Figure 3.15, the authors added X1 and X3 to the original input-stage circuit.
The variable signal varies from 0v, preventing the neuron from ring to Vdd forcing
the neuron to re. When the variable signal is very low, X1 turns OFF which could
prevent the neuron from ring. When the variability signal rises, Xl turns ON and
the circuit behaves like the rst inverter in the circuit with no variability. When
the variability signal increases suciently, X3 turns ON and pulls down the output,
sending a rising edge to the spike-generation circuit and the neuron res. In this
case, the variable signal forces the neuron to re if the variability amplitude is
sucient. The results of the simulation using our noise inputs are shown in Figure
3.16. The stochastic resonance is using the added noise to lowering the threshold
of the neurons, it is critical for kicking o the neuronal circuits at initial stage. We
used this feature in the subsequent blank slate network in Chapter 5 to enable a
biomimetic robot to learn movement from scratch.
3.4 Synapse with Frequency Adaptation
Experimental neuroscientists have revealed the physiological mechanism of adap-
tive synaptic frequency lters in the region of the fore-brain such as the hippocam-
pus [65]. The synaptic weight updating provides linear signal processing, while the
38
adaptive synaptic frequency ltering provides nonlinear signal processing besides
providing a thresholding function for the neuron. By varying the physical char-
acteristics of neurotransmitter concentration and the neurotransmitter re-uptake
mechanism in synapses, the post-synaptic potential (PSP) responds to the pre-
synaptic action potential (AP) as the output of an adjustable band pass lter.
This synaptic function has been discovered in memory formation, perception of
sensory signals, and decision making. [66{68]
Despite these neuroscience discoveries that indicate the potential frequency-
adaptation mechanism in a single synapse, exactly how the frequency sensitivity
adapts during learning is still under study, as is the learning process in large
neuronal networks over the long term. Furthermore, the specic frequency learn-
ing rules have not been thoroughly explored. Hence, we propose here a possible
frequency-based STDP (fSTDP) method and have implemented it as a neuromor-
phic circuit.
In this section, we present the circuit designs of a novel tuneable band pass-
lter-like synapse, a frequency-adaptive synapse. It is loosely based on a BioRC
synapse design [4] and a model of a memristor device [69] is used to achieve contin-
uous (analog) resistance. Only a certain frequency range of the pre-synaptic spike
train can lead to high-gain PSP, and the frequency range is dependent on the resis-
tance of the memristor. To mimic the voltage-dependent neurons communicating
with the frequency-adaptive synapse, a voltage-dependent axon hillock is modied
39
from our previous design [1]. The output spike train frequency of this axon hillock
circuit is positively correlated with its input voltage value, if the input voltage is
above the threshold.
Figure 3.17: Spiking Axon Hillock Circuit with Voltage-Dependent Spiking Frequency.
The triangles are buers including two inverters serially connected.
3.4.1 Voltage-Dependent Variable-Frequency Axon
Hillock
Our proposed relationship between neural computation and spiking frequency is
the following: We assume that the membrane potential (nonlinear summation
of PSPs) is proportional to the spiking frequency of the axon hillock. As the
synaptic frequency operation requires the postsynaptic neuron to re in dierent
frequency ranges corresponding to dierent PSPs, a voltage-dependent variable-
frequency spiking axon hillock is necessary. Our proposed transistor-level circuit
of the voltage-dependent axon hillock is shown in Figure 3.17. It is modied from
40
Hsu's bursting axon hillock circuit by eliminating the single spike mode and re-
ducing the ring range from 40-60 GHz to 20-300 MHz, along with some transistor
rearrangement.
This circuit generates a spike train when the membrane potential input (non-
linear summation of all PSPs) is above its threshold, and the frequency of the
output spike train is positively correlated with input (PSP summation) voltage.
An initial low input turns on the transistor M1 to pull down the voltage of node
N1, thus ensuring the output is low. Meanwhile the transistor M3 is turned on
to charge node N2, then the o transistor M5 and on transistor M8 are ready
for future high input. If the input voltage is higher than the threshold of buer
B1, the transistor M1 will be turned o to allow node N1 to be charged. In the
meantime, transistor M7 is turned on to discharge node N3, turning transistor M6
on, and the input voltage supplies current for charging node N1 as the source of
transistor M6. When the voltage of node N1 is high enough to turn on transistor
M4, the node N2 is discharged to turn on transistor M2. The node N1 will be
pulled down to ground directly, while recharging node N2 to turn o the transistor
M2. At this point, the node N1 enters the loop of charging through transistor M6
and periodical discharging with transistor M2, and generates a spike train.
41
The frequency of the output spike train depends on the charging current
through transistor M6, which is related to the input voltage. The relation be-
tween input voltage and charging current
owing through M6 is Equation 3.2:
I
SD
=
p
C
ox
W
L
(V
SG
jV
T
j)V
SD
V
SD
2
2
(3.2)
where
p
is the hole mobility, C
ox
is the capacitance of the oxide, W is the width of
diusion region, L is the length of diusion region, and V
T
is the threshold voltage.
When on, transistor M6 is operating in the linear region and the frequency of the
output spike train is linearly related to input voltage if above threshold of .6v.
The waveform of the spike train with three dierent frequencies is shown in Figure
3.18 and the output frequency range is shown in Figure 3.19 corresponding to the
Figure 3.18: Output spike with Dierent PSP LevelS, When PSP levels are 0.5V, 0.6V
and 0.7V, output spike frequencies are 20MHz, 130MHz and 280MHz respectively.
input voltage from 0 V to 0.7 V. Besides changing the spike train frequency, the
42
pulse width of each spike is also changed by the input voltage.
1
The pulse width
variance aects the synapse circuit response. We will discuss the detailed eect in
the next section.
Figure3.19: Output spike with Dierent PSP LevelS, When PSP levelS are 0.5V, 0.6V
and 0.7V, output spike frequencies are 20MHz, 130MHz and 280MHz correspondingly.
3.4.2 Frequency-selective synapse circuit
The transistor-level circuit of the frequency-selective synapse is shown in Figure
3.20. It is modied from the original BioRC synapse [4] by adding two PMOS
transistors, one capacitor and one memristor into the original synapse circuit design
to implement frequency plasticity. The added part changes the current
owing
from Vdd according to the input frequency.
At the range of resistance value from 5M
to 10M
for the memristor,
the synapse circuit exhibits a band pass-like function. The circuit generates a
1
Previous spiking BioRC axon hillock circuits do not modulate spike width as spiking fre-
quency varies and do not respond to membrane potential variations with variable spiking fre-
quencies.
43
Figure 3.20: Frequency-Selective Synapse Circuit
biomimetic PSP signal with an input pulse through the following process. A high
voltage (0.7V) input turns on the transistor M1 and o the transistor M2, then the
node Cleft is charged to turn on transistor M3 to charge output; a following low
voltage (0 V) input turns o the transistor M1 and on the transistor M2, then the
node cleft is discharged and the output is discharged by the biased transistor M4.
If the input is a spike train with moderate frequency, charge will be accumulated at
the cleft node as the charge current is greater than the discharge current over the
time slot of the input spike train. Thus the output PSP will be temporally summed
to reach a higher voltage. The conductance of transistor M5 is used to mimic the
neurotransmitter concentration in a biological synapse. A high-frequency input
spike train reduces the conductance of the transistor M5 through charging the
44
node A, while the resistance of the memristor controls the discharging current of
node A to mimic the neurotransmitter availability.
The reduced conductance of transistor M5 decreases the charge current of node
Cleft, hence the high frequency input spike train cannot induce the cumulative PSP
to reach as high voltage as previously. We characterized the frequency-selective
synapse by testing the synapse with input pulse signals with 1 ns pulse width from
2MHz to 200MHz. The PSP voltage response of resistance setup from 1M
to
50M
is shown in Figure. 3.21. As we mentioned above, the pulse width of the
Figure 3.21: PSP amplitude response respective to input spike frequency
input spike train also aects the synapse response, thus a verication of merged
synapses and axon hillock circuits is presented in this work. The circuit cong-
uration is shown in Figure 3.22. An axon hillock with given inputs 0.6V, 0.65V
and 0.7V is connected to a synapse and the output of the synapse is connected to
another axon hillock. The synapse with xed memristor resistance 1M
receives
spike trains of 30, 132, and 330 MHz with corresponding pulse width. Only the
45
Figure 3.22: Frequency Selective Synapse Verication Circuit
132MHz spike train is able to invoke PSP above 0.5V. The simulation result is
shown in Figure 3.23. The amplitude of the synapse PSP output increases with
higher input frequency until the overcharged node A turns o transistor M5. With
the threshold of the following axon hillock circuit, the high frequency input signal
is ltered out.
Figure 3.23: Frequency-Selective Synapse Verication Circuit Simulation Result
3.4.3 Example STDP-like Frequency Adjustment Circuit
The previous STDP circuit with modication is used to make the frequency ad-
justment of this synapse circuit adaptive. The learning method implemented here
46
is not STDP, it is only a co ntrolling dependent on the pre and post spikes. The
details of this implementation is explained in the following paragraphs.
STDP has been observed and computationally simulated with a few learning
types. A variation of STDP learning is implemented in this work to adjust the pass
band of the synapse adaptively through manipulating the memristor resistance of
the synapse circuit. In this STDP implementation, the frequency-adjusting dy-
namics depend not only on the current state, but also depend on the relationship
between presynaptic and postsynaptic action potentials [57]. This means that,
besides the selected synaptic frequency, each synapse keeps track of the recent
presynaptic and postsynaptic spike history. In terms of our STDP model, every
time a postsynaptic spike arrives at the synapse, the postsynaptic spike will ac-
cumulate charge
post
in diusion capacitance of transistors, and then the charge
will decay gradually. Every time the presynaptic neuron spikes, the charge
pre
is
also accumulated and then decays. Each time, a presynaptic spike arrives at the
synapse, the frequency change ! is calculated based on the presynaptic charges.
A simplied STDP mathematical model is given below:
! =
8
>
>
<
>
>
:
(1)
@
post
@t
; if
pre
>C
0; if
pre
C
(3.3)
where is the amount of resistance change at each time step of the memristor in
a synapse circuit,
pre
is the accumulated charge from the pre signal,
post
is the
47
accumulated charge from the post signal and C is a constant greater than
pre
.
The assumption of this equation is that both
pre
and
post
are decaying over time
after rising high instantly. If the post signal arrives before the pre signal decays
to 0, the frequency response range of this synapse will be tuned down through
increasing the resistance. If only the post signal arrives, which means the output
neuron res before with the ring of presynaptic neurons within an innite time,
the frequency response range of this synapse will be tuned up through decreasing
.
Figure 3.24: Frequency-Selectivity learning circuit
The novel biomimetic circuit implementation of this STDP equation is shown in
Fig. 3.24. is analogous to the resistance change of memristor for each pulse.
pre
and
post
are analogous to electrical charges, and charge signals are decaying over
time through biasing transistors connected to the ground. If and only if the post
48
signal arrives rst, the connected pre gated transistor will be charged. Then if the
pre signal arrives successively, the Set pulse will be triggered and the Reset signal
is inhibited by discharging. The resistance of memristor will be decreased by one
Set pulse to tune up the frequency response of this synapse. If only the pre signal
arrives, the Set signal will not be triggered, because the pre-charging of the post
transistor gate is absent, and the pre signal will trigger the Reset pulse without
discharging inhibition. The resistance of the memristor will be increased by one
Reset pulse to tune down the frequency response of this synapse. All the charging
nodes in this circuit are discharged by a constant bias transistor to implement the
dierential timing factor dQ=dt. In the mathematical model, the amplitude of !
is a continuous value depending on the product of and the dierential equation.
However, the amplitude of ! in the circuit implementation is a discrete value
of the resistance change of memristor for each pulse. A positive-edge input will
trigger this circuit to generate one current pulse output with xed amplitude and
duration. The simulation results shown in Fig. 3.25 present three scenarios:
Scenario 1:
If both post and pre spikes arrive in sequence and the timing interval is
short enough, the Set pulse is triggered and the resistance of memristor is
decreased. If the resistance of the memristor reaches its minimum value, the
memristor will no longer respond to the Set pulse.
49
Figure 3.25: Frequency-selective learning simulation result
Scenario 2:
If both post and pre spikes arrive in sequence with the timing interval slightly
longer than the rst scenario, neither signal Set nor Reset will be triggered.
This scenario is shown in Fig. 3.24 (b) around 70 ns. The reason for this
phenomenon is the decayed
post
cannot serve as a source to enable the Set
pulse trigger, but it is strong enough to enable the discharge of the
pre
and
then inhibit the pre signal to trigger the Reset signal.
Scenario 3:
If only the pre signal arrives or the time interval between signal post and pre
is long enough, the Reset signal is triggered and the resistance is increased.
If the resistance of the memristor reaches its maximum value, the memristor
will no longer respond to the Reset pulse any more.
50
The frequency adjustment circuit we have constructed uses our original STDP
circuit reported earlier but the rule for controlling the memristor is dierent.
3.4.4 Frequency-Adaptive Synapse
The frequency-Adaptive synapse is formed by combining the STDP circuit and
the frequency-selective synapse. The circuit diagram is shown in Fig. 3.26. The
pre spike signals from pre-axon-hillock are connected to both synapse and STDP
circuit, and the post spike signal is connected to STDP circuit. The set and reset
signal from the STDP circuit are connected with the synapse circuit to tune the
frequency sensitivity through controlling the resistance of the memristor. When
only pre spikes are present, the resistance of the memristor will be decreased by the
STDP circuit from high to increase the gain of the frequency-selective synapse for
higher probability of triggering post spikes. On the other hand, when the pre-post
spike pairs are present, the resistance of the memristor will be increased by the
STDP circuit to narrow the response frequency window.
The circuit simulation result is shown in Fig. 3.27. The pre and post spikes
are generated by two voltage dependent axon hillocks independently with given
inputs.
The STDP circuit generates Set and Reset signals to change the resistance of
the memristor in the synapse circuit according to the pre and post spikes. The
51
Figure 3.26: Frequency adaptive synapse test circuit
synapse circuit generates a PSP according to the pre signal with changing memris-
tor resistance. Initially, only pre spikes are present at 300MHz, only Reset signals
are generated to decrease the memristor resistance to achieve higher gain. The
minimum resistance value of memristor is 800k
, with this resistance the synapse
has low response to 300MHz input spikes. Then the pre spikes frequency is reduced
to 130MHz and the post spikes re at 300MHz. At this condition, the synapse re-
sponses to the pre spikes generate strong PSP output. Meanwhile, the STDP
circuit generates both Set and Reset signals according to the timing of pre and
post pairs. As a result, The amplitude of PSP changes according to the memristor
resistance changes.
52
Figure 3.27: Frequency adaptive simulation result. The pre spikes re in 300MHz for
rst 8 s and then in 130MHz, the post spikes re in 300MHz, the initial value of the
memristor resistance is 2.8 M
and its minimum value is 800k
.
3.4.5 Frequency-Adaptive STDP Synapse with Strength
Variation
To implement the strength variation and adaptation in the frequency adaptive
synapse, another memristor MEM2 is added to the output of the synapse circuit
and a similar STDP circuit is used to change its resistance value. The same pre and
post signals are connected to this STDP circuit. The synapse strength is translated
to the amplitude of the synapse output current, and the changing of the strength
is regulated by the STDP circuit. The circuit diagram is shown in Fig. 3.28. The
pre and post knobs are swapped in the newly added STDP circuit to implement
the STDP learning rule as the most widely found in biological synapses. When
53
pre and post pairs are present, the memristor resistance is decreased to enable
stronger current; when only the post spikes are present, the memristor resistance
is increased to weaken the synapse.
Figure 3.28: Frequency adaptive synapse with strength variation test circuit
To test this synapse, the same testing is set up as in the previous section. The
pre and post spikes are generated by two voltage-dependent axon hillocks indepen-
dently with given inputs. The STDP circuit generates Set 1 and Reset 1 signals
to change the resistance of the memristor MEM1 in the synapse circuit according
to the pre and post spikes under the frequency adjustment rule. Simultaneously,
the STDP circuit generates Set 2 and Reset 2 signals to change the resistance of
the memristor MEM2 in the synapse circuit according to the pre and post spikes
under the Hebbian rule. The synapse circuit generates PSP according to the pre
54
signal with changing MEM1 resistance and generates output current by PSP over
changing MEM2 resistance . Initially, only pre spikes are present in 300MHz, and
only Reset 1 signals are generated to decrease the MEM1 resistance to achieve
higher gain. The minimum resistance value of memristors is 800k
; with this
resistance the synapse has low response to 300MHz input spikes. Then the pre
spikes frequency is reduced to 130MHz and the post spikes re in 300MHz. At this
condition, the synapse responds to the pre spikes, generating strong PSP output.
Due to the initial resistance of MEM2 being minimum, the current output is also
very strong. Meanwhile, the upper STDP circuit generates both Set 1 and Reset
1 signals according to the timing of pre and post pairs, while the lower STDP
circuit only generates Reset 2 signal according to its Hebbian learning rule. As
a result, The amplitude of PSP changes slightly, while the current output drops
continuously as the MEM2 resistance increases. The simulation result is shown in
Fig. 3.29.
55
Figure 3.29: Frequency adaptive synapse with strength variation test circuit. The
pre spikes re in 300MHz for rst 8 s and then in 130MHz, the post spikes re in
300MHz, the initial value of the MEM1 and MEM2 resistance are 2.8 M
and 800k
,
the minimum value of memristor resistance is 800k
.
56
Chapter 4
Astrocyte Circuitry Neuromorphic
Implementation
4.1 Sub-cortical Circuitry for Neuromodulation
Subcortical parts of mammalian brains essential for survival can be found in all ver-
tebrates, including the basal ganglia that are responsible for reinforcement learning
and the cerebellum, that provides the brain with forward models of motor com-
mands [22]. Long range connection rather than local connection is the major
feature of these brain part. These subcortical parts are in charge of organizing
thousands of cortical networks and managing the global
ow of information in the
cortex. For example, the dopamine neurons in the brainstem provide reward pre-
diction and neurotransmitters, which are key computational cells in the temporal-
dierence reinforcement-learning process. The dopamine neurons in the midbrain
are observed throughout the cortex and basal ganglia; these neurons modulate
synaptic plasticity and provide motivation for obtaining long-term rewards [23].
Several other neuromodulatory systems also control global brain states to guide
behavior, representing negative rewards, surprise, condence, and temporal dis-
counting [24]. Therefore, a biomimetic neuromorphic system should include not
57
only the cortical part for special tasks, but also system-level communication as the
brain subcortical parts.
4.1.1 Astrocyte Circuitry in Brain Stimulation for Parkin-
son's Disease
The astrocyte is a type of glial cell that contiguously tiles the entire central ner-
vous system in a non-overlapping fashion [70]. And astrocyte cells are spread in
both cortical and subcortical regions as communication tunnels. The subcorti-
cal astrocyte is a potential brain stimulation target for treatment of Parkinson's
Disease, that will extremely reduce the implanting risks and issues by allowing
the brain stimulation sites located at the brain-blood-barrier instead of the Tha-
lamus [71]. We studied the information transmission from astrocyte to thalamus
using a computational method, the details can be found in the Appendix.
4.1.2 Astrocytic Circuit with MoS
2
Device
To explore the potential communication method in neuromorphic systems, the
BioRC group proposed a neuromorphic astrocyte circuit and studied its functions.
The compartmentalized construction of our neuromorphic circuits and the ability
to control neural parameters directly by means of specic control voltages allow
us to insert additional mechanisms intuitively. Irizarry-Valle [46] uses this com-
partmentalized approach to insert the uptake of glutamate by astrocytes and the
58
synapse inactivation mechanism, along with the astrocytic calcium Ca
2+
release
causing glutamate release into the neurotransmitter section of our synapses. How-
ever, overhead in the form of an explicit voltage adder [43] is introduced to sum the
transmitters and a more complicated synapse circuit design is used to implement
that model. Considering the number of synapses in human brain is in the tril-
lions and 60% of the synapses are tripartite, connecting with glial cells, reducing
the overhead signicantly is necessary for mimicking this feature in neuromorphic
systems. In this work, by exploiting the back gate control of an MoS
2
FET , the
explicit adder is not required and we can use the simple synapse design.
Based on successful demonstrations of
exible MoS
2
FET instances and cir-
cuits fabricated with
exible MoS
2
FETs [7], we have designed a SPICE model for
neuromorphic circuit simulation. We then designed and simulated a MoS
2
FET
analog circuit model of a neural synapse, dendritic arbor, axon hillock and astro-
cyte microdomain that capture a spiking neural subsystem including the actions of
neurotransmitters, ion channel mechanisms, temporal summation of PSPs, neuron
ring, and gliotransmitter-induced calcium current. An architecture including all
the components is presented to emulate the tripartite synapses' communication
between neurons through the astrocytes microdomain.
A long channel drift-diusion compact SPICE model for the MoS
2
FET has
been designed to model a device with more than 100 nm gate length and validated
59
by experimental results. [72] When the transistor is further scaled down to sub-
20 nm gate length, a ballistic transport model is more suitable for describing the
current since the channel length becomes comparable or even less than the mean
free path of transition metal dichalcogenide eld-0eect transistor (TMDFET)
(15 nm). However, the ballistic transport model requires numerical integration
and is not directly SPICE-compatible, making it dicult to perform circuit-level
simulations. Gholipour [9] introduced the ballistic enhancement factor (BEF) to
the original drift-diusion model, so that they obtained an approximate ballistic
transportation for the SPICE compact model. Based on these two models, we
designed our Dual-Gated MoS
2
SPICE model. In Figure 4.1, our model simulations
show the drain to source voltage is xed at 0.5 V, back gate is grounded , x axis is
top gate voltage, y axis is the current
owing through the materials, and 0, 10%,
and 50% strains are plotted. The geometry of the monolayer MoS
2
is 32 nm wide,
and 16 nm long.
Due to the dierence between top-gate oxide and back-gate oxide in mate-
rial and geometry, the current through drain and source will respond to the gate
voltages dierently. In Figure 4.2, our initial device simulations show the top gate
voltage xed at 0.3 V and the back gate voltage varies from 0 to 3 V. The top-oxide
is 2.8 nm thickness, and the back-oxide is Si with 100nm thickness.
60
Figure 4.1: I-V Curve of MoS
2
FET
Figure 4.2: I
d
s vs. V
d
s with Back-Gate Voltage and Fixed Top-Gate Voltage
Figure 4.3 shows a BioRC excitatory synapse circuit. All the transistors are
MoS
2
FETs, and the transient simulation result is shown in Figure 4.4. The input
action potential(AP) is spikes with maximum amplitude 0.7 V; neurotransmitter
input is biased at 0.15 V; reuptake input is biased at 0.05 V; KR input is bi-
ased at 0.1 V; gliotransmitter is grounded. The excitatory postsynaptic potential
61
Figure 4.3: BioRC MoS
2
excitatory synapse circuit [4]
magnitude (EPSP) is approximately 14% of the action potential and the duration
is about 4 times as long as the action potential, somewhat shorter than EPSPs
described in the literature.
Figure 4.4: The Action Potential and EPSP of the MoS
2
Synapse under Normal Op-
eration
To explore the output of the synapse circuit aected by the back-gate control
Gliotransmitters, the circuit is simulated with the back gate voltage from 0 to
3V. The simulation result is shown in Figure 4.5 with the input spike omitted to
62
Figure 4.5: Synapse EPSP Simulation Result under Back-Gate Control
illustrate the details of the EPSP. The input of this simulation is a 20ns spike with
maximum 0.7V and neurotransmitter control is xed at 0.15V.
To validate this circuit model, simulations with process variations are necessary.
The width, length, thickness of top oxide, and thickness of back oxide are varied
within 10% in a Gaussian distribution, then 10
4
simulations were run, and the
results are shown in Figure 4.6. The maximum value of EPSP varies 15% with
the delay varying 7%, but the correlation between the amplitude and delay is -0.6.
An inversely proportional relationship constrains the variation of the product of
amplitude and delay.
Figure 4.6: Delay and Maximum Value under Process Variation
63
Figure 4.7 shows several compartments of a simplied astrocyte analog circuit.
It is a distributed resistive (pass transistor) network that takes inputs from the
voltages representing synaptic cleft neurotransmitter concentrations of dierent
synapse circuits. A voltage representing eects of neurotransmitter concentration
from each synapse is fed into a non-inverting delay circuit whose output voltage
representing released neurotransmitters is summed in the astrocyte with delayed
eects of neurotransmitter voltages from other synapses. This distributed sum
represents calcium waves in the astrocyte process. In previous BioRC neural de-
signs [4], an adder was used to sum the eects of synapse neurotransmitters, in-
creasing complexity and power consumption. In this paper, a single MoS
2
FET
replaces each adder to vastly simplify the design.
Figure 4.7: Astrocyte Microdomain Circuit [5]
The network shown in Figure 4.8 illustrates the conguration we used for this
experiment. In a network of neurons with spiking inputs, an astrocyte spans sev-
eral synapses to create the neural-astrocyte interactions. In our experiment, 5
synapse PSPs connect to the astrocytic microdomain. The input of the synapses
64
is sequential spikes applied to each of the synapses. The astrocyte output connects
with the gliotransmitter control of synapse S6. To show the modulation eect
of the astrocytic mechanisms on synapse S6, we ran simulations of the circuit
network and compared the PSPs with a blank control S7. In Figure 4.9, the prop-
agation and correlation of the astrocytic calcium ion signal is shown with respect
to the independent synaptic cleft signals. In Figure4.10, the PSP of synapse S6 is
modulated in amplitude and duration by the astrocytic microdomain through the
calcium ion signal Ca(Astro)5, causing gliotransmitters to be released, increasing
the PSP of synapse 6.
Figure 4.8: Sketch of Astrocyte Neural Network
65
Figure 4.9: Simulation Result of Astrocytic Ca
2
+ Signal with Synaptic Cleft Signal
Figure 4.10: Simulation Result of Astrocytic Modulation to Synapse 6
66
Chapter 5
Neuronal Network Neuromorphic
Implementation
We connect neuron and synapse circuits in a networked manner to create a neu-
ronal network. Unlike traditional articial neural networks (ANNs) processing
mathematical models of neural elements on digital processors, our neuromorphic
hardware models an asynchronous neuronal network with custom analog circuits
modeling biological neurons to a rst order. Cortical circuitry, subcortical cir-
cuitry, and memory circuitry analog circuits implementations are presented in this
section and the details are discussed below.
5.1 Cortical Circuitry for Pattern Recognition
Fig. 5.1 (a) shows the conguration of the feed-forward neuronal network, which is
constituted with 25 input neurons, 20 output neurons, and 500 synapses with initial
random strengths. The input neuron is the axon-hillock circuit shown in 3.15. All
the current PSP outputs of the synapses are connected with a current mirror [73]
shown in Fig. 5.2. All isolated current PSP outputs from the current mirrors
are connected together to realize linear current addition. The summation of the
current is connected to an RC element (500fF/10M
) to constitute a simplied
67
dendrtic arbor along with converting to voltage output. Then the output of the
dendrtic arbor is connected to the axon-hillock of the output neuron.
Figure 5.1: (a) Fully connected neural network implemented in this neuromorphic
system. (b) Patterns example input to the neural network. (c) Simulation result of one
pattern recognition exercise.
In the simulation result in Fig. 5.1 (c), the second pattern in Fig. 5.1 (b)
is fed into the neuronal network. The pattern is 5 x 5 pixels with 1 bit binary
value, then the pattern is encoded as 0.7V pulses of 0.1 ns duration and repeated 6
times with 15 ns interval between the repetitions. These pulses activate the input
neurons, and then the network learns this pattern adaptively. After the rst echo,
ve output neurons red spikes to respond to this pattern. If the input pattern
stimulates an output neuron, the synapses that received both pre and post signals
68
Figure 5.2: STDP-Dopaminergic learning circuit
will be strengthened according to STDP learning Scenario 1 or keep the same
strength according to Scenario 2, while the synapses that only received post signal
will be weakened according to Scenario 3.
Base on the randomization, the output neurons 8, 15, and 17 have relatively
stronger initial states and timing dynamics, so they learned the pattern within the
rst trial. Output neuron 10 has fair initial synapses states, but the delay between
pre and post signals is too long to trigger the strengthening for synapses instead of
weakening. Output neuron 14 weakens the synapses for the rst trial like output
neuron 10, but the pre-post timing is good for Trials 3 and 4. This situation is
due to scenario 2 that happened during the rst trial. So neuron 14 still has high
enough PSP to trigger the output neuron, and then the STDP strengthens the
synapses to learn this pattern eventually. After training with 6 trials, this pattern
is stably recognized by 4 output neurons.
A simulation of three successive pattern recognitions by the feed-forward neu-
ronal network is presented in Fig. 5.3. The three patterns shown in Fig. 5.3 (a)
69
are fed into the same feed-forward neuronal network described in the main text
(25 input neurons, 500 synapses, and 20 output neurons) at 0 ns, 65ns, and 130 ns.
Each pixel of the pattern is encoded as 8 spikes with 6 ns interval. The learning
result is shown in Fig. 5.3 (b). The neuron 9, 16, and 17 learned the rst pattern;
the neuron 13 and 15 learned the second pattern; the neuron 7 and 12 learned the
third pattern.
Figure5.3: (a) The three patterns fed into the feed-forward neuronal network. (b) The
20 output neurons response during the on-the-
y learning process of the three successive
patterns
70
5.2 Memory Circuitry for Sequential Informa-
tion
The recurrent neuronal circuit is a proposed mechanism of sequential information
processing to explain the decision making in neurophysiological studies [74]. More
experimental and computational results suggest that decision making and working
memory can be served by the same cortical mechanism, presumably residing in
the parietofrontal circuit [75]. However, the detailed circuitry of incorporating
memory with recurrent neuronal circuits has not been revealed in detail. More
studies indicate decision making is highly related to sequential memory [76].
[77, 78] presented recurrent connections between VLSI neuron layers for ori-
entation computation and other applications. However, VLSI implementation of
recurrent neuronal connections for timing-related information recording and ab-
straction has not been explored. In contrast, recurrent articial neural networks
(RNN) have been proposed and applied in software for decades [79]. The RNN algo-
rithm records and abstracts timing-related information in a state-machine manner
harnessing digital computing chips. Diehl et al. [80] has proposed to implement
RNN in digital neuromorphic chips, such as TrueNorth.
In this work, we proposed a recurrent neuronal network neuromorphic design
for sequential memory realization and future sequence recognition towards to an
articial human brain. In this neuromorphic design, the MAM-based STDP is used
71
as a learning method to constitute the memory, and the network is recurrently
connected to learning sequential information.
The recurrent neuronal network illustration is shown in Fig. 5.4, where 25
neurons are connected as a sequential neuron chain with strong excitatory synapses;
another 5 neurons representing patterns are fully connected with the neuron chain;
the sequential neurons are pre-connected to the STDP synapses and the pattern
neurons are post-connected to the STDP excitatory synapses. The sensory neurons
are connected as inputs to the pattern neurons in a one-to-one manner through
strong excitatory synapses. The pattern neurons can be treated as representations
or abstraction of signals from the sensory neurons.
Figure 5.4: Recurrent neuronal network for sequential memory.
The learning process of this recurrent neuronal network is shown in Fig. 5.5
After the start signal triggers the neuron chain, the neurons re successively from
N
0
to N
24
behaving like a clock. While the neuron chain is ring, any ring sequence
72
of the pattern neuron from P
0
to P
4
will be learned by the STDP synapses between
the sequential neurons and pattern neurons.
Figure 5.5: Recurrent neuronal network during learning process.
After the learning process, this recurrent neuronal network enters the recall
process, shown in Fig. 5.6. In the recall process, the sequential neuron chain
is triggered by the recall signal to re successively and the pattern neurons re
according to the learning result, while the sensory neurons are quiet. Due to the
STDP, the strong excitatory synapses between the sensory neurons and the pattern
neurons will be depressed to be weak excitatory synapses. As a result of the recall
process, the learning result is reinforced and the pattern neurons are isolated from
the sensory neurons to complete the abstraction. In the case of multiple sequential
neurons connecting to one pattern neuron, the STDP will lead to forgetting in the
recall process. For example, N
1
and N
24
can bpth trigger the P
0
through learned
strong excitatory synapse in Fig. 5.6. If N
1
res and triggers P
0
, the synapse
73
between N
24
and P
0
should be weakened and forgotten. This issue is solved by
set the amplitude of strengthening to be 5 times larger than the amplitude of
weakening. After several cycles of recall, the pattern neurons can be completely
isolated from the sensory neurons with little in
uence on the synapses between
pattern neurons and sequential neurons.
Figure 5.6: Recurrent neuronal network during recall process.
The circuit simulation result of sequential memory with complex patterns is
shown in Fig. 5.7. The delay between sequential neurons is around 5.9 ns, so the
start signal is given every 300 ns to start each period of the neuron chain. In the
rst period, the pattern neurons are given input sequence signals as the pattern to
be learned, and the neuron chain nished the learning in this period. The following
two periods recall what has been learned previously.
The most signicant dierence between the learning period and the recall pe-
riod is the timing shift of the pattern neuron ring, because during learning the
74
Figure 5.7: Simulation result of the recurrent neuronal network for sequential memory.
pre and post signals happen in a small time window, but during replaying the
post signal is triggered by the pre signal. The shifting is3 ns, so it does not
make signicant in
uence on the accuracy of the learned sequence. However, our
neuromorphic system is an on-line learning system; this shifting will trigger the
learning of the next sequential neuron during the replay period. As a result, the
learned sequence will shift every period and lead to the wrong process. We solved
this issue by discovering the minimum overlap time window for two successive
STDP processes in this circuit conguration. By applying and adjusting the over-
lap time window, this recurrent neuronal network can handle both the learning
75
and recall processes without extra control. In specic, if the time window allowing
potentiation is shorter than the propagation delay from pre neuron to post neuron,
the synaptic weight will be decreased with each time the post neuron ring. The
missing signal shown in the third period of S
2
output is due to multiple sequential
neurons connecting to one pattern neuron case as discussed above.
After sequential memory is achieved by this recurrent neuronal network, the
next challenge is solving a sequence recognition task. An extended recurrent neu-
ronal network shown in Fig. 5.8 is proposed to be able to memorize and recognize
sequences. One more group of pattern neurons without connection from sequential
Figure 5.8: Extended recurrent neuronal network for sequential memory and recogni-
tion.
76
neurons is added to delay the new coming sequence for comparison with the mem-
orized sequence. A group of detection neurons that only re when they have two
inputs is added to compare the new incoming sequence and memorized sequence.
The output of the detection neuron will inhibit an always-ring check neuron, and
the check neuron connects to all the sequential neurons with inhibitory synapses.
If the new incoming sequence is the same as the memorized sequence, the detec-
tion neurons will re and inhibit the check neuron, so that the sequential neurons
can keep ring successively. In the other hand, if the new incoming sequence is
dierent from the memorized sequence, then the detection neurons will not re.
So the check neuron will not be inhibited and inhibit the sequential neurons to
stop this comparison.
The learning process of this extended recurrent neuronal network shown in
Fig. 5.9 is almost the same as the original recurrent neuronal network. The start
signal triggers the sequential neurons to re successively. The pattern neurons re
according to the outputs of the sensory neurons. The excitatory synapses between
the sequential neurons and the pattern neurons will be potentiated or depressed
through STDP. Since all the detection neurons have 2 inputs from pattern neurons,
the check neuron will be inhibited and thus the sequential neurons will not be
inhibited. An assumption is made here; the ring rate of the detection neurons is
larger than the input decay of the check neuron, so that the check neuron is always
inhibited during the learning process.
77
Figure 5.9: Extended Recurrent Neuronal Network during the Learning Process.
After the learning process, the circuit enters the recall process immediately
shown in Fig. 5.10. In the recall process, the sequential neuron chain is triggered
by the recall signal to re successively and the pattern neurons re according to
the learning result, while the sensory neurons are quiet. Due to the STDP, the
strong excitatory synapses between the sensory neurons and the pattern neurons
will be depressed to be weak excitatory synapses. An assumption is made here
that the PSP input of the Check Neuron has long decay time. So, several recall
cycles can be processed without inhibition from the check neuron.
78
After the recall process, this extended recurrent neuronal network has memo-
rized a sequence and is ready to recognize a new sequence. A circuit-level simu-
lation is planned in the future. For the circuit level simulation, the details of the
detection neuron, check neuron, and the inhibitory synapse will be provided.
Figure 5.10: Extended recurrent neuronal network during recall process.
79
Chapter 6
Blank-slate Network for Biomimetic
Robot
The human brain learns new information and skills without catastrophic forgetting
existing information and skills. This work demonstrated that learning can occur
in neuromorphic neuronal networks, without forgetting, with a combination of
short-term synaptic plasticity implemented using STDP and long-term plasticity
implemented with dopaminergic modulation of synaptic strength. We combine
these plasticity mechanisms with synaptic variability to demonstrate learning from
a \blank slate" neuronal network structured in a similar manner to the barrel cortex
of the rat. We show using a neuromorphic mechanism for dopaminergic spike-
timing-dependent plasticity (STDP) that supports reinforcement learning. The
blank slate network is used as a control system of a biomimetic robot mimicking
its cortex.
This work build simulations of cortical electronic neurons connected in a way
that modeled cortical column in the biological brain. All synapses were initially too
weak to evoke ring and there was general connectivity within and across neuron
chains. Inhibitory interneurons were used to focus ring on one neuron chain at a
time, with weak connections that could be strengthened as ring occurred.
80
We implemented learning in our population of electronic neurons via malleable
synaptic and dendritic computations, reinforced by successful interactions of the
NeuRoBot's legs with the ground. Physical behavior was evoked by simulated
neurons that initially red only randomly, that were sensitive to neuromodula-
tion, dopamine-like mechanisms to inhibit/strengthen individual synapses and si-
lence/awaken synaptic networks to enable multiple context-dependent behaviors.
Noisy synapses provided random neural behavior, and input signals invoked desire
or fear coupled with random ring to signal the subcortical system to move for-
ward or backward. The resultant movement signaled back to the cortex caused a
dopaminergic neuron to release dopamine, strengthening synapses that had con-
tributed to the movement. This created strengthened chains of neurons that
learned a specic signal - move forward or move backward. A key element was
the cortical-motor-physical-sensory feedback loop made possible by the NeuRoBot
that supported autonomous exploration and learning in the physical world.
6.1 Synapse with Dopamine Reward, Noise, and
STDP
Neuromorphic multi-state synapse design with noise and STDP has been intro-
duced in Chapter 3, a dopaminergic control is added to enable reward learning.
The complete design is shown in Fig. 6.1. One more NT Increase gated transistor
81
is added to make a copy of the weight-change trace using charge and stored in node
A connecting with Dopa-gated PMOS transistor, and a biased transistor named
NT Trace Decay
Figure6.1: Dopamine-modulated STDP Synapse Circuit. The gate names with arrows
are inputs, while others are biasing controls. No extra capacitance is added, charge
storage relies on the capacitances of transistors.
is used to decay this stored charge. If the Dopa signal arrives, the stored charge
will be released to NT node, while the NT Decay node is discharged temporally
through the Dopa gated NMOS transistor. Then the Dopa Decay gated transistor
starts charging the NT Decay node back to 0.4V. Due to the exclusion of NT
Increase and NT Decrease, the trace of NT Decrease is neglected in this design.
A Noise is injected through a capacitor. STDP that changes synaptic strength is
implemented using an existing 8-transistor STDP circuit 3.3 (a), where the circuit
82
detects the sequencing between presynaptic spiking pre and postsynaptic spiking
post. Three scenarios could happen:
Scenario 1:
If both pre and post spikes arrive in sequence and the timing interval is short
enough, the NT Increase pulse is generated to charge the NT node and node
A.
Scenario 2:
If both pre and post spikes arrive in sequence with the timing interval slightly
longer than the rst scenario, neither signal NT Increase nor NT Decrease
will be triggered.
Scenario 3:
If only the post signal arrives or the time interval between signal pre and
post is long enough, the NT Decrease signal is triggered and releases certain
amount of charge in node NT.
The simulation result is shown in Fig. 6.2. In the rst 50 ns, 4 pre-post spike
pairs generate 3 NT Increase signals from the STDP circuit and change the voltage
of the NT node in the short term without a Dopa signal. All the nodes of the STDP
circuit start from 0 V, thus the charge from the rst pre-post pair is insucient to
trigger NT Increase. Then the EPSP triggered by pre spikes produces responses
83
with small variations. At about 60 ns, the Dopa enables long-term synapse weight
change. The voltage of the NT node is increased signicantly and keeps increasing
due to the following NT Increase signals. The EPSP triggered by pre spikes is
also increased signicantly. After 100 ns, Post spikes without pre spikes trigger
NT Decrease signals to reduce the voltage of the NT node. The EPSP decays
without pre spikes.
Figure 6.2: Dopamine-Modulated STDP synapse Waveforms
84
6.2 The Blank Slate Network and Bimomimetic
Robot
We constructed a Robot that had a cortex and associated neurons simulated in
SPICE, a subcortical system in software, and a body of a robotic cat. The focus
of the dissertation was on the cortical simulation.
The Robot we have constructed learned to move and navigate in a biomimetic
manner similar to a biological brain-body system. Its cortical neurons and other
connected neurons in the brain began with a blank slate neuronal network, with
neurons arranged in chains that inhibited each other's activity and clustered in
cortical `minicolumns.' The neurons initially had weak synapses and were densely
connected, with inhibitory interneurons connecting chains of neurons to support
mutual inhibition. Neurons began ring spontaneously as synapses were initially
very noisy, with a stimulus like `desire to nd target' and learning began to occur.
These signals were sent to the subcortex to produce behavior and, if successful,
signals were sent back to the cortex where dopaminergic neurons rewarded the
blank slate neurons by giving persistence to synapse strength . As the NeuRoBot
learned from successful experience, noise in the synapses decreased and cortical
connectivity became more specic, representing learning behaviors. In conjunc-
tion with cortical learning, the NeuRoBot learned to associate neural ring with
85
Figure 6.3: Two-Chain Cortical Neuronal Network Simulated with SPICE
useful movement, and was able to cause specic movement with peripheral neu-
ral ring. Once a movement behavior was learned, the NeuRoBot could learn a
new behavior like moving backward by providing a dierent stimulus to the Neu-
RoBot's cortical neurons (e.g., fear). Moving in the (new) desired direction causes
a signal representing dopamine to reward synapses in a chain or network of cortical
neurons that contributed to the new movement by strengthening them over the
long term. Importantly, this did not overwrite (forget) the ability to produce pre-
viously learned movements. The cortical subsystem that demonstrated learning
was built on a blank slate neural network. Fig. 6.3 shows a fragment of cortical
column with a 2-chain neuronal network. The NA, D, and I circles are axon hillock
86
circuits (same as the axon hillock design in Fig. 3.4, excitatory synapse used the
design of Fig. 6.1, and inhibitory synapse used the design of Fig. 2.1 (b).
The waveform shown in Fig. 6.4 shows a simulation of the cortical subsystem
without communication to the subcortical system. For learning to occur, neurons
simulated in HSPICE were in a resting state and desire inputs signalled that
action should occur but synapses were weak so no spike occured. A noisy synapse
caused spiking in a neuron also being triggered by desire to signal the sub-cortical
nervous system to move; any neurons that spiked as a result caused STDP; Short-
term learning occurred. The sub-cortical system, if connected, would have caused
the robot to move forward as a result of the cortical command. The robot would
have signalled forward motion has occurred, resulting in dopamine that created
long-term strengthening of synapses that experienced STDP. Long-term learning
would have occurred. After enough iterations, synapses were strong enough so
that desire without noise could cause spiking and signaling the sub-cortical system.
boosting the signal. The learning process repeated for fear without forgetting the
strengthened desire pathway.
Subsequent to this simulation, we demonstrated the same type of situation
with subcortical subsystem involvement and actual movement from the robot as
shown in Fig. 6.5. In the cortex part(a Linux system computer), Python code
was in charge of iteratively running HSPICE simulations of this network for 1 ms
87
Figure6.4: Waveform showing two neurons signaling forward and backward movement,
NA1 and NA2. Simulated stimulus and response of subcortical system when desire and
noise were introduced caused NA1 to spike, dopamine was released and strengthened
synapses in the A chain. Fear and noise caused neuron NB1 to spike, and simulated
movement caused dopamine to be released, strengthening synapses in the B chain.
simulation time, recording simulation results of each iteration, and communicat-
ing with the robot through a socket. In the subcortical part, a MATLAB code
communicates with the cortex and controlS the robot. A video is available online
(https://tinyurl.com/vv7ggmr) to show the robot'S legs walking forwards
under desire stimulation (to nd a target) and walking backwards under fear stim-
ulation (to retreat from danger) as the cortical neuron simulation proceeds. The
video shows example cortical synapse strengthening as the simulation progresses
and the voice over describes the simulation.
88
Figure6.5: Waveform showing two neurons signaling forward and backward movement,
NA1 and NA2. Simulated stimulus and response of subcortical system when desire and
noise were introduced caused NA1 to spike, dopamine was released and strengthened
synapses in the A chain. Fear and noise caused neuron NB1 to spike, and simulated
movement caused dopamine to be released, strengthening synapses in the B chain. This
screenshot if from the video described earlier.
89
Chapter 7
Future Research
All the proposed work is targeting a biomimetic neuromorphic system with uni-
versal capability as a brain. The noisy neurons will be the substitution of the
work demonstrated in Chapter 5. Then I am going to show a new pattern with
over response features learned by previous training can be distinguished due to the
noisy neuron. And the noise will be used as a kick-o condition for the neuronal
network. This work will be nished by May, 2021. The MoS
2
based astrocytes are
just implemented within an extremely small neural network right now to present
its theoretical function. By incorporating the astrocytes in the proposed neuro-
morphic system framework, the power, area, and complexity will be compared
with CMOS astrocyte designs, and the astrocytes eect like synaptic communica-
tion will be demonstrated. This work will be done by Sep, 2021. In the proposed
neuromorphic system framework, MAM is used as persistence memory, but the
comparison of the persistence capability is missing. A CMOS capacitance-based
memory design is going to be compared with MAM in mathematical calculation
and SPICE simulation to demonstrated the persistence in quantity. To show the
dierences of this neuromorphic system framework with other neuromorphic sys-
tems, more engineering details of asynchronous, power, and parallelism for dierent
90
applications will be extracted and summarized. A lifelong-learning neuronal net-
work with dendritic computation is also expected. This network should be capable
of recognizing gure patterns with angle information in each pixel. And the circuit
level simulation of the extended recurrent neuronal network also will be done. This
work will be done by Dec, 2021.
91
Appendix A
Simulation: Astrocyte Deep Brain
stimulation in Parkinson's Disease in the
Brain Blood Barrier
A.1 Motivation
Traditional brain stimulation methods such as deep brain stimulation have played
critical roles as treatments of neurological disorders, but this method cannot be a
candidate for large scale neural modulation due to its limited stimulation spots and
invasive processing. The other brain stimulation methods like transcranial mag-
netic stimulation have issues with accuracy. Therefore, a minimal-invasive or non-
invasive large scale stimulation method are expecting for neurological modulation.
The modulation can be used as improved treatment for neurological disorders such
as Parkinson's Disease (PD), Epilepsy, and Alzheimer's Disease. Furthermore, the
modulation also can aect on human brain memory and learning systems, which
can be used for delaying aging process.
Magneto-electric nanoparticles can be delivered through blood and triggered
by certain external magnetic eld as a minimal-invasive large scale method. The
locations and amplitude of stimulation can also be precisely manipulated by the
external magnetic led.
92
Astrocytes have been proved playing neural modulators in neural networks as
part of brain blood barrier (BBB). Therefore, the particles can be located outside
BBB to reduce the toxicity and invasive eect. Moreover, less side eect from fault
stimulations can be expecting in this indirect stimulation method.
A.2 Stimulation Method
Unlike surface-limited electric elds that are typically generated by invasive contact
electrodes, magnetic elds generated by injected ME nanoparticles can eectively
penetrate the entire brain non-invasively, and be \switched" on and o remotely
using external low-energy magnetic eld sources. The nanoparticles shown in Fig.
A.1 must satisfy certain requirements on the strength of magneto-electric (ME)
coupling (dened by the ME coecient). magneto-electric nanoparticles (MNP)
must be smaller than approximately 20 nm in diameter to penetrate the blood-
brain barrier (BBB). Having the size of the ME nanoparticles within the BBB-
dened boundary enables adequate delivery of the nanoparticles into selected brain
regions. Only very low intensity external magnetic eld is required to stimulate
brain activity at any depth in the brain. The external magnetic eld can be focused
to act upon ME nanoparticles in any particular region of the brain. The external
magnetic eld generates AC signals in ME nanoparticles that are correlated with
93
Figure A.1: (a) SEM Image of Magneto-electric Nano Particles. (b) Polarization by
External Magnetic Field. (c) AFM Image of Magneto-electric Nano Particles.
the frequency spectrum of the neural charge activity, which in turn causes neurons
in the region to re at similar frequencies.
A.3 ResultsofAstrocyteStimulationforParkin-
son's Disease Treatment
Comparing to traditional stimulation electrode, the MNPs just can generate elec-
trical potential instead of current. Hence, the response of astrocytes to electrical
potential attracted our attention. The voltage gated calcium channel of astrocyte
has been studied for years, this channel allows calcium ions to enter astrocyte ac-
cording ro the memberane voltage. The electrical potential of MNPs is used to
change the member voltage of astrocytes to open the calcium channels and then
stimulate the calcium wave due to the calcium-induced calcium oscillation inside
astrocytes. The high concentration of calcium ions inside astrocytes will lead to
the release of glutamate, an excitation neurotransmitter for AMPA and NMDAR
94
receptors. The released glutamate will excite the AMPA/NMDAR, then the re-
ceptors will be opened for sodium and calcium ions to generate depolarization of
neurons. If the interneurons to the Thalamus are exited, then the Thalamus will
be inhibited by the GABA from the interneurons to reduce the abnormal behavior
due to Parkinson's Disease. This processing is similiar with that the Thalamus is
GABAergic inhibited by the GPi due to the traditional DBS on STN or GPi. The
whole process is shown as a
ow diagram in the Fig. A.2. The simulation details
are demostrated in the following subsections.
Figure A.2: The Flow Diagram of the Astrocytes Stimulation for Parkinson's Disease
Treatment
A.3.1 Voltage Gated Calcium Channels
The astrocyte is separated from the extracellular space by the membrane, with
two compartments representing extracellular space and intracellular space. The
dierent types of VGCCs are distributed in the cellular membrane and form the
95
pathway of calcium ions in
ux. The model of VGCC has been improved progres-
sively in recent years revealing dierent sub-types with specic parameters [81].
A simplied voltage channel model [82] of Eq. A.1 and A.2 is adopted here to
describe the approximate calcium current to demonstrate the charge
ux. Where
N is the number of the calcium channels, g is the conductance of the channels,
P is the probability of the activation and inactivation of the channels, and E
ca
is
the chemical potential of the calcium ions. Then the relation between membrane
potential and calcium current is achieved in the Fig. A.3. The maximum conduc-
tance of these channels is achieved at about 0.1 V, the polarization of MNPs is
sucient to open the channels.
I
ca
=N
ch
g
ca
P
ca
(t;V ) (VE
ca
) (A.1)
P
ca
(t;V ) =(t;V )
2
+i(t;V )
2
(A.2)
A.3.2 Calcium-induced Calcium Oscillation
Many experimental results [83] suggested that an inositol 1,4,5-triphosphate (IP3)
dependent calcium-induced calcium release (CICR) mechanism is involved in Ca
2+
oscillations. This was central to the process in the intracellular space (ICS). In fact,
96
FigureA.3: Charge Moved into Astrocyte with Membrane potentials and Accumulated
Times
the Ca
2+
oscillations needed both extracellular and intracellular Ca
2+
contribution.
The Ca
2+
ions entered through VGCCs increases the IP3 concentration, then the
IP3 receptors on Endoplasmic Reticulum (ER) are activated and release more
Ca
2+
ions. The illustration of the CICR process is shown in Fig. A.4a. Voltage-
gated calcium channels (VGCCs) were found to be involved in Ca
2+
oscillations
in pharmacological trials. The putative notion of Ca
2+
oscillations in astrocytes
was that a small in
ux of Ca
2+
into the cytoplasm via VGCCs induces CICR
activated by IP3. To determine the threshold of the calcium
ux through VGCCs
and the amplitude of the calcium oscillation in astrocytes, the simulation of CICR
is proceeded by the model of [84].
97
The Eq. A.3 :
dCa
cyt
=dt = J
VGCC
P
out
Ca
cyt
+J
CICR
+
J
SERCA
+P
f
(Ca
ER
Ca
cyt
)
(A.3)
describes the correlation between the small in
ux of Ca
2+
from VGCC and the
calcium oscillation inside the astrocyte cell, where J
VGCC
is the current density
from VGCC, P
out
Ca
cyt
represents the rate of calcium
ux from the cytosol into the
extracellular space, J
CICR
represents the IP3-mediated CICR
ux to the cytosol
from the ER, J
SERCA
is the
ux from the sarcoplasmic/endoplasmic reticulum
calcium ATPase (SERCA), which lls the ER with calcium ions from the cytosol,
and P
f
(Ca
ER
{Ca
cyt
) describes the leakage
ux from the ER into the cytosol along
the concentration gradient. The simulation result is shown in Fig. A.4b.
The Eq. A.4:
dCa
ER
=dt =J
SERCA
J
CICR
+P
f
(Ca
ER
Ca
cyt
) (A.4)
describes the
ux from ER, and the simulation result is shown in Fig. A.4c.
The Eq. A.5:
dIP
3
=dt =J
PLC
P
deg
IP 3
cyt
(A.5)
describes the IP3 alternation inside the astrocyte cell, where J
PLC
denotes IP3
production, and P
deg
IP3
cyt
represents IP3 degradation. The simulation result is
98
shown in Fig. A.4d. In the above three equations, P
out
, P
f
, and P
deg
are constants.
Through this CICR simulation, the initial concentration of calcium from VGCC
is optimized to be about 0.1 mol, then the product of channel numbers and
activation time should be above 2 10
7
s to induce the CICR.
Figure A.4: (a) The Diagram of CICR in Astrocytes. (b) The Ca
2+
Oscillation in
Astrocyte Cytosol. (c) The Ca
2+
Oscillation in ER. (d) The IP3 Oscillation in ER.
A.3.3 Glutamate and Receptors
The released glutamate from astrocytes will excite both the AMPAR and NMDAR,
then the receptors will be opened for Na
+
and Ca
2
+ to generate depolarization
in neurons. The AMPAR is only response to the neurotransmitter concentration,
and the conductance can reach to 1 with sucient glutamate supply, which means
99
this receptor will open the ion channels with sucient time of astrocyte calcium
oscillation. The model of AMPAR is adopted from [85]. The simulation result is
shown in Fig. A.5, the red line indicates the concentration of the glutamate and
the black line is the conductance of ion channel.
Figure A.5: AMPAR Conductance Simulation with Time and Neurotransmitter Con-
centration
The extrasynaptic NMDAR response to Ca
2
+ oscillations in astrocytes has
been reported by [26]. Both the membrane voltage and neurotransmitter con-
tribute to control the conductance of NMDAR. The MNPs polarization can sup-
port the voltage for the neurons closed to the BBB, and the astrocytic glutamate
will mediate the NMDAR. Then the
ux of slow inwards current through NM-
DAR can be calculated using model [86]. The conductance variation of NMDAR
is shown in Fig. A.6a, and the charge
ow through NMDAR is shown in Fig. A.6b.
100
Figure A.6: (a) Conductance of NMDAR. (b) Charge
ow through NMDAR.
A.3.4 Neurological Modulation for Parkinson's Disease
The neurological modulation using deep brain stimulation (DBS) for Parkinson's
Disease (PD) has been widely studied with simulations and experiments in this
decade. In this work, the computational procedure was built on top of a conven-
tional model from [87] used to simulate the electric eld dynamics in the neural
network. In this basic model, the activities of Thalamus and Basal Ganglia are
simulated to mimic the neural dynamics related with PD. Inside the Basal Gan-
glia, a small network including STN, GPi, and external globus pallidus (GPe) is
simulated. The initial input is applied to STN, then the STN excites both GPe and
GPi. The GPe feedback inhibition to the whole Basal Ganglia, while the output
of GPi will inhibit the Thalamus. The simulation result of the normal control is
shown in Fig. A.7.
For comparison, Fig. A.8 illustrates typical signals in the same four parts of
the brain of the patient suering from Parkinson's Disease. The most drastic
101
Figure A.7: Simulation Result of Normal Control
dierence from the case of the healthy person is an appearance of pronounced
lapses in the periodic pulsed sequences particularly in the Thalamus region. Also,
the periodicity of the pulses in the other regions is broken.
Figure A.8: Simulation Result of Parkinson's Disease
102
The traditional DBS located the stimulation site at the STN to excite more
ring of the GPi and then inhibit the burst ring of the neurons in Thalamus. The
DBS-PD treatment neural diagram is shown in Fig. A.9.
The possible mechanisms is that DBS led to improvements in thalamic through-
put delity|either driving the output of the basal ganglia to reduce bursting in
the GPi, producing tonic inhibition of the thalamus; or silencing the output of the
basal ganglia to produce tonic disinhibition of the thalamus, consistent with the
classical view of the \virtual lesion" eects of DBS [88]. The simulation of DBS
eect in PD model is shown in Fig. A.10, the DBS supplies 130 Hz stimulation of
STN resulted in high frequency regular ring of neurons in both GPe and GPi due
to the excitatory projections from STN to GP.
Figure A.9: Connections within the network are according to known biology of the
astrocytes, basal ganglia, and Thalamus with DBS
Activation of GPi produced regular high frequency ring only in these GPi
bers, and since GPi had no eerent connections to STN or GPe, the ring patterns
103
Figure A.10: Simulation Result of Parkinson's Disease with DBS
of GPe and STN neurons were not aected by activation of GPi. The GPi has
inhibitory connections to the thalamic cells, and regularization of the output from
the GPi resulting from either STN or GPi activation resulted in the thalamus
receiving regular tonic inhibition, which led to higher thalamic delity.
In this work, we propose to use MNPs to stimulate the astrocytes and then
achieve Thalamus tonic inhibition through interneurons. The Astrocyte-PD treat-
ment neural diagram is shown in Fig. A.11. To account for the eect of ME
nanoparticles, we made the following assumptions that can be justied at this
early stage of research. First, with the average diameter of the ME nanoparticles
below 50 nm in a neural system with the smallest feature size of at least an order
of magnitude higher, we can use a trivial point-dipole approximation to model the
electric eld by each local nanoparticle [89]. Second, we assume a uniform distribu-
tion of nanoparticles in BBB region under study. The assumption is valid because
104
Figure A.11: Connections within the network are according to known biology of the
astrocytes, basal ganglia, and Thalamus with MNPs
the nanoparticles, due to their average diameters, experience a relatively negligible
resistance from the surrounding tissues and therefore their spatial distribution in
the ground state can be controlled by an external magnetic eld. The electric
dipole moment of each nanoparticle, P, is determined by the external magnetic
eld, H, according to the expression,
P
i
=
X
j
ij
H
j
; (A.6)
where
ij
is the 1st order linear magneto-electric (ME) tensor coecient.
Therefore, assuming an isotropic matrix (with identical diagonal coecients and
zero non-diagonal coecients) with the typical value for
ij
of 100 V cm
1
Oe
1
and a local magnetic eld of 300 Oe, the polarized moment at the location of the
nanoparticle would be 30 kV cm
1
. In the current arrangement, the purpose of
MF nanoparticles is to act as additional stimulation sources of electric elds that
105
can be precisely controlled by external magnetic elds because of the non-zero
ME constant. The nanoparticles can be considered as nely controlled stimu-
lation switches that can enable high-precision (with nanoscale localization) and
high-throughput (energy-ecient) non-invasive medical procedures. To articially
trigger (stimulate) electric pulses in the brain region under study with the purpose
to prevent or compensate for any illness-caused malfunctions or lapses in a periodic
chain of electric signals in the parts of neural system occupied by the nanoparticles,
we would apply ac magnetic elds at the matching frequencies. Fig. A.12 shows
the simulation result of the astrocytes modulation in PD model. The interneurons
inhibit the Thalamus directly without connection to the Basal Ganglia, the higher
thalamic delity is achieved by tuning the astrocytes stimulation while the Basal
Ganglia remains with same activities as PD.
Figure A.12: Simulation Result of Parkinson's Disease with Astrocyte Stimulation of
MNPs
106
A.4 Conclusion
Based on the research of astrocytes in neuroscience, this work shows new large
scale neuromodulation method through targeting astrocyte with magneto-electric
nanoparticles. Comparing with electrodes, nanoparticles can target the highly
distributed astrocyte. In the other hand, astrocytes allow nanoparticles to be in
BBB to reduce the toxicity. We simulate the whole system from stimulation, brain
circuit to modulation results. The eect of magneto-electric nanoparticles non-
invasively stimulate the astrocytes in BBB of a patient with Parkinson's Disease,
and the pulsed sequences of the neural networks were brought to the levels com-
parable to those of healthy people. The preliminary results of this study suggests
that using ME nanoparticles can lead the way to implementing nanotechnology
to improve our understanding of the astrocytes neural network and develop new
nano-medical methods for non-invasive treatment of neural system diseases with
astrocytes stimulation.
107
Bibliography
[1] C.-C. Hsu, \Dendritic computation and plasticity in neuromorphic circuits,"
Ph.D. dissertation, University of Southern California, 2014.
[2] S. Barzegarjalali and A. C. Parker, \A hybrid neuromorphic circuit demon-
strating schizophrenic symptoms," in Biomedical Circuits and Systems Con-
ference (BioCAS), 2015 IEEE. IEEE, 2015, pp. 1{4.
[3] J. Joshi, A. Parker, and C.-C. Hsu, \A carbon nanotube cortical neuron with
spike-timing-dependent plasticity\," in Engineering in Medicine and Biology
Society, 2009. EMBC 2009. Annual International Conference of the IEEE,
Sept 2009, pp. 1651{1654.
[4] A. C. Parker, J. Joshi, C.-C. Hsu, and N. A. D. Singh, \A carbon nanotube
implementation of temporal and spatial dendritic computations," in Circuits
and Systems, 2008. MWSCAS 2008. 51st Midwest Symposium on. IEEE,
2008, pp. 818{821.
[5] Y. Irizarry-Valle and A. C. Parker, \An astrocyte neuromorphic circuit that
in
uences neuronal phase synchrony," IEEE transactions on biomedical cir-
cuits and systems, vol. 9, no. 2, pp. 175{187, 2015.
[6] K. S. Novoselov, A. K. Geim, S. Morozov, D. Jiang, M. Katsnelson, I. Grig-
orieva, S. Dubonos, Firsov, and AA, \Two-dimensional gas of massless dirac
fermions in graphene," nature, vol. 438, no. 7065, p. 197, 2005.
[7] H. Wang, L. Yu, Y.-H. Lee, Y. Shi, A. Hsu, M. L. Chin, L.-J. Li, M. Dubey,
J. Kong, and T. Palacios, \Integrated circuits based on bilayer MoS
2
transis-
tors," Nano letters, vol. 12, no. 9, pp. 4674{4680, 2012.
[8] B. Radisavljevic, A. Radenovic, J. Brivio, i. V. Giacometti, and A. Kis,
\Single-layer MoS
2
transistors," Nature nanotechnology, vol. 6, no. 3, pp. 147{
150, 2011.
[9] M. Gholipour, Y.-Y. Chen, and D. Chen, \Flexible transition metal dichalco-
genide eld-eect transistor (tmdfet) hspice model," 2016.
108
[10] S. B. Desai, S. R. Madhvapathy, A. B. Sachid, J. P. Llinas, Q. Wang, G. H.
Ahn, G. Pitner, M. J. Kim, J. Bokor, C. Hu et al., \MoS
2
transistors with
1-nanometer gate lengths," Science, vol. 354, no. 6308, pp. 99{102, 2016.
[11] J. Joshi, J. Zhang, C. Wang, C. Hsu, A. Parker, C. Zhou, and U. Ravishankar,
\A biomimetic fabricated carbon nanotube synapse for prosthetic applica-
tions," in IEEE/NIH 2011 LIfe Science Systems and Applications Workshop,
2011.
[12] S. A. Wolf, A. Y. Chtchelkanova, and D. M. Treger, \Spintronics|a retro-
spective and perspective," IBM Journal of Research and Development, vol. 50,
no. 1, pp. 101{110, 2006.
[13] K. Yogendra, D. Fan, B. Jung, and K. Roy, \Magnetic Pattern Recognition
Using Injection-Locked Spin-Torque Nano-Oscillators," IEEE Transactions on
Electron Devices, vol. 63, no. 4, pp. 1674{1680, Apr. 2016.
[14] G. Srinivasan, A. Sengupta, and K. Roy, \Magnetic tunnel junction based
long-term short-term stochastic synapse for a spiking neural network with
on-chip stdp learning," Scientic reports, vol. 6, p. 29545, 2016.
[15] J. Torrejon, M. Riou, F. A. Araujo, S. Tsunegi, G. Khalsa, D. Querlioz,
P. Bortolotti, V. Cros, K. Yakushiji, A. Fukushima, H. Kubota, S. Yuasa,
M. D. Stiles, and J. Grollier, \Neuromorphic computing with nanoscale
spintronic oscillators," Nature, vol. 547, no. 7664, p. 428, Jul. 2017. [Online].
Available: https://www.nature.com/articles/nature23011
[16] D. C. Ralph and M. D. Stiles, \Spin transfer torques," Journal of
Magnetism and Magnetic Materials, vol. 320, no. 7, pp. 1190{1216, Apr.
2008. [Online]. Available: http://www.sciencedirect.com/science/article/pii/
S0304885307010116
[17] J. Sinova, S. O. Valenzuela, J. Wunderlich, C. Back, and T. Jungwirth, \Spin
Hall eects," Rev. Mod. Phys., vol. 87, no. 4, pp. 1213{1260, Oct. 2015.
[Online]. Available: https://link.aps.org/doi/10.1103/RevModPhys.87.1213
[18] J. A. Katine, F. J. Albert, R. A. Buhrman, E. B. Myers, and D. C. Ralph,
\Current-Driven Magnetization Reversal and Spin-Wave Excitations in Co
$/$Cu $/$Co Pillars," Phys. Rev. Lett., vol. 84, no. 14, pp. 3149{3152, Apr.
2000. [Online]. Available: https://link.aps.org/doi/10.1103/PhysRevLett.84.
3149
[19] L. Liu, C.-F. Pai, Y. Li, H. W. Tseng, D. C. Ralph, and R. A. Buhrman,
\Spin-Torque Switching with the Giant Spin Hall Eect of Tantalum,"
109
Science, vol. 336, no. 6081, pp. 555{558, May 2012. [Online]. Available:
http://science.sciencemag.org/content/336/6081/555
[20] M. Hu, Y. Wang, Q. Qiu, Y. Chen, and H. Li, \The stochastic modeling of
tio 2 memristor and its usage in neuromorphic system design," in Design Au-
tomation Conference (ASP-DAC), 2014 19th Asia and South Pacic. IEEE,
2014, pp. 831{836.
[21] I. Boybat, M. L. Gallo, T. Moraitis, T. Parnell, T. Tuma, B. Rajendran,
Y. Leblebici, A. Sebastian, E. Eleftheriou et al., \Neuromorphic computing
with multi-memristive synapses," arXiv preprint arXiv:1711.06507, 2017.
[22] T. J. Sejnowski, \The unreasonable eectiveness of deep learning in articial
intelligence," Proceedings of the National Academy of Sciences, 2020.
[23] P. R. Montague, P. Dayan, and T. J. Sejnowski, \A framework for mesen-
cephalic dopamine systems based on predictive hebbian learning," Journal of
neuroscience, vol. 16, no. 5, pp. 1936{1947, 1996.
[24] E. Marder, \Neuromodulation of neuronal circuits: back to the future," Neu-
ron, vol. 76, no. 1, pp. 1{11, 2012.
[25] A. Volterra, N. Liaudet, and I. Savtchouk, \Astrocyte ca2+ signalling: an
unexpected complexity," Nature Reviews Neuroscience, vol. 15, no. 5, pp. 327{
335, 2014.
[26] T. Fellin, O. Pascual, S. Gobbo, T. Pozzan, P. G. Haydon, and G. Carmignoto,
\Neuronal synchrony mediated by astrocytic glutamate through activation of
extrasynaptic nmda receptors," Neuron, vol. 43, no. 5, pp. 729{743, 2004.
[27] J. Joshi, A. Parker, and K.-C. Tseng, \In-silico glial microdomain to invoke
excitability in cortical neural networks," in Submitted to International Sym-
posium on Circuits and Systems, 2011.
[28] P. Gao, B. V. Benjamin, and K. Boahen, \Dynamical system guided map-
ping of quantitative neuronal models onto neuromorphic hardware," IEEE
Transactions on Circuits and Systems I: Regular Papers, vol. 59, no. 10, pp.
2383{2394, 2012.
[29] R. A. Nawrocki, R. M. Voyles, and S. E. Shaheen, \A mini review of neuro-
morphic architectures and implementations," IEEE Transactions on Electron
Devices, vol. 63, no. 10, pp. 3819{3829, 2016.
110
[30] N. Qiao, H. Mostafa, F. Corradi, M. Osswald, F. Stefanini, D. Sumislawska,
and G. Indiveri, \A recongurable on-line learning spiking neuromorphic pro-
cessor comprising 256 neurons and 128k synapses," Frontiers in neuroscience,
vol. 9, p. 141, 2015.
[31] T. Sharp, F. Galluppi, A. Rast, and S. Furber, \Power-ecient simulation of
detailed cortical microcircuits on spinnaker," Journal of neuroscience methods,
vol. 210, no. 1, pp. 110{118, 2012.
[32] P. A. Merolla, J. V. Arthur, R. Alvarez-Icaza, A. S. Cassidy, J. Sawada,
F. Akopyan, B. L. Jackson, N. Imam, C. Guo, Y. Nakamura et al., \A million
spiking-neuron integrated circuit with a scalable communication network and
interface," Science, vol. 345, no. 6197, pp. 668{673, 2014.
[33] M. Davies, N. Srinivasa, T.-H. Lin, G. Chinya, Y. Cao, S. H. Choday, G. Di-
mou, P. Joshi, N. Imam, S. Jain et al., \Loihi: A neuromorphic manycore
processor with on-chip learning," IEEE Micro, vol. 38, no. 1, pp. 82{99, 2018.
[34] A. Sengupta, P. Panda, P. Wijesinghe, Y. Kim, and K. Roy, \Magnetic tunnel
junction mimics stochastic cortical spiking neurons," Scientic reports, vol. 6,
p. 30039, 2016.
[35] T. Serrano-Gotarredona, T. Masquelier, T. Prodromakis, G. Indiveri, and
B. Linares-Barranco, \Stdp and stdp variations with memristors for spiking
neuromorphic learning systems," Frontiers in neuroscience, vol. 7, p. 2, 2013.
[36] J. A. P erez-Carrasco, C. Zamarreno-Ramos, T. Serrano-Gotarredona, and
B. Linares-Barranco, \On neuromorphic spiking architectures for asyn-
chronous stdp memristive systems," in Circuits and Systems (ISCAS), Pro-
ceedings of 2010 IEEE International Symposium on. IEEE, 2010, pp. 1659{
1662.
[37] S. H. Jo, T. Chang, I. Ebong, B. B. Bhadviya, P. Mazumder, and W. Lu,
\Nanoscale memristor device as synapse in neuromorphic systems," Nano let-
ters, vol. 10, no. 4, pp. 1297{1301, 2010.
[38] G. Indiveri, B. Linares-Barranco, R. Legenstein, G. Deligeorgis, and T. Pro-
dromakis, \Integration of nanoscale memristor synapses in neuromorphic com-
puting architectures," Nanotechnology, vol. 24, no. 38, p. 384010, 2013.
[39] K. Cameron, V. Boonsobhak, A. Murray, and D. Renshaw, \Spike timing
dependent plasticity (stdp) can ameliorate process variations in neuromorphic
vlsi," IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1626{1637,
2005.
111
[40] J.-s. Seo, B. Brezzo, Y. Liu, B. D. Parker, S. K. Esser, R. K. Montoye, B. Ra-
jendran, J. A. Tierno, L. Chang, D. S. Modha et al., \A 45nm cmos neu-
romorphic chip with a scalable architecture for learning in networks of spik-
ing neurons," in Custom Integrated Circuits Conference (CICC), 2011 IEEE.
IEEE, 2011, pp. 1{4.
[41] T. J. Koickal, A. Hamilton, S. L. Tan, J. A. Covington, J. W. Gardner, and
T. C. Pearce, \Analog vlsi circuit implementation of an adaptive neuromorphic
olfaction chip," IEEE Transactions on Circuits and Systems I: Regular Papers,
vol. 54, no. 1, pp. 60{73, 2007.
[42] J. Joshi, A. C. Parker, and C.-C. Hsu, \A carbon nanotube cortical neu-
ron with spike-timing-dependent plasticity," in Engineering in Medicine and
Biology Society, 2009. EMBC 2009. Annual International Conference of the
IEEE. IEEE, 2009, pp. 1651{1654.
[43] X. Zhou, Y. Guo, A. C. Parker, C.-C. Hsu, and J. Choma, \Biomimetic non-
linear CMOS adder for neuromorphic circuits," in Neural Engineering (NER),
2013 6th International IEEE/EMBS Conference on. IEEE, 2013, pp. 876{
879.
[44] P. Mamdouh and A. C. Parker, \A switched-capacitor dendritic arbor for low-
power neuromorphic applications," in 2017 IEEE International Symposium on
Circuits and Systems (ISCAS). IEEE, 2017, pp. 1{4.
[45] M. Mahvash and A. C. Parker, \A memristor spice model for designing mem-
ristor circuits," in 2010 53rd IEEE International Midwest Symposium on Cir-
cuits and Systems. IEEE, 2010, pp. 989{992.
[46] Y. Irizarry-Valle, A. C. Parker, and J. Joshi, \A neuromorphic approach to em-
ulate neuro-astrocyte interactions," in Neural Networks (IJCNN), The 2013
International Joint Conference on. IEEE, 2013, pp. 1{7.
[47] R. K. Lee and A. C. Parker, \A cmos circuit implementation of retrograde
signaling in astrocyte-neuron networks," in 2016 IEEE Biomedical Circuits
and Systems Conference (BioCAS). IEEE, 2016, pp. 588{591.
[48] S. Barzegarjalali, K. Yue, and A. Parker, \Noisy neuromorphic circuit mod-
eling obsessive compulsive disorder," 2016 29th IEEE International System-
on-Chip Conference (SOCC), Sep 2016.
[49] W. Zhao and Y. Cao, \New generation of predictive technology model for
sub-45 nm early design exploration," IEEE Transactions on Electron Devices,
vol. 53, no. 11, pp. 2816{2823, 2006.
112
[50] S. Barzegarjalali, K. Yue, and A. C. Parker, \Noisy neuromorphic circuit mod-
eling obsessive compulsive disorder," in System-on-Chip Conference (SOCC),
2016 29th IEEE International. IEEE, 2016, pp. 327{332.
[51] K. Yue, Y. Liu, R. K. Lake, and A. C. Parker, \A brain-plausible neuromorphic
on-the-
y learning system implemented with magnetic domain wall analog
memristors," Science advances, vol. 5, no. 4, p. eaau8170, 2019.
[52] S. Lequeux, J. Sampaio, V. Cros, K. Yakushiji, A. Fukushima, R. Matsumoto,
H. Kubota, S. Yuasa, and J. Grollier, \A magnetic synapse: multilevel
spin-torque memristor with perpendicular anisotropy," Scientic Reports,
vol. 6, p. 31510, Aug. 2016. [Online]. Available: https://www.nature.com/
articles/srep31510
[53] G.-q. Bi and M.-m. Poo, \Synaptic modications in cultured hippocampal
neurons: dependence on spike timing, synaptic strength, and postsynaptic
cell type," Journal of neuroscience, vol. 18, no. 24, pp. 10 464{10 472, 1998.
[54] L. Aitkin, C. Dunlop, and W. Webster, \Abbott, lf, varela, ja, sen, k., and
nelson, sb (1997). synaptic depression and cortical gain control. science 275:
220{224."
[55] F. S. Chance, S. B. Nelson, and L. Abbott, \Complex cells as cortically am-
plied simple cells," Nature neuroscience, vol. 2, no. 3, pp. 277{282, 1999.
[56] C. C. Bell, V. Z. Han, Y. Sugawara, and K. Grant, \Synaptic plasticity in a
cerebellum-like structure depends on temporal order," Nature, vol. 387, no.
6630, p. 278, 1997.
[57] H. Markram, J. L ubke, M. Frotscher, and B. Sakmann, \Regulation of synap-
tic ecacy by coincidence of postsynaptic aps and epsps," Science, vol. 275,
no. 5297, pp. 213{215, 1997.
[58] M. Stip cevi c and B. M. Rogina, \Quantum random number generator based
on photonic emission in semiconductors"," Review of scientic instruments,
vol. 78, no. 4, p. 045104, 2007.
[59] S. Cova, M. Ghioni, A. Lacaita, C. Samori, and F. Zappa, \Avalanche pho-
todiodes and quenching circuits for single-photon detection," Applied optics,
vol. 35, no. 12, pp. 1956{1976, 1996.
[60] B. Huang, X. Zhang, W. Wang, Z. Dong, N. Guan, Z. Zhang, and H. Chen,
\CMOS monolithic optoelectronic integrated circuit for on-chip optical inter-
connection\," Optics Communications, vol. 284, no. 16, pp. 3924{3927, 2011.
113
[61] K. Chilukuri, M. J. Mori, C. L. Dohrman, and E. A. Fitzgerald, \Mono-
lithic CMOS-compatible AlGaInP visible LED arrays on silicon on lattice-
engineered substrates (SOLES)"," Semiconductor Science and Technology,
vol. 22, no. 2, p. 29, 2006.
[62] N. Faramarzpour, M. J. Deen, S. Shirani, and Q. Fang, \Fully integrated sin-
gle photon avalanche diode detector in standard CMOS 0.18-m technology\,"
Electron Devices, IEEE Transactions on, vol. 55, no. 3, pp. 760{767, 2008.
[63] M. Mahvash and A. C. Parker, \Modeling intrinsic ion-channel and synaptic
variability in a cortical neuromorphic circuit," in Biomedical Circuits and
Systems Conference (BioCAS), 2011 IEEE. IEEE, 2011, pp. 69{72.
[64] J. Joshi, C.-C. Hsu, A. Parker, and P. Deshmukh, \A carbon nanotube cor-
tical neuron with excitatory and inhibitory dendritic computations"," in Life
Science Systems and Applications Workshop, 2009. LiSSA 2009. IEEE/NIH,
April 2009, pp. 133{136.
[65] V. A. Klyachko and C. F. Stevens, \Excitatory and feed-forward inhibitory
hippocampal synapses work synergistically as an adaptive lter of natural
spike trains," PLoS biology, vol. 4, no. 7, p. e207, 2006.
[66] S. J. Martin, P. D. Grimwood, and R. G. Morris, \Synaptic plasticity and
memory: an evaluation of the hypothesis," Annual review of neuroscience,
vol. 23, no. 1, pp. 649{711, 2000.
[67] S. Ahrens, S. Jaramillo, K. Yu, S. Ghosh, G.-R. Hwang, R. Paik, C. Lai,
M. He, Z. J. Huang, and B. Li, \Erbb4 regulation of a thalamic reticular
nucleus circuit for sensory selection," Nature neuroscience, vol. 18, no. 1, p.
104, 2015.
[68] G. Deco, E. T. Rolls, and R. Romo, \Synaptic dynamics and decision making,"
Proceedings of the National Academy of Sciences, vol. 107, no. 16, pp. 7545{
7549, 2010.
[69] Z. Wang, S. Joshi, S. E. Savel'ev, H. Jiang, R. Midya, P. Lin, M. Hu, N. Ge,
J. P. Strachan, Z. Li et al., \Memristors with diusive dynamics as synaptic
emulators for neuromorphic computing," Nature materials, vol. 16, no. 1, p.
101, 2017.
[70] R. K. Lee, \Astrocyte-mediated plasticity and repair in cmos neuromorphic
circuits," Ph.D. dissertation, University of Southern California, 2018.
[71] V. Vedam-Mai, E. Van Battum, W. Kamphuis, M. Feenstra, D. Denys,
B. Reynolds, M. Okun, and E. Hol, \\Deep brain stimulation and the role
of astrocytes\," Molecular psychiatry, vol. 17, no. 2, p. 124, 2012.
114
[72] S. V. Suryavanshi and E. Pop, \Physics-based compact model for circuit sim-
ulations of 2-dimensional semiconductor devices," in Device Research Confer-
ence (DRC), 2015 73rd Annual. IEEE, 2015, pp. 235{236.
[73] C. Mead and M. Ismail, Analog VLSI implementation of neural systems.
Springer Science & Business Media, 2012, vol. 80.
[74] R. J. Douglas and K. A. Martin, \Recurrent neuronal circuits in the neocor-
tex," Current biology, vol. 17, no. 13, pp. R496{R500, 2007.
[75] X.-J. Wang, \Decision making in recurrent neuronal circuits," Neuron, vol. 60,
no. 2, pp. 215{234, 2008.
[76] M. N. Shadlen and D. Shohamy, \Decision making and sequential sampling
from memory," Neuron, vol. 90, no. 5, pp. 927{939, 2016.
[77] P. A. Merolla and K. Boahen, \Dynamic computation in a recurrent network
of heterogeneous silicon neurons," in Circuits and Systems, 2006. ISCAS 2006.
Proceedings. 2006 IEEE International Symposium on. IEEE, 2006, pp. 4{pp.
[78] M. Giulioni, F. Corradi, V. Dante, and P. Del Giudice, \Real time unsu-
pervised learning of visual stimuli in neuromorphic vlsi systems," Scientic
reports, vol. 5, p. 14730, 2015.
[79] S. Hochreiter and J. Schmidhuber, \Long short-term memory," Neural com-
putation, vol. 9, no. 8, pp. 1735{1780, 1997.
[80] P. U. Diehl, G. Zarrella, A. Cassidy, B. U. Pedroni, and E. Neftci, \Con-
version of articial recurrent neural networks to spiking neural networks for
low-power neuromorphic hardware," in Rebooting Computing (ICRC), IEEE
International Conference on. IEEE, 2016, pp. 1{8.
[81] M. D'Ascenzo, M. Vairano, C. Andreassi, P. Navarra, G. B. Azzena, and
C. Grassi, \\Electrophysiological and molecular evidence of l-(cav1), n-(cav2.
2), and r-(cav2. 3) type ca2+ channels in rat cortical astrocytes"," Glia,
vol. 45, no. 4, pp. 354{363, 2004.
[82] W. A. Catterall, \\Voltage-gated calcium channels\," Cold Spring Harbor
perspectives in biology, vol. 3, no. 8, p. a003947, 2011.
[83] G. Sharma and S. Vijayaraghavan, \\Nicotinic cholinergic signaling in hip-
pocampal astrocytes involves calcium-induced calcium release from intracel-
lular stores"," Proceedings of the National Academy of Sciences, vol. 98, no. 7,
pp. 4148{4153, 2001.
115
[84] S. Zeng, B. Li, S. Zeng, and S. Chen, \\Simulation of spontaneous ca2+
oscillations in astrocytes mediated by voltage-gated calcium channels\," Bio-
physical journal, vol. 97, no. 9, pp. 2429{2437, 2009.
[85] D. Sterratt, B. Graham, A. Gillies, and D. Willshaw, \Principles of computa-
tional modelling in neuroscience" . Cambridge University Press, 2011.
[86] A. Veidenbaum, \\Ecient simulation of large-scale spiking neural networks
using cuda graphics processors\," in International Joint Conference on Neural
Networks, 2009, pp. 2145{2152.
[87] R. Q. So, A. R. Kent, and W. M. Grill, \\Relative contributions of local cell
and passing ber activation and silencing to changes in thalamic delity dur-
ing deep brain stimulation and lesioning: a computational modeling study","
Journal of computational neuroscience, vol. 32, no. 3, pp. 499{519, 2012.
[88] P. J. Hahn, G. S. Russo, T. Hashimoto, S. Miocinovic, W. Xu, C. C. McIn-
tyre, and J. L. Vitek, \\Pallidal burst activity during therapeutic deep brain
stimulation\," Experimental neurology, vol. 211, no. 1, pp. 243{251, 2008.
[89] L. Abelmann, S. K. Khizroev, D. Litvinov, J.-G. Zhu, J. A. Bain, M. H.
Kryder, K. Ramst ock, and C. Lodder, \\Micromagnetic simulation of an ul-
trasmall single-pole perpendicular write head"," Journal of applied physics,
vol. 87, no. 9, pp. 6636{6638, 2000.
116
Abstract (if available)
Abstract
This dissertation presents research on neuromorphic circuits and nanotechnologies that support learning. This work started from multi-state synapse circuit design and explored implantations of all CMOS and nanoelectronics circuit hybrids. Magnetic analog memristor (MAM) is the nanoelectronic device used in this synapse circuit to realize multiple weight states. Based on the multi-state synapse circuits, features including short-long term memory and noise are added to enable reward-based learning using our synapse circuits. Furthermore, a frequency adaptation feature is explored to increase the nonlinearity for more complex learning algorithm. ❧ Based on the improved synapse circuits and previous work done by the BioRC group, neuronal circuitries including cortical, subcortical, and memory networks are demonstrated in this work. The cortical circuit is a 2-layer feedforward network that can learn with spike-time dependent plasticity (STDP) in an online and on-chip manner to do image pattern recognition. An astrocyte network that modulates neuronal behavior is also presented as subcortical circuitry and another nanoelectronic device, MoS2, is used to optimize this implementation. We also studied the astrocyte network numerically for Parkinson’s disease brain stimulation treatment. The neuronal memory circuitry, especially sequential memory, is implemented as a neuron chain with STDP synapses. ❧ The last piece of this work is a blank-slate network for biomimetic robot control. This network enables a robot to learn movement from emotions such as desire and fear. The emotions map to possible neurons evoking in cortical manner; each motion neuron is followed by a neuron chain communicating with robot’s activators, sensors and reward neurons. With noisy random emotion inputs, the robot will learn according movement and form sequential memories. This learned result won’t be forgotten when learning new actions, which is satisfying lifelong learning. ❧ In conclusion, a multi-state synapse circuit with both linear and nonlinear functions is proposed to increase the parallelism of this neuromorphic system. A memristor device in the synapse circuit provides analog values to enable the multi-states and frequency nonlinearity. The subcortical parts are implemented as astrocytic and dopaminergic regulation circuits. These parts enable the transformation from short-term memory to long-term memory. With a sequential memory implemented by the multi-states synapse and neuron chain structure, a solution of the lifelong learning problem is demonstrated.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Astrocyte-mediated plasticity and repair in CMOS neuromorphic circuits
PDF
Power-efficient biomimetic neural circuits
PDF
Plasticity in CMOS neuromorphic circuits
PDF
Modeling astrocyte-neural interactions in CMOS neuromorphic circuits
PDF
Dendritic computation and plasticity in neuromorphic circuits
PDF
Dynamic neuronal encoding in neuromorphic circuits
PDF
Graph machine learning for hardware security and security of graph machine learning: attacks and defenses
PDF
A biomimetic approach to non-linear signal processing in ultra low power analog circuits
PDF
Radiation hardened by design asynchronous framework
PDF
Analog and mixed-signal parameter synthesis using machine learning and time-based circuit architectures
PDF
Electronic design automation algorithms for physical design and optimization of single flux quantum logic circuits
PDF
Towards efficient edge intelligence with in-sensor and neuromorphic computing: algorithm-hardware co-design
PDF
Neuromorphic motion sensing circuits in a silicon retina
PDF
Ultra-low-latency deep neural network inference through custom combinational logic
PDF
Memristor device engineering and memristor-based analog computers for mobile robotics
PDF
Multi-phase clocking and hold time fixing for single flux quantum circuits
PDF
Compiler and runtime support for hybrid arithmetic and logic processing of neural networks
PDF
Attacks and defense on privacy of hardware intellectual property and machine learning
PDF
Development of electronic design automation tools for large-scale single flux quantum circuits
PDF
An FPGA-friendly, mixed-computation inference accelerator for deep neural networks
Asset Metadata
Creator
Yue, Kun
(author)
Core Title
Circuit design with nano electronic devices for biomimetic neuromorphic systems
School
Viterbi School of Engineering
Degree
Doctor of Philosophy
Degree Program
Electrical Engineering
Degree Conferral Date
2021-12
Publication Date
09/21/2021
Defense Date
07/09/2020
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
nano electronic,neural network,neuromorphic algorithm,neuromorphic hardware,neuromorphic robotics,OAI-PMH Harvest,VLSI circuits
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Parker, Alice Cline (
committee chair
), Nakano, Aiichiro (
committee member
), Nuzzo, Pierluigi (
committee member
)
Creator Email
kunyue@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC15921177
Unique identifier
UC15921177
Legacy Identifier
etd-YueKun-10083
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Yue, Kun
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright. The original signature page accompanying the original submission of the work to the USC Libraries is retained by the USC Libraries and a copy of it may be obtained by authorized requesters contacting the repository e-mail address given.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
nano electronic
neural network
neuromorphic algorithm
neuromorphic hardware
neuromorphic robotics
VLSI circuits