Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Emulating variability in the behavior of artificial central neurons
(USC Thesis Other)
Emulating variability in the behavior of artificial central neurons
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
EMULATING VARIABILITY IN THE BEHAVIOR OF ARTIFICIAL CENTRAL NEURONS by Mohammad Mahvash Mohammadi A Dissertation Presented to the FACULTY OF THE USC GRADUATE SCHOOL UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulllment of the Requirements of the Degree DOCTOR OF PHILOSOPHY (ELECTRICAL ENGINEERING) May 2012 Copyright 2012 Mohammad Mahvash Mohammadi Dedicated to my parents Iran and Reza ii Acknowledgements I thank my advisor Professor Alice C. Parker for her guidance, support and endless encouragement. Her enthusiasm towards research and her insight into the problems made this thesis possible. I will never forget her warm heart and friendship that she showed towards me. I thank Professor Vasilis Z. Marmarelis and Professor Tansu Celikel for their valuable criticisms and suggestions which enhanced both technical contents and readability of this thesis. I thank my parents for their endless love, support and patience, which made it possible for me to continue my graduate study, and nally to nish this thesis. iii Table of Contents Dedication ii Acknowledgements iii List of Figures vi Abstract xi Chapter 1: Introduction 1 1.1 Research Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.3 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 Chapter 2: Variability in the Nervous System 10 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 2.2 Synaptic Variability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.3 Ion-Channel Variability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.3.1 Spontaneous Firing . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.3.2 Ion-Channel Variability in the Hodgkin-Huxley Model . . . . . . . 20 2.3.3 Ion-Channel Variability in the Integrate-and-Fire Model . . . . . . 22 2.3.4 Ion-Channel Variability at the Circuit Level . . . . . . . . . . . . . 23 Chapter 3: Chaotic Signal Generator 25 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.2 Chaotic Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 3.3 Chaotic Signal Generator Circuit . . . . . . . . . . . . . . . . . . . . . . . 29 3.4 Improved Chaotic Signal Generator Circuit . . . . . . . . . . . . . . . . . 36 3.5 Circuit Adjustment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Chapter 4: Circuit Implementation of Intrinsic Variability 40 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.2 Neurotransmitter-Release Variability in the Synapse Circuit . . . . . . . . 42 4.3 Ion-Channel Variability in the Axon Hillock Circuit . . . . . . . . . . . . 50 iv Chapter 5: Impact of the Variability 59 5.1 Reliability of Spike Timing . . . . . . . . . . . . . . . . . . . . . . . . . . 59 5.2 Reliability of Spike Train Propagation . . . . . . . . . . . . . . . . . . . . 66 5.3 Innite Looping Behavior in a Neural Network . . . . . . . . . . . . . . . 77 Chapter 6: Conclusion and Future Research 81 Appendix A: Stochastic Resonance 83 Appendix B: Technologies beyond CMOS: Memristor 90 B.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 B.2 SPICE Model of a Memristor . . . . . . . . . . . . . . . . . . . . . . . . . 92 B.3 Low Pass Filter Using a Memristor . . . . . . . . . . . . . . . . . . . . . . 95 B.4 Integrator Using a Memristor . . . . . . . . . . . . . . . . . . . . . . . . . 97 B.5 Memristor in Neuromorphic Circuits . . . . . . . . . . . . . . . . . . . . . 101 Bibliography 106 v List of Figures Figure 2.1: Overview of variability in the nervous system [33] . . . . . . . . . . 11 Figure 2.2: Sequence of events involved in transmission at a typical chemical synapse [70] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Figure 2.3: Action potential generation steps [16] . . . . . . . . . . . . . . . . . 18 Figure 2.4: Spontaneous ring rate as a function of membrane surface area [20] 20 Figure 2.5: Spontaneous output from the simulation [20] . . . . . . . . . . . . 21 Figure 2.6: Interspike intervals from simulation [20] . . . . . . . . . . . . . . . 22 Figure 2.7: 500 responses to the constant and uctuating inputs [18] . . . . . . 23 Figure 3.1: Transfer function for the chaotic map . . . . . . . . . . . . . . . . . 28 Figure 3.2: Sequence of x n for a = 1:9, b = 1, and x 0 = 0:9 . . . . . . . . . . . 29 Figure 3.3: Block diagram of the chaotic generator circuit . . . . . . . . . . . . 30 Figure 3.4: Delay and Scaling block at the transistor level . . . . . . . . . . . . 31 Figure 3.5: PWL function at the transistor level . . . . . . . . . . . . . . . . . 33 Figure 3.6: Output current vs. input current for PWL function when I in > 0 . 35 Figure 3.7: Block diagram for the improved switched current circuit . . . . . . 36 Figure 3.8: Scaled delay block at the transistor level . . . . . . . . . . . . . . . 37 Figure 3.9: Output vs. input current for PWL function . . . . . . . . . . . . . 37 vi Figure 3.10: Chaotic voltage with V pp = 400mv, V mid = 300mv, V init = 480mv and period of each sample= 10nsec. . . . . . . . . . . . . . . . . . . . . 39 Figure 4.1: A system block diagram of the cortical neuron model with a pyra- midal neuron cartoon [47]. . . . . . . . . . . . . . . . . . . . . . . . . . . 41 Figure 4.2: The carbon nanotube a) excitatory and b) inhibitory synapses where R represents Reuptake [48]. . . . . . . . . . . . . . . . . . . . . . . . . . 43 Figure 4.3: Input and output of the excitatory synapse a. no variability b. with variability included. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 Figure 4.4: Input spike, PSP's, dendritic output and output spike for the neuron with no variability included. . . . . . . . . . . . . . . . . . . . . . . . . . 45 Figure 4.5: Probability of ring when a Gaussian voltage is included for neu- rotransmitter release control, resulting in synaptic variability, only one synapse has spike at the input. . . . . . . . . . . . . . . . . . . . . . . . 47 Figure 4.6: Probability of ring when a Gaussian voltage is included for neu- rotransmitter release control, resulting in synaptic variability, only one synapse has spike at the input (MATLAB simulation). . . . . . . . . . . 48 Figure 4.7: Probability of ring when a chaotic signal is included for neuro- transmitter release control, resulting in synaptic variability. . . . . . . . 49 Figure 4.8: Probability of ring when neurotransmitter-release variability is included in one synapse, SPICE results (top), MATLAB results (bottom). 51 Figure 4.9: Probability of ring when neurotransmitter-release variability is included in two synapses, MATLAB results. . . . . . . . . . . . . . . . . 52 Figure 4.10: Spike generation circuit in the axon hillock module [47] . . . . . . 53 Figure 4.11: Axon hillock input stage with included variability . . . . . . . . . 55 Figure 4.12: Input and output in the axon hillock with variability included (chaotic signal). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Figure 4.13: Probability of ring vs PSP amplitudes for four types of variability. 57 vii Figure 4.14: Probability of ring vs PSP amplitudes for Gaussian with two dierent mean values. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Figure 5.1: PSP and output spike for three experiments. . . . . . . . . . . . . . 61 Figure 5.2: Reliability of spike timing for constant PSP vs. standard deviation of ion-channel variability. . . . . . . . . . . . . . . . . . . . . . . . . . . 62 Figure 5.3: Reliability of spike timing for constant PSP vs. PSP amplitude . . 63 Figure 5.4: 100 trials raster plot for spike times for constant and variable PSP (stimulus). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Figure 5.5: Reliability of spike timing for variable PSP vs. standard deviation of synaptic variability for three dierent standard deviations of ion-channel variability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 Figure 5.6: Reliability of spike timing for variable PSP vs. standard deviation of synaptic variability for three dierent standard deviations of ion-channel variability, MATLAB simulation. . . . . . . . . . . . . . . . . . . . . . . 67 Figure 5.7: The cortical neuron simulated with eight excitatory synapses in SPICE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 Figure 5.8: Spikes arriving at the eight synapses, the dendrite output and the output of the axon hillock when ion-channel variability is not included. . 70 Figure 5.9: Spikes arriving at the ve synapses, the dendrite output and the output of the axon hillock when ion-channel variability is not included. . 70 Figure 5.10: Probability of the neuron ring in response to a single spike arriving at eight (blue), six(red) and ve(green) synapses vs ion-channel variability standard deviation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 viii Figure 5.11: Experiment 1: a) a train of spikes arriving at eight synapses, b) the neuron response with no ion-channel variability included, experiment 2: c) a train of spikes arriving at six synapses, d) a train of spikes (with one missing spike) arriving at the rest of the synapses (two synapses), e) the neuron response with no ion-channel variability included, experiment 3: f) a train of spikes arriving at six synapses, g) a train of spikes (with two missing spikes) arriving at the rest of the synapses (two synapses), h) the neuron response with no ion-channel variability included. . . . . . 73 Figure 5.12: Probability of the neuron ring in response to a train of spikes (ve spikes) arriving at the synapses for experiment one (blue), two(red) and three(green) synapses vs ion-channel variability standard deviation. . . . 74 Figure 5.13: Probability of the neuron ring in response to a single spike arriving at eight (blue), six(red) and ve(green) synapses vs ion-channel variability standard deviation (MATLAB simulation results). . . . . . . . . . . . . 75 Figure 5.14: Probability of the neuron ring in response to a train of spikes (ve spikes) arriving at the synapses for experiment one (blue), two (red) and three (green) synapses vs ion-channel variability standard deviation (MATLAB simulation results). . . . . . . . . . . . . . . . . . . . . . . . 77 Figure 5.15: The cortical neuron simulated in SPICE with three excitatory synapses. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Figure 5.16: Probability of ring vs. number of loops for three standard deviations. 79 Figure 5.17: Probability of ring vs. number of loops for three PSP amplitudes. 80 Figure A.1: Input and output of the axon hillock for input with no variability, low variability and high variability . . . . . . . . . . . . . . . . . . . . . 85 Figure A.2: Signal (blue) and noise (red) power vs ion-channel noise standard deviation in the axon hillock . . . . . . . . . . . . . . . . . . . . . . . . . 87 Figure A.3: Signal-to-noise ratio vs ion-channel noise standard deviation, ex- periment (blue solid line) and mathematic calculation (red dashed line). 88 Figure A.4: Signal-to-noise ratio vs ion-channel variability standard deviation with correlated synaptic variability for three correlation factors (red 0.05, blue 0.1 and green 0.2) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 ix Figure B.1: The coupled variable resistor model . . . . . . . . . . . . . . . . . 92 Figure B.2: 100 ohm resistor subcircuit . . . . . . . . . . . . . . . . . . . . . . 94 Figure B.3: Memristor subcircuit . . . . . . . . . . . . . . . . . . . . . . . . . . 94 Figure B.4: Voltage and current of the memristor . . . . . . . . . . . . . . . . 95 Figure B.5: I-V curve . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 Figure B.6: Low pass lter with memristor . . . . . . . . . . . . . . . . . . . . 97 Figure B.7: Frequency response of the low-pass lter . . . . . . . . . . . . . . . 97 Figure B.8: Integrator with memristor . . . . . . . . . . . . . . . . . . . . . . . 98 Figure B.9: Input and output waveforms in a ramp-shape regime . . . . . . . . 99 Figure B.10: Input and output waveforms in the non-linear regime . . . . . . . 100 Figure B.11: Voltage and current waveforms of memristor . . . . . . . . . . . . 102 Figure B.12: State variable w vs. time . . . . . . . . . . . . . . . . . . . . . . . 102 Figure B.13: Three-terminal memristor circuit . . . . . . . . . . . . . . . . . . 103 Figure B.14: Input and output waveforms in three-terminal memristor circuit . 103 Figure B.15: State variable w in a three-terminal memristor circuit . . . . . . . 104 Figure B.16: Input and output waveforms in a three-terminal memristor circuit 104 Figure B.17: State variable w in a three-terminal memristor circuit . . . . . . . 105 x Abstract The variability in the behavior of articial central neurons is the topic of this thesis. Variable behavior has been observed in biological neurons, resulting in changes in neural behavior that might be useful to capture in neuromorphic circuits. This thesis presents a neuromorphic cortical neuron with two main sources of intrinsic variability; synaptic neu- rotransmitter release and ion-channel variability, designed to be used in neural networks as part of the BioRC Biomimetic Real-Time Cortex project. This neuron has been de- signed and simulated using carbon nanotube transistors, one of several nanotechnologies under consideration to meet the challenges of scale presented by the cortex. Research results suggest that some instances of variability are stochastic, while other studies indicate that some instances of variability are chaotic. In this thesis, both possible sources of variability are considered by embedding either Gaussian noise or a chaotic signal into the neuromorphic or synaptic circuit and observing the results. Our overarching goal for the BioRC project is to demonstrate complex neural net- works that possess memory and learning capability. To this end, we believe the be- havior of such networks would be enhanced by the addition of variability. This thesis describes neurotransmitter-release variability and ion-channel variability modeled at the circuit level using carbon nanotube circuit elements. We include two dierent types of signal variabilities in the circuit, a signal with Gaussian noise and a chaotic signal. For neurotransmitter-release variability these signals are simulated as if they were generated internally in a synapse circuit to vary the neurotransmitter release in an unpredictable xi manner. Variation in neurotransmitter concentration in the synaptic cleft causes a change in the peak magnitude and duration of the postsynaptic potential. For ion-channel vari- ability, these signals are simulated as if they were generated internally in an axon hillock circuit to change the ring mechanism. The variable signal could force the neuron to re if the variability strength were sucient or could prevent the neuron from ring. The variable signal is independent of the post-synaptic potential. When there is no post- synaptic potential applied to the axon hillock (the cell membrane is at resting potential), the variable signal forcing the neuron to re in fact models spontaneous ring of the neuron. For Gaussian noise, we include a le in our SPICE simulation consisting of random voltage samples that control neurotransmitter release volume. For chaotic signals, we present a chaotic signal generator circuit design and simulation using carbon nanotube transistor SPICE models, the output of which would likewise control neurotransmitter release. The circuit uses a chaotic piecewise linear one-dimensional map, implemented with switched-current circuits that can operate at high frequencies to generate a chaotic output current. The results presented in this thesis illustrate that neurotransmitter-release variabil- ity plays a benecial role in the reliability of spike generation. In an examination of the reliability of spike generation, the precision of spike timing in the carbon nanotube circuit simulations was found to be dependent on stimulus (postsynaptic potential) tran- sients. Postsynaptic potentials with low neurotransmitter release variability or without neurotransmitter release variability produced imprecise spike trains, whereas postsynap- tic potentials with high neurotransmitter-release variability produce spike trains with reproducible timing. xii In simulation experiments, spontaneous ring of neurons due to ion-channel variability was demonstrated. The thesis illustrates how, in one particular case, ion-channel vari- ability could play a benecial role in the reliability of transferring a train of spikes. The thesis also shows how ion-channel variability can halt innite looping behavior with posi- tive feedback in a neural network such as those thought to occur in obsessive-compulsive disorder. The design was simulated using carbon nanotube transistors and a SPICE simulation. xiii Chapter 1: Introduction 1.1 Research Overview Variability is a prominent feature of biological behavior, playing a central role in the be- havior of the neurons in the nervous system. While the purpose of such variability is not completely understood, recent studies to be described in Section I.2 indicate that vari- ability might oer distinct advantages. Variability in ion-channel behavior could lead to structural plasticity and other forms of learning by causing a post-synaptic neuron to re. This ring could cause active synapses to be strengthened (spike-timing-dependent plas- ticity), so that subsequently the neuron could re even with typical ion-channel behavior in the axon hillock. In terms of system behavior, variability in ion-channel behavior could assist in moving a neural network from a local minimum. Any articial neural system designed to be brain-like, or biomimetic, might be enhanced by some variability in behav- ior. Articial neurons that spontaneously re without sucient post-synaptic potential could trigger brain activity that was unanticipated, but useful. At the neuronal level, variability could enhance sensitivity to weak signals, a phe- nomenon that is known as stochastic resonance. Also variability could play a construc- tive role leading to increase reliability of neuronal ring in single neurons. Input with low variability or without variability produce imprecise spike trains, whereas input with 1 high variability coming from synaptic variability produce spike trains with reproducible timing. At the network level, variability could halt innite looping behavior with posi- tive feedback in a neural network such as those thought to occur in obsessive-compulsive disorder. The mechanisms underlying variability in neuronal behavior are not completely un- derstood. According to Laplace, randomness is only a measure of our \ignorance of the dierent causes involved in the production of events." One of the most important features of modern scientic inquiry is the ability to nd the source of variability in neuronal behavior and also how to predict future events. There have been dramatic discoveries in neuroscience over the last 30 years. Each year there is a lot of research about under- standing of the mechanical way with which we perceive, we remember, we speak and we feel. This research could help to predict the behavior of any animal at any time from the current environmental situation. Human behavior is not completely determined, however every year the behavior evaluates as knowledge accumulates [63]. Variability exists even in the best-understood behavioral systems; therefore it would be impossible to predict individual behavior. There are a number of systems from sin- gle neurons and synapses to invertebrate and vertebrate animals including humans that generate variable output despite no variations in input at all [63]. At the neuronal level, researchers typically deal with this variability in laboratory studies. They do measure- ments several times at dierent times and plot the results. Trial-to-trial variability is observed by many researchers in neural networks [33], [77]. It is unclear whether neurons are noisy or there is an underlying, unknown process that appears to be random. If we assume the neural network is a deterministic system, 2 then one source of this variability might be chaos [50]. Chaotic behavior is highly non- linear behavior that can be characterized with nonlinear mathematics. If the system's dynamics are highly sensitive to the initial conditions and the initial state of the neural circuitry varies at the beginning of each trial, this leads to dierent neuronal and behav- ioral responses. Another source of variability is noise. It is important to understand the interaction of chaotic dynamics and noisiness and the resultant combined variability. In this thesis, we consider both sources of variability. The nervous system is an organ system containing a network of specialized cells called neurons that coordinate the actions of an animal and transmit signals between dierent parts of its body. The nervous system of vertebrate animals (including humans) is divided into the central nervous system (CNS) and peripheral nervous system (PNS). The central nervous system (CNS) is the largest part of the nervous system that coordinates the activity of all parts of the body. It contains the majority of the nervous system and consists of the brain and the spinal cord, as well as the retina. The brain contains neurons connected primarily through synapses. In the human brain, there are about 100 billion (10 11 ) neurons and 100 trillion (10 14 ) synapses. The Peripheral Nervous System (PNS) resides or extends outside the CNS. The PNS consists of the sensory neurons running from stimulus receptors that inform the CNS of the stimuli and motor neurons. Sensory neurons are typically classied as the neurons responsible for converting external stimuli from the environment into internal stimuli. They are activated by sensory inputs (vision, touch, hearing, etc.), and send projections into the central nervous system that convey sensory information to the brain or spinal cord. Motor neurons are running from the CNS to the muscles and glands - called eectors 3 - that take action. In fact, motor neurons convert information coming from the CNS into mechanical forces in their attached muscle bers. Any telecommunication system that is transferring and processing information is noisy. Noise is in fact a random or irregular signal that interferes with or obscures a main signal that contains information. The nervous system is no exception and like other systems is noisy from receiving signals in the sensory neurons to processing information in the CNS and then sending information to the motor neurons. Some neuroscience researchers classify variability as sensory, motor or cellular vari- ability [33]. Others call cellular variability intrinsic variability [29] or other names. We classify variability in the nervous system as extrinsic and intrinsic if it is outside or inside the CNS. Therefore sensory variability and motor variability are in the extrinsic variability category because they refer to variability outside the CNS. Inside the central nervous system (CNS), the two main mechanisms that possess in- trinsic variability are the neuron ring mechanism and the synapse. Neurons communicate with each other by generating a stream of action potentials (spikes). Action potentials are triggered when signals combining from several synapses result in enough positive ions to bring the membrane potential up to the threshold voltage. However intrinsic variability could cause some neurons to re even when the membrane potential is not above the threshold or even re when there is no signal coming from synapses. Also, conversely, intrinsic variability might prevent a neuron from ring even when the membrane voltage is above the threshold. Therefore intrinsic variability plays an important role in neuronal behavior. Our overarching goal for the BioRC (Biomimetic Real-Time Cortex) project is to demonstrate complex neural networks that possess memory and learning capability. To 4 this end, we believe the behavior of such networks would be enhanced by the addition of variability. This thesis describes the design of a carbon nanotube neuromorphic cortical neuron with two main sources of intrinsic variability; neurotransmitter-release variability and ion-channel variability. Neurotransmitter-release variability and ion-channel variabil- ity are modeled at the circuit level using carbon nanotube circuit elements. We include a choice of two dierent types of signal variabilities in the circuit, a signal with Gaussian noise and a chaotic signal. For neurotransmitter-release variability these signals are simu- lated as if they were generated internally in a synapse circuit to vary the neurotransmitter release in an unpredictable manner. Variation in neurotransmitter concentration in the synaptic cleft causes a change in the peak magnitude and duration of the postsynaptic potential. For ion-channel variability, these signals are simulated as if they were gener- ated internally in an axon hillock circuit to change the ring mechanism. The variable signal could force the neuron to re if the variability strength were sucient or could prevent the neuron from ring even with adequate membrane potential. When there is no post-synaptic potential at the axon hillock (the cell membrane is at resting potential), the variable signal forcing the neuron to re in fact models spontaneous ring of the neu- ron. For Gaussian noise, we include a le in our SPICE simulation consisting of random voltage samples that control neurotransmitter release volume. In actual circuits, device variability due to thermal eects, when amplied, could be used as a source of variability. For chaotic signals, we present a chaotic signal generator circuit design and simulation using carbon nanotube transistor SPICE models, the output of which would likewise con- trol neurotransmitter release. The circuit uses a chaotic piecewise linear one-dimensional map, implemented with switched-current circuits that can operate at high frequencies to generate a chaotic output current. 5 1.2 Related Work Variability in a single neuron response and its consequences for neural networks have been under scrutiny for many decades. In 1964, the random-walk (RW) model was introduced to model the stochastic discharge of neurons that was measured experimentally. The earliest solution of the integrate-and-re (IAF) model that included stochastic activity modeled the incoming signal from the synapse (post synaptic potential) as a random walk [42]. In 1965, the IAF model was formulated with stochastic input to include the decay of the membrane potential [76]. Knight in 1972 pioneered the study of the eect of noise on the dynamics of a simple spiking neuron using the integrate-and-re (IF) model [52]. The noise model he studied was a simplied model in which the threshold is drawn randomly after each spike. Gerstner extended these results and studied both slow noise models and fast-escape-rate noise models [43]. Fourcaud and Brunel completed previous works by studying the impact of synaptic variability on the dynamics of the ring probability of a spiking neuron using IF model [40]. Stein discussed whether the variability is because of neural noise or it is an important part of the signal [76]. Carelli claimed that the irregularities found in the membrane potential of bursting neurons are related to nonlinear and chaotic properties of the cells [17]. Faisal reviewed the sources of noise in the nervous system and showed how noise contributes to trial-to-trial variability [33]. The eects of intrinsic variability in the ion channels of the neuronal membranes have been studied for a long time, with pioneering studies by Pecher [68], Fatt and Katz [36], [37], and numerous others. Ion-channel variability has important eects on its information 6 processing capabilities, changing action potential dynamics, enhancing signal detection, altering the spike-timing reliability and aecting the tuning properties of the cell [7], [24], [30], [57]. A number of authors did theoretical and numerical analysis of the Hodgkin- Huxley (HH) equations with stochastic uctuations of the ion channels [3], [20]. This uctuation can cause spontaneous ring and places limits on the miniaturization of the brain's wiring [34]. Overall, ion-channel variability has been studied and analyzed theo- retically by several neuroscience researchers using the IF and the Hodgkin-Huxley (HH) models. However they did not have circuit implementation of the ion-channel variability. Several authors investigated advantages of the variability in single neuron and neural networks. Mainen and Sejnowski studied reliability of spike timing in rat neocortical slices [61]. They applied two types of inputs to the neuron, inputs with low noise or without noise and inputs with high noise. The timing of spikes drifted from one trial to the next. By comparing the trial-to-trial results for two types of inputs, they demonstrated that the precision of spike timing depends on the level of noise in the input. Inputs with high noise generate spike trains with reproducible timing. Mainen and Sejnowski used cortical neurons in their study, but this argument also applies to synaptic transmission in sensory pathways [49]. Reliability of spike timing was studied by Cecchi, et al. using the leaky integrate-and-re model [18]. Overall, several researchers demonstrate this phenomenon in theory, simulation and experiments [13], [31], [41]. These research ndings have encouraged us to examine spike timing characteristics with neurotransmitter-release variability in our neuromorphic circuit simulations. At the network level, neuronal networks that have formed in the presence of noise will be more robust and explore more states, which will facilitate learning and adaptation to the changing demands of a dynamic environment [33]. 7 Manwani and Koch [62] provide arguments that indicate variability is helpful, while others suggest that unreliability in transmission in the cortex due to variability is an energy-saving feature, and multiple pathways increase the reliability. In this thesis we include neurotransmitter-release variability and ion-channel variabil- ity in a neuromorphic neural circuit. One approach to include variability in a neuronal circuit is to implement a deterministic Hodgkin-Huxley model with added white noise at the circuit level [19]. However, we use circuits built in Parker's BioRC group (e.g. [66], [47], [48]) because they are designed to expand for greater control over specic mechanisms and to incorporate additional mechanisms. Also, our approach is to include variability in transistors in the circuit that correspond to biological functions aected by variability and therefore our approach is more biomimetic than Chen's [19]. Many researchers have built neuromorphic circuits with no variability (e.g. [10], [35], [45], [58], [86]). To date, we are not aware of biomimetic ion-channel circuits that have variable behavior apart from the work by Chen. 1.3 Thesis Outline Chapter II describes dierent sources of variability in nervous system. we brie y review sensory variability and motor variability from the extrinsic variability category and then we focus on two main sources of intrinsic variability: synaptic variability and ion-channel variability. Ion-channel variability causing spontaneous ring and ion-channel variability in the Hodgkin-Huxley model and the Integrate-and-Fire model are also reviewed in this chapter. 8 In Chapter III, we present a chaotic signal generator circuit. We discuss a chaotic map and then present and design the circuit at the transistor level and show the results. We show how to improve the performance of the circuit. Chapter IV models intrinsic variability at the circuit level, the main purpose of the research described in this thesis. We focus on two main sources of intrinsic variability at the circuit level; neurotransmitter-release variability in the synapse circuit and ion-channel variability in the axon hillock circuit. Chapter V describes the impact of variability on the behavior of the neuron. We study temporal information in a train of spikes, reliability of spike generation and innite looping behavior in a small neural network. Finally, conclusion and future research are summarized in Chapter VI. 9 Chapter 2: Variability in the Nervous System 2.1 Introduction In order to study variability in nervous system, we classify variability as extrinsic and in- trinsic referring to variability outside and inside of the CNS. Figure 2.1 shows an overview of variability in the nervous system. As shown in this gure, extrinsic variability includes sensory variability and motor variability. The gure also shows two main sources of intrinsic variability: ion-channel variability and synaptic variability. At the biochemical and biophysical level there are many stochastic processes in neu- rons. These include protein production and degradation, the opening and closing of ion channels, the fusing of synaptic vesicles and the diusion and binding of signaling molecules to receptors. Combination of all stochastic elements could be a big source of randomness in the nervous system, however by averaging large numbers of such stochastic elements, the randomness of individual elements could be eliminated. One source of extrinsic variability is variability in the sensory system. Variability in the sensory system is mainly in the form of noise therefore we use noise as a source of sensory variability. External sensory stimuli are intrinsically noisy because they are coming from a noisy environment. In the auditory system, noise exists in the input in the or at least processes that appear to be stochastic, but may not be completely understood. 10 Figure 2.1: Overview of variability in the nervous system [33] form of random collisions of air molecules against the eardrum or of Brownian motion of cochlear components which imposes a limit on the auditory system [46]. In the visual system, photoreceptors receive photons at a rate governed by a Poisson process. Similar to the auditory system, this noise limits contrast sensitivity in vision. This impact is reduced at low light levels when the number of photons arriving at the photoreceptors are less [8]. In the olfactory and gustatory systems, chemical sensing is aected by thermodynamic noise because molecules arrive at the receptor at random rates owing to diusion and because receptor proteins are limited in their ability to accurately count the number of signaling molecules [6], [9]. 11 As shown in Figure 2.1a, in the sensory transduction and amplication stage, a sensory signal is converted into a chemical signal (in the visual, olfactory and gustatory systems) or a mechanical signal (in the auditory system) and then is amplied and is converted into an electrical signal. Noise in later stages is a combination of sensory noise plus noise during the amplication process (transducer noise). Therefore signals that are weaker than the noise cannot be distinguished from it after amplication [2]. When we talk about noise in a system consisting of multiple stages, we can show mathematically that reducing noise in the rst stage is more important that other stages. For example in telecommunication systems, at receiver side, the rst stage is LNA (low noise amplier) and has better noise performance than other stages in the receiver. There- fore, to reduce noise in the nervous system, organisms often pay a high metabolic and structural price at the rst stage (the sensory stage). For example, a ys photoreceptors account for 10% of its resting metabolic consumption and its eyes optics make up over 20% of the ight payload [56]. Another source of extrinsic variability is variability in the motor system. We consider noise as the only source of motor variability. As shown in Figure 2.1b, motor neurons receive signals from the CNS and convert them into mechanical forces in the muscle bers. The number of muscle bers that are innervated from a single neuron is proportional to the force. In addition, when the whole-muscle force increases, the ring rates of the active motor neurons increase, such that those that innervate a small number of muscle bers have the highest ring rate. The human skeletal muscle produces a force that has variability. The variability is proportional to the average force that is produced by that muscle. Whole-muscle force is determined by the number of active motor neurons and the ring rates of these 12 neurons. The motor neuron that innervates the most bers will have the lowest ring rate. Therefore, any variability in the force that is generated by the muscle bers that are innervated by this motor neuron will contribute the most to whole-muscle force variability. Human motor behavior from eye movements to hand trajectories is in such a way to eliminate or reduce the eect of motor noise by optimal control. However it is still unclear how much of the observed trial-to-trial movement variability is because of motor variability and how much is due to other sources of variability in the motor system [4], [5]. In the following sections, we study two dominant sources of intrinsic variability, synap- tic variability and ion-channel variability. 2.2 Synaptic Variability Synapses are connections between neurons. Most synapses connect an axon of a pre- synaptic neuron to the dendrite of a post-synaptic neuron. The human brain has a huge number of synapses. Each of the 10 11 neurons has on average 7000 synaptic connections to other neurons, and in the cortex each neuron has an average of 10000 synapses. Figure 2.2, shows a structure of a typical synapse in the human brain with a summary of sequences of events involved in the transmission at a typical chemical synapse. The whole process in the synapse usually takes less than one millisecond. The process starts with receiving an action potential from a pre-synaptic neuron, and then the membrane is depolarized and causes voltage-gated Ca 2+ ion channels to open. That increases the Ca 2+ concentration in the interior and then that activates a set of calcium-sensitive proteins attached to vesicles that contain a neurotransmitter. Then the membranes of some vesicles fuse with 13 the membrane of the pre-synaptic neuron. That opens the vesicles and then dumps their neurotransmitter contents into the synaptic cleft. Some of the neurotransmitter escapes, however some binds to the receptors. This binding causes the receptor molecules to be activated. This activation aects the behavior of the post-synaptic neuron. Finally neurotransmitters break loose from the receptors. Some of them are reabsorbed by the pre-synaptic neuron and some break down metabolically. As we mentioned earlier, we have variability in responses in the synapse. Researchers have observed synapse variability in postsynaptic responses. It is unclear whether synapse responses are random or there is an underlying, unknown process in synapses that only appears to be random. There is evidence that shows the source of this variability is noisy behavior on the part of a neural mechanism [14], [15]. However, some researchers believe synapse variability arises from a complex deterministic process [38], [51]. This process could be chaotic. Chaotic behavior is deterministic, highly non-linear behavior that can be characterized with nonlinear mathematics. King studied the diversity of non-linear characteristics of the neuron and synapse and proposed several chaotic models for neural processes [50]. In this thesis, we consider both sources of variability, noise and chaos. Many neocortical cells receive an intense synaptic bombardment from thousands of synapses, which contains meaningful information and noise from other cells. The main component of the synaptic variability experienced by a neuron originates in the myriad of synapses made by other cells onto it. Every spike arriving at this synapse contributes a random amount of charge to the cell. They usually call this synaptic background noise. During synaptic processing of presynaptic action potentials, there are several steps that generates variability such as the spontaneous opening of intracellularCa 2+ channels, synapticCa 2+ channel noise, spontaneous fusion of a vesicle release pathway, spontaneous 14 Figure 2.2: Sequence of events involved in transmission at a typical chemical synapse [70] fusion of a vesicle with the membrane and neurotransmitter-release variability [28], [39], [25], [84], [60]. Koch demonstrated that chemical synapses release transmitters probabilis- tically [53]. Variability in the number of neurotransmitter molecules released per vesicle ( 2000) arises owing to variations in vesicle size [82] and vesicular neurotransmitter 15 concentration [85]. The main source of synaptic variability is neurotransmitter release variability. Neurotransmitters are involved in the chemical processing of the synapse. The release of neurotransmitters is not deterministic and in fact synaptic vesicles release neurotransmitters variably. We will focus on this particular variable mechanism in the research described here. In this thesis, we model neurotransmitter release variability at the circuit level. In particular, a Gaussian noise or chaotic signal is applied to a synapse circuit to vary the neurotransmitter release in an unpredictable manner. Variation in neurotransmitter concentration in the synaptic cleft causes a change in the peak post-synaptic potential. We will talk about modeling neurotransmitter release variability at the circuit level in Section IV.2. 2.3 Ion-Channel Variability One main source of intrinsic variability is ion-channel variability. Nervous systems use the action potential (spike) to send information along axons to other synapses and neurons. The action potential is generated in the neuron by voltage-gated ion channels whose gating behavior is subject to thermodynamic uctuations. The reliability of the spike is an important requirement for encoding, transmitting and computing neural information. The precision of spike arrival times is on the order of 1-10 ms in many species [1], and cortical neurons have specialized to detect coincident arrival of spikes on millisecond timescales [81]. As shown in Figure 2.3, bothNa + andK + ion channels are involved in the production of an action potential. Both Na + and K + channels are opened by depolarizing the 16 membrane, but they respond independently and sequentially. Na + channels open before K + channels. Each Na + channel has two gates, an activation gate and an inactivation gate. In order to have diusion through Na + the channel, both gates must be open. As shown in the gure, at step one (Resting state) the activation gate is closed and the inactivation gate is open. At step two (Depolarization) a stimulus opens the activation gates on some of Na + channels. That allows more Na + ions to diuse into cell. At step three, depolarization opens the activation gates on most Na + channels, while the K + channels activation gates remain closed. Na + ions inside the cell make the inside of the membrane positive with respect to the outside. Once the threshold is crossed, a positive feedback cycle rapidly increases the membrane potential. At step four, the inactivation gates on most Na + channels close and the activation gates on most K + channels open, causing the membrane potential to come back toward the resting potential. As mentioned above, the action potential is generated by voltage-gated ion channels whose gating behavior is subject to thermodynamic uctuations. The Na + and K + channels open and close in a stochastic fashion, following the laws of probability. However, distinct from tossing a coin or a die, the probability of nding the channel close or open is not a xed number but can be modied by some external stimulus, such as the voltage. This stochastic behavior produces random ionic current changes that causes variability in the responses, called ion-channel variability. It is unclear whether ion channels are stochastic or there is an underlying, unknown process that appears to be random. One source of ion-channel variability might be chaos. King studied the diversity of non-linear characteristics of the neuron and synapse and proposed several chaotic models for neural processes [50]. Another source of variability is noise. In this case, individual ion-channels are probabilistic devices [44]. In this thesis, we consider both sources of variability. 17 Figure 2.3: Action potential generation steps [16] Impact of variability depends on the way in which the spike trains may carry infor- mation, the frequency (rate) coding or the temporal coding. In rate coding, the number of spikes per a time period ( the ring rate ) is important and carries information. The ring rate is related to the stimulus intensity, i.e., the ring rate increases with increasing stimulus intensity. In rate coding, we can use statistical averages of many inputs to nd the information and then ion channel variability does not have big impact on the infor- mation. Temporal coding employs the timing of the spikes or the particular ordering of 18 interspike intervals. In this case, because ion channel variability cause jitter in interspike intervals, it has a big impact on the information. Ion-channel variability has impact on the propagation and the initiation of action potential and can introduce uncertainty into threshold properties of action potentials and cause jitter in interspike intervals of repetitive ring. Neuroscientist showed that ion-channel variability can generate variability in the action potential threshold at nodes of ranvier [72]. Also ion-channel variability in the dendrite and in the soma produces membrane potential uctuations that are large enough to aect action potential timing. Variability in the action potential initiation in membrane patches might be because of ion-channel variability [32]. 2.3.1 Spontaneous Firing The strength of the ion channel variability is proportional to the axonal input resistance and it depends inversely on axon diameter (D 3 2 ). In axons less than 0:3m diameter, the input resistance is large enough that opening of just a single Na channel could generate an action potential in the absence of any other inputs. In this case, the neuron res without any input from any synapses. This ring mechanism is called spontaneous ring. In this thesis, we model spontaneous ring at the circuit level. Spontaneous action potentials become exponentially more frequent as axon diameter decreases. Therefore for axons below 0:08-0:10m, the rate of spontaneous ring is too high, which makes axons useless for communication. This lower limit matches the smallest diameters of axons across species. Therefore, ion-channel variability sets the lower limit for the diameter of excitable cell bodies to 3m [34]. Figure 2.4 shows spontaneous ring rate as a function of axon area. 19 Figure 2.4: Spontaneous ring rate as a function of membrane surface area [20] 2.3.2 Ion-Channel Variability in the Hodgkin-Huxley Model Neuronal action potentials, generated by the actions of populations of ion channels can typically be modeled using some variant of the classic phenomenological model of Hodgkin and Huxley. The membrane equation of the Hodgkin-Huxley model describing the squid giant axon is give by the following equation: C dV dt = g Na m 3 h(VE Na ) + g K n 4 (VE K ) + g L (VE L ) (2.1) Where V is the membrane potential, E Na and E K are the sodium and potassium reversal or Nernst potentials,E L is the resting leakage potential for a leakage conductance g L , C is the capacitance and m, h and n are gating variables. If this equation is solved with discrete Markovian ion kinetics instead of the usual continuous rate equations, it could lead to spontaneous generation of action potentials. Figure 2.5 shows the output 20 Figure 2.5: Spontaneous output from the simulation [20] from the simulations for six values of membrane area. As we mentioned previously, the spontaneous rate is high for small membrane areas, which correspond to small numbers of channels. As the membrane area is increased, the rate of spontaneous action potentials approaches zero, the expected rate from the deterministic model [20]. The probability density function (PDF) for the interspike intervals can also be esti- mated by solving the Hodgkin-Huxley equation. The ring of action potential is governed approximately by Poisson process and therefore the interspike intervals has a negative exponential PDF shown in Figure 2.6. 21 Figure 2.6: Interspike intervals from simulation [20] 2.3.3 Ion-Channel Variability in the Integrate-and-Fire Model Ion-channel variability could be studied using the leaky integrate-and-re model, which assumes the neuron is a leaky capacitor driven by a current which simulates the actual synaptic inputs. A noise term is added to the equation to represent several internal sources of noise such as ion channel noise and synaptic noise to obtain a system described by the following Langevin equation: C dV dt =gV +I(t) +(t) (2.2) Where C is the cell membrane capacitance, V is the membrane potential, gV is the leakage term, g is a conductance, I(t) is the input current and (t) is Gaussian noise. When the potential reaches the threshold V 0 an action potential is generated and the system returns to the equilibrium potential V e [18]. 22 Figure 2.7: 500 responses to the constant and uctuating inputs [18] Figure 2.7 shows the results of a numerical simulation of Equation 2.2 in response to two dierent signals, in both cased in the presence of internal noise. Because of ion channel variability applied to the integrate-and-re model, there is a trial-to-trial variability in spike timing over 500 trials. When I(t) contains high-frequency components, spikes are clustered more tightly to the upward strokes of the input and overall there is less trial-trial variability than in constant input case. This is actually an advantage of having synaptic variability in the input. We will discuss in more detail in Section V.1. 2.3.4 Ion-Channel Variability at the Circuit Level In order to model ion-channel variability at the circuit level, a variable control signal is applied to the axon hillock circuit. The variable signal could force the neuron to re if the variability strength were sucient or could prevent the neuron from ring. The variable signal is independent of the post-synaptic potential. When there is no post- synaptic potential applied to the axon hillock (the cell membrane is at resting potential), 23 the variable signal forcing the neuron to re in fact models spontaneous ring of the neuron. We will talk about the circuit in more details in chapter IV.3. 24 Chapter 3: Chaotic Signal Generator 3.1 Introduction As we mentioned in the previous chapter, the source of synaptic variability or ion-channel variability could be either noise or chaos. In this thesis, both possible sources of vari- ability are considered by embedding either Gaussian noise or a chaotic signal into the neuromorphic cortical neuron circuit. Therefore we need to design two circuits, a noise generator circuit and a chaotic signal generator circuit. Designing a good noise generator is not an easy task and is still a major research topic. The core of any kind of random number generator must be an intrinsically random physical process. So random generator designs range from tossing a coin, throwing a dice, drawing from a urn, drawing from a deck of cards and spinning a roulette to measuring thermal noise from a resistor and shot noise from a Zener diode or a vacuum tube, measuring radioactive decay from a radioactive source, integrating dark current from a metal insulator semiconductor capacitor, detecting locations of photo events and sampling a stable high frequency oscillator with an unstable low frequency clock. The hard part of these designs is that how to convert the noisiness of a physical process into a sequence of random numbers without suering from the random and uncontrollable appearance of the random physical process and consequently introducing biases into the binary sequence. 25 Two well-known random generator designs are pseudo random generators using LFSR (Linear Feedback Shift Register) and chaos-based random generators. Since the seventies, the use of chaotic dynamics for the generation of random sequences has raised a lot of interest. Several authors have already proposed to use chaotic systems as sources of physical randomness [79]. Beside the chaos-based random generator, LFSR is well known because of its sim- plicity. The output of LFSR is not complete random signal but it's pseudo random, it's because the output signal repeats after a certain time, in fact the signal has a period. In LFSR, the period of sequence is 2 n 1 where n is the number of stages or number of ip- ops in the circuit. In order to get a signal closer to real random signal, the period needs to be a big number and then that requires a big number of ip- ops in the design. In order to get approximately the same amount of randomness as the chaos-based ran- dom generator, the number of ip- ops need to be more than 15. Assume the number of transistors in each ip- op is at least 10, then the total number of transistors in the LFSR design would be 150. However in chaos-based random number generator, there is less than 40 transistors. Therefore the chaos-based random generator has much less transistors that makes the size much smaller than the LFSR circuit and also the total power in chaotic generator is lower. For our cortical neuron circuit, we require a noise generator circuit and a chaotic signal generator circuit. For a noise generator circuit, we use the chaos-based random generator. Therefore we only need to design a chaotic signal generator circuit. The output of that circuit could be applied directly to the cortical neuron circuit as a chaotic signal. Using the idea of chaos-based random number generators, we could generate random numbers after some data processing at the output of the chaotic signal generator. 26 In this chapter, we design a chaotic signal generator using carbon nanotube transistors. We rst discuss dierent types of chaotic map and choose one for this design, the we talk about the design at the transistor level and improve the design to get better performance. 3.2 Chaotic Map It is common to use a chaotic map for pseudo-random number generation. A chaotic map is in fact one or more nonlinear or piecewise linear function and it is a nonlinear mapping, while most of the conventional random generators are linear (such as LFSR). Therefore it is more complex and becomes a potential candidate of perfect random source. Some of chaotic maps that commonly used for the random number generator designs are Logistic map, Chebyshev map, Skewed tent map, and Sawtooth map , etc. Among chaotic maps, we use a chaotic piecewise linear one-dimensional map because the implementation of this map can be done simply using switched capacitor or switch current circuits. We implement the design by switched current circuits, which can operate at high frequencies [26]. The chaotic map used in the design is described by the following recurrence relation- ship: x n+1 =f(x n ) = 8 > > < > > : ax n +b :x n < 0 ax n b :x n 0 (3.1) wherex i is theith sample of the generated sequence, anda andb are oating numbers. As shown in Figure 3.1, the transfer function is mapping any2 b a < x n < 2 b a tob < x n+1 < b. Therefore if a < 1, x n converges and if a > 2, x n diverges. Hence a must be 27 Figure 3.1: Transfer function for the chaotic map in the range of [1; 2] to ensure the output x n , is in the range of [b;b]. For a closer to 2, there is less redundancy and a better chaotic signal generator; however whena gets closer to 2, there is a higher risk of appearance of periodic attractors and of breakdown of the chaotic signal generator [78]. In this map, x n is a chaotic signal, however if a random binary number is desired, based on the sign of x n positive or negative, a binary one or zero is generated. We analyzed this chaotic signal generator in MATLAB. Figure 3.2 shows a sequence of x n for a = 1:9, b = 1, and x 0 = 0:9. In order to conrm the RNG designed here is an acceptable random generator, we did random property analysis (Bernoulli test) in 28 Figure 3.2: Sequence of x n for a = 1:9, b = 1, and x 0 = 0:9 MATLAB. We analyzed the number of ones and zeros and runs in the sequence of 100; 000 bits of output. 3.3 Chaotic Signal Generator Circuit Several authors have designed a chaotic signal generator in CMOS technology [79] using the diagram shown in Figure 3.3. We used their design and then converted the circuit to carbon nanotube technology. The design has two blocks: the PWL function block and the Delay and Scaling block. Figure 3.4 shows the delay and scaling block at the transistor level. We use carbon nanotube (CNT) devices for the transistors. In this device, instead of length and width as in CMOS, we determine the length of doped CNT source-side and drain-side extension 29 Figure 3.3: Block diagram of the chaotic generator circuit regions (Lss and Ldd) and the number of parallel nanotubes in the device in order to control the current. The circuit is in fact a current mirror with 3 branches. We adjust resistor R1 and the geometric factors of transistors to have the ratio between currents in dierent branches as follows: I(M5) = 2I(M3) (3.2) I(M4) =I(M6) (3.3) I(M8) =a:I(M3) (3.4) I(M9) =a:I(M7) (3.5) When the input current I in is injected, we have the following equations: I(M6) =M(M4) =I in +I(M3) (3.6) I(M7) =I(M5)I(M6) =Iin +I(M3) (3.7) I(M9) =a:I(M7) =a:I in +a:I(M3) (3.8) 30 Figure 3.4: Delay and Scaling block at the transistor level I out =I(M8)I(M9) =a:I(M3) +a:I in a:I(M3) =a:I in (3.9) Therefore the output current is the input current scaled by a factor of a. M10, M11, M12 and M13 are two switches that work with non-overlapping clocks. This part of the circuit is in fact causing a delay from the input current to the output current. In this design, all transistors must be in the saturation region. Otherwise the current mirrors do not work properly. This condition is even harder to satisfy in carbon nanotube technology compared to the same design in CMOS technology, because V dd in Carbon nanotube technology (900mv) for the technology we simulated [27] is smaller than V dd in CMOS technology and the drain to source voltages for transistors in series are lower. 31 This condition also must be valid for any input current. In this design, the input current is changing from10A to 10A. That changes the drain voltage of M3, however the source and gate voltage of M3 are xed. Therefore M3 is at risk of going to the triode region. To make sure thatM3 always stays in the saturation region, it is better to adjust the gate source voltage of M3 very close to the threshold voltage. Other transistors such asM5,M6,M8 andM9 are at risk of leaving saturation too; and that makes the design dicult to work properly. Another drawback of this design is the problem of channel length modulation. Even if a transistor is in saturation, the drain current is not independent of the drain source voltage V ds . Therefore even if the geometric factors of transistors in the current mirror are xed the current might change. For example the current inM5 changes slightly when the input current I in and then the drain voltage of M5 change. In some cases, we can ignore this small change of current, however we have to make the design robust for high input current. That would be done by making the range of input currents narrower. M10 and M12 behave like switches. We must make the number of tubes in these transistors big enough to behave like ideal switches. Figure 3.5, shows the PWL function. In this design M14, M15, M16 and M17 are two current mirrors. We adjusted the geometric of the transistors to have current of b in M16 and M17. When the input current to this block is positive, the drain voltage of M18 and M19 increases. This node is also input of the rst inverter in three cascaded inverters. Therefore the outputs of these inverters are going to be zero, one and zero respectfully. So forI in > 0,M19,M21,M22 andM25 are ON andM18,M20,M23 and M24 are OFF. 32 Figure 3.5: PWL function at the transistor level I(M25) +I(M19) =I(M17) (3.10) I out =I(M25) =I(M19)I(M17) =I in b (3.11) In this case, the circuit is generating binary one at the binary output, and the output current of this block as shown above is equal to the input current minus current b. When 33 the input current is negative the drain voltage of M18 and M19 decreases. Therefore M18, M20, M23 and M24 are ON and M19, M21, M22 and M25 are OFF. I(M24) +I(M18) =I(M16) (3.12) I out =I(M24) =I(M18) +I(M16) =I in +b (3.13) The binary output in this case becomes low (binary zero) and the output current is the summation of the input current and constant current b. We rst simulated the circuit for the PWL function separately. In this case, the input current is a current source sweeping from19A to 19A. (We dened a = 1:9 and b = 10A). When we changed the input current from 1A to 10A, the voltage at node 3, is changing from 206mv to 697mv. As we mentioned earlier,M17 must be in saturation mode in order to get a constant current of 10A when its drain voltage is changing from 206mv to 697mv. Therefore we assigned the gate source voltage of M17 very close to the threshold voltage and also made the number of tubes a big number. In this case, the current of M17 is 10A 1%. Figure 3.6 shows the output current versus the input current for the PWL function block. When the input current is positive and increasing from 1A to 10A, the output current rises linearly from9A to 0A as expected. However when the input current goes above 10A, the output current becomes positive and does not follow a linear change as expected. When the output current switches from negative to positive, current goes from node 3 to node 6 and in fact the source and drain of the transistorM25 are reversed. In this case, voltages at some nodes (node 3) are even more than more than V dd , which forces the circuit to work improperly. 34 Figure 3.6: Output current vs. input current for PWL function when I in > 0 We have the same problem for the negative input current. When the input current changes from1A to10A, the output current falls linearly from 9A to 0A. However when the input current goes below10A, the output current switches from positive to negative and its change is not linear any more. Now the question is that if there is any way we can limit the input current to the PWL function block from [19A; 19A] to [10A; 10A] and the circuit still works properly. 35 Figure 3.7: Block diagram for the improved switched current circuit 3.4 Improved Chaotic Signal Generator Circuit As shown in Figure 3.7, by limiting the input current to the PWL function block from [19A; 19A] to [10A; 10A] the circuit operates properly. We divide the output current of the scaled delay block and send one half to the PWL function block and add other half to the output of the PWL function block. Since the PWL function block is a linear function and its gain is one it does not matter if we send half of current to the input and then we add the other half to the output of this block. At the transistor level, the scaled delay circuit changes as shown in Figure 3.8. We made the width ofM8 andM9 half to get half the current and then addedM26 andM27 that are the same size as M8 andM9. There is no change in the PWL function block at the transistor level. We simulated the PWL function block for the input current from9:5A to 9:5A and got approximately linear results, as shown in Figure 3.9. 36 Figure 3.8: Scaled delay block at the transistor level Figure 3.9: Output vs. input current for PWL function 37 3.5 Circuit Adjustment The chaotic generator circuit generates a current between9:5A and 9:5A at I out1 (shown in Figure 3.8) in a chaotic manner based on Equation (3.1). In order to use this chaotic current as an input to the neuron circuit, we converted the chaotic current to a chaotic voltage. Then we adjusted the voltage level and the frequency of samples. After the adjustment, the chaotic voltage is described by the following recurrence relationship : v n+1 = 8 > > < > > : 1:9v n + 0:5V pp 0:9V mid :v n <V mid 1:9v n 0:5V pp 0:9V mid :v n V mid (3.14) whereV pp is peak to peak voltage of the chaotic signal andV mid is the mid point voltage. In order to convert the output current to a chaotic voltage, we can use a small resistor in the output of I out1 and then a comparator senses the voltage dropping along this resistor. Then we use an amplier or attenuator to change the peak to peak voltage. We add or remove DC to the chaotic voltage to adjust the mid point. In the reminder of this thesis, we characterize the chaotic voltage applied to the neuron circuit by four parameters, V pp , V mid , V init (initial condition) and the period of each sample. For example, Figure 3.10 shows a chaotic voltage with V pp = 400mv, V mid = 300mv, V init = 480mv and period of each sample= 10nsec. 38 Figure 3.10: Chaotic voltage withV pp = 400mv,V mid = 300mv,V init = 480mv and period of each sample= 10nsec. 39 Chapter 4: Circuit Implementation of Intrinsic Variability 4.1 Introduction As mentioned earlier, the main goal of this project is modeling the intrinsic variability in neural networks at the circuit level and demonstrating the value of variability in behavior of neural networks. There is great deal of research about variability in the nervous system; however there is no circuit implementation of the neurons with included variability. In this chapter, we rst explain the circuit implementation of the synapse and axon hillock with no variability included, designed in Parker's BioRC group. Then we discuss how we can include ion-channel variability and synaptic variability in the cortical neuron circuit. Figure 4.1 shows a simplied model for a cortical neuron, consisting of synapses, the dendritic arbor and the axon hillock. Excitatory and inhibitory synapses receive action potentials from pre-synaptic neurons and transfer a post-synaptic potential to the dendritic arbor. The dendritic arbor consists of several adders that add all signals coming from synapses and generate one single signal. The arrangement of the adders and their linearities vary depending on the individual neuron and the synaptic location. The amplitude of the dendritic arbor potential signal is compared with the threshold voltage 40 Figure 4.1: A system block diagram of the cortical neuron model with a pyramidal neuron cartoon [47]. in the axon hillock. If it is more than the threshold, a spike is generated at the output of the axon hillock. Otherwise there is no spike at the output. Variability is not included in any part of this model. In order to include the intrinsic variability to the model, we will include neurotransmitter-release variability in the synapse circuit and ion-channel variability in the axon hillock circuit. Other aspects of neural activity could also exhibit variable behavior but the two aspects mentioned here are thought to be the most likely sources of variability. 41 4.2 Neurotransmitter-Release Variability in the Synapse Cir- cuit Figure 4.2 shows a BioRC carbon nanotube excitatory and inhibitory synapse circuits [48]. This circuit models cell potentials and neurotransmitter concentrations with voltages, with a correspondence between circuit elements and biological mechanisms. In Figure 4.2a, the excitatory synapse circuit, the action potential turns M5 ON. M7 and M6 are already ON. Therefore the voltage at the source of M5 or gate of M7 increases until M7 turns OFF, limiting the rise of voltage at the synaptic cleft. M6 controls the neurotransmitter concentration in the cleft. M3 is modeling the reuptake process. The inverter containing M1 and M2 generates a delay between the action potential and the neurotransmitter reuptake. The neurotransmitter release causes the ion channels to open. This depolarization is modeled by M13. The pull-up transistor in the neurotransmitter section (M6) modulates the neurotransmitter concentration in the synaptic cleft (the voltage at the synaptic cleft node). The voltage at the gate labeled Neurotransmitter controls the neurotransmitter release. This causes a change in the EPSP peak amplitude, directly altering the synapse strength. In the circuit with no variability, the voltage applied to the gate labeled Neurotrans- mitter could be a xed biasing voltage or the voltage could vary as the result of some retrograde process in the synapse arising in the postsynaptic neuron. In the case de- scribed in this thesis, the gate voltage is either an analog noise signal or a chaotic signal that makes the neurotransmitter release variable. This variable input causes the peak amplitude of the EPSP to be variable and varies the synapse strength stochastically or 42 Figure 4.2: The carbon nanotube a) excitatory and b) inhibitory synapses where R rep- resents Reuptake [48]. 43 Figure 4.3: Input and output of the excitatory synapse a. no variability b. with variability included. chaotically. Similarly, we produce variability in a BioRC inhibitory synapse. Figure 4.3 shows the result of the synapse circuit with and without variability. We simulated the carbon nanotube cortical neuron consisting of three excitatory synapses with dierent strengths, one inhibitory synapse, a dendritic arbor and the axon- hillock circuits in SPICE. We performed several experiments for neurotransmitter-release variability. First we simulated the neuron with no variability. A spike was applied to the four synapses at the same time. The neurotransmitter concentration voltage controls for excitatory synapses one, two and three are 850mv, 700mv and 550mv respectively. Therefore three excitatory synapses had dierent strengths (dierent peak EPSPs). The 44 Figure 4.4: Input spike, PSP's, dendritic output and output spike for the neuron with no variability included. neurotransmitter concentration voltage control for the inhibitory synapse is 700mv. Fig- ure 4.4 shows the result. As shown in the gure, the EPSP for excitatory synapse one has the highest peak since this is the strongest synapse and then the PSPs for the other synapses are in order based on their strength. The EPSP peak for the second spike is slightly more than the rst EPSP peak and so on. The reason is that the rst spike is applied when the neuron is at the resting potential. However the second and third spikes are applied to the neuron before the neuron goes back to the resting potential, so there is temporal summation. The output of the dendrite is a summation of all PSPs and when it crosses the threshold voltage (170mv), the neuron res. As shown in the gure, the neuron res when all three 45 excitatory PSP's peak. If we apply a spike to only one synapse instead of applying to all synapses, the dendritic output is not sucient to re the neuron. We included neurotransmitter-release variability and calculated the probability of ring. For example, we applied a spike to the rst synapse and instead of 850mv xed biasing voltage for neurotransmitter control voltage, we included a Gaussian voltage with mean = 850mv and standard deviation changing from = 0mv to = 500mv and period of each sample 10ps. We assume all other synapses have no spikes as inputs. The probability of ring is shown in Figure 4.5. We changed the standard deviation from 0mv to 500mv with step size of 25mv. For each standard deviation we ran the SPICE and MATLAB experiments 100 times with 100 dierent Gaussian samples. Among 100 experiments, we counted how many times the neuron res. The probability is calculated by dividing the number of experiments that the neuron res by 100. When = 0mv, meaning there is no synaptic variability, the PSP from one synapse is not strong enough to re the neuron and then the probability is zero. When we allowed synaptic variability and increase the standard deviation, the probability increases. Since synapse one is stronger, the PSP generated from synapse one is closer to the threshold voltage and therefore the probability for variability in neurotransmitter release for synapse one causing the neuron to re is higher. Synapse three generates a PSP that is much smaller than the threshold and even a strong synaptic variability cannot help the neuron to re without spikes at other synapses in the dendritic arbor. If we assume the PSP signal is Gaussian, we can calculate the probability of ring using cumulative distribution function of Gaussian and get similar results to do curve matching. This assumption might not be necessarily true. The following function shows 46 Figure 4.5: Probability of ring when a Gaussian voltage is included for neurotransmitter release control, resulting in synaptic variability, only one synapse has spike at the input. the probability of ring for dierent PSP values. Each PSP value is a Gaussian ran- dom number. Figure 4.6 shows the result of plotting the following function versus in MATLAB. p(V PSP >V th ) = 0:5 0:5erf( V th p 2 2 ) (4.1) where is the mean of the PSP and is the standard deviation of the PSP signal, and substituting the non-variable PSP values for the three synapses as the means of the PSP's. Based on this function, we can conclude that the maximum probability is 0:5 if we include a strong Gaussian neurotransmitter release control (large mean). We did a similar experiment but, instead of Gaussian release control, we used a chaotic control mechanism. V pp changes from 0mv to 1000mv. V mid is the same as the xed 47 Figure 4.6: Probability of ring when a Gaussian voltage is included for neurotransmitter release control, resulting in synaptic variability, only one synapse has spike at the input (MATLAB simulation). biasing voltage for neurotransmitter control voltage and V init changes from V mid V pp =2 to V mid +V pp =2. As shown in Figure 4.7, traces with the same color are the results for one synapse with dierent initial conditions. When V pp is low, the results for dierent initial conditions are almost the same, however for higher V pp , when the PSP is close to the threshold, the results are dierent, meaning that the eect of the initial condition on the probability of ring increases for PSP's close to the threshold voltage. We applied a spike to all four synapses and included neurotransmitter-release vari- ability in just one of them each time. We included a Gaussian neurotransmitter-release variability with = 850mv and = 0mv to = 200mv in the rst synapse and cal- culated the probability of ring. The mean of the Gaussian signal is the same as the 48 Figure 4.7: Probability of ring when a chaotic signal is included for neurotransmitter release control, resulting in synaptic variability. xed biasing voltage in the no-variability experiment because the Gaussian signal has a symmetric variation around this voltage, meaning that the probability of being more than 850mv is the same as being less (50%). The dendritic output has a variable amplitude and therefore the neuron sometimes res and sometimes not depending on the dendritic output amplitude. We calculated the probability of ring. We did the same experiments for the variability in synapse two, three and the inhibitory synapse. Figure 4.8 shows the probability of ring when the synaptic variability is included in the excitatory and inhibitory synapses. All probabilities start from 1 because when there is no variability included, the peak of dendritic output (173:8mv) is slightly more than the threshold volt- age (170mv) and the neuron res as shown in Figure 4.4. When the synaptic variability is included, the variability could push the output of the dendrite below the threshold. 49 Therefore the probability goes down from 1 to 0:5. The probability of ring when the variability in the inhibitory synapse is the lowest, meaning that the inhibitory synapse is more sensitive to neurotransmitter-release variability than the excitatory synapse. Also by comparing the probability for three synapses, we can conclude that the neuron is more sensitive to neurotransmitter-release variability included in a weak synapse as compared to variability included in a strong synapse. We did another experiment similar to the previous one but we included the variability in two synapses; synapse one and synapse two. Gaussian signals included in two synapses have the same standard deviation and could be correlated. Figure 4.9 shows the prob- ability of ring versus standard deviation in the MATLAB simulation. As shown in the gure, for higher correlation factor (), the probability is lower, meaning when the vari- ability of synapses is correlated, the variability is more eective and changes the ring probability more. We can verify this statement by calculating the standard deviation of a Gaussian signal which is a summation of two Gaussian signals. The standard deviation is as follows: PSP = q 2 PSP1 + 2 PSP2 + 2:: PSP1 : PSP2 (4.2) As shown in the above equation, higher correlation factor means higher standard deviation and stronger Gaussian signal. 4.3 Ion-Channel Variability in the Axon Hillock Circuit Ion-channel variability is included into the axon hillock module which contains the input stage and the spike generation stage. The input stage consists of an amplier and two inverters in cascade. It receives summation of EPSP's and IPSP's from the dendrite arbor 50 Figure 4.8: Probability of ring when neurotransmitter-release variability is included in one synapse, SPICE results (top), MATLAB results (bottom). 51 Figure 4.9: Probability of ring when neurotransmitter-release variability is included in two synapses, MATLAB results. and after amplifying the signal and shaping the pulse, sends a rising edge to the spike generation stage circuit. The spike generation circuit shown in Figure 4.10 behaves like a self-resetting CMOS circuit. If the rising edge output from the input stage of the axon hillock crosses the threshold voltage, X2 turns ON. X1 is already ON in the resting state. Therefore X1 and X2 pull the gate voltage of X8 low, opening the Na + ion channel. The output rises quickly to Vdd. As soon as the output goes high, the circuit resets itself; that means X4 turns ON, X1 turns OFF and nally X8 and X7 turn OFF and ON respectively. X7 is a model for K + ion channels. When X7 is ON that means K + ion channels open and the 52 Figure 4.10: Spike generation circuit in the axon hillock module [47] output returns to the low voltage (the resting potential). The width of the output spike is controlled by the delay of the inverter providing input to X7. This spike-generation model can be used with no variability with ion-channel inputs controlled by the output of a simple amplier (only necessary because the simplied ex- perimental BioRC neurons used here have a small quantity of synapses) followed by an 53 inverter pair that creates a fast rising edge required by the self-resetting circuit to gener- ate a spike ; i.e., opening and closing ion channels modeled by X8 and X7 is deterministic because the controls for these two channels have no variability. In order to include vari- ability, we added X9 and X12 to the original input-stage circuit (Figure 4.11) and applied a Gaussian noise or chaotic signal to the gates of these transistors (labeled variability). The variable signal could prevent the neuron from ring or force the neuron to re de- pending on the level of variability. When the variable signal is very small, X9 turns OFF which could prevent the neuron from ring. When the variability signal rises to around 450mv, X9 turns ON and the circuit behaves like the rst inverter in the circuit with no variability. When the variability signal increases suciently, X12 turns ON and pulls down the output, sending a rising edge to the spike-generation circuit and the neuron res. In this case, the variable signal forces the neuron to re if the variability amplitude is sucient. With a moderate variability input, the neuron res reliably depending on the membrane potential. We also show how the variability changes the threshold voltage for ring. When the variable signal is low, it pushes the threshold voltage for ring close to V dd . In this case, input to the axon that caused ring in the non-variable neuron is not able to cross the threshold and the variability prevents neuron from ring. By increasing the variable signal, the threshold voltage is reduced and causes the neuron to re even with low input, while the non-variable neuron would not re. For a very high variability, the variable neuron res with no input, which models spontaneous ring. Figure 4.12 shows the result for the experiments including ion-channel variability in the axon. We included a chaotic voltage signal with V mid = 450mv, V pp = 900mv and period of each sample 10ps (green). At time 130ps, PSP applied to the axon (purple) 54 Figure 4.11: Axon hillock input stage with included variability has an amplitude (200mv) more than the threshold voltage (170mv) and the neuron is expected to re however because of the low chaotic signal at time 130ps, the neuron does not re (blue). In fact in this case, variability prevents the neuron from ring. At time 730ps, the PSP is 150mv less than the threshold voltage, but the chaotic signal at that time is strong enough to help the PSP to cross the threshold and then the neuron res. At time 400ps, the PSP is 0v, but variability forces the neuron to re which models spontaneous ring. We included Gaussian noise for the variability signal with = 450mv and = 100mv in the axon. We applied several PSP samples with amplitude changing from 0mv to 340mv with 10mv step size. For each PSP sample, we simulated the circuit 100 times with 100 Gaussian samples for ion-channel variability and summed the ring occurrences over 100 runs. We calculated the probability of ring for each PSP sample roughly by dividing the number of ring occurrences by 100. We repeated the experiment for Gaussian noise with dierent standard deviation ( = 200mv) and also for two types of chaotic signals, 55 Figure 4.12: Input and output in the axon hillock with variability included (chaotic signal). one with V mid = 450mv and V pp = 350mv, and another one with V mid = 450mv and V pp = 600mv. Figure 4.13 shows the result. The orange trace shows the result for the circuit with no variability. The threshold voltage is about 165mv. When PSP changes from below the threshold to more than the threshold the probability for the circuit with no variability jumps from zero to one. When small variability is included (green and blue traces), the probability changes slower around the threshold voltage than the probability for the result with no variability. When variability is larger (purple and red traces), the probability changes much slower around the threshold voltage. The results in Figure 4.13 are approximately symmetric around the threshold because the mean of both the Gaussian signal and also the mid-point of the chaotic voltage are 450mv. Therefore the probability of variability forcing the neuron to re is the same as the probability of preventing the neuron from ring. We ran the circuit for Gaussian noise 56 Figure 4.13: Probability of ring vs PSP amplitudes for four types of variability. with dierent mean ( = 750mv and = 100mv). Figure 4.14 shows the result. When the mean of the Gaussian signal is higher, for the same PSP amplitude, the probability of ring is higher meaning that the variability is more likely to force the neuron to re rather than to prevent the neuron from ring. The probability of spontaneous ring (when PSP is zero) is higher when the mean of the Gaussian noise is higher. 57 Figure 4.14: Probability of ring vs PSP amplitudes for Gaussian with two dierent mean values. 58 Chapter 5: Impact of the Variability As we mentioned in Section I.2, recent studies indicate that variability might oer distinct advantages. In this chapter we discuss three impacts of the variability. we demonstrate how neurotransmitter-release variability could play a benecial role leading to increase reliability of neuronal ring in single neurons. Then we illustrate how, in one particular case, ion-channel variability could play a benecial role in the reliability of transferring a train of spikes. We nally show how ion-channel variability can halt innite looping behavior with positive feedback in a neural network such as those thought to occur in obsessive-compulsive disorder. 5.1 Reliability of Spike Timing The nervous system sends information using trains of action potentials. Neuroscientists have long debated how spike trains carry information. There are two main approaches in which the spike trains may carry information: the rate coding and the temporal coding. In temporal coding the exact timing of the spikes are important and carry information. Therefore temporal coding involves precise patterns of spikes and in this case, the reli- ability of spike timing has an important impact on the information content of the spike train. 59 As we mentioned in Section I.2, several researchers such as Mainen and Sejnowski studied reliability of spike timing in the neuron. Their research ndings have encouraged us to examine spike timing characteristics with neurotransmitter-release variability in our neuromorphic circuit simulations. In this section, we demonstrate how neurotransmitter-release variability could play a constructive role leading to increase reliability of neuronal ring in single neurons. A postsynaptic potential (PSP) with no neurotransmitter-release variability is applied to an axon hillock SPICE simulation model and leads to a train of spikes in the simulation. Ion-channel variability is included in the axon hillock. Trial-to-trial results are studied. Then neurotransmitter-release variability is included in the synapse circuit and a PSP with neurotransmitter-release variability is applied to the axon. Trial-to-trial results are compared with the previous results. The timing of the spikes drifted from one trial to the next and the amount of drift shows how timing-reliable and timing-reproducible spikes are. The results for PSP with variability are more timing reliable than the PSP with no variability. We did an experiment without neurotransmitter-release variability. We applied a constant total PSP to the axon hillock. The amplitude of the total PSP (175mv) was xed over time and is slightly more than the threshold voltage (170mv). We included ion-channel variability in the axon hillock. For ion-channel variability, we included a Gaussian control of ion channels with = 450mv and = 360mv. The result is shown in Figure 5.1. Since the total PSP is more than the threshold, the neuron keeps spiking. The refractory period is 100psec meaning that when each spike happens, the neuron will not re for 100psec. Since the spike times depend on the levels of ion-channel variability, the spikes for three experiments (red, blue and green) have dierent spike times. 60 Figure 5.1: PSP and output spike for three experiments. We characterize spike timing reliability with a variable called Reliability. We ran the experiment 100 times to get a raster plot. This quantity is formed from the number of spikes in 100 trials of raster plot that occurred at approximately the same time divided by the total number of spikes (100 spikes). This quantity is therefore between 0 and 1. For example, when the neuron res with perfect spike timing reliability, all spikes in 100 trials of the raster plot occur at the same time; therefore the reliability is 1. Figure 5.2 shows the reliability of spike timing for constant PSP versus standard deviation of ion-channel variability. For ion-channel variability, we included Gaussian variability with = 450mv and from 0mv to 0:5v. When there is no ion-channel variability ( = 0) reliability is 1. When ion-channel variability increases the reliability decreases as expected. The spike times change randomly depending on the ion-channel variability level. 61 Figure 5.2: Reliability of spike timing for constant PSP vs. standard deviation of ion- channel variability. Figure 5.3 shows the reliability of spike timing for constant PSP versus the PSP amplitude. When the PSP amplitude increases, it overcomes the impact of ion-channel variability on spike times and therefore the reliability increases. Now we include neurotransmitter-release variability (Gaussian with = 0:6v and from 2:8v) in the synapse circuit. The PSP amplitude changes randomly. Figure 5.4 shows a raster plot of 100 trials for spike times for two types of PSP: a constant PSP and a PSP with synaptic variability included. The reliability of spike timing for the noiseless PSP is 0.105 and for the noisy PSP is 0.412, meaning when variable PSP is applied to the axon, spike times in the neuron are more reliable. This situation happens when 62 Figure 5.3: Reliability of spike timing for constant PSP vs. PSP amplitude synaptic variability is stronger than ion-channel variability (in Figure 5.4, (ion-channel variability)= 0:2v and (synaptic variability)= 2:8v). Figure 5.5 shows the reliability of spike timing for variable PSP versus standard deviation of synaptic variability for three dierent standard deviations of ion-channel variability. When the synaptic variability standard deviation increases, the reliability increases. This is because in order to have higher reliability, synaptic variability must be stronger than ion channel variability, as explained below. When the PSP is constant and has amplitude around the threshold voltage, ion- channel variability denes spike times across the trials and therefore the neuron generates 63 Figure 5.4: 100 trials raster plot for spike times for constant and variable PSP (stimulus). unreliable or irreproducible spikes. If the PSP amplitude is much higher than the thresh- old, the neuron res at the same time in all trials and a low ion channel variability can not 64 Figure 5.5: Reliability of spike timing for variable PSP vs. standard deviation of synaptic variability for three dierent standard deviations of ion-channel variability. change spike times. The same situation happens for the PSP amplitude much lower than the threshold; ion-channel variability does not have a considerable impact on spike timing. However when the PSP amplitude is around the threshold, spike times are mainly dened by ion-channel variability and therefore the neuron generates unreliable spikes across the trials. Now a strong synaptic variability is included in the synapse and makes PSP am- plitude strongly variable. Variability in PSP is frozen, meaning that the same variable PSP waveform is used repeatedly across all trials. For example in the raster plot shown in Figure 5.4, the same Gaussian signal was used for all 100 trials. In this case, high synaptic variability moves PSP far away from the threshold, either much more than the threshold 65 or much less than the threshold, and since it's frozen, the neuron generates reliable spikes. Therefore PSP with synaptic variability generates reliable spiking, provided the amplitude of the synaptic variability is large enough to overcome the ion-channel variability. We can obtain similar results to experimental results in Figure 5.5 by probabilistic calculation of the Gaussian distribution. As we mentioned earlier, in order to get reliable spike timing, the PSP amplitude must be much higher or much lower than the threshold voltage. If we assume variable PSP is a Gaussian signal with mean equal to the threshold voltage, the probability of PSP to be far away from the threshold voltage is: p(;) = 1p(jV PSP V th j<) = 1erf( p 2 2 ) (5.1) where is the standard deviation of PSP. is a quantity to determine how far the PSP amplitude is from the threshold voltage. This quantity changes based on the standard deviation of the ion-channel variability. We plot this function by changing the standard deviation of the PSP from 0v to 6v for three dierent as shown in Figure 5.6. The results are approximately the same as results in Figure 5.5. 5.2 Reliability of Spike Train Propagation The research described in this thesis demonstrates how ion-channel variability could play a benecial role in neural coding when a train of spikes contains information based on temporal coding and arrives at a neuron possessing ion-channel variability. Trains of spikes arrive at the neuron from several presynaptic neurons, with multiple spike trains arriving from a single neuron. Multiple synapses originating at the same presynaptic neuron 66 Figure 5.6: Reliability of spike timing for variable PSP vs. standard deviation of synaptic variability for three dierent standard deviations of ion-channel variability, MATLAB simulation. provide stronger signals as well as some redundancy. If any spike in any duplicated train is missing because of variability in any parts of the network such as variability in presynaptic neurons, ion-channel variability could help to recover that missed spike. Small amounts of ion-channel variability included in the cortical neuron circuit can allow weak signals (the total PSP's coming from several synapses) to re the neuron and, in essence, recover the missed spike. However too strong variability could be detrimental to transferring the train, as we show. Therefore ion-channel variability at a certain level helps the neuron to transfer a train of spike reliably, even when the input sequence is noisy (missing spikes). Of course, if there is variability in non-duplicated spike trains then ion-channel variability 67 could also force ring or prevent ring, and it is not clear whether ion-channel variability plays a benecial role in this case. One impact of ion-channel variability is that the ion-channel variability could help the neuron to re when the neuron is expected to re but, because the PSP amplitude is slightly less than the threshold voltage, possibly due to synaptic variability, the neuron does not re. In order to demonstrate this impact of the ion-channel variability at the circuit level by simulating our cortical neuron circuit, we devised a specic example that illustrates the impact of the variability. The impact of variability could be demonstrated also with other examples. In our example, we used the carbon nanotube cortical neuron with eight excitatory synapses, a dendritic-arbor and the axon hillock circuits shown in Figure 5.7. All synapses are identical with the neurotransmitter concentration voltage control 580mv. We applied a train of spikes to all synapses. We assume that the spikes arriving at all synapses are synchronized and are sent from the same presynaptic neuron. The arrival times would in practice be unsynchronized unless all spikes originated at a single source. However, temporal summation allows PSPs from unsynchronized spikes arriving at dierent synapses to sum, and in many cases, synaptic strength is heightened with parallel synapses to the same presynaptic neuron. If the spikes arrive at all synapses synchronously, the output of dendrite is 239mv more than the threshold voltage (180mv). If the spikes arrive only at seven, six or ve synapses synchronously, the outputs of dendrite are 208:1mv, 178:2mv and 147:7mv. Our assumption is that when the spikes arrive at at least ve of the eight synapses, the neuron is expected to re. When the spikes arrive at eight or seven synapses, the neuron res since the output of dendrite is more than the threshold voltage. However when the spikes arrive only at six or ve synapses, the output of the dendrite is slightly less than the threshold voltage. In this case, ion-channel 68 Figure 5.7: The cortical neuron simulated with eight excitatory synapses in SPICE. variability could lower the threshold slightly, causing the neuron to re. However the variability might also prevent the neuron from ring even when the spikes arrive at seven or eight synapses by increasing the ring threshold. In the following experiments, we will show the impact of variability with the simulation results. Figure 5.8 shows the input to the synapse, the output of the dendrite and the output of the axon hillock when a train of three spikes arrives at all eight synapses synchronously and the ion-channel variability is not included. The output of dendrite is more than the threshold voltage and the neuron res in response to each spike in the train. Therefore for the case that the ion-channel variability is not included and spikes arrive at all eight synapses synchronously, the probability of the neuron ring in response to all spikes in the train is one. If the spikes arrive at only ve synapses, the neuron with no variability included does not re as shown in Figure 5.9. The reason that the spikes did not arrive at three synapses out of eight synapses might be because of any kind of variability in the pre-synaptic neu- ron such as synaptic variability or ion-channel variability that prevents the pre-synaptic neuron spike from reaching each synapse, or prevents the synapse from responding. There 69 Figure 5.8: Spikes arriving at the eight synapses, the dendrite output and the output of the axon hillock when ion-channel variability is not included. Figure 5.9: Spikes arriving at the ve synapses, the dendrite output and the output of the axon hillock when ion-channel variability is not included. is also a possibility that the spikes arrive at all synapses, but only at ve synapses are the spikes synchronized. The output of the dendritic arbor has a amplitude is less than the threshold voltage and therefore the probability of the neuron ring in response is zero. 70 The ion-channel variability is included in the axon hillock circuit by including Gaussian noise with = 450mv and the standard deviation changing from = 0mv to 50mv and a step size of 2mv. First we calculated the probability of the neuron ring in response to one single spike arriving at the synapses. In order to calculate the probability we ran the experiment 30 times and tabulated the instances of the neuron ring. Then the probability is the number of res divided by 30. Figure 5.10 shows the results. In the blue trace, the neuron responded to the spike arriving at all eight synapses and, as mentioned in the rst experiment, the probability of the ring when there is no ion-channel variability included is one. By including the ion-channel variability and increasing the standard deviation, over some magnitude of the standard deviation the probability decreases. In this case, the ion-channel variability prevents the neuron from ring and is in fact detrimental to the spike transferring. In the red and green traces, the neuron responded to the input spike arriving at only six and ve synapses out of the eight synapses. The membrane potential of the dendritic arbor at the cell body is lower than the threshold, the neuron does not re and therefore the probability is zero. By increasing the standard deviation, the probability of ring increases. In this case, ion-channel variability is sucient to force the neuron to re with some probability greater than zero. We repeated the experiment with a train of spikes instead of a single spike. A train of spikes consisting of ve spikes arrives at the synapses and the neuron might re in response to some of the spikes in the train. We calculated the probability of the neuron ring in response to all spikes in the train shown in Figure 5.11 for three experiments. In rst experiment, with no ion-channel variability, the neuron responded to all ve spikes arriving at all eight synapses. In the second experiment, the neuron did not respond to the third spike in the train since the third spike arrived at only six synapses out of eight 71 Figure 5.10: Probability of the neuron ring in response to a single spike arriving at eight (blue), six(red) and ve(green) synapses vs ion-channel variability standard deviation. synapses. In the last experiment, the neuron did not respond to the second and the forth spikes. Figure 5.12 shows the probability of the neuron ring in response to all spikes in the train or the probability of a train of spikes propagating when ion-channel variability is included. In the rst experiment (red trace) the neuron responded to all spikes in the train and the probability of ring is around one for the standard deviation less than 20mv, and then it decreases. In the second experiment (blue trace), the neuron did not respond to one spike out of ve spikes (third spike). The probability of ring starts from zero and increases for standard deviation less than 8mv, with a declining trend after about 72 Figure 5.11: Experiment 1: a) a train of spikes arriving at eight synapses, b) the neuron response with no ion-channel variability included, experiment 2: c) a train of spikes arriving at six synapses, d) a train of spikes (with one missing spike) arriving at the rest of the synapses (two synapses), e) the neuron response with no ion-channel variability included, experiment 3: f) a train of spikes arriving at six synapses, g) a train of spikes (with two missing spikes) arriving at the rest of the synapses (two synapses), h) the neuron response with no ion-channel variability included. 20mv. In this case, the ion-channel variability helps the neuron to respond to the third spike and in fact recovers the missed spike in the train arriving at two synapses (Figure 5.11.d) by forcing the neuron to re. Then the probability of ring has a decreasing trend for the standard deviation more than 20mv. In the green trace, the neuron did not respond to two spikes out of ve spikes with no ion-channel variability, but the ion-channel 73 Figure 5.12: Probability of the neuron ring in response to a train of spikes (ve spikes) arriving at the synapses for experiment one (blue), two(red) and three(green) synapses vs ion-channel variability standard deviation. variability could help to recovers two missed spikes. As shown in the result, when the ion- channel variability with the standard deviation around 20mv is included, the probability of ring in all three experiments (red, blue and green traces) are approximately close to the maximum peak. The results shown above are from SPICE simulations. We can model the experimental results with several equations and get approximately the same results by plotting equa- tions in MATLAB. If we assumed the threshold voltage has a Gaussian shape with mean 74 Figure 5.13: Probability of the neuron ring in response to a single spike arriving at eight (blue), six(red) and ve(green) synapses vs ion-channel variability standard deviation (MATLAB simulation results). of and standard deviation of , then the probability of the neuron ring in response to a single spike is as follows: p(V PSP >V th ) = 0:5 0:5erf( V PSP p 2 2 ) (5.2) When the spikes arrive at all synapses, the dendrite potential peak is 237:7mv. When the spikes arrive at six or ve synapses the peak is 178mv or 147:7mv respectively. Figure 5.13 shows the result for p(237:7mv >V th ), p(178mv >V th ) and p(147:7mv >V th ). The results are approximately similar to the result in Figure 5.10 for SPICE simulation. 75 We can calculate the probability of the neuron ring in response to a train of ve spikes arriving at the synapses by multiplying the probability of ring for each single spike in the train. Therefore the probabilities for the rst, second and third experiment in Figure 5.11, are calculated as follows: p 1 =p(237:7mv>V th ) 5 (5.3) p 2 =p(237:7mv>V th ) 4 :p(178mv>V th ) (5.4) p 3 =p(237:7mv>V th ) 3 :p(178mv>V th ) 2 (5.5) The result of plotting these equations in MATLAB are shown in Figure 5.14, which is visually similar to the result in Figure 5.12. Based on the results shown in this section, we can conclude that ion-channel variability can force neurons to re in certain conditions and for some levels of variability. As shown in Figure 5.12, when small amounts of ion-channel variability is included (for example Gaussian with standard deviation around 20mv ), the ion-channel variability tries to recover any missed spike in the train arriving at the synapses by forcing the neuron to re. However too much variability is depressive. For example in Figure 5.12, Gaussian noise with high standard deviation (more than 20mv) might have an undesirable eect on the neuron response to the spikes by preventing the neuron from ring. 76 Figure 5.14: Probability of the neuron ring in response to a train of spikes (ve spikes) arriving at the synapses for experiment one (blue), two (red) and three (green) synapses vs ion-channel variability standard deviation (MATLAB simulation results). 5.3 Innite Looping Behavior in a Neural Network Ion-channel variability could prevent innite loops in a neural network. The brain con- tains a huge amount of connections between neurons through synapses. In the human brain, there are about 100 billion (10 11 ) neurons and 100 trillion (10 14 ) synapses. It is theoretically possible to place an embedded neural network into an innite loop. That causes the network to work improperly and might be a cause of obsessive-compulsive dis- order (OCD). Ion-channel variability in the neural network could disrupt an innite loop. If there is any unproductive loop in the network, it could eventually be interrupted after a 77 short time because of the variability. This thesis shows how ion-channel variability could block any possible loop. If several neurons in the network are connected in such a way that the spike arriving at the synapse depends on the the neuron response to the previous spike arriving at the synapse connected to the same neuron (positive feedback), there is a possibility that the neurons re in a loop forever. Ion-channel variability however can halt the looping. In order to demonstrate this impact of variability at the circuit level, we connected the output of the neuron to the input to make a loop as shown in Figure 5.15. For simplicity we used only one neuron although realistic neural circuits have many more neurons in each loop. The impact of this variability also could be demonstrated in any loop with more than one neuron. The neuron has three excitatory synapses. Output of the dendrite is 200mv, more than the threshold voltage (170mv). Without ion-channel variability, the circuit might generate spikes forever. We included ion-channel variability by including Gaussian noise with = 500mv and three dierent standard deviations ( = 100mv, = 200mv and = 300mv) and calculated the probability of ring versus the number of loops. As shown in Figure 5.16, the probability goes down rapidly and the neuron will not spike any more after several spikes. By increasing the standard deviation, the probability drops faster and the number of spikes before the neuron stops ring is reduced. We repeated the experiment for dierent total PSP amplitudes. We included Gaussian noise with = 500mv and = 150mv and calculated the probability for three dierent total PSP amplitudes. When the total PSP is close to the threshold voltage, the prob- ability goes down faster. However for the strong PSP, it takes several spikes before the ion-channel variability prevents the neuron from spiking. 78 Figure 5.15: The cortical neuron simulated in SPICE with three excitatory synapses. Figure 5.16: Probability of ring vs. number of loops for three standard deviations. 79 Figure 5.17: Probability of ring vs. number of loops for three PSP amplitudes. 80 Chapter 6: Conclusion and Future Research In this thesis, we only consider ion-channel variability and neurotransmitter-release vari- ability only in the axon hillock and synapse. One future research is to consider other sources of variability in other parts of the neuron such as ion-channel variability in the dendrite. We designed and simulated all circuits using carbon nanotube SPICE models. It is a good idea to build the circuits and test all experiments. If the nanotube transistors are not available, the circuits could be designed in CMOS and fabricated in a CMOS chip and all experiments in this thesis could be done on the CMOS chip. As we mentioned in the introduction, it is important to understand the interaction of chaotic dynamics and noisiness and the resultant combined variability. More detail about chaotic behavior of the neuron is beyond this thesis, however a good future research topic is to study chaotic dynamics of the neural networks in more detail and try to implement them at the circuit level. Some researchers studied the benets of variability or noise at the network level. Variability involves robustness of the network and also how noise facilitate learning and adaptation to the changing demands of a dynamic network. In this thesis, we focused on a single neuron. A research plan could be to study the impact of the variability in a network. 81 In this thesis, we did not complete a noise generator circuit. A noise generator circuit needs to be designed using the output of the chaotic signal generator and embedded into the neuron circuit. 82 Appendix A: Stochastic Resonance At the neuronal level, noise as a source of variability could enhance sensitivity to weak sig- nals (subthreshold signals) in threshold-detecting systems, a phenomenon that is known as stochastic resonance. Stochastic resonance occurs if the signal-to-noise ratio of a nonlinear system increases for certain level of noise. It often occurs in systems with a threshold be- havior and when the input signal to the system is slightly below threshold (sub-threshold). Sub-threshold signals are not strong enough to cross the threshold but a low-noise inten- sity helps the input signal to pass through, eectively increasing the signal-to-noise ratio. However if a large noise intensity is injected, the output is dominated by the noise and then it reduces the signal-to-noise ratio. Stochastic resonance in neural networks is an important and growing area of stochastic neural networks. There are a large numbers of publications about stochastic resonance in neural networks. About 500 papers on stochastic resonance contain a reference in the title, abstract, or keywords to the words neuron or neural, which illustrates the signicant interest in studying a positive role for randomness in neural function [64]. Stochastic resonance was found and studied in the sensory system of animals by several researchers. Since its rst discovery in cat visual neurons [59], stochastic resonance eects have been demonstrated in a range of sensory systems. Evidence for stochastic resonance was rst found in nerve signals from mechanoreceptors located on the tail fan of craysh 83 [30] and then it was found in shark multimodal sensory cells [11], cricket [57], rat [24] and paddlesh [73]. There is evidence that human sensory systems make use of stochastic resonance as well such as vision [74] and auditory systems [87]. The behavioral impact of stochastic resonance has been directly demonstrated and manipulated in human balance control [69]. Stochastic resonance was demonstrated in a mathematical model of a single neuron using a dynamical system approach [12]. Kosko observed stochastic resonance in the neural tissue of sensory systems [54]. He studied stochastic resonance in a noisy threshold model and also in continuous and spiking neuron models with Levy noise [55], [67]. In this chapter, we demonstrate stochastic resonance at the circuit level. As we discussed about the results in Section IV.3, the ion-channel variability included in the axon hillock can help weak PSP's which are slightly less than the threshold to re the neuron. This variability enhances the sensitivity of the axon hillock to the weak signals. However if the ion-channel variability is too strong, the neuron res mainly based on variability not PSP's. The sensitivity of the output response decreases inversely with increasing the ion-channel variability level; too much variability will deteriorate the sensitivity of signal detection. We did an experiment for the ion-channel variability. As shown in Figure A.1, a PSP signal with three dierent amplitudes (40mv, 175mv , 300mv) is applied to the axon hillock. When there is no ion-channel variability, neuron res only for PSP more than the threshold voltage (180mv). When the ion-channel variability, Gaussian with = 450mv and = 180mv is included in the axon, the neuron might re for PSP slightly less than the threshold voltage. When the variability is strong; Gaussian with = 450mv and = 350mv, the neuron might re even for PSP's which are too weak and much less than 84 Figure A.1: Input and output of the axon hillock for input with no variability, low vari- ability and high variability the threshold. In this case, the output spike is dominated by the ion-channel variability and therefore, the ion-channel variability is not desirable. Another common approach to characterize stochastic resonance is the signal-to-noise ratio (SNR). This quantity is formed from the ratio obtained from the signal power and the noise power. If the neuron res when PSP is more than threshold or slightly less than 85 threshold, we consider that spike as a valid spike. We add power of all valid spikes and call signal power. The rest of spikes in the output of the axon hillock are because of noise and therefore we consider the summation of their power as the noise power. Based on this denition, we applied the same PSP as shown in Figure A.1. Signal power is the summation of all spikes happen when when PSP is 175mv or 300mv and noise power is the summation of all spikes for PSP 40mv or 0mv. Figure A.2 shows the signal and noise power versus the ion-channel variability standard deviation in the axon hillock. As shown in this gure, when standard deviation is less than 60mv, the noise power is negligible, but the signal power increases. Therefore SNR shown in Figure A.3 increases. For standard deviation more than 60mv noise is started increasing, however signal power is almost constant. SNR in this case, decreases. As we expected, SNR increases with increasing the ion-channel variability level and then it reaches a maximum. SNR will go down after reaching the maximum meaning that more noise is not desirable any more. We can also calculate SNR using mathematic analysis. We have a random Gaussian variable v n . The probability of ring when v n is applied to the axon is equal to the probability of v n plus PSP crossing the threshold as follows: p(v n >V th +V PSP ) = 0:5 0:5erf( V th +V PSP p 2 2 ) (A.1) Each PSP sample more than the threshold or slightly less than the threshold adds p(v n >V th +V PSP )P spike to the signal power. Otherwise it adds the same power to the noise power. P spike is a xed number for the power of one single spike. Based on these equations SNR is calculated and the result is shown in red dashed line in Figure A.3. 86 Figure A.2: Signal (blue) and noise (red) power vs ion-channel noise standard deviation in the axon hillock PSP's applied to the axon might be variable because of synaptic release variability. We want to nd out the eect of synaptic variability on stochastic resonance. We included synaptic release variability with dierent strengths and calculated SNR. Synaptic vari- ability could be correlated or uncorrelated with ion-channel variability. Figure A.4 shows Signal-to-Noise ratio when synaptic variability is applied to PSP with three dierent cor- relation factors (red 0.05, blue 0.1 and green 0.2). As shown in this gure, the peak of green graph is slightly more than the peak of blue and red graphs. Also the peak happens for the lower standard deviation of the ion-channel variability. The synaptic variability might help the PSP signal to cross the threshold and the neuron res or it might have a 87 Figure A.3: Signal-to-noise ratio vs ion-channel noise standard deviation, experiment (blue solid line) and mathematic calculation (red dashed line). negative eect. However when the synaptic variability is correlated with the ion-channel variability, both together help the PSP signal to re the neuron. 88 Figure A.4: Signal-to-noise ratio vs ion-channel variability standard deviation with cor- related synaptic variability for three correlation factors (red 0.05, blue 0.1 and green 0.2) 89 Appendix B: Technologies beyond CMOS: Memristor Since the 1970's people have been predicting the end of CMOS because of several limi- tations. Although because of massive research, all previous barriers has been overcome, many experts are now claiming that the industry is reaching limits that wont be overcome. Scaling in CMOS are limited by three main factors as follows: Minimum dimensions that can be fabricated, Diminishing returns in switching performance and O-state leakage [71]. In order to implement neuromorphic circuits, other technologies beyond CMOS are required. In the human brain, there are 100 billion (10 11 ) neurons and 100 trillion (10 14 ) synapses. Therefore the size and power are two key factors that make CMOS less desir- able than other technologies. One of the most popular technologies is carbon nanotubes transistors. This concept is very appealing because it is still a transistor and could make use of all the architectural knowledge developed for CMOS. Fortunately SPICE models of this type of transistor were created by a group at Stanford [27] and are available online [65]. In our circuits, all transistors are in carbon nanotubes and we used the original SPICE models from Stanford. Another technology that might be useful in designing neuromorphic circuits is Mem- ristive nanodevices. In order to use memristor for neuromorphic circuits, we need to have a SPICE model for memristor. In this chapter, rst we present a SPICE model for a 90 memristor using dependent voltage sources. The model is validated by simulating simple circuits and comparing with the expected results. Then we show the results for simula- tion of two circuits, a low pass lter in which a memristor is in series with a resistor and an integrator circuit with operational amplier. We nally talk about how we can use memristor in neuromorphic circuits. B.1 Background In 1971 Leon Chua presented the missing circuit element that he called a memristor (memory + resistor) [21]. He noted among the six possible combinations of the four fundamental circuit variables, i, v, q and , ve have well-known relationships. Two of them are the denition of current and Faraday's law and the rest are given by the three circuit elements, resistor(R = dv dt ), inductor(L = d di ) and capacitor(C = dq dv ). Based on the symmetry, he claimed there should be a forth fundamental element called a memristor which is a relation between charge and magnetic ux(M = d dq ). Although he presented memristor's laboratory realization in the form of active circuits, before May 2008 the existence of a memristor in the form of a physical device had not been discovered. In May 2008, scientists at HP labs, led by R. Stanley Williams, announced an invention of a physical device for the memristor [83], [80]. They also presented a physical model of a memristor called the coupled variable resistor model, which works like a perfect memristor under certain conditions. In this thesis, we present a simplied SPICE model for memristors based on the coupled variable resistor model. While a proprietary SPICE model for the HP technology exists, no model has been available for general use. In order to design and simulate circuits with memristor elements, a SPICE model is required. 91 Figure B.1: The coupled variable resistor model Applications of the memristor include implementing neuromorphic circuits using mem- ristive nanodevices [22], [23], [75] and ultra dense nonvolatile memories. B.2 SPICE Model of a Memristor In the SPICE model of a memristor presented here, there is a thin semiconductor lm that has two regions, one with a high concentration of dopant that behaves like a low resistance called R ON and the other with a low dopant concentration with higher resistance called R OFF . The lm is sandwiched between two metal contacts (Figure B.1). When we apply a voltagev(t) to the device, the dopants drift from low to high or high to low concentration depending on the voltage polarity. For simplicity, we assume linear ionic drift in a uniform electric eld with average ion mobility V . In this case, the V-I characteristic of the device is 92 v(t) = (R ON w(t) D +R OFF (1 w(t) D ))i(t) (B.1) w is the state variable which is the length of the doped region in the thin lm (Figure B.1). dw(t) dt is proportional to the current and therefore w is a function of charge. w(t) = V R ON D q(t) (B.2) Based on Equations B.1 and B.2, in fact a memristor is acting like a variable resistor; its resistivity M(q) is a function of charge. M(q) =R OFF (R OFF R ON ) V R ON D 2 q(t) (B.3) In order to model a variable resistor in SPICE, we rst model a regular resistor with a dependent voltage source without using a resistor. Figure B.2 shows how to model a 100 ohm resistor in SPICE. To sense current in the circuit, we use an independent voltage sourceV sense which is 0.0 volts and therefore it has no eect on the output voltage. The other source, V r is a dependent voltage source that generates the voltage across the resistor based on the sensed current times the desired resistance (V r =I 100 ), where I is measured at V sense . Now to create a model for a memristor that is a variable resistor, all we need to do is change the value of the voltage source V r to a function of q based on Equation B.3. As shown in Figure B.3, a capacitor C sense is added to sense the charge in the circuit. Note that to cancel the eect of the capacitor voltage on the output voltage, we subtract it from the V r voltage value. (V r =IMV c ), where V c is the voltage across C sense . 93 Figure B.2: 100 ohm resistor subcircuit Figure B.3: Memristor subcircuit We validate the model by simulating for dierent input voltages such as DC, sinusoidal and square wave signals and show a curve here comparing with the expected results. Figure B.4 shows the current of the memristor when a sinusoidal input voltage is applied. The V-I characteristic of the memristor is shown in Figure B.5. (R ON = 1 ;R OFF = 160 ;D = 10nm; V = 10 14 m 2 =sec:volts) Note that this model is valid as long as the system remains in the memristor regime which, in this case, is where the state variable w 94 Figure B.4: Voltage and current of the memristor is bounded between zero andD. Based on Equation B.2, the condition for the memristor regime is as follows: 0q(t) D 2 V R ON (B.4) B.3 Low Pass Filter Using a Memristor A low-pass lter using a memristor in series with a resistor is proposed (Figure B.6). In order to obtain the gain-magnitude frequency response we apply a sine wave and measure the magnitude of the main frequency in the output. Since the circuit is non-linear, in 95 Figure B.5: I-V curve addition to the main frequency, other frequency components are generated. However the main frequency has the highest magnitude. Figure B.7 shows the frequency response when 0:3sin(!t) is applied. The result is similar to a simple RL lter (inductor in series with resistor) however when the input signal amplitude changes, the response would be dierent, which means the memristor is not acting like a xed inductor but depends on the level of the voltage applied to the memristor; the equivalent inductor is dierent. As shown in Figure B.7, at a high frequency, the memristor is acting like a resistance and the circuit is changed to a voltage divider. 96 Figure B.6: Low pass lter with memristor Figure B.7: Frequency response of the low-pass lter B.4 Integrator Using a Memristor Figure B.8 shows an integrator using a memristor and an operational amplier. When a square wave is applied to the input, the output would be a ramp signal, which is in 97 Figure B.8: Integrator with memristor fact the integral of the input voltage (Figure B.9). As compared to an integrator with inductor, the memristor acts like an inductor under certain conditions. If the period of the input signal or the amplitude of the input square wave increases, the output is not a ramp shape any more (Figure B.10). Now we calculate the conditions in which the memristor could be replaced by an inductor and also an equivalent inductor value. Based on Equation B.3 and the denition of the memristor (d =M(q)dq), we obtain the ux as a function of the charge: (t) =R OFF q(t) (R OFF R ON ) V R ON 2D 2 q 2 (t) (B.5) We identify r = (R OFF R ON ) V R ON D 2 . We assume the input is v 0 at time [0;T=2] whereT is the period of the signal. Using Faraday's law (v(t) = d dt ) we nd the ux as a function of time and then replace it in Equation 5 to to have the charge as a second-order equation of time as follows: 98 Figure B.9: Input and output waveforms in a ramp-shape regime r 2 q 2 (t)R OFF q(t) +v 0 t = 0 (B.6) q(t) = R OFF q R 2 OFF 2v 0 rt r (B.7) (We retain only the negative sign in \" to satisfy the memristor regime condition in Equation B.4). Using i = dq dt we have 99 Figure B.10: Input and output waveforms in the non-linear regime i(t) = v 0 q R 2 OFF 2v 0 rt (B.8) i(t) = v 0 R OFF + v 2 0 r R 3 OFF t + 12v 3 0 r 2 R 5 OFF t 2 +::: (B.9) We can approximate the current using the rst two terms of the Taylor series, if the third term is much smaller than the second term. This is the condition in which memristor is equivalent to an inductor : v 0 T R 2 OFF 6r (B.10) 100 As compared to the output of an integrator with an inductor, the equivalent inductor is calculated. L eqv = R 3 OFF R ON (R OFF R ON ) D 2 V 1 v 0 (B.11) Therefore we can conclude, the memristor is acting like an inductor under certain conditions. B.5 Memristor in Neuromorphic Circuits In order to use memristor in any circuit, we have to make sure the state variable w which is the length of the doped region in the thin lm (Figure B.1) is always between 0 and D. Otherwise the device is no longer in the memristor regime and the SPICE model can not model the behavior. For example, if we apply the voltage waveform shown in Figure B.11 to the memristor, the state variable w rises as shown in Figure B.12. As shown in the gure, whenever a positive voltage is applied to the memristor, w rises. If a negative voltage is applied, w falls. At time 5ms, w is getting close to D. If we apply a positive voltage for longer time, w reaches D and the circuit does not work correctly any more. This is one limitation for using memristor in neuromorphic circuits or any other circuits. Another diculty about using the memristor in circuits is that the memristor is a two-terminal device. We designed a circuit to use memristor as a three-terminal device. As shown in Figure B.13 we used a resistor in series with a resistor. Other side of the resistor is connected to a negative voltage Vss in order to reduce w. Figure B.14 shows the input and output to the circuit and Figure B.15 shows w=D versus time. As shown in the gure, the circuit responses to the second pulse stronger than to the rst pulse. Therefore the circuit has a memory behavior and if we keep applying 101 Figure B.11: Voltage and current waveforms of memristor Figure B.12: State variable w vs. time pulses to the circuit, the circuit responses become stronger. This circuit might be a good example to model long-term potentiation (LTP) in neuromorphic circuits. If the circuit does not receive any pulse for a long time (shown in Figure B.16), the negative voltage 102 Figure B.13: Three-terminal memristor circuit Figure B.14: Input and output waveforms in three-terminal memristor circuit Vss forcew to decrease and the circuit does not respond stronger or might even respond weaker than the previous response which is a model for long-term depression (LTD). 103 Figure B.15: State variable w in a three-terminal memristor circuit Figure B.16: Input and output waveforms in a three-terminal memristor circuit 104 Figure B.17: State variable w in a three-terminal memristor circuit 105 Bibliography [1] W. Bair, \Spike timing in the mammalian visual system," Curr Opin Neurobiol, 9, pp. 447{453, 1999. [2] H. B. Barlow, W. R. Levick, and M. Yoon, \Responses to single quanta of light in retinal ganglion cells of the cat," Vision Res., 11, pp. 87{101, 1971. [3] F. Bazs, L Zalnyi, and G. Csrdi, \Channel noise in Hodgkin-Huxley model neurons," Physics Letters A, V. 311, Issue 1, pp. 13{20, 2003. [4] R. J. van Beers, \The sources of variability in saccadic eye movements," J. Neurosci., 27, pp. 8757{8770, 2007. [5] R. J. van Beers, P. Haggard, and D. M. Wolpert, \The role of execution noise in movement variability," J. Neurophysiol, 91, pp. 1050{1063, 2004. [6] H. C. Berg, and E. M. Purcell, \Physics of chemoreception," Biophys. J., 20, pp. 193{219, 1977. [7] S. M. Bezrukov, and I. Vodyanoy, \Noise-induced enhancement of signal transduction across voltage-dependent ion channels," Nature, 378, pp. 362{364, 1995. [8] W. Bialek, \Physical limits to sensation and perception," Annu. Rev. Biophys. Bio- phys. Chem., 16, pp. 455{478, 1987. [9] W. Bialek, and S. Setayeshgar, \Physical limits to biochemical signaling," Proc. Natl Acad. Sci. USA, 102, pp. 10040{10045, 2005. [10] K. Boahen, \Neuromorphic microchips," Scientic American, pp. 56{63, May 2005. [11] H. A. Braun, H. Wissing, K. Schafer, and M. C. Hirsch, \Oscillation and noise determine signal transduction in shark multimodal sensory cells," Nature 367, pp. 270{273, 1994. [12] A. Bulsara, E. W. Jacobs, T. Zhou, F. Moss, and L. Kiss, \Stochastic resonance in a single neuron model: theory and analog simulation," Journal of Theoretical Biology 152 (4), pp. 531{555, October 1991. 106 [13] D. A. Butts, C. Weng, J. Jin, C. I. Yeh, N. A. Lesica, J. M. Alonso, G. B. Stanley, \Temporal precision in the neural code and the timescales of natural vision," Nature 449, pp. 92-95, 2007. [14] W. H. Calvin, and C. F. Stevens, \Synaptic noise and other sources of randomness in motoneuron interspike intervals," J. Neurophysiol, 31, pp. 574{587, 1968. [15] W. H. Calvin, and C. F. Stevens, \Synaptic noise as a source of variability in the interval between action potentials," Science, 155, pp. 842{844, 1967. [16] N. A. Campbell, J. B. Reece, \Biology" 7th Edition, Benjamin-Cummings Pub Co, 2007. [17] P.V. Carelli, M.B. Reyes, J.C. Sartorelli and R.D. Pinto, \Whole Cell Stochastic Model Reproduces the Irregularities Found in the Membrane Potential of Bursting Neurons," J Neurophysiol 94, pp. 1169{1179, 2005. [18] G. A. Cecchi, M. Sigman, J. M. Alonso, L. Martinez, D. R. Chialvo, and M. O. Magnasco, \Noise in neurons is message dependent," PNAS, Vol. 97, No. 10., pp. 5557{5561, 2000. [19] H. Chen, S. Saghi, L. Buhry, and S. Renaud, \Real-time Simulation of Biologically- realistic Stochastic Neurons in VLSI," IEEE Transactions on Neural Networks, 21, pp. 1511{1517., 2010. [20] C. C. Chow, and J. A. White, \Spontaneous action potentials due to channel uctu- ations," Biophys. J., 71, pp. 3012{3021, 1996. [21] L. Chua, \Memristor-the missing circuit element," Circuits Theory, IEEE Transac- tions on [legacy, pre - 1988], vol. 18, no. 5, pp. 507{519, 1971. [22] L. Chua, \Nonlinear circuit foundations for nanodevices, I - The four element torus," Proc IEEE, vol. 91, no. 11, pp. 1830{1859, 2003. [23] L. O. Chua and S. M. Kang, \Memristive devices and systems," Proc IEEE, vol. 64, no. 2, pp. 209-223, 1976. [24] J. J. Collins, T. T. Imho, and P. Grigg, \Noise-enhanced information transmis- sion in rat SA1 cutaneous mechanoreceptors via aperiodic stochastic resonance," J. Neurophysiol, 76, pp. 642{645, 1996. [25] R. Conti, Y. Tan, and I. Llano, \Action potential-evoked and ryanodine-sensitive spontaneous Ca2+ transients at the presynaptic terminal of a developing CNS in- hibitory synapse," J. Neurosci, 24, pp. 6946{6957, 2004. [26] M. Delgado-Restituto, F. Medeiro, and A. Rodriguez-Vazquez, \Nonlinear switched- current CMOS IC for random signal generation," Electron. Lett., vol. 29, no. 25, pp. 2190{2191, 1993. 107 [27] J. Deng and H. S. P. Wong, \A Circuit-Compatible SPICE model for Enhancement Mode Carbon Nanotube Field Eect Transistors," International Conference on In Simulation of Semiconductor Processes and Devices, pp. 166{169, 2006. [28] A. Destexhe, M. Rudolph, J. M. Fellous, and T. J. Sejnowski, \Fluctuating synaptic conductances recreate in vivo-like activity in neocortical neurons," Neuroscience, 107, pp. 13-24, 2001. [29] K. Diba, H. A. Lester, and C. Koch, \Intrinsic noise in cultured hippocampal neurons: Experiment and modeling," J Neuroscience, vol. 24, pp. 9723{9733, 2004. [30] J. K. Douglass, L. Wilkens, E. Pantazelou, and F. Moss, \Noise enhancement of information transfer in craysh mechanoreceptors by stochastic resonance," Nature 365 (6444): pp. 337{340, September, 1993. [31] G. B. Ermentrout, R. F. Galn, and N. N. Urban, \Reliability, synchrony and noise," Trends Neurosci, 31(8), pp. 428{434, 2008. [32] A. A. Faisal, and S. B. Laughlin, \Stochastic simulations on the reliability of action potential propagation in thin axons," PLoS Comput. Biol., vol. 3, pp. 0783{0795, 2007. [33] A. A. Faisal, L. P. J. Selen, and D. M. Wolpert, \Noise in the nervous system," Nature Reviews Neuroscience, vol. 9, pp. 292{303, 2008. [34] A. A. Faisal, J. A. White, and S. B. Laughlin, \Ion-channel noise places limits on the miniaturization of the brain's wiring," Current Biology, vol. 5, pp. 1143{1149, 2005. [35] E. Farquhar, and P. Hasler, \Bio-Physically Inspired Silicon Neuron," IEEE Trans. on Circuits and Systems, vol. 52, No. 3, pp. 477{488, March 2005. [36] P. Fatt, and B. Katz, \Some observations on biological noise," Nature 166, pp. 597{ 598, 1950. [37] P. Fatt, and B. Katz, \Spontaneous subthreshold activity at motor nerve endings," J Physiol (Lond) 117, pp. 109{128, 1952. [38] P. Faure, D. Kaplan, and H. Korn, \Synaptic ecacy and the transmission of complex ring patterns between neurons," J. Neurophysiol, 84, pp. 3010{3025, 2000. [39] J. M. Fellous, M. Rudolph, A. Destexhe, and T. J. Sejnowski, \Synaptic background noise controls the input-output characteristics of single cells in an in vitro model of in vivo activity," Neuroscience, 122, pp. 811-829, 2003. [40] N. Fourcaud, and N. Brunel, \Dynamics of the ring probability of noisy integrate- and-re neurons," Neural Computation, 14, pp. 2057{2110, 2002. 108 [41] R. F. Galan, G. B. Ermentrout, and N. N. Urban, \Optimal time scale for spike-time reliability: theory, simulations and experiments," J. Neurophysiol, 99, pp. 277-283, 2008. [42] G. L. Gerstein, and B. Mandelbrot, \Random walk models for the spike activity of a single neuron," Biophys. J., 4, pp. 41{68, 1964. [43] W. Gerstner, \Population dynamics of spiking neurons: Fast transients, asyn- chronous states, and locking." Neural Computation, 12, pp. 43{89, 2000. [44] B. Hille, \Ionic Channels of Excitable Membranes," 2nd ed. Sinauer Associates, Sun- derland, MA., 1992. [45] K.M. Hynna, and K. Boahen, \Neuronal ion-channel dynamics in silicona," IEEE International Symposium on Circuits and Systems, pp. 21{24, May 2006. [46] F. Jaramillo, and K. Wiesenfeld, \Mechanoelectrical transduction assisted by Brow- nian motion: a role for noise in the auditory system." Nature Neuroscience, vol. 1, n. 5, pp. 384{388, 1998. [47] J. Joshi, C. Hsu, A. C. Parker, and P. Deshmukh, \A carbon nanotube cortical neuron with excitatory and inhibitory dendritic computations," in IEEE/NIH 2009 LIfe Science Systems and Applications Workshop (LiSSA 2009), 2009. [48] J. Joshi, A. C. Parker, and C-C Hsu, \A carbon nanotube cortical neuron with spike- timing-dependent plasticity," Conference proceedings :Annual International Confer- ence of the IEEE Engineering in Medicine and Biology Society, EMBC, pp. 1651{ 1654, 2009. [49] P. Kara, P. Reinagel, and R. C. Reid, \Low response variability in simultaneously recorded retinal, thalamic, and cortical neurons," Neuron 27, pp. 635-646, 2000. [50] C. C. King, \Fractal and Chaotic Dynamics in Nervous Systems," Progress in Neu- robiology, 36, pp. 279{308, 1991. [51] I. C. Kleppe, and H. P. C. Robinson, \Correlation entropy of synaptic input-output dynamics," Phys. Rev. E Stat. Nonlin. Soft Matter Phys, 74, 041909, 2006. [52] B. W. Knight, \Dynamics of encoding in a population of neurons." J. Gen. Physiol., 59, pp. 734{766, 1972. [53] C. Koch, \Biophysics of Computation," Oxford University Press, New York, 1999. [54] B. Kosko, \Noise" Viking Press. ISBN 0-670-03495-9., 2006. [55] B. Kosko, and S. Mitaim, \Stochastic resonance in noisy threshold neurons," Neural Networks, pp 755{761, 2003. 109 [56] S. B. A. Laughlin, C. John, D. C. OCarroll, and R. R. de Ruyter van Steveninck, \Information Theory and the Brain," eds Baddeley, R. H. R. and Foldiak, R. , pp. 46{61 ,Cambridge Univ. Press, 2000. [57] J. E. Levin, and J. P. Miller, \Broadband neural encoding in the cricket cercal sensory system enhanced by stochastic resonance," Nature, 380, pp. 165{168, 1996. [58] B. Liu, and J.F. Frenzel, \A CMOS neuron for VLSI implementation of pulsed neural networks," Proc. 28th Ann. Conf. Ind. Electron. IECON02, Sevilla, Spain, pp. 3182{ 3185, Nov. 2002. [59] A. Longtin, A. Bulsara, and F. Moss, \Time-interval sequences in bistable systems and the noise-induced transmission of information by sensory neurons," Phys. Rev. Lett. 67, pp. 656{659, 1991. [60] X. Lou, V. Scheuss, and R. Schneggenburger, \Allosteric modulation of the presy- naptic Ca2+ sensor for vesicle fusion," Nature, 435, pp. 497-501, 2005. [61] Z. F. Mainen, and T. J. Sejnowski, \Reliability of spike timing in neocortical neu- rons." Science 268, 1503{1506 1995. [62] A. Manwani, \Detecting and Estimating Signals over Noisy and Unreliable Synapses: Information-Theoretic Analysis," Neural Comp 13, pp. 1{33, 2001. [63] A. Maye, C. Hsieh, G. Sugihara, and B. Brembs, \Order in spontaneous behavior," PLoS One, May 16. 2007. [64] M. D. McDonnell, and D. Abbott, \What is Stochastic Resonance? Denitions, misconceptions, debates, and its relevance to biology," PLOS Computational Biology 5 (5): e1000348, 2009. [65] http://nano.stanford.edu/models.php [66] A. C. Parker, J. Joshi, C-C Hsu, and N. A. D. Singh, \A Carbon Nanotube Imple- mentation of Temporal and Spatial Dendritic Computations," 51st IEEE Midwest Symposium on Circuits and Systems, pp. 818{821, Aug. 2008. [67] A. Patel, and B. Kosko, \Stochastic Resonance in Continuous and Spiking Neuron Models With Levy Noise," IEEE Transactions on Neural Networks, pp. 1993{2008, 2008. [68] C. Pecher, \La uctuation dexcitabilite de la bre nerveuse," Arch Intern Physiol 49, pp. 129{152, 1939. [69] A. A. Priplata, J. B. Niemi, J. D. Harry, L. A. Lipsitz, and J. J. Collins, \Vibrating insoles and balance control in elderly people," Lancet 362, pp. 1123{1124, 2003. 110 [70] D. Purves, G. A. Augustine, D. Fitzpatrick, W. Hall W, A. S. LaMantia, J.O. Mc- Namaraand and L. White, \Neuroscience." 4th edition. Sunderland, MA, Sinauer Associates. 2008. [71] D. Rairigh, \Limits of CMOS technology scaling and technologies beyond-CMOS," IEEE, 2005. [72] J. T. Rubinstein, \Threshold uctuations in an N sodium channel model of the node of Ranvier," . Biophys. J. 68, pp. 779{785, 1995. [73] D. F. Russell, L. A. Wilkens, and F. Moss, \Use of behavioural stochastic resonance by paddle sh for feeding," Nature 402 (6759), pp. 291{294, November 1999. [74] E. Simonotto, M. Riani, C. Seife, M. Roberts, J. Twitty, and F. Moss, \Visual Perception of Stochastic Resonance," Physical Review Letters 78 (6), pp. 1186, 1997. [75] G. S. Sinder, \Spike-timing-dependent learning in memristive nanodevices," IEEE/ACM International Symposium on Nanoscale Architectures, Anaheim, CA, pp. 85{92. June 2008. [76] R. B. Stein, \A theoretical analysis of neuronal variability," Biophys. J., vol. 5, pp. 173, 1965. [77] R. B. Stein, E. R. Gossen, and K. E. Jones, \Neuronal variability: noise or part of the signal? " Nature Reviews Neuroscience, vol. 6, pp. 389{397, 2005. [78] T. Stojanovski and L. Kocarev, \Chaos based random number generators Part I: Analysis," IEEE Trans. Circuits Syst. I, vol. 48, pp. 281{288, 2001. [79] T. Stojanovski and L. Kocarev, \Chaos based random number generators Part II: Practical Realization," IEEE Trans. Circuits Syst. I, vol. 48, pp. 382{385, 2001. [80] D. B. Strukov, G. S. Snider, D. R. Stewart, and S. R. Williams, \The missing mem- ristor found," Nature, vol. 453, no. 7191, pp. 80{83, 2008. [81] G. J. Stuart, and M. Hausser, \Dendritic coincidence detection of epsps and action potentials," Nat Neurosci, 4, pp. 63{71, 2011. [82] D. Sulzer, and R. Edwards, \Vesicles: equal in neurotransmitter concentration but not in volume," Neuron, 28, pp. 5{7, 2000. [83] J. M. Tour and T. He, \The forth element," Nature, vol. 453, pp. 42{43, 2008. [84] S. Q. Wang, L. S. Song, E. G. Lakatta, and H. Cheng, \Ca2+ signalling between single L.type Ca2+ channels and ryanodine receptors in heart cells," Nature, 410, pp. 592{596, 2001. 111 [85] X-S Wu, L. Xue, R. Mohan, K. Paradiso, K. D. Gillis, and L-G Wu, \The origin of quantal size variation: vesicular glutamate concentration plays a signicant role," J. Neurosci, 27, pp. 3046{3056, 2007. [86] K. A. Zaghloul, and K. Boahen, \Optic nerve signals in a neuromorphic chip II: testing and results," Biomedical Engineering, IEEE Transactions on, vol. 51, no. 4, pp. 667{675, April 2004. [87] F. G. Zeng, Q. J. Fu, and R. Morse, \Human hearing enhanced by noise," Brain Research 869 (12), pp. 251{255, June 2000. 112
Abstract (if available)
Abstract
The variability in the behavior of artificial central neurons is the topic of this thesis. Variable behavior has been observed in biological neurons, resulting in changes in neural behavior that might be useful to capture in neuromorphic circuits. This thesis presents a neuromorphic cortical neuron with two main sources of intrinsic variability; synaptic neurotransmitter release and ion-channel variability, designed to be used in neural networks as part of the BioRC Biomimetic Real-Time Cortex project. This neuron has been designed and simulated using carbon nanotube transistors, one of several nanotechnologies under consideration to meet the challenges of scale presented by the cortex. ❧ Research results suggest that some instances of variability are stochastic, while other studies indicate that some instances of variability are chaotic. In this thesis, both possible sources of variability are considered by embedding either Gaussian noise or a chaotic signal into the neuromorphic or synaptic circuit and observing the results. ❧ Our overarching goal for the BioRC project is to demonstrate complex neural networks that possess memory and learning capability. To this end, we believe the behavior of such networks would be enhanced by the addition of variability. This thesis describes neurotransmitter-release variability and ion-channel variability modeled at the circuit level using carbon nanotube circuit elements. We include two different types of signal variabilities in the circuit, a signal with Gaussian noise and a chaotic signal. For neurotransmitter-release variability these signals are simulated as if they were generated internally in a synapse circuit to vary the neurotransmitter release in an unpredictable manner. Variation in neurotransmitter concentration in the synaptic cleft causes a change in the peak magnitude and duration of the postsynaptic potential. For ion-channel variability, these signals are simulated as if they were generated internally in an axon hillock circuit to change the firing mechanism. The variable signal could force the neuron to fire if the variability strength were sufficient or could prevent the neuron from firing. The variable signal is independent of the post-synaptic potential. When there is no post-synaptic potential applied to the axon hillock (the cell membrane is at resting potential), the variable signal forcing the neuron to fire in fact models spontaneous firing of the neuron. ❧ For Gaussian noise, we include a file in our SPICE simulation consisting of random voltage samples that control neurotransmitter release volume. For chaotic signals, we present a chaotic signal generator circuit design and simulation using carbon nanotube transistor SPICE models, the output of which would likewise control neurotransmitter release. The circuit uses a chaotic piecewise linear one-dimensional map, implemented with switched-current circuits that can operate at high frequencies to generate a chaotic output current. ❧ The results presented in this thesis illustrate that neurotransmitter-release variability plays a beneficial role in the reliability of spike generation. In an examination of the reliability of spike generation, the precision of spike timing in the carbon nanotube circuit simulations was found to be dependent on stimulus (postsynaptic potential) transients. Postsynaptic potentials with low neurotransmitter release variability or without neurotransmitter release variability produced imprecise spike trains, whereas postsynaptic potentials with high neurotransmitter-release variability produce spike trains with reproducible timing. ❧ In simulation experiments, spontaneous firing of neurons due to ion-channel variability was demonstrated. The thesis illustrates how, in one particular case, ion-channel variability could play a beneficial role in the reliability of transferring a train of spikes. The thesis also shows how ion-channel variability can halt infinite looping behavior with positive feedback in a neural network such as those thought to occur in obsessive-compulsive disorder. ❧ The design was simulated using carbon nanotube transistors and a SPICE simulation.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Dynamic neuronal encoding in neuromorphic circuits
PDF
Plasticity in CMOS neuromorphic circuits
PDF
Nonlinear dynamical modeling of single neurons and its application to analysis of long-term potentiation (LTP)
PDF
Excitatory-inhibitory interactions in pyramidal neurons
PDF
Computational investigation of glutamatergic synaptic dynamics: role of ionotropic receptor distribution and astrocytic modulation of neuronal spike timing
PDF
Dendritic computation and plasticity in neuromorphic circuits
PDF
A million-plus neuron model of the hippocampal dentate gyrus: role of topography, inhibitory interneurons, and excitatory associational circuitry in determining spatio-temporal dynamics of granul...
PDF
Nonlinear modeling of causal interrelationships in neuronal ensembles: an application to the rat hippocampus
PDF
Decoding memory from spatio-temporal patterns of neuronal spikes
PDF
Multiscale spike-field network causality identification
PDF
From sensory processing to behavior control: functions of the inferior colliculus
PDF
Modeling astrocyte-neural interactions in CMOS neuromorphic circuits
PDF
Simulating electrical stimulation and recording in a multi-scale model of the hippocampus
PDF
Engineering genetic tools to illustrate new insights into the homeostatic control of synaptic strength
PDF
Genetic and molecular analysis of a rhythmic behavior in C. elegans: how neuropeptide signaling conveys temporal information
PDF
Synaptic circuits for information processing along the central auditory pathway
PDF
Parametric and non‐parametric modeling of autonomous physiologic systems: applications and multi‐scale modeling of sepsis
PDF
Defining the circuits and mechanisms mediating a pacemaker-controlled behavior in C. elegans
PDF
On the electrophysiology of multielectrode recordings of the basal ganglia and thalamus to improve DBS therapy for children with secondary dystonia
PDF
Physiology of the inner ear: the role of the biophysical properties of spiral ganglion neurons in encoding sound intensity information at the auditory nerve
Asset Metadata
Creator
Mahvash Mohammadi, Mohammad
(author)
Core Title
Emulating variability in the behavior of artificial central neurons
School
Viterbi School of Engineering
Degree
Doctor of Philosophy
Degree Program
Electrical Engineering
Degree Conferral Date
2012-05
Publication Date
03/12/2012
Defense Date
12/13/2011
Publisher
Los Angeles, California
(original),
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
carbon nanotube,chaotic signal,cortical neuron,ion-channel variability,noisy neuron,OAI-PMH Harvest,reliability of spike,stochastic neuron,synaptic variability
Format
theses
(aat)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Parker, Alice C. (
committee chair
), Celikel, Tansu (
committee member
), Marmarelis, Vasilis Z. (
committee member
)
Creator Email
m_mahvashm@yahoo.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC11290579
Unique identifier
UC11290579
Identifier
etd-MahvashMoh-493.pdf (filename)
Legacy Identifier
etd-MahvashMoh-493
Dmrecord
204254
Document Type
Dissertation
Format
theses (aat)
Rights
Mahvash Mohammadi, Mohammad
Internet Media Type
application/pdf
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
carbon nanotube
chaotic signal
cortical neuron
ion-channel variability
noisy neuron
reliability of spike
stochastic neuron
synaptic variability