Index
-
action potential 1.1.2, 2.2.3
-
activity, population activity
-
A-current 2.3.3
-
adaptation 1.4.1, 13.6, Fig. 2.17, Chapter 6
-
adaptive integrate-and-fire 6.1
-
AdEx, adaptive integrate-and-fire
-
afterpotential 1.2.2, 6.4.2
-
all-to-all coupling 12.3.1
-
AMPA receptor 3.1.2
-
Arrhenius & Current model 9.4.1
-
Arrhenius formula 9.4.1
-
assembly, neuron
-
asynchronous firing 12.4.1
-
asynchronous irregular 13.4.2
-
attractor network 17.2.3
-
autocorrelation function 7.4
-
autocorrelation-renewal 7.5.3
-
axon 1.1.1, 3.3
-
balanced excitation and inhibition 12.3.4, 12.4.4, 12.4.4, 8.2.2
-
Bayesian decoding 11.3
-
Bayesian parameter estimation 10.2.1
-
Bayesian regularization 10.2.2
-
BCM rule 19.2.1
-
bifurcation 4.4
-
biophysical neuron model 2.3.1
-
blobs of activity 18.3.1
-
Brunel network 12.4.4, 13.4.2
-
bumb attractors 18.3
-
bursting, firing pattern
-
cable equation 3.2.1
-
calcium-dependent potassium current 2.3.3
-
calcium spike 2.3.5
-
close-loop stimulus design 10.4
-
cluster states 14.2.3
-
coding 7.6
-
coefficient of variation 7.3.1
-
compartmental model 3.4
-
competition 16.2
-
competitive network 16.3.3
-
conductance-based neuron model 2.2.1
-
conductance input 13.6.3
-
connectivity
-
conservation equation 14.1.3
-
continuity equation 13.1.2
-
continuum model 18.1
-
of population activity 18.1
-
contrast enhancement 18.2.3
-
correlation
-
cortex 1.1.4, 11.3.3, 12.1, 12.1.2, 18.2
-
coupling
-
covariance matrix 19.3.2
-
covariance rule 19.2.1
-
current
-
cut-off frequency 8.1.3
-
Dale’s law 17.3.2
-
decision making Chapter 16
-
decoding 11.3
-
Deep Brain Stimulation (DBS) 20.3
-
dendrite 1.1.1, 1.4.4, 3.2, 6.3.2, 6.4.4
-
dendritic spike 1.4.4, 3.4
-
density equation
-
for membrane potential 13.2.3
-
for refractory variable 14.4.1
-
relation with integral equations 14.4
-
depression of synapses 3.1.3
-
diffusion model 13.3, 8.4
-
diffusive noise 15.2.3
-
drift-diffusion model 16.4.2
-
echo state network 20.1
-
encoding models 11.1
-
escape model 9.1
-
escape noise 15.2.2, Chapter 9
-
escape rate Chapter 9, 9.1.1
-
event-based moment expansion 14.5.1
-
excitable system 4.5
-
exponential integrate-and-fire 5.2
-
facilitation of synapses 3.1.3
-
field equation 18.1
-
field model 18.1
-
finite-size effects 14.6
-
f-I-plot 2.2.3
-
firing
intensity 9.1.1
-
firing pattern Chapter 6, 6.1, 6.2
-
Firing rate 7.2
-
firing rate 7.2, 7.3, 7.6.2
-
firing regime
-
first passage time 8.4.2
-
first principal component 19.3.2, 19.3.2
-
FitzHugh-Nagumo model 4.2.1, 4.4.3
-
Fitzhugh-Nagumo model 3.3.1
-
fixed point 4.3.2
-
flow field 4.3
-
flux 13.1.2
-
Fokker-Planck equation 13.3, 13.3, 8.4
-
Fourier transform 1.3.6
-
frequency-current relation 2.2.3
-
full coupling 12.3.1
-
gain
-
gain function
-
Gamma distribution 6.
-
gating variable 2.2.1
-
Generalized Linear Model (GLM) 10.1.2
-
generative model 9.2
-
ghost of fixed point 4.4.1
-
GLM - Generalized Linear Model 10.1.2
-
Green’s function 3.2.2
-
hazard 7.5.1
-
h-current 2.3.4
-
Hebb’s postulate 19.1
-
Hodgkin-Huxley model 2.2
-
Hopf bifurcation 4.4.3
-
Hopfield model 17.2
-
hyper column 18.2.3
-
illusion 18.1, 18.2.3
-
impulse response 6.4.2
-
inhibition
-
inhibition-stabilized network 18.2.4
-
inhibitory plasticity 20.1.2
-
inhibitory rebound 1.4.1, 4.1.2
-
integral equation 14.1.2
-
integral-equation approach Chapter 14
-
integrate-and-fire model 1.3
-
interspike interval 7.3
-
interval distribution 14.1, 7.5.1
-
for periodic input 9.3
-
input-dependent 7.5.4
-
ion channel 2.2, 2.3
-
Kramers-Moyal expansion 8.4
-
Langevin equation 8.1.1, 8.4
-
Leaky Integrate-and-Fire Model 1.3
-
learning window 19.1.2
-
Liapunov function 16.4.1, 17.2.5
-
Libet experiment 16.5.1
-
likelihood of spike train 9.2
-
limit cycle 4.3.2
-
Linear-Nonlinear Poisson 15.3.3
-
Linear-Nonlinear Poisson Model (LNP) 11.2.1
-
linear regression 10.1.1
-
liquid computing 20.1
-
LNP model, Linear-Nonlinear Poisson
-
locking 20.2.1
-
log-likelihood of a spike train 9.2
-
long-term depression Chapter 19
-
long-term potentiation Chapter 19, 19.1.1
-
low-connectivity network 12.4.4
-
low-threshold calcium current 2.3.5
-
LTD,
long-term depression
-
LTP,
long-term potentiation
-
MAP, maximum a posteriori
-
Markov Process 8.4
-
matrix
-
maximum a posteriori 11.3.1
-
M-current 2.3.3
-
membrane potential 1.2
-
memory Chapter 17
-
Mexican hat 18.1.1
-
Mexican hat connectivity 18.2.2, 18.2.4
-
Morris-Lecar model 4.2.1, 4.4.1
-
motor cortex 19.4
-
MT neuron 16.1.1
-
Nernst potential 2.1.1
-
network, population
-
neural mass models 12.2
-
neuron 1.1
-
neurotransmitter 1.1.3
-
NMDA receptor 3.1.2
-
noise Chapter 7
-
noise model
-
diffusive noise 8.1
-
escape noise 9.1
-
noisy integration 8.1
-
noisy threshold 9.1
-
random connectivity 12.4.4
-
stochastic spike arrival 8.2
-
noise spectrum 7.4
-
nullclines 4.3.1
-
Oja’s rule 19.2.1
-
orientation selectivity, model of 18.2.3
-
Ornstein-Uhlenbeck process 8.1.1, 8.4
-
oscillation 13.4.2, 14.2.3, 20.3
-
overlap 17.2.3
-
pairing experiment 19.1.2
-
parameter estimation
-
Parkinson disease 20.3
-
pattern recognition 17.2.3
-
with spiking neurons 17.3
-
peri-stimulus-time histogram 7.2.2
-
persistent sodium current 2.3.3
-
phase code 20.2, 7.6.2
-
phase plane 6.2.2, 6.2.4
-
phase plane analysis 4.3
-
phase portrait 4.3
-
phase response curve 20.2.3
-
plasticity
-
plasticity synaptic Chapter 19
-
point process 7.5
-
Poisson neuron 7.5.3
-
Poisson process
-
population 12.1.2, 7.2.3
-
population activity 13.2.3, 14.1, 7.2.3
-
population dynamics 12.2
-
population vector 7.6.1
-
postsynaptic potential 1.2.1
-
power spectrum 7.4
-
prediction Chapter 11
-
priming 17.1
-
principal component analysis 19.3.2
-
principal components 19.3.2
-
probability current 13.2.1
-
PSTH, peri-stimulus time-histogram
-
pyramidal cell 3.4
-
quasi-renewal theory 14.1.4, 14.5
-
quasi steady state 4.2.1
-
random connectivity 12.3.2, 12.3.3, 12.4.4
-
random walk 8.2.2
-
random weight matrix 20.1.2
-
rate 7.2
-
rebound, inhibitory 1.4.1, 2.3.5, 4.1.2
-
rebound spike 2.2.3
-
receptive field 1.1.4, 11.2.1, 12.1.1
-
reconstruction kernel 7.6.2
-
refractoriness 6.4.2
-
refractory density 14.4.1
-
refractory period 1.1.2
-
regression
-
regularization 10.2.2
-
relaxation oscillation 4.6
-
renewal
-
renewal theory
-
reservoir computing 20.1
-
resting potential 1.2
-
reversal potential 1.4.2, 2.1.2
-
reverse correlation 11.2.1, 7.6.2
-
rheobase current 4.4
-
ring model 18.1.1, 18.2.3
-
ruins of fixed point 4.4.1
-
saddle point 4.3.2
-
scaling behavior 12.3
-
self-averaging 17.2.3
-
separation of time scales 4.2.1, 4.6
-
shunting inhibition 1.4.2
-
signal-to-noise ratio 7.4, 9.4.2
-
similarity measure 10.3.5
-
singular perturbation 4.6
-
slow sodium current 2.3.4
-
soft threshold Chapter 9
-
soma 1.1.1
-
spectral radius 20.1.2
-
spike
-
spike afterpotential, afterpotential
-
spike response model 6.4
-
spike train 1.1.2
-
spike train decoding
-
spike-triggered average 10.2.1, 11.2.1
-
spiking neuron model
-
spontaneous activity 12.4.4, Fig. 7.1, Chapter 7
-
STA,
spike-triggered average
-
stability 4.3.2
-
stable manifold 4.5.1
-
stationary state 14.2.2
-
STDP 19.1.2
-
Stein’s model 8.2
-
stimulation
-
stimulus reconstruction
-
stochastic
-
STP, short-term plasticity
-
Stroop effect 17.1
-
subthreshold oscillation 6.2.4
-
survivor function 7.5.1
-
synapse 1.1.1, 1.1.3, 3.1
-
synaptic
-
transmission failures 7.1.2
-
synaptic depression 3.1.3
-
synaptic plasticity Chapter 19
-
synchonous regular 14.2.3
-
synchronization 20.3
-
synchronous regular 13.4.2
-
synchrony 7.6.2
-
threshold 1.2.2, 5.1.1
-
time-rescaling theorem 10.3.3
-
transfer function
-
transient spike 2.2.3
-
type I/II model
-
visual illusion 18.1, 18.2.3
-
Vogels-Abbott network 12.4.4
-
volition 16.5
-
weight matrix 20.1.2
-
Wiener-Khinchin Theorem 7.4
-
will, volition
-
Wilson-Cowan model 15.3.1
-
winner-take-all 16.3.4
-
working memory 17.1.3, 18.3.1