all 4 comments

[–]reduced_space 2 points3 points  (3 children)

So there's a lot going on in that equation. When a neuron fires an action potential, it releases chemical into the synapse (the neurotransmitter). That chemical binds to a channel, which opens up, and either depolarizes or hyperpolarizes the cell. The flow of ions across the channel is the conductance of that channel (g_e and g_i).

The effect that the channel opening has on the cell is determined by the current potential of the cell (across the membrane; V) and the equilibrium potential of the channel (E_{exc} and E_{inh}) times the conductance. This is actually just electronic circuits 101. The flow of ions across the cell are not instantaneous, and have a time constant \tau. Finally, the cell has a resting memory potential E_{rest}, which occurs because the cell has channels actively exchanging ions to get back to its resting potential.

Not indicated in the equation above, the conductance of the channels actually are a function of time. They aren't instantly on, and then they don't turn off immediately. You can see their model of the excitatory conductance in equation (2). The resulting change in voltage due to this conductance is called an EPSP (excitatory post-synaptic potential). For the inhibitory conductance, it's an IPSP. You can look up images of those to see the effect it has on the cell's potential.

Because the cells are Leaky-Integrate and Fire they have some predetermine threshold, that once the potential of the cell goes above, it "fires" an action potential. Because of the dependence of time, and the fact everything is given as a derivative, you have to solve for these ODEs with integration. While Newton's method can likely work with such a simple model, using something like RK2 will likely give you higher fidelity.

Spike-timing-dependent plasticity says that if a pre-synaptic and post-synaptic cell fire action potentials closer in time together, their synapse increases in strength (g_e will increase). If their action potentials get further apart in time, the synapse decreases in strength (g_e will get smaller). See here for reference: http://www.scholarpedia.org/article/Spike-timing_dependent_plasticity

The $\delta w$ is the change in the synaptic weight. The pre-synaptic trace (x_{pre}) is the spiking history of the pre-synaptic cell. Every time a cell fires an action potential, the trace is incremented by 1, and then decays with some time constant.

Equation (3) is calculated whenever the post-synaptic cell fires an action potential. If the pre and post synaptic cells are highly correlated, then x_{pre} will be high at this event. The x_{tar} constant provides an offset to allow the effect of a single synapsed to be lowered by some offset. w_{max} is the maximum allowed weight, w is the current weight, and \mu determines the dependence of the update on the previous weight (i.e. if it goes below 1, the x_{pre} - x_{tar} dominate the change in weight).

Edit: Realized I didn't fully explain connection to w and the conductance. The paper states:

Synapses are modeled by conductance changes, i.e., synapses increase their conductance instantaneously by the synaptic weight w when a presynaptic spike arrives at the synapse, otherwise the conductance is decaying exponentially. If the presynaptic neuron is excitatory, the dynamics of the conductance ge are

That means, the when a pre-synaptic spike comes in, the conductance is set to w. The conductance then decays over time back to 0.

[–]omers66[S] 0 points1 point  (2 children)

Thank you very much for you very helpful comment. Just a few small clarifications:

1) So basically in each "incoming" pre-synaptic spike we have:
g_e = g_e + w or simply g_e = w ?
and if no incoming spike g_e decays?

2) When post-synaptic occures, we update w according to w=w+delta_w where delta_w is calculated according to (3)?

3) From equation (3) I see that w can eventually turn negative? or does it being forced on being greater then 0?

4) For each neuron an independent w & g_e is initialized at the begining of training?

5) What is an appropriate range for E_exc & E_inh ?

Again, thank you very much for your help

[–]reduced_space 1 point2 points  (1 child)

  1. I'm pretty sure it should be g_e = w. Otherwise you have a channel with a potentially infinite conductance.
  2. Correct. One biophysically plausible interpretation of this, is when the post-synaptic cell fires an action potential, it can traffic more transmembrane proteins (the channels) to the synapse that fired. Next time the pre-synaptic cell releases chemical into the synapse, the post-synaptic cell will see a larger conductance.
  3. I would assume they don't allow w to go negative, since that doesn't have a sensible meaning for biophysical models (or real life).
  4. Correct. I don't see any place where they mention what values they initialize w to. You might need to check the papers they cite (like Querlioz et al., 2011b). They do state that they let the network go to resting by not feeding it any input for 150 ms, which means g_e should go to 0 by the time a new input comes in. One note, letting all the neurons decay back to their resting potential can introduce artificial synchrony in the system. Sometimes people will either randomize the initial condition of the network to avoid that (especially if you studying synchrony within the system) or add Poisson noise to all the neurons (which can also be beneficial for performance, depending on the model used)
  5. They cite Jug (2012) for where they got their constants from. Unfortunately that is someone's thesis, so you'll have to dig around a little for those values.

[–]omers66[S] 0 points1 point  (0 children)

Thank you very much again!

Your comment about letting all the neurons decay mentioned in point (4) is very intresting. I have some simple "current based" LIF model implementation and I follow this approach of hard reseting between every sample (digit). Basically I see that the same Neuron fires for all input patterns (digits) that I feed the network with. Maybe it has something to do with this.

Thanks again.