-
Notifications
You must be signed in to change notification settings - Fork 0
/
class LIFNode2.txt
75 lines (53 loc) · 2.89 KB
/
class LIFNode2.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
class LIFNode2(BaseNode):
def __init__(self, tau: float = 2., v_threshold: float = 1.,
v_reset: float = 0., surrogate_function: Callable = surrogate.Sigmoid(),
detach_reset: bool = False):
"""
* :ref:`API in English <LIFNode.__init__-en>`
.. _LIFNode.__init__-cn:
:param tau: 膜电位时间常数
:type tau: float
:param v_threshold: 神经元的阈值电压
:type v_threshold: float
:param v_reset: 神经元的重置电压。如果不为 ``None``,当神经元释放脉冲后,电压会被重置为 ``v_reset``;
如果设置为 ``None``,则电压会被减去 ``v_threshold``
:type v_reset: float
:param surrogate_function: 反向传播时用来计算脉冲函数梯度的替代函数
:type surrogate_function: Callable
:param detach_reset: 是否将reset过程的计算图分离
:type detach_reset: bool
Leaky Integrate-and-Fire 神经元模型,可以看作是带漏电的积分器。其阈下神经动力学方程为:
.. math::
V[t] = V[t-1] + \\frac{1}{\\tau}(X[t] - (V[t-1] - V_{reset})
* :ref:`中文API <LIFNode.__init__-cn>`
.. _LIFNode.__init__-en:
:param tau: membrane time constant
:type tau: float
:param v_threshold: threshold voltage of neurons
:type v_threshold: float
:param v_reset: reset voltage of neurons. If not ``None``, voltage of neurons that just fired spikes will be set to
``v_reset``. If ``None``, voltage of neurons that just fired spikes will subtract ``v_threshold``
:type v_reset: float
:param surrogate_function: surrogate function for replacing gradient of spiking functions during back-propagation
:type surrogate_function: Callable
:param detach_reset: whether detach the computation graph of reset
:type detach_reset: bool
The Leaky Integrate-and-Fire neuron, which can be seen as a leaky integrator.
The subthreshold neural dynamics of it is as followed:
.. math::
V[t] = V[t-1] + \\frac{1}{\\tau}(X[t] - (V[t-1] - V_{reset})
"""
assert isinstance(tau, float) and tau > 1e-6
super().__init__(v_threshold, v_reset, surrogate_function, detach_reset)
self.tau = tau
def extra_repr(self):
return super().extra_repr() + f', tau={self.tau}'
def neuronal_charge(self, x: torch.Tensor):
dt = 3.5e-6
if self.v_reset is None:
self.v = self.v + ((x - self.v) / self.tau)*dt
else:
if isinstance(self.v_reset, float) and self.v_reset == 0.:
self.v = self.v + ((x - self.v) / self.tau)*dt
else:
self.v = self.v + ((x - (self.v - self.v_reset)) / self.tau)*dt