hope this fix your issue Your understanding is correct. When all initial values are identical, for example initialize every weight to 0, then when doing backpropagation, all weights will get the same gradient, and hence the same update. This is what is referred to as the symmetry.
Hope this helps In particular backreferences to capturing parentheses make regular expressions more complex than regular, context-free, or context-sensitive grammars. The name is simply historically grown (as many words). See also this section in Wikipedia and this explanation with an example from Perl.
The term "context" in programming languages and how context is affected by loading and updation?
may help you . What does the term context mean in context-free and context-sensitive languages? Can a variable have multiple contexts? If I need to store a particular value in a particular memory address how does does affect the context of the memory address? And if I want to update a variable how does the context of the variable change? , A context-sensitive grammar, productions have the general form
aBc -> ab'c
B -> b'
Whats a/the "bluebook" in the context of network programming?
I hope this helps you . This isn't a typical SO question, but here's the answer. Of course your model doesn't answer your question as it is a neuron model. For connections (synapses in the brain, or elsewhere), you need a model for synapses. In biology, a presynaptic spike (i.e. an "input spike" to a synapse) causes a time-dependent change of the postsynaptic membrane conductance. The shape of this conductance change in biological approximately has a so-called double exponential shape:
// processing at time step t
I_syn *= exp(-delta_t / tau_synapse) // delta_t is your simulation time step
foreach presynaptic_spike of neuron j:
I_syn += weight_of_connection(j)
Are "data races" and "race condition" actually the same thing in context of concurrent programming