ºÝºÝߣ

ºÝºÝߣShare a Scribd company logo
EGR 183: Modeling Neural Networks  in silico Dr. Needham - Fall 2007 Daniel Calrin B.S.E. Daniel Cook Joshua Mendoza-Elias
Background: Neurons
Background: LTP and LTD The mechanisms of Long-term Potentiation
Background continued: The mechanisms of Long-term Depression
Short-term and Long-term Effects
The Basis of Hebbian Learning
Foundation for our Computer Model
Types of Neuron Inhibitory Neurons Input Neuron Output Neuron Neuron is voltage-clamped, presynaptic to all neurons in model Inhibitory neurons depress post-synaptic neurons Excitatory Neurons Average firing rates solved at each time step Learning rule determines change in synaptic strength inhibitory synapse excitatory synapse Key: 1 2 3 4 ¦Á  =  - 1 4
Synaptic strengths 1.0 1.0 t=t 0 t=t 0+1 + ¦Á In phase Out of phase + ¦Á Two neurons are firing full-speed: Strengths increase by factor of alpha 1.0 0.0 - ¦Â One neuron v i  is firing but v j  is not: Strengths decrease by factor of beta - ¦Â v i v j v i v j v i v j v i v j
Inhibitory Neurons t=t 0 t=t 0+1 Excitatory Inhibitory = ...+ w i,j v i  +... = ...- w i,j v i  +... An inhibitory neuron v i  is firing, depressing the post-synaptic neuron Weighted v i  is summed negatively into v j  Weighted v i  is summed positively into v j  An excitatory neuron v i  is firing, potentiating the post-synaptic neuron v i v j v j v i v i v j v i v j
In-phase & out-of-phase components, but we could not teach the model complete phasic inversion Need further development to do this: one-way connections (i.e. some strengths are 0) Results
Phase components Learning rule:  ¦Á  = | v 1  - v N  | Input Neuron Output Neuron Convergence In phase component Out of phase component time firing rate
Output Neuron Maxima & Minima Local max  near  minimum Local min  near  maximum Maximum at  maximum time firing rate Input Neuron
Further developments Short-term Sparse synapse matrix (i.e. some synapses are strength 0) Asynchronous firing Multi-dimensional training (i.e. for character recognition, sound recognition, etc.) Long-term Ca +2  Modeling Gene Expression Profile (DNA microarray data to reflect changes in synaptic efficacy)
Biological parallel with  In Silico Starvoytov  et al.  2005 Light-drected stimulation of neurons on silicon wafers.  J Neurophysiol   93 : 1090-1098.
LDS in concert with Computer Simulation MEAs vs. LDS  More real-time data More quickly Scans: Works on variably connected neural networks

More Related Content

EGR 183 Final Presentation

  • 1. EGR 183: Modeling Neural Networks in silico Dr. Needham - Fall 2007 Daniel Calrin B.S.E. Daniel Cook Joshua Mendoza-Elias
  • 3. Background: LTP and LTD The mechanisms of Long-term Potentiation
  • 4. Background continued: The mechanisms of Long-term Depression
  • 6. The Basis of Hebbian Learning
  • 7. Foundation for our Computer Model
  • 8. Types of Neuron Inhibitory Neurons Input Neuron Output Neuron Neuron is voltage-clamped, presynaptic to all neurons in model Inhibitory neurons depress post-synaptic neurons Excitatory Neurons Average firing rates solved at each time step Learning rule determines change in synaptic strength inhibitory synapse excitatory synapse Key: 1 2 3 4 ¦Á = - 1 4
  • 9. Synaptic strengths 1.0 1.0 t=t 0 t=t 0+1 + ¦Á In phase Out of phase + ¦Á Two neurons are firing full-speed: Strengths increase by factor of alpha 1.0 0.0 - ¦Â One neuron v i is firing but v j is not: Strengths decrease by factor of beta - ¦Â v i v j v i v j v i v j v i v j
  • 10. Inhibitory Neurons t=t 0 t=t 0+1 Excitatory Inhibitory = ...+ w i,j v i +... = ...- w i,j v i +... An inhibitory neuron v i is firing, depressing the post-synaptic neuron Weighted v i is summed negatively into v j Weighted v i is summed positively into v j An excitatory neuron v i is firing, potentiating the post-synaptic neuron v i v j v j v i v i v j v i v j
  • 11. In-phase & out-of-phase components, but we could not teach the model complete phasic inversion Need further development to do this: one-way connections (i.e. some strengths are 0) Results
  • 12. Phase components Learning rule: ¦Á = | v 1 - v N | Input Neuron Output Neuron Convergence In phase component Out of phase component time firing rate
  • 13. Output Neuron Maxima & Minima Local max near minimum Local min near maximum Maximum at maximum time firing rate Input Neuron
  • 14. Further developments Short-term Sparse synapse matrix (i.e. some synapses are strength 0) Asynchronous firing Multi-dimensional training (i.e. for character recognition, sound recognition, etc.) Long-term Ca +2 Modeling Gene Expression Profile (DNA microarray data to reflect changes in synaptic efficacy)
  • 15. Biological parallel with In Silico Starvoytov et al. 2005 Light-drected stimulation of neurons on silicon wafers. J Neurophysiol 93 : 1090-1098.
  • 16. LDS in concert with Computer Simulation MEAs vs. LDS More real-time data More quickly Scans: Works on variably connected neural networks