SlideShare une entreprise Scribd logo
1  sur  22
Montecarlo-based method for
large scale network analysis
   with Gibbs distribution

      Hassan Nasser & Bruno Cessac
 Neuromathcomp team – INRIA Sophia-Antipolis
Spike train dictionary
Observable
●   Observable: a function which associates to a
    spike train a real number.
●   Ex: Firing rate, pairwise correlation.
                           Firing rate

                           Neurons k1 fires at
                           n1 while the neuron
                           k2 fires at the time n2

                           Neurons k1 fires at
                           k1 while the neurons
                           n2 is silent at time n2
Empirical average
●   The frequency of an observable is the spike
    train.
Gibbs potential
●   Gibbs potential: represents a model of spike
    train where observables and observable
    coefficients (parameters) are represented.




             Potential   Parameters   Observable
                          (weights)
Modeling a spike train with a
          Gibbs potential
●   Given a spike train (and its empirical averages
    = empirical probability distribution).
●   Given a Gibbs potential and the associated
    Gibbs probability distribution.
●   Our aim is to find parameters such that the
    KL distance between the empirical and
    theoretical Gibbs distribution is minimal -->
    Maximal entropy.
Relevant previous works
●   Vasquez et al. 2012 showed that Gibbs
    potential models with memory reproduce more
    preciselt the statistical distibution of a spike
    train --> Small size networks.
●   Tkacik et al. 2008 showed a Montecarlo based
    method to reproduce the statistics for an Ising
    models --> Large scale network.
Maximum entropy Vs
               montecarlo
●   Maximum entropy
      ●   Precise (Solving with the transfer matrix).
      ●   Computation time grows exponentially with the
           network and memory size (Computation of
           transfer matrix).
●   Montecarlo:
      ●   Not precise (Error depends on the chain length).
      ●   Fast (Liner growth with network and memory
           size).
My work
●   Reproducing the statistics for large networks
    using Montecarlo.



●   I began with implementing the classical
    Montecarlo in order to reproduce the statistics
    but the problem didn't converge.
●   The situtation is different when in the spatio-
    temporal case, since:



●




●   The normalization factor changes in the spatio-
    temporal case. However, the classical
    Montecarlo works for Ising and Memory = 1
    models (Taking into account a detailed balance
    assumption).
●   The situtation is different when in the spatio-
    temporal case, since:



●




●   The normalization factor changes in the spatio-
    temporal case. However, the classical
    Montecarlo works for Ising and Memory = 1
    models (Taking into account a detailed balance
    assumption).
●   The situtation is different when in the spatio-
    temporal case, since:



●




●   The normalization factor changes in the spatio-
    temporal case. However, the classical
    Montecarlo works for Ising and Memory = 1
    models (Taking into account a detailed balance
    assumption).
Then
How it works?
●   Given a real spike train.
●   We generate a random spike train of length
    (Ntimes) and a random set of parameters.
●   We choose random events in the raster and we
    flip the event (0 --> 1 or 1-->0) and we compute
    the difference of energy.
●   If the energy increase, we accept the new state.
    Otherwise, we reject.
●   We do this flipping (Nflip) times.
Reproducing the statistics
               ●          With a simple know-distribution raster, where
                          empirical probabilities are known, we could
                          generate another raster with the same
                          probability distribution.
Theoretical Probability




                                          Empirical Probability
How fast is Montecarlo?


                                  ial
                                nt
                            o ne
                      Exp
            Linerar
CPU time Vs Ntimes
Influence of Ntimes and Nflip
        on the error
     3 Neurons      7 Neurons
●   What we presented was the application of
    Montecarlo for small size networks.
●   Why?
●   Because with small size networks, we can
    compute the error committed on the observable
    computation.
●   Why?
●   Because the maximal entropy can compute
    observable averages only for small size
    networks.
Bruno invented a “particular
              potential”
●   A potential where observable averages could
    be computed analytically even for large
    networks.
●   This potential is a set of pairs of events
    (neuron, time).




    This is the estimated average with Montecarlo.
    We can compute the error by comparing this estimated average
    with the empirical average given by the real raster
Error and CPU time
What we did - what we haven't
             done yet.
●   Implementing Montecarlo based algorithm that
    gives an estimation for a spike train statistics on
    a Gibbs distribution for large networks and for
    models with memory
●   Now we can deal with large networks.
●   We want to compute the parameters.
●   Application of our method on really real data.

Contenu connexe

Tendances

Fundamentals of Quantum Computing
Fundamentals of Quantum ComputingFundamentals of Quantum Computing
Fundamentals of Quantum Computingachakracu
 
Quantum Computing
Quantum ComputingQuantum Computing
Quantum Computingt0pgun
 
PR12-225 Discovering Physical Concepts With Neural Networks
PR12-225 Discovering Physical Concepts With Neural NetworksPR12-225 Discovering Physical Concepts With Neural Networks
PR12-225 Discovering Physical Concepts With Neural NetworksKyunghoon Jung
 
Quantum computation: past-now-future - 2021-06-19
Quantum computation: past-now-future - 2021-06-19Quantum computation: past-now-future - 2021-06-19
Quantum computation: past-now-future - 2021-06-19Aritra Sarkar
 
Presentation on quantum computers
Presentation on quantum computersPresentation on quantum computers
Presentation on quantum computersNancy Mann
 
Quantum Computing by Rajeev Chauhan
Quantum Computing by Rajeev ChauhanQuantum Computing by Rajeev Chauhan
Quantum Computing by Rajeev ChauhanOWASP Delhi
 
Strengths and limitations of quantum computing
Strengths and limitations of quantum computingStrengths and limitations of quantum computing
Strengths and limitations of quantum computingVinayak Sharma
 
Implication of rh and qc on information security sharad nalawade(author)
Implication of rh and qc on information security sharad nalawade(author)Implication of rh and qc on information security sharad nalawade(author)
Implication of rh and qc on information security sharad nalawade(author)Priyanka Aash
 
Quantum Computing Basics
Quantum Computing BasicsQuantum Computing Basics
Quantum Computing BasicsChristian Waha
 
Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...
Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...
Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...Dr Rajnish Mallick
 
Crimson Publishers-Natural Limitations of Quantum Computing
Crimson Publishers-Natural Limitations of Quantum ComputingCrimson Publishers-Natural Limitations of Quantum Computing
Crimson Publishers-Natural Limitations of Quantum ComputingCrimsonpublishers-Electronics
 

Tendances (20)

Quantum Information
Quantum InformationQuantum Information
Quantum Information
 
Fundamentals of Quantum Computing
Fundamentals of Quantum ComputingFundamentals of Quantum Computing
Fundamentals of Quantum Computing
 
Quantum Computing
Quantum ComputingQuantum Computing
Quantum Computing
 
Quantum Computing
Quantum ComputingQuantum Computing
Quantum Computing
 
PR12-225 Discovering Physical Concepts With Neural Networks
PR12-225 Discovering Physical Concepts With Neural NetworksPR12-225 Discovering Physical Concepts With Neural Networks
PR12-225 Discovering Physical Concepts With Neural Networks
 
Quantum computation: past-now-future - 2021-06-19
Quantum computation: past-now-future - 2021-06-19Quantum computation: past-now-future - 2021-06-19
Quantum computation: past-now-future - 2021-06-19
 
Quantum Computing
Quantum ComputingQuantum Computing
Quantum Computing
 
Quantum computing
Quantum computingQuantum computing
Quantum computing
 
Quantum computing meghaditya
Quantum computing meghadityaQuantum computing meghaditya
Quantum computing meghaditya
 
Des2017 quantum computing_final
Des2017 quantum computing_finalDes2017 quantum computing_final
Des2017 quantum computing_final
 
Presentation on quantum computers
Presentation on quantum computersPresentation on quantum computers
Presentation on quantum computers
 
Quantum Computing by Rajeev Chauhan
Quantum Computing by Rajeev ChauhanQuantum Computing by Rajeev Chauhan
Quantum Computing by Rajeev Chauhan
 
Strengths and limitations of quantum computing
Strengths and limitations of quantum computingStrengths and limitations of quantum computing
Strengths and limitations of quantum computing
 
Quantum computing
Quantum computingQuantum computing
Quantum computing
 
Implication of rh and qc on information security sharad nalawade(author)
Implication of rh and qc on information security sharad nalawade(author)Implication of rh and qc on information security sharad nalawade(author)
Implication of rh and qc on information security sharad nalawade(author)
 
Quantum computing
Quantum computingQuantum computing
Quantum computing
 
Quantum Computing Basics
Quantum Computing BasicsQuantum Computing Basics
Quantum Computing Basics
 
OPTICALQuantum
OPTICALQuantumOPTICALQuantum
OPTICALQuantum
 
Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...
Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...
Quantum Computing Primer - Future of Scientific Computing: Opportunities for ...
 
Crimson Publishers-Natural Limitations of Quantum Computing
Crimson Publishers-Natural Limitations of Quantum ComputingCrimson Publishers-Natural Limitations of Quantum Computing
Crimson Publishers-Natural Limitations of Quantum Computing
 

Similaire à Large scale analysis for spiking data

The Search for Gravitational Waves
The Search for Gravitational WavesThe Search for Gravitational Waves
The Search for Gravitational Wavesinside-BigData.com
 
myelin presentation
myelin presentationmyelin presentation
myelin presentationahmadbutt92
 
NeurSciACone
NeurSciAConeNeurSciACone
NeurSciAConeAdam Cone
 
Artificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering College
Artificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering CollegeArtificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering College
Artificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering CollegeDhivyaa C.R
 
Artificial Intelligence Applications in Petroleum Engineering - Part I
Artificial Intelligence Applications in Petroleum Engineering - Part IArtificial Intelligence Applications in Petroleum Engineering - Part I
Artificial Intelligence Applications in Petroleum Engineering - Part IRamez Abdalla, M.Sc
 
Markov Chain Monte Carlo explained
Markov Chain Monte Carlo explainedMarkov Chain Monte Carlo explained
Markov Chain Monte Carlo explaineddariodigiuni
 
First paper with the NITheCS affiliation
First paper with the NITheCS affiliationFirst paper with the NITheCS affiliation
First paper with the NITheCS affiliationRene Kotze
 
[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...
[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...
[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...Taegyun Jeon
 

Similaire à Large scale analysis for spiking data (20)

Quantum Computers
Quantum ComputersQuantum Computers
Quantum Computers
 
Advanced Machine Learning
Advanced Machine LearningAdvanced Machine Learning
Advanced Machine Learning
 
rnn BASICS
rnn BASICSrnn BASICS
rnn BASICS
 
Lec10new
Lec10newLec10new
Lec10new
 
lec10new.ppt
lec10new.pptlec10new.ppt
lec10new.ppt
 
The Search for Gravitational Waves
The Search for Gravitational WavesThe Search for Gravitational Waves
The Search for Gravitational Waves
 
Quantum & AI in Finance
Quantum & AI in FinanceQuantum & AI in Finance
Quantum & AI in Finance
 
myelin presentation
myelin presentationmyelin presentation
myelin presentation
 
NeurSciACone
NeurSciAConeNeurSciACone
NeurSciACone
 
Neural
NeuralNeural
Neural
 
UNIT III (8).pptx
UNIT III (8).pptxUNIT III (8).pptx
UNIT III (8).pptx
 
Artificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering College
Artificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering CollegeArtificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering College
Artificial Neural Network by Dr.C.R.Dhivyaa Kongu Engineering College
 
Artificial Intelligence Applications in Petroleum Engineering - Part I
Artificial Intelligence Applications in Petroleum Engineering - Part IArtificial Intelligence Applications in Petroleum Engineering - Part I
Artificial Intelligence Applications in Petroleum Engineering - Part I
 
Markov Chain Monte Carlo explained
Markov Chain Monte Carlo explainedMarkov Chain Monte Carlo explained
Markov Chain Monte Carlo explained
 
Conformer review
Conformer reviewConformer review
Conformer review
 
Quantum & AI in Finance
Quantum & AI in FinanceQuantum & AI in Finance
Quantum & AI in Finance
 
Quantum computing1
Quantum computing1Quantum computing1
Quantum computing1
 
Quantum comput ing
Quantum comput ingQuantum comput ing
Quantum comput ing
 
First paper with the NITheCS affiliation
First paper with the NITheCS affiliationFirst paper with the NITheCS affiliation
First paper with the NITheCS affiliation
 
[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...
[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...
[PR12] PR-050: Convolutional LSTM Network: A Machine Learning Approach for Pr...
 

Plus de Hassan Nasser

Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Hassan Nasser
 
Poster Toward a realistic retinal simulator
Poster Toward a realistic retinal simulatorPoster Toward a realistic retinal simulator
Poster Toward a realistic retinal simulatorHassan Nasser
 
Toward a realistic retina simulator
Toward a realistic retina simulatorToward a realistic retina simulator
Toward a realistic retina simulatorHassan Nasser
 
Large scalespikingnetworkanalysis
Large scalespikingnetworkanalysisLarge scalespikingnetworkanalysis
Large scalespikingnetworkanalysisHassan Nasser
 
Presentation of local estimation of pressure wave velocity from dynamic MR im...
Presentation of local estimation of pressure wave velocity from dynamic MR im...Presentation of local estimation of pressure wave velocity from dynamic MR im...
Presentation of local estimation of pressure wave velocity from dynamic MR im...Hassan Nasser
 
Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.
Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.
Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.Hassan Nasser
 

Plus de Hassan Nasser (6)

Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
 
Poster Toward a realistic retinal simulator
Poster Toward a realistic retinal simulatorPoster Toward a realistic retinal simulator
Poster Toward a realistic retinal simulator
 
Toward a realistic retina simulator
Toward a realistic retina simulatorToward a realistic retina simulator
Toward a realistic retina simulator
 
Large scalespikingnetworkanalysis
Large scalespikingnetworkanalysisLarge scalespikingnetworkanalysis
Large scalespikingnetworkanalysis
 
Presentation of local estimation of pressure wave velocity from dynamic MR im...
Presentation of local estimation of pressure wave velocity from dynamic MR im...Presentation of local estimation of pressure wave velocity from dynamic MR im...
Presentation of local estimation of pressure wave velocity from dynamic MR im...
 
Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.
Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.
Mesure locale de la vitesse de l’onde de pression par l’IRM dynamique.
 

Large scale analysis for spiking data

  • 1. Montecarlo-based method for large scale network analysis with Gibbs distribution Hassan Nasser & Bruno Cessac Neuromathcomp team – INRIA Sophia-Antipolis
  • 3. Observable ● Observable: a function which associates to a spike train a real number. ● Ex: Firing rate, pairwise correlation. Firing rate Neurons k1 fires at n1 while the neuron k2 fires at the time n2 Neurons k1 fires at k1 while the neurons n2 is silent at time n2
  • 4. Empirical average ● The frequency of an observable is the spike train.
  • 5. Gibbs potential ● Gibbs potential: represents a model of spike train where observables and observable coefficients (parameters) are represented. Potential Parameters Observable (weights)
  • 6. Modeling a spike train with a Gibbs potential ● Given a spike train (and its empirical averages = empirical probability distribution). ● Given a Gibbs potential and the associated Gibbs probability distribution. ● Our aim is to find parameters such that the KL distance between the empirical and theoretical Gibbs distribution is minimal --> Maximal entropy.
  • 7. Relevant previous works ● Vasquez et al. 2012 showed that Gibbs potential models with memory reproduce more preciselt the statistical distibution of a spike train --> Small size networks. ● Tkacik et al. 2008 showed a Montecarlo based method to reproduce the statistics for an Ising models --> Large scale network.
  • 8. Maximum entropy Vs montecarlo ● Maximum entropy ● Precise (Solving with the transfer matrix). ● Computation time grows exponentially with the network and memory size (Computation of transfer matrix). ● Montecarlo: ● Not precise (Error depends on the chain length). ● Fast (Liner growth with network and memory size).
  • 9. My work ● Reproducing the statistics for large networks using Montecarlo. ● I began with implementing the classical Montecarlo in order to reproduce the statistics but the problem didn't converge.
  • 10. The situtation is different when in the spatio- temporal case, since: ● ● The normalization factor changes in the spatio- temporal case. However, the classical Montecarlo works for Ising and Memory = 1 models (Taking into account a detailed balance assumption).
  • 11. The situtation is different when in the spatio- temporal case, since: ● ● The normalization factor changes in the spatio- temporal case. However, the classical Montecarlo works for Ising and Memory = 1 models (Taking into account a detailed balance assumption).
  • 12. The situtation is different when in the spatio- temporal case, since: ● ● The normalization factor changes in the spatio- temporal case. However, the classical Montecarlo works for Ising and Memory = 1 models (Taking into account a detailed balance assumption).
  • 13. Then
  • 14. How it works? ● Given a real spike train. ● We generate a random spike train of length (Ntimes) and a random set of parameters. ● We choose random events in the raster and we flip the event (0 --> 1 or 1-->0) and we compute the difference of energy. ● If the energy increase, we accept the new state. Otherwise, we reject. ● We do this flipping (Nflip) times.
  • 15. Reproducing the statistics ● With a simple know-distribution raster, where empirical probabilities are known, we could generate another raster with the same probability distribution. Theoretical Probability Empirical Probability
  • 16. How fast is Montecarlo? ial nt o ne Exp Linerar
  • 17. CPU time Vs Ntimes
  • 18. Influence of Ntimes and Nflip on the error 3 Neurons 7 Neurons
  • 19. What we presented was the application of Montecarlo for small size networks. ● Why? ● Because with small size networks, we can compute the error committed on the observable computation. ● Why? ● Because the maximal entropy can compute observable averages only for small size networks.
  • 20. Bruno invented a “particular potential” ● A potential where observable averages could be computed analytically even for large networks. ● This potential is a set of pairs of events (neuron, time). This is the estimated average with Montecarlo. We can compute the error by comparing this estimated average with the empirical average given by the real raster
  • 22. What we did - what we haven't done yet. ● Implementing Montecarlo based algorithm that gives an estimation for a spike train statistics on a Gibbs distribution for large networks and for models with memory ● Now we can deal with large networks. ● We want to compute the parameters. ● Application of our method on really real data.