際際滷shows by User: UmbertoPicchini / http://www.slideshare.net/images/logo.gif 際際滷shows by User: UmbertoPicchini / Fri, 17 Mar 2023 11:41:39 GMT 際際滷Share feed for 際際滷shows by User: UmbertoPicchini Guided sequential ABC schemes for simulation-based inference /slideshow/guided-sequential-abc-schemes-for-simulationbased-inference/256590007 slidesguidedabc-230317114139-0d7ea702
Presented at BayesComp 2023 in Levi (Finland), based on Picchini and Tamborrino (2022). Guided sequential ABC schemes for intractable Bayesian models, arXiv:2206.12235.]]>

Presented at BayesComp 2023 in Levi (Finland), based on Picchini and Tamborrino (2022). Guided sequential ABC schemes for intractable Bayesian models, arXiv:2206.12235.]]>
Fri, 17 Mar 2023 11:41:39 GMT /slideshow/guided-sequential-abc-schemes-for-simulationbased-inference/256590007 UmbertoPicchini@slideshare.net(UmbertoPicchini) Guided sequential ABC schemes for simulation-based inference UmbertoPicchini Presented at BayesComp 2023 in Levi (Finland), based on Picchini and Tamborrino (2022). Guided sequential ABC schemes for intractable Bayesian models, arXiv:2206.12235. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/slidesguidedabc-230317114139-0d7ea702-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Presented at BayesComp 2023 in Levi (Finland), based on Picchini and Tamborrino (2022). Guided sequential ABC schemes for intractable Bayesian models, arXiv:2206.12235.
Guided sequential ABC schemes for simulation-based inference from Umberto Picchini
]]>
77 0 https://cdn.slidesharecdn.com/ss_thumbnails/slidesguidedabc-230317114139-0d7ea702-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Bayesian inference for mixed-effects models driven by SDEs and other stochastic models: a scalable approach /slideshow/bayesian-inference-for-stoch-memsbayesian-inference-for-mixedeffects-models-driven-by-sdes-and-other-stochastic-models-a-scalable-approach/251492401 main-220401152448
An important, and well studied, class of stochastic models is given by stochastic differential equations (SDEs). In this talk, we consider Bayesian inference based on measurements from several individuals, to provide inference at the "population level" using mixed-effects modelling. We consider the case where dynamics are expressed via SDEs or other stochastic (Markovian) models. Stochastic differential equation mixed-effects models (SDEMEMs) are flexible hierarchical models that account for (i) the intrinsic random variability in the latent states dynamics, as well as (ii) the variability between individuals, and also (iii) account for measurement error. This flexibility gives rise to methodological and computational difficulties. Fully Bayesian inference for nonlinear SDEMEMs is complicated by the typical intractability of the observed data likelihood which motivates the use of sampling-based approaches such as Markov chain Monte Carlo. A Gibbs sampler is proposed to target the marginal posterior of all parameters of interest. The algorithm is made computationally efficient through careful use of blocking strategies, particle filters (sequential Monte Carlo) and correlated pseudo-marginal approaches. The resulting methodology is is flexible, general and is able to deal with a large class of nonlinear SDEMEMs [1]. In a more recent work [2], we also explored ways to make inference even more scalable to an increasing number of individuals, while also dealing with state-space models driven by other stochastic dynamic models than SDEs, eg Markov jump processes and nonlinear solvers typically used in systems biology. [1] S. Wiqvist, A. Golightly, AT McLean, U. Picchini (2020). Efficient inference for stochastic differential mixed-effects models using correlated particle pseudo-marginal algorithms, CSDA, https://doi.org/10.1016/j.csda.2020.107151 [2] S. Persson, N. Welkenhuysen, S. Shashkova, S. Wiqvist, P. Reith, G. W. Schmidt, U. Picchini, M. Cvijovic (2021). PEPSDI: Scalable and flexible inference framework for stochastic dynamic single-cell models, bioRxiv doi:10.1101/2021.07.01.450748.]]>

An important, and well studied, class of stochastic models is given by stochastic differential equations (SDEs). In this talk, we consider Bayesian inference based on measurements from several individuals, to provide inference at the "population level" using mixed-effects modelling. We consider the case where dynamics are expressed via SDEs or other stochastic (Markovian) models. Stochastic differential equation mixed-effects models (SDEMEMs) are flexible hierarchical models that account for (i) the intrinsic random variability in the latent states dynamics, as well as (ii) the variability between individuals, and also (iii) account for measurement error. This flexibility gives rise to methodological and computational difficulties. Fully Bayesian inference for nonlinear SDEMEMs is complicated by the typical intractability of the observed data likelihood which motivates the use of sampling-based approaches such as Markov chain Monte Carlo. A Gibbs sampler is proposed to target the marginal posterior of all parameters of interest. The algorithm is made computationally efficient through careful use of blocking strategies, particle filters (sequential Monte Carlo) and correlated pseudo-marginal approaches. The resulting methodology is is flexible, general and is able to deal with a large class of nonlinear SDEMEMs [1]. In a more recent work [2], we also explored ways to make inference even more scalable to an increasing number of individuals, while also dealing with state-space models driven by other stochastic dynamic models than SDEs, eg Markov jump processes and nonlinear solvers typically used in systems biology. [1] S. Wiqvist, A. Golightly, AT McLean, U. Picchini (2020). Efficient inference for stochastic differential mixed-effects models using correlated particle pseudo-marginal algorithms, CSDA, https://doi.org/10.1016/j.csda.2020.107151 [2] S. Persson, N. Welkenhuysen, S. Shashkova, S. Wiqvist, P. Reith, G. W. Schmidt, U. Picchini, M. Cvijovic (2021). PEPSDI: Scalable and flexible inference framework for stochastic dynamic single-cell models, bioRxiv doi:10.1101/2021.07.01.450748.]]>
Fri, 01 Apr 2022 15:24:48 GMT /slideshow/bayesian-inference-for-stoch-memsbayesian-inference-for-mixedeffects-models-driven-by-sdes-and-other-stochastic-models-a-scalable-approach/251492401 UmbertoPicchini@slideshare.net(UmbertoPicchini) Bayesian inference for mixed-effects models driven by SDEs and other stochastic models: a scalable approach UmbertoPicchini An important, and well studied, class of stochastic models is given by stochastic differential equations (SDEs). In this talk, we consider Bayesian inference based on measurements from several individuals, to provide inference at the "population level" using mixed-effects modelling. We consider the case where dynamics are expressed via SDEs or other stochastic (Markovian) models. Stochastic differential equation mixed-effects models (SDEMEMs) are flexible hierarchical models that account for (i) the intrinsic random variability in the latent states dynamics, as well as (ii) the variability between individuals, and also (iii) account for measurement error. This flexibility gives rise to methodological and computational difficulties. Fully Bayesian inference for nonlinear SDEMEMs is complicated by the typical intractability of the observed data likelihood which motivates the use of sampling-based approaches such as Markov chain Monte Carlo. A Gibbs sampler is proposed to target the marginal posterior of all parameters of interest. The algorithm is made computationally efficient through careful use of blocking strategies, particle filters (sequential Monte Carlo) and correlated pseudo-marginal approaches. The resulting methodology is is flexible, general and is able to deal with a large class of nonlinear SDEMEMs [1]. In a more recent work [2], we also explored ways to make inference even more scalable to an increasing number of individuals, while also dealing with state-space models driven by other stochastic dynamic models than SDEs, eg Markov jump processes and nonlinear solvers typically used in systems biology. [1] S. Wiqvist, A. Golightly, AT McLean, U. Picchini (2020). Efficient inference for stochastic differential mixed-effects models using correlated particle pseudo-marginal algorithms, CSDA, https://doi.org/10.1016/j.csda.2020.107151 [2] S. Persson, N. Welkenhuysen, S. Shashkova, S. Wiqvist, P. Reith, G. W. Schmidt, U. Picchini, M. Cvijovic (2021). PEPSDI: Scalable and flexible inference framework for stochastic dynamic single-cell models, bioRxiv doi:10.1101/2021.07.01.450748. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/main-220401152448-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> An important, and well studied, class of stochastic models is given by stochastic differential equations (SDEs). In this talk, we consider Bayesian inference based on measurements from several individuals, to provide inference at the &quot;population level&quot; using mixed-effects modelling. We consider the case where dynamics are expressed via SDEs or other stochastic (Markovian) models. Stochastic differential equation mixed-effects models (SDEMEMs) are flexible hierarchical models that account for (i) the intrinsic random variability in the latent states dynamics, as well as (ii) the variability between individuals, and also (iii) account for measurement error. This flexibility gives rise to methodological and computational difficulties. Fully Bayesian inference for nonlinear SDEMEMs is complicated by the typical intractability of the observed data likelihood which motivates the use of sampling-based approaches such as Markov chain Monte Carlo. A Gibbs sampler is proposed to target the marginal posterior of all parameters of interest. The algorithm is made computationally efficient through careful use of blocking strategies, particle filters (sequential Monte Carlo) and correlated pseudo-marginal approaches. The resulting methodology is is flexible, general and is able to deal with a large class of nonlinear SDEMEMs [1]. In a more recent work [2], we also explored ways to make inference even more scalable to an increasing number of individuals, while also dealing with state-space models driven by other stochastic dynamic models than SDEs, eg Markov jump processes and nonlinear solvers typically used in systems biology. [1] S. Wiqvist, A. Golightly, AT McLean, U. Picchini (2020). Efficient inference for stochastic differential mixed-effects models using correlated particle pseudo-marginal algorithms, CSDA, https://doi.org/10.1016/j.csda.2020.107151 [2] S. Persson, N. Welkenhuysen, S. Shashkova, S. Wiqvist, P. Reith, G. W. Schmidt, U. Picchini, M. Cvijovic (2021). PEPSDI: Scalable and flexible inference framework for stochastic dynamic single-cell models, bioRxiv doi:10.1101/2021.07.01.450748.
Bayesian inference for mixed-effects models driven by SDEs and other stochastic models: a scalable approach from Umberto Picchini
]]>
226 0 https://cdn.slidesharecdn.com/ss_thumbnails/main-220401152448-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Stratified Monte Carlo and bootstrapping for approximate Bayesian computation /slideshow/stratified-monte-carlo-and-bootstrapping-for-approximate-bayesian-computation/234125228 stratifiedabc-200517060742
Presented on 7 May 2020 at "One World Approximate Bayesian Computation (ABC) Seminar". A video is available at https://youtu.be/IOPnRfAJ_W8 Approximate Bayesian computation (ABC) is computationally intensive for complex model simulators. To exploit expensive simulations, data-resampling via bootstrapping was used with success in [1] to obtain many artificial datasets at little cost and construct a synthetic likelihood. When using the same approach within ABC to produce a pseudo-marginal ABC-MCMC algorithm, the posterior variance is inflated, thus producing biased posterior inference. Here we use stratified Monte Carlo to considerably reduce the bias induced by data resampling. We also show that it is possible to obtain reliable inference using a larger than usual ABC threshold, by employing stratified Monte Carlo. Finally, we show that with stratified sampling we obtain a less variable ABC likelihood. In our paper [2] we consider simulation studies for static (Gaussian, g-and-k distribution, Ising model) and dynamic models (Lotka-Volterra). For the Lotka-Volterra case study, we compare our results against a standard pseudo-Marginal ABC and find that our approach is four times more efficient and, given limited computational budget, it explores the posterior surface more thoroughly. A comparison against state-of-art sequential Monte Carlo ABC is also reported. References [1] R. G. Everitt (2017). Bootstrapped synthetic likelihood. arXiv:1711.05825. [2] U. Picchini, R.G. Everitt (2019). Stratified sampling and resampling for approximateBayesian computation. arXiv:1905.07976]]>

Presented on 7 May 2020 at "One World Approximate Bayesian Computation (ABC) Seminar". A video is available at https://youtu.be/IOPnRfAJ_W8 Approximate Bayesian computation (ABC) is computationally intensive for complex model simulators. To exploit expensive simulations, data-resampling via bootstrapping was used with success in [1] to obtain many artificial datasets at little cost and construct a synthetic likelihood. When using the same approach within ABC to produce a pseudo-marginal ABC-MCMC algorithm, the posterior variance is inflated, thus producing biased posterior inference. Here we use stratified Monte Carlo to considerably reduce the bias induced by data resampling. We also show that it is possible to obtain reliable inference using a larger than usual ABC threshold, by employing stratified Monte Carlo. Finally, we show that with stratified sampling we obtain a less variable ABC likelihood. In our paper [2] we consider simulation studies for static (Gaussian, g-and-k distribution, Ising model) and dynamic models (Lotka-Volterra). For the Lotka-Volterra case study, we compare our results against a standard pseudo-Marginal ABC and find that our approach is four times more efficient and, given limited computational budget, it explores the posterior surface more thoroughly. A comparison against state-of-art sequential Monte Carlo ABC is also reported. References [1] R. G. Everitt (2017). Bootstrapped synthetic likelihood. arXiv:1711.05825. [2] U. Picchini, R.G. Everitt (2019). Stratified sampling and resampling for approximateBayesian computation. arXiv:1905.07976]]>
Sun, 17 May 2020 06:07:42 GMT /slideshow/stratified-monte-carlo-and-bootstrapping-for-approximate-bayesian-computation/234125228 UmbertoPicchini@slideshare.net(UmbertoPicchini) Stratified Monte Carlo and bootstrapping for approximate Bayesian computation UmbertoPicchini Presented on 7 May 2020 at "One World Approximate Bayesian Computation (ABC) Seminar". A video is available at https://youtu.be/IOPnRfAJ_W8 Approximate Bayesian computation (ABC) is computationally intensive for complex model simulators. To exploit expensive simulations, data-resampling via bootstrapping was used with success in [1] to obtain many artificial datasets at little cost and construct a synthetic likelihood. When using the same approach within ABC to produce a pseudo-marginal ABC-MCMC algorithm, the posterior variance is inflated, thus producing biased posterior inference. Here we use stratified Monte Carlo to considerably reduce the bias induced by data resampling. We also show that it is possible to obtain reliable inference using a larger than usual ABC threshold, by employing stratified Monte Carlo. Finally, we show that with stratified sampling we obtain a less variable ABC likelihood. In our paper [2] we consider simulation studies for static (Gaussian, g-and-k distribution, Ising model) and dynamic models (Lotka-Volterra). For the Lotka-Volterra case study, we compare our results against a standard pseudo-Marginal ABC and find that our approach is four times more efficient and, given limited computational budget, it explores the posterior surface more thoroughly. A comparison against state-of-art sequential Monte Carlo ABC is also reported. References [1] R. G. Everitt (2017). Bootstrapped synthetic likelihood. arXiv:1711.05825. [2] U. Picchini, R.G. Everitt (2019). Stratified sampling and resampling for approximateBayesian computation. arXiv:1905.07976 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/stratifiedabc-200517060742-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Presented on 7 May 2020 at &quot;One World Approximate Bayesian Computation (ABC) Seminar&quot;. A video is available at https://youtu.be/IOPnRfAJ_W8 Approximate Bayesian computation (ABC) is computationally intensive for complex model simulators. To exploit expensive simulations, data-resampling via bootstrapping was used with success in [1] to obtain many artificial datasets at little cost and construct a synthetic likelihood. When using the same approach within ABC to produce a pseudo-marginal ABC-MCMC algorithm, the posterior variance is inflated, thus producing biased posterior inference. Here we use stratified Monte Carlo to considerably reduce the bias induced by data resampling. We also show that it is possible to obtain reliable inference using a larger than usual ABC threshold, by employing stratified Monte Carlo. Finally, we show that with stratified sampling we obtain a less variable ABC likelihood. In our paper [2] we consider simulation studies for static (Gaussian, g-and-k distribution, Ising model) and dynamic models (Lotka-Volterra). For the Lotka-Volterra case study, we compare our results against a standard pseudo-Marginal ABC and find that our approach is four times more efficient and, given limited computational budget, it explores the posterior surface more thoroughly. A comparison against state-of-art sequential Monte Carlo ABC is also reported. References [1] R. G. Everitt (2017). Bootstrapped synthetic likelihood. arXiv:1711.05825. [2] U. Picchini, R.G. Everitt (2019). Stratified sampling and resampling for approximateBayesian computation. arXiv:1905.07976
Stratified Monte Carlo and bootstrapping for approximate Bayesian computation from Umberto Picchini
]]>
63 0 https://cdn.slidesharecdn.com/ss_thumbnails/stratifiedabc-200517060742-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Stratified sampling and resampling for approximate Bayesian computation /UmbertoPicchini/stratified-sampling-and-resampling-for-approximate-bayesian-computation stratifiedabc-190625153557
Presented at the "Second Italian Meeting on Probability and Mathematical Statistics", Vietri. ]]>

Presented at the "Second Italian Meeting on Probability and Mathematical Statistics", Vietri. ]]>
Tue, 25 Jun 2019 15:35:57 GMT /UmbertoPicchini/stratified-sampling-and-resampling-for-approximate-bayesian-computation UmbertoPicchini@slideshare.net(UmbertoPicchini) Stratified sampling and resampling for approximate Bayesian computation UmbertoPicchini Presented at the "Second Italian Meeting on Probability and Mathematical Statistics", Vietri. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/stratifiedabc-190625153557-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Presented at the &quot;Second Italian Meeting on Probability and Mathematical Statistics&quot;, Vietri.
Stratified sampling and resampling for approximate Bayesian computation from Umberto Picchini
]]>
385 5 https://cdn.slidesharecdn.com/ss_thumbnails/stratifiedabc-190625153557-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
A likelihood-free version of the stochastic approximation EM algorithm (SAEM) for parameter estimation in complex models /slideshow/a-likelihoodfree-version-of-the-stochastic-approximation-em-algorithm-saem-for-parameter-estimation-in-complex-models/67391141 saem-synlik-161019064108
I show how to obtain approximate maximum likelihood inference for "complex" models having some latent (unobservable) component. With "complex" I mean models having a so-called intractable likelihood, where the latter is unavailable in closed for or is too difficult to approximate. I construct a version of SAEM (and EM-type algorithm) that makes it possible to conduct inference for complex models. Traditionally SAEM is implementable only for models that are fairly tractable analytically. By introducing the concept of synthetic likelihood, where information is captured by a series of user-defined summary statistics (as in approximate Bayesian computation), it is possible to automatize SAEM to run on any model having some latent-component.]]>

I show how to obtain approximate maximum likelihood inference for "complex" models having some latent (unobservable) component. With "complex" I mean models having a so-called intractable likelihood, where the latter is unavailable in closed for or is too difficult to approximate. I construct a version of SAEM (and EM-type algorithm) that makes it possible to conduct inference for complex models. Traditionally SAEM is implementable only for models that are fairly tractable analytically. By introducing the concept of synthetic likelihood, where information is captured by a series of user-defined summary statistics (as in approximate Bayesian computation), it is possible to automatize SAEM to run on any model having some latent-component.]]>
Wed, 19 Oct 2016 06:41:08 GMT /slideshow/a-likelihoodfree-version-of-the-stochastic-approximation-em-algorithm-saem-for-parameter-estimation-in-complex-models/67391141 UmbertoPicchini@slideshare.net(UmbertoPicchini) A likelihood-free version of the stochastic approximation EM algorithm (SAEM) for parameter estimation in complex models UmbertoPicchini I show how to obtain approximate maximum likelihood inference for "complex" models having some latent (unobservable) component. With "complex" I mean models having a so-called intractable likelihood, where the latter is unavailable in closed for or is too difficult to approximate. I construct a version of SAEM (and EM-type algorithm) that makes it possible to conduct inference for complex models. Traditionally SAEM is implementable only for models that are fairly tractable analytically. By introducing the concept of synthetic likelihood, where information is captured by a series of user-defined summary statistics (as in approximate Bayesian computation), it is possible to automatize SAEM to run on any model having some latent-component. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/saem-synlik-161019064108-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> I show how to obtain approximate maximum likelihood inference for &quot;complex&quot; models having some latent (unobservable) component. With &quot;complex&quot; I mean models having a so-called intractable likelihood, where the latter is unavailable in closed for or is too difficult to approximate. I construct a version of SAEM (and EM-type algorithm) that makes it possible to conduct inference for complex models. Traditionally SAEM is implementable only for models that are fairly tractable analytically. By introducing the concept of synthetic likelihood, where information is captured by a series of user-defined summary statistics (as in approximate Bayesian computation), it is possible to automatize SAEM to run on any model having some latent-component.
A likelihood-free version of the stochastic approximation EM algorithm (SAEM) for parameter estimation in complex models from Umberto Picchini
]]>
465 5 https://cdn.slidesharecdn.com/ss_thumbnails/saem-synlik-161019064108-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Inference via Bayesian Synthetic Likelihoods for a Mixed-Effects SDE Model of Tumor Growth /slideshow/inference-via-bayesian-synthetic-likelihoods-for-a-mixedeffects-sde-model-of-tumor-growth/65135203 dzf531k1qnmq0rfuqnlq-signature-321d57b17a45a27f3e547321c1e545fb0e87d18fff0cc73c20d8ada091fdcdf3-poli-160818171108
presented at the European Meeting of Statisticians 2017, Helsinki 24-28 July 2017]]>

presented at the European Meeting of Statisticians 2017, Helsinki 24-28 July 2017]]>
Thu, 18 Aug 2016 17:11:08 GMT /slideshow/inference-via-bayesian-synthetic-likelihoods-for-a-mixedeffects-sde-model-of-tumor-growth/65135203 UmbertoPicchini@slideshare.net(UmbertoPicchini) Inference via Bayesian Synthetic Likelihoods for a Mixed-Effects SDE Model of Tumor Growth UmbertoPicchini presented at the European Meeting of Statisticians 2017, Helsinki 24-28 July 2017 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/dzf531k1qnmq0rfuqnlq-signature-321d57b17a45a27f3e547321c1e545fb0e87d18fff0cc73c20d8ada091fdcdf3-poli-160818171108-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> presented at the European Meeting of Statisticians 2017, Helsinki 24-28 July 2017
Inference via Bayesian Synthetic Likelihoods for a Mixed-Effects SDE Model of Tumor Growth from Umberto Picchini
]]>
464 3 https://cdn.slidesharecdn.com/ss_thumbnails/dzf531k1qnmq0rfuqnlq-signature-321d57b17a45a27f3e547321c1e545fb0e87d18fff0cc73c20d8ada091fdcdf3-poli-160818171108-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
My data are incomplete and noisy: Information-reduction statistical methods for knowledge extraction can save your day: tools and opportunities for modelling /slideshow/my-data-are-incomplete-and-noisy-informationreduction-statistical-methods-for-knowledge-extraction-can-save-your-day-tools-and-opportunities-for-modelling/62773944 clju7u0ssbjrlk4kbmzw-signature-e9ff639cf3fb5b4d936b5d309f706db19bf0ef98f614d7abd1a781a5c5d746d5-poli-160606150434
We review parameter inference for stochastic modelling in complex scenario, such as bad parameters initialization and near-chaotic dynamics. We show how state-of-art methods for state-space models can fail while, in some situations, reducing data to summary statistics (information reduction) enables robust estimation. Wood's synthetic likelihoods method is reviewed and the lecture closes with an example of approximate Bayesian computation methodology. Accompanying code is available at https://github.com/umbertopicchini/pomp-ricker and https://github.com/umbertopicchini/abc_g-and-k Readership lecture given at Lund University on 7 June 2016. The lecture is of popular science nature hence mathematical detail is kept to a minimum. However numerous links and references are offered for further reading.]]>

We review parameter inference for stochastic modelling in complex scenario, such as bad parameters initialization and near-chaotic dynamics. We show how state-of-art methods for state-space models can fail while, in some situations, reducing data to summary statistics (information reduction) enables robust estimation. Wood's synthetic likelihoods method is reviewed and the lecture closes with an example of approximate Bayesian computation methodology. Accompanying code is available at https://github.com/umbertopicchini/pomp-ricker and https://github.com/umbertopicchini/abc_g-and-k Readership lecture given at Lund University on 7 June 2016. The lecture is of popular science nature hence mathematical detail is kept to a minimum. However numerous links and references are offered for further reading.]]>
Mon, 06 Jun 2016 15:04:34 GMT /slideshow/my-data-are-incomplete-and-noisy-informationreduction-statistical-methods-for-knowledge-extraction-can-save-your-day-tools-and-opportunities-for-modelling/62773944 UmbertoPicchini@slideshare.net(UmbertoPicchini) My data are incomplete and noisy: Information-reduction statistical methods for knowledge extraction can save your day: tools and opportunities for modelling UmbertoPicchini We review parameter inference for stochastic modelling in complex scenario, such as bad parameters initialization and near-chaotic dynamics. We show how state-of-art methods for state-space models can fail while, in some situations, reducing data to summary statistics (information reduction) enables robust estimation. Wood's synthetic likelihoods method is reviewed and the lecture closes with an example of approximate Bayesian computation methodology. Accompanying code is available at https://github.com/umbertopicchini/pomp-ricker and https://github.com/umbertopicchini/abc_g-and-k Readership lecture given at Lund University on 7 June 2016. The lecture is of popular science nature hence mathematical detail is kept to a minimum. However numerous links and references are offered for further reading. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/clju7u0ssbjrlk4kbmzw-signature-e9ff639cf3fb5b4d936b5d309f706db19bf0ef98f614d7abd1a781a5c5d746d5-poli-160606150434-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> We review parameter inference for stochastic modelling in complex scenario, such as bad parameters initialization and near-chaotic dynamics. We show how state-of-art methods for state-space models can fail while, in some situations, reducing data to summary statistics (information reduction) enables robust estimation. Wood&#39;s synthetic likelihoods method is reviewed and the lecture closes with an example of approximate Bayesian computation methodology. Accompanying code is available at https://github.com/umbertopicchini/pomp-ricker and https://github.com/umbertopicchini/abc_g-and-k Readership lecture given at Lund University on 7 June 2016. The lecture is of popular science nature hence mathematical detail is kept to a minimum. However numerous links and references are offered for further reading.
My data are incomplete and noisy: Information-reduction statistical methods for knowledge extraction can save your day: tools and opportunities for modelling from Umberto Picchini
]]>
1196 11 https://cdn.slidesharecdn.com/ss_thumbnails/clju7u0ssbjrlk4kbmzw-signature-e9ff639cf3fb5b4d936b5d309f706db19bf0ef98f614d7abd1a781a5c5d746d5-poli-160606150434-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Inference for stochastic differential equations via approximate Bayesian computation /slideshow/inference-for-stochastic-differential-equations-via-approximate-bayesian-computation/58504552 abcumea-160220184210
Despite the title the methods are appropriate for more general dynamical models (including state-space models). Presentation given at Nordstat 2012, Ume奪. Relevant research paper at http://arxiv.org/abs/1204.5459 and software code at https://sourceforge.net/projects/abc-sde/ ]]>

Despite the title the methods are appropriate for more general dynamical models (including state-space models). Presentation given at Nordstat 2012, Ume奪. Relevant research paper at http://arxiv.org/abs/1204.5459 and software code at https://sourceforge.net/projects/abc-sde/ ]]>
Sat, 20 Feb 2016 18:42:10 GMT /slideshow/inference-for-stochastic-differential-equations-via-approximate-bayesian-computation/58504552 UmbertoPicchini@slideshare.net(UmbertoPicchini) Inference for stochastic differential equations via approximate Bayesian computation UmbertoPicchini Despite the title the methods are appropriate for more general dynamical models (including state-space models). Presentation given at Nordstat 2012, Ume奪. Relevant research paper at http://arxiv.org/abs/1204.5459 and software code at https://sourceforge.net/projects/abc-sde/ <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/abcumea-160220184210-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Despite the title the methods are appropriate for more general dynamical models (including state-space models). Presentation given at Nordstat 2012, Ume奪. Relevant research paper at http://arxiv.org/abs/1204.5459 and software code at https://sourceforge.net/projects/abc-sde/
Inference for stochastic differential equations via approximate Bayesian computation from Umberto Picchini
]]>
742 6 https://cdn.slidesharecdn.com/ss_thumbnails/abcumea-160220184210-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
ABC with data cloning for MLE in state space models /slideshow/abc-with-data-cloning-for-mle-in-state-space-models/58395964 abcdatacloning-160217215913
An application of the "data cloning" method for parameter estimation via MLE aided by Approximate Bayesian Computation. The relevant paper is http://arxiv.org/abs/1505.06318]]>

An application of the "data cloning" method for parameter estimation via MLE aided by Approximate Bayesian Computation. The relevant paper is http://arxiv.org/abs/1505.06318]]>
Wed, 17 Feb 2016 21:59:12 GMT /slideshow/abc-with-data-cloning-for-mle-in-state-space-models/58395964 UmbertoPicchini@slideshare.net(UmbertoPicchini) ABC with data cloning for MLE in state space models UmbertoPicchini An application of the "data cloning" method for parameter estimation via MLE aided by Approximate Bayesian Computation. The relevant paper is http://arxiv.org/abs/1505.06318 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/abcdatacloning-160217215913-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> An application of the &quot;data cloning&quot; method for parameter estimation via MLE aided by Approximate Bayesian Computation. The relevant paper is http://arxiv.org/abs/1505.06318
ABC with data cloning for MLE in state space models from Umberto Picchini
]]>
485 5 https://cdn.slidesharecdn.com/ss_thumbnails/abcdatacloning-160217215913-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Accelerated approximate Bayesian computation with applications to protein folding data /slideshow/accelerated-approximate-bayesian-computation-with-applications-to-protein-folding-data/58395566 abcprotein-160217214818
際際滷s for a seminar given at Dept. Mathematics, Uppsala University, 4 September 2014. Relevant paper is http://arxiv.org/abs/1310.0973]]>

際際滷s for a seminar given at Dept. Mathematics, Uppsala University, 4 September 2014. Relevant paper is http://arxiv.org/abs/1310.0973]]>
Wed, 17 Feb 2016 21:48:18 GMT /slideshow/accelerated-approximate-bayesian-computation-with-applications-to-protein-folding-data/58395566 UmbertoPicchini@slideshare.net(UmbertoPicchini) Accelerated approximate Bayesian computation with applications to protein folding data UmbertoPicchini 際際滷s for a seminar given at Dept. Mathematics, Uppsala University, 4 September 2014. Relevant paper is http://arxiv.org/abs/1310.0973 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/abcprotein-160217214818-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> 際際滷s for a seminar given at Dept. Mathematics, Uppsala University, 4 September 2014. Relevant paper is http://arxiv.org/abs/1310.0973
Accelerated approximate Bayesian computation with applications to protein folding data from Umberto Picchini
]]>
366 5 https://cdn.slidesharecdn.com/ss_thumbnails/abcprotein-160217214818-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Intro to Approximate Bayesian Computation (ABC) /slideshow/intro-to-approximate-bayesian-computation-abc/58394921 abcslides-160217212955
A 3hrs intro lecture to Approximate Bayesian Computation (ABC), given as part of a PhD course at Lund University, February 2016. For sample codes see http://www.maths.lu.se/kurshemsida/phd-course-fms020f-nams002-statistical-inference-for-partially-observed-stochastic-processes/]]>

A 3hrs intro lecture to Approximate Bayesian Computation (ABC), given as part of a PhD course at Lund University, February 2016. For sample codes see http://www.maths.lu.se/kurshemsida/phd-course-fms020f-nams002-statistical-inference-for-partially-observed-stochastic-processes/]]>
Wed, 17 Feb 2016 21:29:55 GMT /slideshow/intro-to-approximate-bayesian-computation-abc/58394921 UmbertoPicchini@slideshare.net(UmbertoPicchini) Intro to Approximate Bayesian Computation (ABC) UmbertoPicchini A 3hrs intro lecture to Approximate Bayesian Computation (ABC), given as part of a PhD course at Lund University, February 2016. For sample codes see http://www.maths.lu.se/kurshemsida/phd-course-fms020f-nams002-statistical-inference-for-partially-observed-stochastic-processes/ <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/abcslides-160217212955-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> A 3hrs intro lecture to Approximate Bayesian Computation (ABC), given as part of a PhD course at Lund University, February 2016. For sample codes see http://www.maths.lu.se/kurshemsida/phd-course-fms020f-nams002-statistical-inference-for-partially-observed-stochastic-processes/
Intro to Approximate Bayesian Computation (ABC) from Umberto Picchini
]]>
1374 21 https://cdn.slidesharecdn.com/ss_thumbnails/abcslides-160217212955-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-UmbertoPicchini-48x48.jpg?cb=1679053208 Interested in statistical inference and computational statistics for stochastic processes and stochastic modelling. umbertopicchini.github.io/ https://cdn.slidesharecdn.com/ss_thumbnails/slidesguidedabc-230317114139-0d7ea702-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/guided-sequential-abc-schemes-for-simulationbased-inference/256590007 Guided sequential ABC ... https://cdn.slidesharecdn.com/ss_thumbnails/main-220401152448-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/bayesian-inference-for-stoch-memsbayesian-inference-for-mixedeffects-models-driven-by-sdes-and-other-stochastic-models-a-scalable-approach/251492401 Bayesian inference for... https://cdn.slidesharecdn.com/ss_thumbnails/stratifiedabc-200517060742-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/stratified-monte-carlo-and-bootstrapping-for-approximate-bayesian-computation/234125228 Stratified Monte Carlo...