際際滷shows by User: azeari / http://www.slideshare.net/images/logo.gif 際際滷shows by User: azeari / Thu, 14 Oct 2021 09:32:20 GMT 際際滷Share feed for 際際滷shows by User: azeari Bayesian Inference and Uncertainty Quantification for Inverse Problems /azeari/bayesian-inference-and-uncertainty-quantification-for-inverse-problems moorestide20211014-211014093220
So-called inverse problems arise when the parameters of a physical system cannot be directly observed. The mapping between these latent parameters and the space of noisy observations is represented as a mathematical model, often involving a system of differential equations. We seek to infer the parameter values that best fit our observed data. However, it is also vital to obtain accurate quantification of the uncertainty involved with these parameters, particularly when the output of the model will be used for forecasting. Bayesian inference provides well-calibrated uncertainty estimates, represented by the posterior distribution over the parameters. In this talk, I will give a brief introduction to Markov chain Monte Carlo (MCMC) algorithms for sampling from the posterior distribution and describe how they can be combined with numerical solvers for the forward model. We apply these methods to two examples of ODE models: growth curves in ecology, and thermogravimetric analysis (TGA) in chemistry. This is joint work with Matthew Berry, Mark Nelson, Brian Monaghan and Raymond Longbottom.]]>

So-called inverse problems arise when the parameters of a physical system cannot be directly observed. The mapping between these latent parameters and the space of noisy observations is represented as a mathematical model, often involving a system of differential equations. We seek to infer the parameter values that best fit our observed data. However, it is also vital to obtain accurate quantification of the uncertainty involved with these parameters, particularly when the output of the model will be used for forecasting. Bayesian inference provides well-calibrated uncertainty estimates, represented by the posterior distribution over the parameters. In this talk, I will give a brief introduction to Markov chain Monte Carlo (MCMC) algorithms for sampling from the posterior distribution and describe how they can be combined with numerical solvers for the forward model. We apply these methods to two examples of ODE models: growth curves in ecology, and thermogravimetric analysis (TGA) in chemistry. This is joint work with Matthew Berry, Mark Nelson, Brian Monaghan and Raymond Longbottom.]]>
Thu, 14 Oct 2021 09:32:20 GMT /azeari/bayesian-inference-and-uncertainty-quantification-for-inverse-problems azeari@slideshare.net(azeari) Bayesian Inference and Uncertainty Quantification for Inverse Problems azeari So-called inverse problems arise when the parameters of a physical system cannot be directly observed. The mapping between these latent parameters and the space of noisy observations is represented as a mathematical model, often involving a system of differential equations. We seek to infer the parameter values that best fit our observed data. However, it is also vital to obtain accurate quantification of the uncertainty involved with these parameters, particularly when the output of the model will be used for forecasting. Bayesian inference provides well-calibrated uncertainty estimates, represented by the posterior distribution over the parameters. In this talk, I will give a brief introduction to Markov chain Monte Carlo (MCMC) algorithms for sampling from the posterior distribution and describe how they can be combined with numerical solvers for the forward model. We apply these methods to two examples of ODE models: growth curves in ecology, and thermogravimetric analysis (TGA) in chemistry. This is joint work with Matthew Berry, Mark Nelson, Brian Monaghan and Raymond Longbottom. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/moorestide20211014-211014093220-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> So-called inverse problems arise when the parameters of a physical system cannot be directly observed. The mapping between these latent parameters and the space of noisy observations is represented as a mathematical model, often involving a system of differential equations. We seek to infer the parameter values that best fit our observed data. However, it is also vital to obtain accurate quantification of the uncertainty involved with these parameters, particularly when the output of the model will be used for forecasting. Bayesian inference provides well-calibrated uncertainty estimates, represented by the posterior distribution over the parameters. In this talk, I will give a brief introduction to Markov chain Monte Carlo (MCMC) algorithms for sampling from the posterior distribution and describe how they can be combined with numerical solvers for the forward model. We apply these methods to two examples of ODE models: growth curves in ecology, and thermogravimetric analysis (TGA) in chemistry. This is joint work with Matthew Berry, Mark Nelson, Brian Monaghan and Raymond Longbottom.
Bayesian Inference and Uncertainty Quantification for Inverse Problems from Matt Moores
]]>
179 0 https://cdn.slidesharecdn.com/ss_thumbnails/moorestide20211014-211014093220-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
bayesImageS: an R package for Bayesian image analysis /slideshow/bayesimages-an-r-package-for-bayesian-image-analysis/105239891 mooresmuser2018-180710225904
There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository.]]>

There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository.]]>
Tue, 10 Jul 2018 22:59:04 GMT /slideshow/bayesimages-an-r-package-for-bayesian-image-analysis/105239891 azeari@slideshare.net(azeari) bayesImageS: an R package for Bayesian image analysis azeari There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/mooresmuser2018-180710225904-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository.
bayesImageS: an R package for Bayesian image analysis from Matt Moores
]]>
335 5 https://cdn.slidesharecdn.com/ss_thumbnails/mooresmuser2018-180710225904-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Exploratory Analysis of Multivariate Data /slideshow/exploratory-analysis-of-multivariate-data/79697426 mooresmultivariate-170912202925
Exploratory data analysis in R using 'visdat' and 'FactoMineR']]>

Exploratory data analysis in R using 'visdat' and 'FactoMineR']]>
Tue, 12 Sep 2017 20:29:25 GMT /slideshow/exploratory-analysis-of-multivariate-data/79697426 azeari@slideshare.net(azeari) Exploratory Analysis of Multivariate Data azeari Exploratory data analysis in R using 'visdat' and 'FactoMineR' <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/mooresmultivariate-170912202925-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Exploratory data analysis in R using &#39;visdat&#39; and &#39;FactoMineR&#39;
Exploratory Analysis of Multivariate Data from Matt Moores
]]>
405 7 https://cdn.slidesharecdn.com/ss_thumbnails/mooresmultivariate-170912202925-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
R package bayesImageS: Scalable Inference for Intractable Likelihoods /slideshow/r-package-bayesimages-scalable-inference-for-intractable-likelihoods/79426257 matt-moores-wednesday-pm-170904214130
There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm and approximate Bayesian computation (ABC). A serious drawback of these algorithms is that they do not scale well for models with a large state space. Markov random fields, such as the Ising/Potts model and exponential random graph model (ERGM), are particularly challenging because the number of discrete variables increases linearly with the size of the image or graph. The likelihood of these models cannot be computed directly, due to the presence of an intractable normalising constant. In this context, it is necessary to employ algorithms that provide a suitable compromise between accuracy and computational cost. Bayesian indirect likelihood (BIL) is a class of methods that approximate the likelihood function using a surrogate model. This model can be trained using a pre-computation step, utilising massively parallel hardware to simulate auxiliary variables. We review various types of surrogate model that can be used in BIL. In the case of the Potts model, we introduce a parametric approximation to the score function that incorporates its known properties, such as heteroskedasticity and critical temperature. We demonstrate this method on 2D satellite remote sensing and 3D computed tomography (CT) images. We achieve a hundredfold improvement in the elapsed runtime, compared to the exchange algorithm or ABC. Our algorithm has been implemented in the R package bayesImageS, which is available from CRAN.]]>

There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm and approximate Bayesian computation (ABC). A serious drawback of these algorithms is that they do not scale well for models with a large state space. Markov random fields, such as the Ising/Potts model and exponential random graph model (ERGM), are particularly challenging because the number of discrete variables increases linearly with the size of the image or graph. The likelihood of these models cannot be computed directly, due to the presence of an intractable normalising constant. In this context, it is necessary to employ algorithms that provide a suitable compromise between accuracy and computational cost. Bayesian indirect likelihood (BIL) is a class of methods that approximate the likelihood function using a surrogate model. This model can be trained using a pre-computation step, utilising massively parallel hardware to simulate auxiliary variables. We review various types of surrogate model that can be used in BIL. In the case of the Potts model, we introduce a parametric approximation to the score function that incorporates its known properties, such as heteroskedasticity and critical temperature. We demonstrate this method on 2D satellite remote sensing and 3D computed tomography (CT) images. We achieve a hundredfold improvement in the elapsed runtime, compared to the exchange algorithm or ABC. Our algorithm has been implemented in the R package bayesImageS, which is available from CRAN.]]>
Mon, 04 Sep 2017 21:41:30 GMT /slideshow/r-package-bayesimages-scalable-inference-for-intractable-likelihoods/79426257 azeari@slideshare.net(azeari) R package bayesImageS: Scalable Inference for Intractable Likelihoods azeari There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm and approximate Bayesian computation (ABC). A serious drawback of these algorithms is that they do not scale well for models with a large state space. Markov random fields, such as the Ising/Potts model and exponential random graph model (ERGM), are particularly challenging because the number of discrete variables increases linearly with the size of the image or graph. The likelihood of these models cannot be computed directly, due to the presence of an intractable normalising constant. In this context, it is necessary to employ algorithms that provide a suitable compromise between accuracy and computational cost. Bayesian indirect likelihood (BIL) is a class of methods that approximate the likelihood function using a surrogate model. This model can be trained using a pre-computation step, utilising massively parallel hardware to simulate auxiliary variables. We review various types of surrogate model that can be used in BIL. In the case of the Potts model, we introduce a parametric approximation to the score function that incorporates its known properties, such as heteroskedasticity and critical temperature. We demonstrate this method on 2D satellite remote sensing and 3D computed tomography (CT) images. We achieve a hundredfold improvement in the elapsed runtime, compared to the exchange algorithm or ABC. Our algorithm has been implemented in the R package bayesImageS, which is available from CRAN. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/matt-moores-wednesday-pm-170904214130-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm and approximate Bayesian computation (ABC). A serious drawback of these algorithms is that they do not scale well for models with a large state space. Markov random fields, such as the Ising/Potts model and exponential random graph model (ERGM), are particularly challenging because the number of discrete variables increases linearly with the size of the image or graph. The likelihood of these models cannot be computed directly, due to the presence of an intractable normalising constant. In this context, it is necessary to employ algorithms that provide a suitable compromise between accuracy and computational cost. Bayesian indirect likelihood (BIL) is a class of methods that approximate the likelihood function using a surrogate model. This model can be trained using a pre-computation step, utilising massively parallel hardware to simulate auxiliary variables. We review various types of surrogate model that can be used in BIL. In the case of the Potts model, we introduce a parametric approximation to the score function that incorporates its known properties, such as heteroskedasticity and critical temperature. We demonstrate this method on 2D satellite remote sensing and 3D computed tomography (CT) images. We achieve a hundredfold improvement in the elapsed runtime, compared to the exchange algorithm or ABC. Our algorithm has been implemented in the R package bayesImageS, which is available from CRAN.
R package bayesImageS: Scalable Inference for Intractable Likelihoods from Matt Moores
]]>
218 3 https://cdn.slidesharecdn.com/ss_thumbnails/matt-moores-wednesday-pm-170904214130-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
bayesImageS: Bayesian computation for medical Image Segmentation using a hidden Potts Model /slideshow/bayesimages-bayesian-computation-for-medical-image-segmentation-using-a-hidden-potts-model/78194683 mtm20170725mrc-170724130631
There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism.]]>

There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism.]]>
Mon, 24 Jul 2017 13:06:30 GMT /slideshow/bayesimages-bayesian-computation-for-medical-image-segmentation-using-a-hidden-potts-model/78194683 azeari@slideshare.net(azeari) bayesImageS: Bayesian computation for medical Image Segmentation using a hidden Potts Model azeari There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/mtm20170725mrc-170724130631-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism.
bayesImageS: Bayesian computation for medical Image Segmentation using a hidden Potts Model from Matt Moores
]]>
339 6 https://cdn.slidesharecdn.com/ss_thumbnails/mtm20170725mrc-170724130631-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Approximate Bayesian computation for the Ising/Potts model /slideshow/approximate-bayesian-computation-for-the-isingpotts-model/76855973 warwickmlabc-170612065718
Bayes formula involves the likelihood function, p(y|theta), which is a problem when the likelihood is unavailable in closed form. ABC is a method for approximating the posterior p(theta|y) without evaluating the likelihood. Instead, pseudo-data is simulated from a generative model and compared with the observations. This talk will give an introduction to ABC algorithms: rejection sampling, ABC-MCMC and ABC-SMC. Application of these algorithms to image analysis will be presented as an illustrative example. These methods have been implemented in the R package bayesImageS. This is joint work with Christian Robert (Warwick/Dauphine), Kerrie Mengersen and Christopher Drovandi (QUT).]]>

Bayes formula involves the likelihood function, p(y|theta), which is a problem when the likelihood is unavailable in closed form. ABC is a method for approximating the posterior p(theta|y) without evaluating the likelihood. Instead, pseudo-data is simulated from a generative model and compared with the observations. This talk will give an introduction to ABC algorithms: rejection sampling, ABC-MCMC and ABC-SMC. Application of these algorithms to image analysis will be presented as an illustrative example. These methods have been implemented in the R package bayesImageS. This is joint work with Christian Robert (Warwick/Dauphine), Kerrie Mengersen and Christopher Drovandi (QUT).]]>
Mon, 12 Jun 2017 06:57:18 GMT /slideshow/approximate-bayesian-computation-for-the-isingpotts-model/76855973 azeari@slideshare.net(azeari) Approximate Bayesian computation for the Ising/Potts model azeari Bayes formula involves the likelihood function, p(y|theta), which is a problem when the likelihood is unavailable in closed form. ABC is a method for approximating the posterior p(theta|y) without evaluating the likelihood. Instead, pseudo-data is simulated from a generative model and compared with the observations. This talk will give an introduction to ABC algorithms: rejection sampling, ABC-MCMC and ABC-SMC. Application of these algorithms to image analysis will be presented as an illustrative example. These methods have been implemented in the R package bayesImageS. This is joint work with Christian Robert (Warwick/Dauphine), Kerrie Mengersen and Christopher Drovandi (QUT). <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/warwickmlabc-170612065718-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Bayes formula involves the likelihood function, p(y|theta), which is a problem when the likelihood is unavailable in closed form. ABC is a method for approximating the posterior p(theta|y) without evaluating the likelihood. Instead, pseudo-data is simulated from a generative model and compared with the observations. This talk will give an introduction to ABC algorithms: rejection sampling, ABC-MCMC and ABC-SMC. Application of these algorithms to image analysis will be presented as an illustrative example. These methods have been implemented in the R package bayesImageS. This is joint work with Christian Robert (Warwick/Dauphine), Kerrie Mengersen and Christopher Drovandi (QUT).
Approximate Bayesian computation for the Ising/Potts model from Matt Moores
]]>
593 11 https://cdn.slidesharecdn.com/ss_thumbnails/warwickmlabc-170612065718-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Importing satellite imagery into R from NASA and the U.S. Geological Survey /slideshow/importing-satellite-imagery-into-r-from-nasa-and-the-us-geological-survey/73419852 mtmwrug20170316-170321144533
Warwick R Users' Group March 16, 2017 https://www2.warwick.ac.uk/fac/sci/wdsi/events/wrug/resources]]>

Warwick R Users' Group March 16, 2017 https://www2.warwick.ac.uk/fac/sci/wdsi/events/wrug/resources]]>
Tue, 21 Mar 2017 14:45:33 GMT /slideshow/importing-satellite-imagery-into-r-from-nasa-and-the-us-geological-survey/73419852 azeari@slideshare.net(azeari) Importing satellite imagery into R from NASA and the U.S. Geological Survey azeari Warwick R Users' Group March 16, 2017 https://www2.warwick.ac.uk/fac/sci/wdsi/events/wrug/resources <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/mtmwrug20170316-170321144533-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Warwick R Users&#39; Group March 16, 2017 https://www2.warwick.ac.uk/fac/sci/wdsi/events/wrug/resources
Importing satellite imagery into R from NASA and the U.S. Geological Survey from Matt Moores
]]>
217 4 https://cdn.slidesharecdn.com/ss_thumbnails/mtmwrug20170316-170321144533-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Accelerating Pseudo-Marginal MCMC using Gaussian Processes /slideshow/accelerating-pseudomarginal-mcmc-using-gaussian-processes/71339983 talk-alg-170124175307
The grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms are pseudo-marginal methods used to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we accelerate the GIMH method by using a Gaussian process (GP) approximation to the log-likelihood and train this GP using a short pilot run of the MCWM algorithm. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model. Our approach produces reasonable estimates of the univariate and bivariate posterior distributions, and the posterior correlation matrix in these examples with at least an order of magnitude improvement in computing time.]]>

The grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms are pseudo-marginal methods used to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we accelerate the GIMH method by using a Gaussian process (GP) approximation to the log-likelihood and train this GP using a short pilot run of the MCWM algorithm. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model. Our approach produces reasonable estimates of the univariate and bivariate posterior distributions, and the posterior correlation matrix in these examples with at least an order of magnitude improvement in computing time.]]>
Tue, 24 Jan 2017 17:53:07 GMT /slideshow/accelerating-pseudomarginal-mcmc-using-gaussian-processes/71339983 azeari@slideshare.net(azeari) Accelerating Pseudo-Marginal MCMC using Gaussian Processes azeari The grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms are pseudo-marginal methods used to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we accelerate the GIMH method by using a Gaussian process (GP) approximation to the log-likelihood and train this GP using a short pilot run of the MCWM algorithm. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model. Our approach produces reasonable estimates of the univariate and bivariate posterior distributions, and the posterior correlation matrix in these examples with at least an order of magnitude improvement in computing time. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/talk-alg-170124175307-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> The grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms are pseudo-marginal methods used to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we accelerate the GIMH method by using a Gaussian process (GP) approximation to the log-likelihood and train this GP using a short pilot run of the MCWM algorithm. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model. Our approach produces reasonable estimates of the univariate and bivariate posterior distributions, and the posterior correlation matrix in these examples with at least an order of magnitude improvement in computing time.
Accelerating Pseudo-Marginal MCMC using Gaussian Processes from Matt Moores
]]>
242 8 https://cdn.slidesharecdn.com/ss_thumbnails/talk-alg-170124175307-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and OpenMP /slideshow/r-package-bayesimages-a-case-study-in-bayesian-computation-using-rcpp-and-openmp/69771984 mtm20161202oxwasp-161202191907
There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository.]]>

There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository.]]>
Fri, 02 Dec 2016 19:19:06 GMT /slideshow/r-package-bayesimages-a-case-study-in-bayesian-computation-using-rcpp-and-openmp/69771984 azeari@slideshare.net(azeari) R package 'bayesImageS': a case study in Bayesian computation using Rcpp and OpenMP azeari There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/mtm20161202oxwasp-161202191907-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> There are many approaches to Bayesian computation with intractable likelihoods, including the exchange algorithm, approximate Bayesian computation (ABC), thermodynamic integration, and composite likelihood. These approaches vary in accuracy as well as scalability for datasets of significant size. The Potts model is an example where such methods are required, due to its intractable normalising constant. This model is a type of Markov random field, which is commonly used for image segmentation. The dimension of its parameter space increases linearly with the number of pixels in the image, making this a challenging application for scalable Bayesian computation. My talk will introduce various algorithms in the context of the Potts model and describe their implementation in C++, using OpenMP for parallelism. I will also discuss the process of releasing this software as an open source R package on the CRAN repository.
R package 'bayesImageS': a case study in Bayesian computation using Rcpp and OpenMP from Matt Moores
]]>
745 9 https://cdn.slidesharecdn.com/ss_thumbnails/mtm20161202oxwasp-161202191907-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Bayesian modelling and computation for Raman spectroscopy /slideshow/bayesian-modelling-and-computation-for-raman-spectroscopy/62574345 2nm9cd1krcmzdmyheub8-signature-f64688dc868a379a5a1e8da27b29a2c2e53b32d2522058e5c683b7e72a7867f3-poli-160531125757
Raman spectroscopy can be used to identify molecules by the characteristic scattering of light from a laser. Each Raman-active dye label has a unique spectral signature, comprised by the locations and amplitudes of the peaks. The Raman spectrum is discretised into a multivariate observation that is highly collinear, hence it lends itself to a reduced-rank representation. We introduce a sequential Monte Carlo (SMC) algorithm to separate this signal into a series of peaks plus a smoothly-varying baseline, corrupted by additive white noise. By incorporating this representation into a Bayesian functional regression, we can quantify the relationship between dye concentration and peak intensity. We also estimate the model evidence using SMC to investigate long-range dependence between peaks. These methods have been implemented as an R package, using RcppEigen and OpenMP.]]>

Raman spectroscopy can be used to identify molecules by the characteristic scattering of light from a laser. Each Raman-active dye label has a unique spectral signature, comprised by the locations and amplitudes of the peaks. The Raman spectrum is discretised into a multivariate observation that is highly collinear, hence it lends itself to a reduced-rank representation. We introduce a sequential Monte Carlo (SMC) algorithm to separate this signal into a series of peaks plus a smoothly-varying baseline, corrupted by additive white noise. By incorporating this representation into a Bayesian functional regression, we can quantify the relationship between dye concentration and peak intensity. We also estimate the model evidence using SMC to investigate long-range dependence between peaks. These methods have been implemented as an R package, using RcppEigen and OpenMP.]]>
Tue, 31 May 2016 12:57:57 GMT /slideshow/bayesian-modelling-and-computation-for-raman-spectroscopy/62574345 azeari@slideshare.net(azeari) Bayesian modelling and computation for Raman spectroscopy azeari Raman spectroscopy can be used to identify molecules by the characteristic scattering of light from a laser. Each Raman-active dye label has a unique spectral signature, comprised by the locations and amplitudes of the peaks. The Raman spectrum is discretised into a multivariate observation that is highly collinear, hence it lends itself to a reduced-rank representation. We introduce a sequential Monte Carlo (SMC) algorithm to separate this signal into a series of peaks plus a smoothly-varying baseline, corrupted by additive white noise. By incorporating this representation into a Bayesian functional regression, we can quantify the relationship between dye concentration and peak intensity. We also estimate the model evidence using SMC to investigate long-range dependence between peaks. These methods have been implemented as an R package, using RcppEigen and OpenMP. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/2nm9cd1krcmzdmyheub8-signature-f64688dc868a379a5a1e8da27b29a2c2e53b32d2522058e5c683b7e72a7867f3-poli-160531125757-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Raman spectroscopy can be used to identify molecules by the characteristic scattering of light from a laser. Each Raman-active dye label has a unique spectral signature, comprised by the locations and amplitudes of the peaks. The Raman spectrum is discretised into a multivariate observation that is highly collinear, hence it lends itself to a reduced-rank representation. We introduce a sequential Monte Carlo (SMC) algorithm to separate this signal into a series of peaks plus a smoothly-varying baseline, corrupted by additive white noise. By incorporating this representation into a Bayesian functional regression, we can quantify the relationship between dye concentration and peak intensity. We also estimate the model evidence using SMC to investigate long-range dependence between peaks. These methods have been implemented as an R package, using RcppEigen and OpenMP.
Bayesian modelling and computation for Raman spectroscopy from Matt Moores
]]>
729 8 https://cdn.slidesharecdn.com/ss_thumbnails/2nm9cd1krcmzdmyheub8-signature-f64688dc868a379a5a1e8da27b29a2c2e53b32d2522058e5c683b7e72a7867f3-poli-160531125757-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Final PhD Seminar /slideshow/final-phd/48955085 mooresphd-150603175634-lva1-app6892
]]>

]]>
Wed, 03 Jun 2015 17:56:34 GMT /slideshow/final-phd/48955085 azeari@slideshare.net(azeari) Final PhD Seminar azeari <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/mooresphd-150603175634-lva1-app6892-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br>
Final PhD Seminar from Matt Moores
]]>
1242 4 https://cdn.slidesharecdn.com/ss_thumbnails/mooresphd-150603175634-lva1-app6892-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Precomputation for SMC-ABC with undirected graphical models /slideshow/precomputation-for-smcabc-with-undirected-graphical-models/36722342 ywkgkb0htv2js6aymmn9-signature-f24123e8773d89ba67db0146b357c897512e8503fa48c3ac0dcb3b4b1360c824-poli-140707172920-phpapp01
ABC in Sydney, July 4, 2014]]>

ABC in Sydney, July 4, 2014]]>
Mon, 07 Jul 2014 17:29:20 GMT /slideshow/precomputation-for-smcabc-with-undirected-graphical-models/36722342 azeari@slideshare.net(azeari) Precomputation for SMC-ABC with undirected graphical models azeari ABC in Sydney, July 4, 2014 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/ywkgkb0htv2js6aymmn9-signature-f24123e8773d89ba67db0146b357c897512e8503fa48c3ac0dcb3b4b1360c824-poli-140707172920-phpapp01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> ABC in Sydney, July 4, 2014
Precomputation for SMC-ABC with undirected graphical models from Matt Moores
]]>
1307 5 https://cdn.slidesharecdn.com/ss_thumbnails/ywkgkb0htv2js6aymmn9-signature-f24123e8773d89ba67db0146b357c897512e8503fa48c3ac0dcb3b4b1360c824-poli-140707172920-phpapp01-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Intro to ABC /slideshow/intro-to-abc/36722283 ibccvepmreevct9laz27-signature-f24123e8773d89ba67db0146b357c897512e8503fa48c3ac0dcb3b4b1360c824-poli-140707172641-phpapp01
ABC in Sydney, July 3, 2014]]>

ABC in Sydney, July 3, 2014]]>
Mon, 07 Jul 2014 17:26:41 GMT /slideshow/intro-to-abc/36722283 azeari@slideshare.net(azeari) Intro to ABC azeari ABC in Sydney, July 3, 2014 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/ibccvepmreevct9laz27-signature-f24123e8773d89ba67db0146b357c897512e8503fa48c3ac0dcb3b4b1360c824-poli-140707172641-phpapp01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> ABC in Sydney, July 3, 2014
Intro to ABC from Matt Moores
]]>
1623 4 https://cdn.slidesharecdn.com/ss_thumbnails/ibccvepmreevct9laz27-signature-f24123e8773d89ba67db0146b357c897512e8503fa48c3ac0dcb3b4b1360c824-poli-140707172641-phpapp01-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Pre-computation for ABC in image analysis /slideshow/moores-talk/34695475 imkz17yatpgayp2e1utz-140514182947-phpapp02
MCMSki IV (the 5th IMS-ISBA joint meeting) January 2014 Chamonix Mont-Blanc, France The associated journal article has now been uploaded to arXiv: http://arxiv.org/abs/1403.4359]]>

MCMSki IV (the 5th IMS-ISBA joint meeting) January 2014 Chamonix Mont-Blanc, France The associated journal article has now been uploaded to arXiv: http://arxiv.org/abs/1403.4359]]>
Wed, 14 May 2014 18:29:47 GMT /slideshow/moores-talk/34695475 azeari@slideshare.net(azeari) Pre-computation for ABC in image analysis azeari MCMSki IV (the 5th IMS-ISBA joint meeting) January 2014 Chamonix Mont-Blanc, France The associated journal article has now been uploaded to arXiv: http://arxiv.org/abs/1403.4359 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/imkz17yatpgayp2e1utz-140514182947-phpapp02-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> MCMSki IV (the 5th IMS-ISBA joint meeting) January 2014 Chamonix Mont-Blanc, France The associated journal article has now been uploaded to arXiv: http://arxiv.org/abs/1403.4359
Pre-computation for ABC in image analysis from Matt Moores
]]>
544 5 https://cdn.slidesharecdn.com/ss_thumbnails/imkz17yatpgayp2e1utz-140514182947-phpapp02-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Variational Bayes /slideshow/variational-bayes/14648147 varbayes-121009032013-phpapp01
Variational Bayesian inference using the R package VBmix]]>

Variational Bayesian inference using the R package VBmix]]>
Tue, 09 Oct 2012 03:20:12 GMT /slideshow/variational-bayes/14648147 azeari@slideshare.net(azeari) Variational Bayes azeari Variational Bayesian inference using the R package VBmix <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/varbayes-121009032013-phpapp01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Variational Bayesian inference using the R package VBmix
Variational Bayes from Matt Moores
]]>
2462 6 https://cdn.slidesharecdn.com/ss_thumbnails/varbayes-121009032013-phpapp01-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Parallel R /slideshow/parallel-r/14461458 parallel-120925210450-phpapp02
]]>

]]>
Tue, 25 Sep 2012 21:04:48 GMT /slideshow/parallel-r/14461458 azeari@slideshare.net(azeari) Parallel R azeari <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/parallel-120925210450-phpapp02-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br>
Parallel R from Matt Moores
]]>
3900 6 https://cdn.slidesharecdn.com/ss_thumbnails/parallel-120925210450-phpapp02-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Informative Priors for Segmentation of Medical Images /slideshow/bayes-on-the/11623518 springbayes2011-120216205121-phpapp01
There is an abundance of prior information available for image-guided radiotherapy, making it ideally suited for Bayesian techniques. I will demonstrate some results from applying the method of Teo, Sapiro & Wandell (1997) to cone-beam computed tomography (CT). A previous CT scan of the same object forms the prior expectation. The posterior probabilities of class membership are smoothed by diffusion, before labeling each pixel according to the maximum a posteriori (MAP) estimate. The effect of the prior and of the smoothing is discussed and some potential extensions to this method are proposed.]]>

There is an abundance of prior information available for image-guided radiotherapy, making it ideally suited for Bayesian techniques. I will demonstrate some results from applying the method of Teo, Sapiro & Wandell (1997) to cone-beam computed tomography (CT). A previous CT scan of the same object forms the prior expectation. The posterior probabilities of class membership are smoothed by diffusion, before labeling each pixel according to the maximum a posteriori (MAP) estimate. The effect of the prior and of the smoothing is discussed and some potential extensions to this method are proposed.]]>
Thu, 16 Feb 2012 20:51:21 GMT /slideshow/bayes-on-the/11623518 azeari@slideshare.net(azeari) Informative Priors for Segmentation of Medical Images azeari There is an abundance of prior information available for image-guided radiotherapy, making it ideally suited for Bayesian techniques. I will demonstrate some results from applying the method of Teo, Sapiro & Wandell (1997) to cone-beam computed tomography (CT). A previous CT scan of the same object forms the prior expectation. The posterior probabilities of class membership are smoothed by diffusion, before labeling each pixel according to the maximum a posteriori (MAP) estimate. The effect of the prior and of the smoothing is discussed and some potential extensions to this method are proposed. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/springbayes2011-120216205121-phpapp01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> There is an abundance of prior information available for image-guided radiotherapy, making it ideally suited for Bayesian techniques. I will demonstrate some results from applying the method of Teo, Sapiro &amp; Wandell (1997) to cone-beam computed tomography (CT). A previous CT scan of the same object forms the prior expectation. The posterior probabilities of class membership are smoothed by diffusion, before labeling each pixel according to the maximum a posteriori (MAP) estimate. The effect of the prior and of the smoothing is discussed and some potential extensions to this method are proposed.
Informative Priors for Segmentation of Medical Images from Matt Moores
]]>
871 6 https://cdn.slidesharecdn.com/ss_thumbnails/springbayes2011-120216205121-phpapp01-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-azeari-48x48.jpg?cb=1638099248 PDRA on the EPSRC funded project "In Situ Nanoparticle Assemblies for Healthcare Diagnostics and Therapy" with Prof. Mark Girolami, in collaboration with the University of Strathclyde, Glasgow. My dissertation involved the development of Bayesian computational methods for spatial analysis of images, with applications to medical imaging and satellite remote sensing. I was previously involved in the Visible Cell project at the Institute for Molecular Bioscience, UQ. I also have over a decade of experience in R&D, having worked for various international companies. mattstats.wordpress.com/ https://cdn.slidesharecdn.com/ss_thumbnails/moorestide20211014-211014093220-thumbnail.jpg?width=320&height=320&fit=bounds azeari/bayesian-inference-and-uncertainty-quantification-for-inverse-problems Bayesian Inference and... https://cdn.slidesharecdn.com/ss_thumbnails/mooresmuser2018-180710225904-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/bayesimages-an-r-package-for-bayesian-image-analysis/105239891 bayesImageS: an R pack... https://cdn.slidesharecdn.com/ss_thumbnails/mooresmultivariate-170912202925-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/exploratory-analysis-of-multivariate-data/79697426 Exploratory Analysis o...