PhD Seminar: Lijin Zhang, Convergence Analysis Of Manifold MALA With Application To Generalized Linear Mixed Models

PhD Seminar: Lijin Zhang, Convergence Analysis Of Manifold MALA With Application To Generalized Linear Mixed Models

Nov 13, 2023 - 2:00 PM
to Nov 13, 2023 - 3:00 PM

Speaker: Lijin Zhang, PhD Candidate, Department of Statistics, Iowa State University

Title: Convergence analysis of manifold MALA with application to generalized linear mixed models

Abstract:  Markov Chain Monte Carlo (MCMC) is widely applied to fit complex, practical models in various disciplines, such as engineering, physics, agriculture, etc. Indeed, when the target density f is not available in closed form, a Markov chain is constructed with stationary density f for forming Monte Carlo estimators of means with respect to f. Although a Harris ergodic Markov chain will converge to the target distribution as the Monte Carlo sample size tends to infinity, and the population means can be consistently estimated by the corresponding MCMC estimators, in practice, it is important to estimate the standard errors associated with these Monte Carlo estimates due to the availability of only a finite sample. Therefore, the geometric ergodicity of a Markov Chain is highly desired as it is the most standard method for guaranteeing a central limit theorem (CLT) for MCMC estimators and finding its standard errors. 

Among the different MCMC algorithms, Metropolis-Hastings (MH) algorithms are predominant. This thesis establishes conditions under which MH algorithms with a position-dependent proposal covariance matrix will or will not have the geometric convergence rate. Indeed, we provide these results for MH algorithms where the normal proposal density has a general mean function c(x) and a covariance matrix G(x) depending on the current position x of the Markov chain.  For modern variants of the popular Metropolis adjusted Langevin algorithms (MALA), like the manifold MALA (MMALA) that adapt to the geometry of the target distributions, the proposal covariance matrix changes in every iteration. Thus, we provide conditions for the geometric ergodicity of different variations of the Langevin algorithms, e.g.,  MMALA, position-dependent MALA (PMALA), and the pre-conditioned MALA (PCMALA). We also provide conditions guaranteeing geometric ergodicity of the pre-conditioned unadjusted Langevin algorithms (PCULA). 

Generalized linear mixed models (GLMMs), where the response variable is allowed to follow a non-Gaussian distribution, are popular statistical models, although their likelihood functions are not available in closed form. GLMMs can handle the over-dispersion often present in real-world data, and MCMC methods are generally used to fit frequentist and Bayesian GLMMs. Since the likelihood function for GLMMs is not available in closed form, Monte Carlo EM or Monte Carlo maximum likelihood methods are often used for making inferences from GLMMs.  Simulation from the conditional (on the data) distribution of the random effects is, in turn, needed for implementing the Monte Carlo EM or Monte Carlo maximum likelihood methods. In this thesis, our general conditions for geometric ergodicity of variants of Langevin algorithms are also verified in the context of conditional simulation from the two most popular GLMMs, namely the binomial GLMM with the logit link and the Poisson GLMM with the log link. These results have important practical implications as users can now confidently fit these models by attaching asymptotically valid standard errors with their Markov chain-based Monte Carlo estimates.