Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Subject "Hamiltonian Monte Carlo"

Sort by: Order: Results:

  • Räisä, Ossi (2021)
    Differential privacy has over the past decade become a widely used framework for privacy-preserving machine learning. At the same time, Markov chain Monte Carlo (MCMC) algorithms, particularly Metropolis-Hastings (MH) algorithms, have become an increasingly popular method of performing Bayesian inference. Surprisingly, their combination has not received much attention in the litera- ture. This thesis introduces the existing research on differentially private MH algorithms, proves tighter privacy bounds for them using recent developments in differential privacy, and develops two new differentially private MH algorithms: an algorithm using subsampling to lower privacy costs, and a differentially private variant of the Hamiltonian Monte Carlo algorithm. The privacy bounds of both new algorithms are proved, and convergence to the exact posterior is proven for the latter. The performance of both the old and the new algorithms is compared on several Bayesian inference problems, revealing that none of the algorithms is clearly better than the others, but subsampling is likely only useful to lower computational costs.
  • Penttinen, Jussi (2021)
    HMC is a computational method build to efficiently sample from a high dimensional distribution. Sampling from a distribution is typically a statistical problem and hence a lot of works concerning Hamiltonian Monte Carlo are written in the mathematical language of probability theory, which perhaps is not ideally suited for HMC, since HMC is at its core differential geometry. The purpose of this text is to present the differential geometric tool's needed in HMC and then methodically build the algorithm itself. Since there is a great introductory book to smooth manifolds by Lee and not wanting to completely copy Lee's work from his book, some basic knowledge of differential geometry is left for the reader. Similarly, the author being more comfortable with notions of differential geometry, and to cut down the length of this text, most theorems connected to measure and probability theory are omitted from this work. The first chapter is an introductory chapter that goes through the bare minimum of measure theory needed to motivate Hamiltonian Monte Carlo. Bulk of this text is in the second and third chapter. The second chapter presents the concepts of differential geometry needed to understand the abstract build of Hamiltonian Monte Carlo. Those familiar with differential geometry can possibly skip the second chapter, even though it might be worth while to at least flip through it to fill in on the notations used in this text. The third chapter is the core of this text. There the algorithm is methodically built using the groundwork laid in previous chapters. The most important part and the theoretical heart of the algorithm is presented here in the sections discussing the lift of the target measure. The fourth chapter provides brief practical insight to implementing HMC and also discusses quickly how HMC is currently being improved.