Talk Abstract: Quadrature is the problem of approximating intractable integrals, a common task in many machine learning and STEM applications. Bayesian quadrature is a model-based method that utilizes convenient properties of Gaussian processes to make probabilistic estimates. Furthermore, Bayesian quadrature has the ability to leverage prior knowledge or domain expertise for increased accuracy and efficiency. The content and style of this presentation will mirror a lecture delivered in CSE 515T: Bayesian Methods in Machine Learning, an upper-level graduate course in machine learning. This talk will broadly cover Gaussian processes, Bayesian quadrature as well as active learning and hopefully convince you to consider Bayesian quadrature the next time you run into an intractable integral.
Location: Sennott Square room 5317, 10:00-11:00 a.m. on Tuesday, March 15th, 2022.
Biosketch: Henry Chai is a postdoctoral teaching fellow at Carnegie Mellon University where he teaches an assortment of courses titled "Introduction to Machine Learning" at various levels. He received his Ph.D. at Washington University in St. Louis where he was advised by Dr. Roman Garnett. His research interests lie at the intersection of Bayesian machine learning, probabilistic numerics and active learning; they can be succinctly summarized by the following question: “how can we efficiently and accurately reason about inherently intractable quantities?” Henry has taught numerous introductory and advanced machine learning courses and has been awarded the CSE departmental teaching award as well as a teaching citation from the Center for Teaching and Learning, both from his time teaching at Washington University in St. Louis.
View this event on the School of Computing and Information events calendar.