توضیحات
ABSTRACT
We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally
decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy
problem and two real world data sets
INTRODUCTION
Gaussian processes [GPs, Rasmussen and Williams 2006] are perhaps the dominant approach for inference on functions They underpin a range of algorithms for regression, classication and unsupervised learning. Unfortunately, when applying a Gaussian process to a data set of size n exact inference has complexity O(n3) with storage demands of O(n2). This hinders the application of these models for many domains. In particular, large spatiotemporal data sets, video, large social network data (e.g. from Facebook), population scale medical data sets, models that correlate across multiple outputs or tasks (for these models complexity is O(n3p3) and storage is O(n2p2) where p is the number of outputs or tasks). Collectively we can think of these applications as belonging to the domain of `big data
Year : 2013
By : James Hensman , Nicolo Fusi , Neil D. Lawrence
File Information : English Language/ 9 Page/Size :1.3 M
Download : click
سال : 2013
کاری از : James Hensman , Nicolo Fusi , Neil D. Lawrence
اطلاعات فایل : زبان انگلیسی /9 صفحه /حجم :1.3 M
لینک دانلود :روی همین لینک کلیک کنید
نقد و بررسیها
هنوز بررسیای ثبت نشده است.