Advanced

Asymptotic Behavior of Bayesian Nonparametric Procedures

Xing, Yang LU (2009) In Acta Universitatis Agriculturae Sueciae
Abstract
Asymptotics plays a crucial role in statistics. The theory of asymptotic consistency of Bayesian nonparametric procedures has been developed by many authors, including Schwartz (1965), Barron, Schervish and Wasserman (1999), Ghosal, Ghosh and Ramamoorthi (1999), Ghosal, Ghosh and van der Vaart (2000), Shen and Wasserman (2001), Walker and Hjort (2001), Walker (2004), Ghosal and van der Vaart (2007) and Walker, Lijoi and Prunster (2007). This theory is mainly based on

existence of uniformly exponentially consistent tests, computation of a metric entropy and measure of a prior concentration around the true value of parameter. However, both the test condition and the metric entropy condition

depend on models but not on prior... (More)
Asymptotics plays a crucial role in statistics. The theory of asymptotic consistency of Bayesian nonparametric procedures has been developed by many authors, including Schwartz (1965), Barron, Schervish and Wasserman (1999), Ghosal, Ghosh and Ramamoorthi (1999), Ghosal, Ghosh and van der Vaart (2000), Shen and Wasserman (2001), Walker and Hjort (2001), Walker (2004), Ghosal and van der Vaart (2007) and Walker, Lijoi and Prunster (2007). This theory is mainly based on

existence of uniformly exponentially consistent tests, computation of a metric entropy and measure of a prior concentration around the true value of parameter. However, both the test condition and the metric entropy condition

depend on models but not on prior distributions. Because a posterior distribution depends on the complexity of the model only through its prior distribution, it

is therefore natural to explore appropriate conditions which incorporate prior distributions. In this thesis we introduce

Hausdorff $\alpha$-entropy and an integration condition, both of which incorporate prior distributions and moreover are weaker than the metric entropy condition and the test condition, respectively. Furthermore, we provide an improved method to measure the prior concentration. By means of these new quantities, we derive several types of general posterior consistency theorems and general posterior convergence rate theorems for i.i.d. and non-i.i.d. models, which lead to improvements in

a number of currently known theorems and their applications. We also study rate adaptation for density estimation within the Bayesian framework and particularly obtain

that the Bayesian procedure with hierarchical prior distributions for log spline densities and a finite number of models achieves the

optimal minimax rate when the true density is H\"older-continuous. This result disconfirms a

conjecture given by Ghosal, Lember and van der Vaart (2003).

Finally, we find a new both necessary and sufficient condition on Bayesian exponential consistency for prior distributions with the Kullback-Leibler support property. (Less)
Please use this url to cite or link to this publication:
author
opponent
  • Professor Hjort, Nils Lid, Department of Mathematics, Blindern NO-0316, Olso
publishing date
type
Thesis
publication status
published
subject
keywords
rate of convergence, Adaptation, density function, Hausdorff entropy, \noindent Hellinger metric, Kullback-Leibler divergence, log spline density, Markov chain, posterior distribution, consistency, nonparametrics, sieve
in
Acta Universitatis Agriculturae Sueciae
defense location
Björken, SLU, Umeå
defense date
2009-05-28 13:00
ISSN
1652-6880
language
English
LU publication?
no
id
18e77b36-1f0c-407e-bbc3-d0767b5b495d (old id 1465103)
date added to LUP
2009-09-02 10:03:17
date last changed
2016-09-19 08:45:01
@phdthesis{18e77b36-1f0c-407e-bbc3-d0767b5b495d,
  abstract     = {Asymptotics plays a crucial role in statistics. The theory of asymptotic consistency of Bayesian nonparametric procedures has been developed by many authors, including Schwartz (1965), Barron, Schervish and Wasserman (1999), Ghosal, Ghosh and Ramamoorthi (1999), Ghosal, Ghosh and van der Vaart (2000), Shen and Wasserman (2001), Walker and Hjort (2001), Walker (2004), Ghosal and van der Vaart (2007) and Walker, Lijoi and Prunster (2007). This theory is mainly based on<br/><br>
existence of uniformly exponentially consistent tests, computation of a metric entropy and measure of a prior concentration around the true value of parameter. However, both the test condition and the metric entropy condition<br/><br>
depend on models but not on prior distributions. Because a posterior distribution depends on the complexity of the model only through its prior distribution, it<br/><br>
is therefore natural to explore appropriate conditions which incorporate prior distributions. In this thesis we introduce<br/><br>
Hausdorff $\alpha$-entropy and an integration condition, both of which incorporate prior distributions and moreover are weaker than the metric entropy condition and the test condition, respectively. Furthermore, we provide an improved method to measure the prior concentration. By means of these new quantities, we derive several types of general posterior consistency theorems and general posterior convergence rate theorems for i.i.d. and non-i.i.d. models, which lead to improvements in<br/><br>
a number of currently known theorems and their applications. We also study rate adaptation for density estimation within the Bayesian framework and particularly obtain<br/><br>
that the Bayesian procedure with hierarchical prior distributions for log spline densities and a finite number of models achieves the<br/><br>
optimal minimax rate when the true density is H\"older-continuous. This result disconfirms a<br/><br>
conjecture given by Ghosal, Lember and van der Vaart (2003).<br/><br>
Finally, we find a new both necessary and sufficient condition on Bayesian exponential consistency for prior distributions with the Kullback-Leibler support property.},
  author       = {Xing, Yang},
  issn         = {1652-6880},
  keyword      = {rate of convergence,Adaptation,density function,Hausdorff entropy,\noindent Hellinger metric,Kullback-Leibler  divergence,log spline density,Markov chain,posterior distribution,consistency,nonparametrics,sieve},
  language     = {eng},
  series       = {Acta Universitatis Agriculturae Sueciae},
  title        = {Asymptotic Behavior of Bayesian Nonparametric Procedures},
  year         = {2009},
}