Date of Award:

5-2000

Document Type:

Dissertation

Degree Name:

Doctor of Philosophy (PhD)

Department:

Mathematics and Statistics

Committee Chair(s)

Kevin Hestir

Committee

Kevin Hestir

Abstract

In this dissertation, new theoretical results are obtained for bounding convergence and mean-square error in conditional coding. Further new statistical methods for the practical application of conditional coding are developed.

Criteria for the uniform convergence are first examined. Conditional coding Markov chains are aperiodic, π-irreducible, and Harris recurrent. By applying the general theories of uniform ergodicity of Markov chains on general state space, one can conclude that conditional coding Markov chains are uniformly ergodic and further, theoretical convergence rates based on Doeblin's condition can be found.

Conditional coding Markov chains can be also viewed as having finite state space. This allows use of techniques to get bounds on the second largest eigenvalue which lead to bounds on convergence rate and the mean-square error of sample averages. The results are applied in two examples showing that these bounds are useful in practice.

Next some algorithms for perfect sampling in conditional coding are studied. An application of exact sampling to the independence sampler is shown to be equivalent to standard rejection sampling. In case of single-site updating, traditional perfect sampling is not directly applicable when the state space has large cardinality and is not stochastically ordered, so a new procedure is developed that gives perfect samples at a predetermined confidence interval.

In last chapter procedures and possibilities of applying conditional coding to mixture models are explored. Conditional coding can be used for analysis of a finite mixture model. This methodology is general and easy to use.

Checksum

1263a7c02bc54851b7b37c5cc3067a80

Included in

Mathematics Commons

Share

COinS