Bayesian k-nearest neighbours

ripley-credible

Our paper with Lionel Cucala, Jean-Michel Marin and Mike Titterington on A Bayesian Reassessment of Nearest Neighbor Classification has now appeared in JASA. Recall that the standard k nearest neighbor (knn) procedure is a deterministic method used in supervised classification where the classification is based on the majority rule on the neighbours. In this paper, we propose a reassessment of the knn method as a statistical technique derived from a proper probabilistic model; in particular, we differ from the assessment found in Holmes and Adams (2002, 2003), where the underlying probabilistic model is not completely coherent in terms of conditionals versus joint. In addition, we evaluate computational tools for Bayesian inference on the parameters of the corresponding model, highlighting the difficulties inherent to both pseudo-likelihood and path sampling approximations of an intractable normalizing constant and demonstrating the limitations of the pseudo-likelihood approximation in this setup.

One Response to “Bayesian k-nearest neighbours”

  1. […] especially since the paper by Neal Friel and Tony Pettitt builds upon our JASA k-nearest neighbour paper. […]

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.