Bayesian k-nearest neighbours


Our paper with Lionel Cucala, Jean-Michel Marin and Mike Titterington on A Bayesian Reassessment of Nearest Neighbor Classification has now appeared in JASA. Recall that the standard k nearest neighbor (knn) procedure is a deterministic method used in supervised classification where the classification is based on the majority rule on the neighbours. In this paper, we propose a reassessment of the knn method as a statistical technique derived from a proper probabilistic model; in particular, we differ from the assessment found in Holmes and Adams (2002, 2003), where the underlying probabilistic model is not completely coherent in terms of conditionals versus joint. In addition, we evaluate computational tools for Bayesian inference on the parameters of the corresponding model, highlighting the difficulties inherent to both pseudo-likelihood and path sampling approximations of an intractable normalizing constant and demonstrating the limitations of the pseudo-likelihood approximation in this setup.

One Response to “Bayesian k-nearest neighbours”

  1. […] especially since the paper by Neal Friel and Tony Pettitt builds upon our JASA k-nearest neighbour paper. […]

Leave a Reply to New arXiv postings « Xi'an's Og Cancel reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.