Asymptotic Analysis of the Nearest Neighbor Decision Rule
Department of Electrical Engineering, Stanford University., 1966 - Statistical decision - 65 pages
The nearest-neighbor decision rule (NN rule) assigns to an unclassified sample the classification of the nearest of n previously classified samples. In a large sample analysis it is shown, under very weak regularity conditions, that the risk incurred by this nonparametric rule is less than twice the Bayes risk. A variety of standard decision problems are treated, and in some cases the bounds given on the NN risk are the best possible. The natural extension of the NN rule to the decision rule that considers several nearby neighbors and takes a vote is also treated. Consideration is given to some of the implementation problems arising in connection with the NN rule. In particular, a method of optical computation is suggested to carry out the necessary calculations and a performance-feedback technique is proposed to determine sample-size requirements. (Author).
What people are saying - Write a review
We haven't found any reviews in the usual places.
EXTENSION AND RELATED ANALYSIS
2 other sections not shown
analysis Arlington Hall Station Asst asymptotic ASYMPTOTIC ANALYSIS Attn Bayes decision rule Bayes rule bounds given Ck(R completes the proof concave concave functions conditional Bayes risk conditional density conditional NN risk conditional probability densities Decision 61 62 decision problem dominated convergence theorem dr dx E[plim Electronic Warfare finite-sample-size Fix and Hodges fj(x function i=n-w+l k-NN risk k-NN rule least upper bound Lemma loss matrix M-class problem MATCHING PROBABILITIES modified NN rule nearest nearest-mean rule nearest-mode rule NEAREST-NEIGHBOR DECISION RULE nearest-neighbor rule notation optical computer Pi(x plim p(x,x plot priori probabilities probability of error probability structure proof of Theorem properties Proving Gr rate of convergence real-valued random variable rule decides sample-size samples are stored Stanford Electronics Laboratories stored sample set Taking expectation Tech Ti(x twice the Bayes two-class problem U.S. Army unclassified USAECOM Liaison Officer vectors window length Wright-Patterson AFB