2016 conference paper

Non-parametric bounds on the nearest neighbor classification accuracy based on the Henze-Penrose metric

2016 ieee international conference on image processing (icip), 1364–1368.

By: S. Ghanem  n, E. Skau n, H. Krim n, H. Clouse* & W. Sakla*

co-author countries: United States of America πŸ‡ΊπŸ‡Έ
Source: NC State University Libraries
Added: August 6, 2018

Analysis procedures for higher-dimensional data are generally computationally costly; thereby justifying the high research interest in the area. Entropy-based divergence measures have proven their effectiveness in many areas of computer vision and pattern recognition. However, the complexity of their implementation might be prohibitive in resource-limited applications, as they require estimates of probability densities which are very difficult to compute directly for high-dimensional data. In this paper, we investigate the usage of a non-parametric distribution-free metric, known as the Henze-Penrose test statistic, to estimate the divergence between different classes of vehicles. In this regard, we apply some common feature extraction techniques to further characterize the distributional separation relative to the original data. Moreover, we employ the Henze-Penrose metric to obtain bounds for the Nearest Neighbor (NN) classification accuracy. Simulation results demonstrate the effectiveness and the reliability of this metric in estimating the inter-class separability. In addition, the proposed bounds are exploited for selecting the least number of features that would retain sufficient discriminative information.