The ISC is part of the Johns Hopkins Applied Physics Laboratory and will follow all current policies. Please visit the JHU/APL page for more information on the Lab's visitor guidance.

2019

Cost-Constrained Classifier Cascade using Sparse Probit Models


Abstract

Feature selection is a common problem in pattern recognition. Though often motivated by the curse of dimensionality, feature selection also has the added benefit of reducing the cost of extracting features from test data. In this work, sparse probit models are modified to incorporate feature costs. A single-classifier approach, Cost-Constrained Feature optimization (CCFO), is compared to a new ensemble method referred to as the Cost-Constrained Classifier Cascade (C4). The C4 method utilizes a boosting framework that accommodates per-sample feature selection. Experimental results compare C4, CCFO, and baseline sparse kernel classification on two data sets with asymmetric feature costs, illustrating that C4 can yield similar or better accuracy and more economical use of expensive features.

Citation

@INPROCEEDINGS8693031 author=C. Ratto and W. Malik and I. Kabir and R. Newsome booktitle=2019 53rd Annual Conference on Information Sciences and Systems (CISS) title=Cost-Constrained Classifier Cascade using Sparse Probit Models : Invited Presentation year=2019 volume= number= pages=1-6 doi=10.1109/CISS.2019.8693031

Citation

@INPROCEEDINGS8693031 author=C. Ratto and W. Malik and I. Kabir and R. Newsome booktitle=2019 53rd Annual Conference on Information Sciences and Systems (CISS) title=Cost-Constrained Classifier Cascade using Sparse Probit Models : Invited Presentation year=2019 volume= number= pages=1-6 doi=10.1109/CISS.2019.8693031