?
Interval Pattern Concept Lattice as a Classifier Ensemble
Decision tree learning is one of the most popular classifica- tion techniques. However, by its nature it is a greedy approach to finding a classification hypothesis that optimizes some information-based crite- rion. It is very fast but may lead to finding suboptimal classification hy- potheses. Moreover, in spite of decision trees being easily interpretable, ensembles of trees (random forests and gradient-boosted trees) are not, which is crucial in some domains, like medical diagnostics or bank credit scoring. In case of such “small, but important-data” problems one is not obliged to perform a greedy search for classification hypotheses, and therefore alternatives to decision tree learning techniques may be con- sidered. In this paper, we propose an FCA-based classification technique where each test instance is classified with a set of the best (in terms of some information-based criterion) classification rules. In a set of bench- marking experiments, the proposed strategy is compared with decision tree and nearest neighbor learning.