Attached
Decision trees are a popular technique in statistical data classification. They recursively
partition the feature space into .disjoint sub-regions until each sub-region becomes
homogeneous with respect to particular class. The basic Classification and Regression
Tree (CART) algorithm partitions die feature space using axis parallel splits. When the true
decision boundaries are no^ilignefr%itlfthe feature axes, this approach can produce a
complicated boundary structure. Oblique decision trees use oblique decision boundaries
to potentially simplify the^qndaryStructure. The major limitation of this approach is
that the tree induction algorithm is Computationally expensive. Hence, as an alternative,
a new decision tree algorithm called HHCART is presented. The method uses a series of
Householder matrides to reflect the training data at each non-terminal node during tree
construction. Each reflection is based on the directions of the eigenvectors from each class'
covariance matrix. Considering of axis parallel splits in the reflected training data provides
an efficient way of finding oblique splits in the unreflected training data. Experimental
results show that the accuracy and size of HHCART trees are comparable with some
benchmark methods. The appealing feature of HHCART is that it can handle both qualitative
and quantitative features in the same oblique split