Alternatives for select-feature
- Random selection: guaranteed to give a decision tree which is
consistent with the training set. No guarantee of optimality.
- Information theoretic selection: select the feature which maximally
partitions the set of instances. Heuristic for finding decision tree with
minimum expected classification time.
- Minimal cost selection: allow for the fact that some features are
costly to evaluate. For example, body temperature is easier to determine
than lung-capacity. Put least costly features high in the tree.