Alternatives for select-feature

  1. Random selection: guaranteed to give a decision tree which is consistent with the training set. No guarantee of optimality.
  2. Information theoretic selection: select the feature which maximally partitions the set of instances. Heuristic for finding decision tree with minimum expected classification time.
  3. Minimal cost selection: allow for the fact that some features are costly to evaluate. For example, body temperature is easier to determine than lung-capacity. Put least costly features high in the tree.

Contents    Page-10    Prev    Next    Page+10    Index