Applications of Depth Minimization of Decision Trees Containing Hypotheses for Multiple-Value Decision Tables

Entropy (Basel). 2023 Mar 23;25(4):547. doi: 10.3390/e25040547.

Abstract

In this research, we consider decision trees that incorporate standard queries with one feature per query as well as hypotheses consisting of all features' values. These decision trees are used to represent knowledge and are comparable to those investigated in exact learning, in which membership queries and equivalence queries are used. As an application, we look into the issue of creating decision trees for two cases: the sorting of a sequence that contains equal elements and multiple-value decision tables which are modified from UCI Machine Learning Repository. We contrast the efficiency of several forms of optimal (considering the parameter depth) decision trees with hypotheses for the aforementioned applications. We also investigate the efficiency of decision trees built by dynamic programming and by an entropy-based greedy method. We discovered that the greedy algorithm produces very similar results compared to the results of dynamic programming algorithms. Therefore, since the dynamic programming algorithms take a long time, we may readily apply the greedy algorithms.

Keywords: decision tree; depth; hypothesis; multiple value decision table.

Grants and funding

The APC was funded by King Abdullah University of Science & Technology.