0
喜欢
0
书签
声明论文
A Fast Way to Produce Near-Optimal Fixed-Depth Decision Trees   
摘  要:   Decision trees play an essential role in many classication tasks. In some circumstances, we only want to consider xed- depth trees. Unfortunately, nding the optimal depth-d de- cision tree can require time exponential in d. This paper presents a fast way to produce a x ed-depth decision tree that is optimal under the Nav e Bayes (NB) assumption. Here, we prove that the optimal depth-d feature essentially depends only on the posterior probability of the class label given the tests previously performed, but not directly on either the iden- tity nor the outcomes of these tests. We can therefore precom- pute, in a fast pre-processing step, which features to use at the nal layer. This results in a speedup of O(n= log n), where n is the number of features. We apply this technique to learn- ing x ed-depth decision trees from standard UCI repository datasets, and nd this model improves the computational cost signicantly . Surprisingly, this approach still yields relatively high classication accuracy, despite the NB assumption.

共享有6个版本
 [展开全部版本] 

Bibtex
创新指数 
阅读指数 
重现指数 
论文点评
还没有人点评哦

Feedback
Feedback
Feedback
我想反馈:
排行榜