ASYMPTOTICS AND INTERPRETABILITY OF DECISION TREES AND DECISION TREE ENSEMBLES
dc.contributor.author | Zhou, Yichen | |
dc.date.accessioned | 2019-10-15T15:30:49Z | |
dc.date.available | 2019-10-15T15:30:49Z | |
dc.date.issued | 2019-05-30 | |
dc.description.abstract | Decision trees and decision tree ensembles are widely used nonparametric statistical models. A decision tree is a binary tree that recursively segments the covariate space along the coordinate directions to create hyper rectangles as basic prediction units for fitting constant values within each of them. A decision tree ensemble combines multiple decision trees, either in parallel or in sequence, in order to increase model flexibility and accuracy, as well as to reduce prediction variance. Despite the fact that tree models have been extensively used in practice, results on their asymptotic behaviors are scarce. In this thesis we present our analyses on tree asymptotics in the perspectives of tree terminal nodes, tree ensembles and models incorporating tree ensembles respectively. Our study introduces a few new tree related learning frameworks for which we can provide provable statistical guarantees and interpretations. Our study on the Gini index used in the greedy tree building algorithm reveals its limiting distribution, leading to the development of a test of better splitting that helps to measure the uncertain optimality of a decision tree split. This test is combined with the concept of decision tree distillation, which implements a decision tree to mimic the behavior of a block box model, to generate stable interpretations by guaranteeing a unique distillation tree structure as long as there are sufficiently many random sample points. Meanwhile, we apply mild modification and regularization to the standard tree boosting to create a new boosting framework named Boulevard. The major difference Boulevard has in contrast to the original framework is our integration of two new mechanisms: honest trees, which isolate the tree terminal values from the tree structure, and adaptive shrinkage, which scales the boosting history to create an equally weighted ensemble. With carefully chosen rates, we establish consistency and asymptotic normality for Boulevard predictions. This theoretical development provides us with the prerequisite for the practice of statistical inference with boosted trees. Lastly, we investigate the feasibility of incorporating existing semi-parametric models with tree boosting. We study the varying coefficient modeling framework with boosted trees applied as its nonparametric effect modifiers, because it is the generalization of several popular learning models including partially linear regression and functional trees. We demonstrate that the new framework is not only theoretically sound as it achieves consistency, but also empirically intelligible as it is capable of producing comprehensible model structures and intuitive visualization. | |
dc.identifier.doi | https://doi.org/10.7298/8eat-hb86 | |
dc.identifier.other | Zhou_cornellgrad_0058F_11397 | |
dc.identifier.other | http://dissertations.umi.com/cornellgrad:11397 | |
dc.identifier.other | bibid: 11050370 | |
dc.identifier.uri | https://hdl.handle.net/1813/67388 | |
dc.language.iso | en_US | |
dc.rights | Attribution-ShareAlike 2.0 Generic | |
dc.rights.uri | https://creativecommons.org/licenses/by-sa/4.0/ | |
dc.subject | Statistics | |
dc.subject | Boosting | |
dc.subject | Asymptotics | |
dc.subject | Decision Tree | |
dc.subject | Interpretability | |
dc.subject | Model Distillation | |
dc.title | ASYMPTOTICS AND INTERPRETABILITY OF DECISION TREES AND DECISION TREE ENSEMBLES | |
dc.type | dissertation or thesis | |
dcterms.license | https://hdl.handle.net/1813/59810 | |
thesis.degree.discipline | Statistics | |
thesis.degree.grantor | Cornell University | |
thesis.degree.level | Doctor of Philosophy | |
thesis.degree.name | Ph.D., Statistics |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Zhou_cornellgrad_0058F_11397.pdf
- Size:
- 1.87 MB
- Format:
- Adobe Portable Document Format