Recursive binary partitioning is a popular tool for regression analysis. Two fundamental problems of exhaustive search procedures usually applied to fit such models have been known for a long time: Overfitting and a selection bias towards covariates with many possible splits or missing values. While pruning procedures are able to solve the overfitting problem, the variable selection bias still seriously effects the interpretability of tree-structured regression models. For some special cases unbiased procedures have been suggested, however lacking a common theoretical foundation. We propose a unified framework for recursive partitioning which embeds tree-structured regression models into a well defined theory of conditional inference procedures. Stopping criteria based on multiple test procedures are implemented and it is shown that the predictive performance of the resulting trees is as good as the performance of established exhaustive search procedures. It turns out that the partitions and therefore the models induced by both approaches are structurally different, indicating the need for an unbiased variable selection. The methodology presented here is applicable to all kinds of regression problems, including nominal, ordinal, numeric, censored as well as multivariate response variables and arbitrary measurement scales of the covariates. Data from studies on animal abundance, glaucoma classification, node positive breast cancer and mammography experience are re-analyzed. / Series: Research Report Series / Department of Statistics and Mathematics
Identifer | oai:union.ndltd.org:VIENNA/oai:epub.wu-wien.ac.at:epub-wu-01_756 |
Date | January 2004 |
Creators | Hothorn, Torsten, Hornik, Kurt, Zeileis, Achim |
Publisher | Institut fĂĽr Statistik und Mathematik, WU Vienna University of Economics and Business |
Source Sets | Wirtschaftsuniversität Wien |
Language | English |
Detected Language | English |
Type | Paper, NonPeerReviewed |
Format | application/pdf |
Relation | http://epub.wu.ac.at/676/ |
Page generated in 0.0017 seconds