Return to search

Leave-Group-Out Cross-Validation for Latent Gaussian Models

Cross-validation is a widely used technique in statistics and machine learning
for predictive performance assessment and model selection. It involves dividing
the available data into multiple sets, training the model on some of the data and
testing it on the rest, and repeating this process multiple times. The goal of
cross-validation is to assess the model’s predictive performance on unseen data.
Two standard methods for cross-validation are leave-one-out cross-validation
and K-fold cross-validation. However, these methods may not be suitable for
structured models with many potential prediction tasks, as they do not take into
account the structure of the data. As a solution, leave-group-out cross-validation
is an extension of cross-validation that allows the left-out groups to make training
sets and testing points to adapt to different prediction tasks. In this dissertation,
we propose an automatic group construction procedure for leave-group-out
cross-validation to estimate the predictive performance of the model when the
prediction task is not specified. We also propose an efficient approximation of
leave-group-out cross-validation for latent Gaussian models. Both of these procedures
are implemented in the R-INLA software.
We demonstrate the usefulness of our proposed leave-group-out cross-validation
method through its application in the joint modeling of survival data and longitudinal
data. The example shows the effectiveness of this method in real-world
scenarios.

Identiferoai:union.ndltd.org:kaust.edu.sa/oai:repository.kaust.edu.sa:10754/692371
Date04 1900
CreatorsLiu, Zhedong
ContributorsRue, Haavard, Computer, Electrical and Mathematical Science and Engineering (CEMSE) Division, Al-Naffouri, Tareq Y., Huser, Raphaël, Mira, Antonietta
Source SetsKing Abdullah University of Science and Technology
LanguageEnglish
Detected LanguageEnglish
TypeDissertation
RelationN/A

Page generated in 0.0025 seconds