• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 149
  • 24
  • 17
  • 7
  • 2
  • 1
  • 1
  • Tagged with
  • 257
  • 257
  • 132
  • 77
  • 50
  • 48
  • 41
  • 39
  • 37
  • 36
  • 32
  • 28
  • 28
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Sequential Change-point Detection in Linear Regression and Linear Quantile Regression Models Under High Dimensionality

Ratnasingam, Suthakaran 06 August 2020 (has links)
No description available.
92

Sparse Latent-Space Learning for High-Dimensional Data: Extensions and Applications

White, Alexander James 05 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The successful treatment and potential eradication of many complex diseases, such as cancer, begins with elucidating the convoluted mapping of molecular profiles to phenotypical manifestation. Our observed molecular profiles (e.g., genomics, transcriptomics, epigenomics) are often high-dimensional and are collected from patient samples falling into heterogeneous disease subtypes. Interpretable learning from such data calls for sparsity-driven models. This dissertation addresses the high dimensionality, sparsity, and heterogeneity issues when analyzing multiple-omics data, where each method is implemented with a concomitant R package. First, we examine challenges in submatrix identification, which aims to find subgroups of samples that behave similarly across a subset of features. We resolve issues such as two-way sparsity, non-orthogonality, and parameter tuning with an adaptive thresholding procedure on the singular vectors computed via orthogonal iteration. We validate the method with simulation analysis and apply it to an Alzheimer’s disease dataset. The second project focuses on modeling relationships between large, matched datasets. Exploring regressional structures between large data sets can provide insights such as the effect of long-range epigenetic influences on gene expression. We present a high-dimensional version of mixture multivariate regression to detect patient clusters, each with different correlation structures of matched-omics datasets. Results are validated via simulation and applied to matched-omics data sets. In the third project, we introduce a novel approach to modeling spatial transcriptomics (ST) data with a spatially penalized multinomial model of the expression counts. This method solves the low-rank structures of zero-inflated ST data with spatial smoothness constraints. We validate the model using manual cell structure annotations of human brain samples. We then applied this technique to additional ST datasets. / 2025-05-22
93

Nonlocal Priors in Generalized Linear Models and Gaussian Graphical Models

Yang, Fang 23 August 2022 (has links)
No description available.
94

Feature Screening for High-Dimensional Variable Selection In Generalized Linear Models

Jiang, Jinzhu 02 September 2021 (has links)
No description available.
95

Energy Distance Correlation with Extended Bayesian Information Criteria for feature selection in high dimensional models

Ocloo, Isaac Xoese 22 September 2021 (has links)
No description available.
96

Methodology for Estimation and Model Selection in High-Dimensional Regression with Endogeneity

Du, Fan 05 May 2023 (has links)
No description available.
97

High Dimensional Data Methods in Industrial Organization Type Discrete Choice Models

Lopez Gomez, Daniel Felipe 11 August 2022 (has links)
No description available.
98

Two Essays on High-Dimensional Inference and an Application to Distress Risk Prediction

Zhu, Xiaorui 22 August 2022 (has links)
No description available.
99

Human Decidual CD8+ T Cells have Phenotypic and Functional Heterogeneity

Alexander, Aria January 2021 (has links)
No description available.
100

Sparse Ridge Fusion For Linear Regression

Mahmood, Nozad 01 January 2013 (has links)
For a linear regression, the traditional technique deals with a case where the number of observations n more than the number of predictor variables p (n > p). In the case n < p, the classical method fails to estimate the coefficients. A solution of the problem is the case of correlated predictors is provided in this thesis. A new regularization and variable selection is proposed under the name of Sparse Ridge Fusion (SRF). In the case of highly correlated predictor, the simulated examples and a real data show that the SRF always outperforms the lasso, eleastic net, and the S-Lasso, and the results show that the SRF selects more predictor variables than the sample size n while the maximum selected variables by lasso is n size.

Page generated in 0.0869 seconds