• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Unified Robust Minimax Framework for Regularized Learning Problems

Zhou, Hongbo 01 May 2014 (has links)
Regularization techniques have become a principled tool for model-based statistics and artificial intelligence research. However, in most situations, these regularization terms are not well interpreted, especially on how they are related to the loss function and data matrix in a given statistic model. In this work, we propose a robust minimax formulation to interpret the relationship between data and regularization terms for a large class of loss functions. We show that various regularization terms are essentially corresponding to different distortions to the original data matrix. This supplies a unified framework for understanding various existing regularization terms, designing novel regularization terms based on perturbation analysis techniques, and inspiring novel generic algorithms. To show how to apply minimax related concepts to real-world learning tasks, we develop a new fault-tolerant classification framework to combat class noise for general multi-class classification problems; further, by studying the relationship between the majorizable function class and the minimax framework, we develop an accurate, efficient, and scalable algorithm for solving a large family of learning formulations. In addition, this work has been further extended to tackle several important matrix-decomposition-related learning tasks, and we have validated our work on various real-world applications including structure-from-motion (with missing data) and latent structure dictionary learning tasks. This work, composed of a unified formulation, a scalable algorithm, and promising applications in many real-world learning problems, contributes to the understanding of various hidden robustness in many learning models. As we show, many classical statistical machine learning models can be unified using this formulation and accurate, efficient, and scalable algorithms become available from our research.

Page generated in 0.1333 seconds