• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 109
  • 89
  • 49
  • 14
  • 14
  • Tagged with
  • 319
  • 224
  • 220
  • 196
  • 196
  • 194
  • 194
  • 193
  • 193
  • 180
  • 139
  • 123
  • 93
  • 84
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Pattern Recognition Applied to the Computer-aided Detection and Diagnosis of Breast Cancer from Dynamic Contrast-enhanced Magnetic Resonance Breast Images

Levman, Jacob 21 April 2010 (has links)
The goal of this research is to improve the breast cancer screening process based on magnetic resonance imaging (MRI). In a typical MRI breast examination, a radiologist is responsible for visually examining the MR images acquired during the examination and identifying suspect tissues for biopsy. It is known that if multiple radiologists independently analyze the same examinations and we biopsy any lesion that any of our radiologists flagged as suspicious then the overall screening process becomes more sensitive but less specific. Unfortunately cost factors prohibit the use of multiple radiologists for the screening of every breast MR examination. It is thought that instead of having a second expert human radiologist to examine each set of images, that the act of second reading of the examination can be performed by a computer-aided detection and diagnosis system. The research presented in this thesis is focused on the development of a computer-aided detection and diagnosis system for breast cancer screening from dynamic contrast-enhanced magnetic resonance imaging examinations. This thesis presents new computational techniques in supervised learning, unsupervised learning and classifier visualization. The techniques have been applied to breast MR lesion data and have been shown to outperform existing methods yielding a computer aided detection and diagnosis system with a sensitivity of 89% and a specificity of 70%.
22

Training Recurrent Neural Networks

Sutskever, Ilya 13 August 2013 (has links)
Recurrent Neural Networks (RNNs) are powerful sequence models that were believed to be difficult to train, and as a result they were rarely used in machine learning applications. This thesis presents methods that overcome the difficulty of training RNNs, and applications of RNNs to challenging problems. We first describe a new probabilistic sequence model that combines Restricted Boltzmann Machines and RNNs. The new model is more powerful than similar models while being less difficult to train. Next, we present a new variant of the Hessian-free (HF) optimizer and show that it can train RNNs on tasks that have extreme long-range temporal dependencies, which were previously considered to be impossibly hard. We then apply HF to character-level language modelling and get excellent results. We also apply HF to optimal control and obtain RNN control laws that can successfully operate under conditions of delayed feedback and unknown disturbances. Finally, we describe a random parameter initialization scheme that allows gradient descent with momentum to train RNNs on problems with long-term dependencies. This directly contradicts widespread beliefs about the inability of first-order methods to do so, and suggests that previous attempts at training RNNs failed partly due to flaws in the random initialization.
23

Monitoring the Generation and Execution of Optimal Plans

Fritz, Christian Wilhelm 24 September 2009 (has links)
In dynamic domains, the state of the world may change in unexpected ways during the generation or execution of plans. Regardless of the cause of such changes, they raise the question of whether they interfere with ongoing planning efforts. Unexpected changes during plan generation may invalidate the current planning effort, while discrepancies between expected and actual state of the world during execution may render the executing plan invalid or sub-optimal, with respect to previously identified planning objectives. In this thesis we develop a general monitoring technique that can be used during both plan generation and plan execution to determine the relevance of unexpected changes and which supports recovery. This way, time intensive replanning from scratch in the new and unexpected state can often be avoided. The technique can be applied to a variety of objectives, including monitoring the optimality of plans, rather then just their validity. Intuitively, the technique operates in two steps: during planning the plan is annotated with additional information that is relevant to the achievement of the objective; then, when an unexpected change occurs, this information is used to determine the relevance of the discrepancy with respect to the objective. We substantiate the claim of broad applicability of this relevance-based technique by developing four concrete applications: generating optimal plans despite frequent, unexpected changes to the initial state of the world, monitoring plan optimality during execution, monitoring the execution of near-optimal policies in stochastic domains, and monitoring the generation and execution of plans with procedural hard constraints. In all cases, we use the formal notion of regression to identify what is relevant for achieving the objective. We prove the soundness of these concrete approaches and present empirical results demonstrating that in some contexts orders of magnitude speed-ups can be gained by our technique compared to replanning from scratch.
24

Pattern Recognition Applied to the Computer-aided Detection and Diagnosis of Breast Cancer from Dynamic Contrast-enhanced Magnetic Resonance Breast Images

Levman, Jacob 21 April 2010 (has links)
The goal of this research is to improve the breast cancer screening process based on magnetic resonance imaging (MRI). In a typical MRI breast examination, a radiologist is responsible for visually examining the MR images acquired during the examination and identifying suspect tissues for biopsy. It is known that if multiple radiologists independently analyze the same examinations and we biopsy any lesion that any of our radiologists flagged as suspicious then the overall screening process becomes more sensitive but less specific. Unfortunately cost factors prohibit the use of multiple radiologists for the screening of every breast MR examination. It is thought that instead of having a second expert human radiologist to examine each set of images, that the act of second reading of the examination can be performed by a computer-aided detection and diagnosis system. The research presented in this thesis is focused on the development of a computer-aided detection and diagnosis system for breast cancer screening from dynamic contrast-enhanced magnetic resonance imaging examinations. This thesis presents new computational techniques in supervised learning, unsupervised learning and classifier visualization. The techniques have been applied to breast MR lesion data and have been shown to outperform existing methods yielding a computer aided detection and diagnosis system with a sensitivity of 89% and a specificity of 70%.
25

Training Recurrent Neural Networks

Sutskever, Ilya 13 August 2013 (has links)
Recurrent Neural Networks (RNNs) are powerful sequence models that were believed to be difficult to train, and as a result they were rarely used in machine learning applications. This thesis presents methods that overcome the difficulty of training RNNs, and applications of RNNs to challenging problems. We first describe a new probabilistic sequence model that combines Restricted Boltzmann Machines and RNNs. The new model is more powerful than similar models while being less difficult to train. Next, we present a new variant of the Hessian-free (HF) optimizer and show that it can train RNNs on tasks that have extreme long-range temporal dependencies, which were previously considered to be impossibly hard. We then apply HF to character-level language modelling and get excellent results. We also apply HF to optimal control and obtain RNN control laws that can successfully operate under conditions of delayed feedback and unknown disturbances. Finally, we describe a random parameter initialization scheme that allows gradient descent with momentum to train RNNs on problems with long-term dependencies. This directly contradicts widespread beliefs about the inability of first-order methods to do so, and suggests that previous attempts at training RNNs failed partly due to flaws in the random initialization.
26

Machine Learning Methods and Models for Ranking

Volkovs, Maksims 13 August 2013 (has links)
Ranking problems are ubiquitous and occur in a variety of domains that include social choice, information retrieval, computational biology and many others. Recent advancements in information technology have opened new data processing possibilities and signi cantly increased the complexity of computationally feasible methods. Through these advancements ranking models are now beginning to be applied to many new and diverse problems. Across these problems data, which ranges from gene expressions to images and web-documents, has vastly di erent properties and is often not human generated. This makes it challenging to apply many of the existing models for ranking which primarily originate in social choice and are typically designed for human generated preference data. As the field continues to evolve a new trend has recently emerged where machine learning methods are being used to automatically learn the ranking models. While these methods typically lack the theoretical support of the social choice models they often show excellent empirical performance and are able to handle large and diverse data placing virtually no restrictions on the data type. These model have now been successfully applied to many diverse ranking problems including image retrieval, protein selection, machine translation and many others. Inspired by these promising results the work presented in this thesis aims to advance machine methods for ranking and develop new techniques to allow e ective modeling of existing and future problems. The presented work concentrates on three di erent but related domains: information retrieval, preference aggregation and collaborative ltering. In each domain we develop new models together with learning and inference methods and empirically verify our models on real-life data.
27

Knowledge Provenance: An Approach to Modeling and Maintaining The Evolution and Validity of Knowledge

Huang, Jingwei 28 July 2008 (has links)
The Web has become an open decentralized global information / knowledge repository, a platform for distributed computing and global electronic markets, where people are confronted with information of unknown sources, and need to interact with “strangers”. This makes trust and the validity of information in cyberspace arise as crucial issues. This thesis proposes knowledge provenance (KP) as a formal approach to determining the origin and validity of information / knowledge on the Web, by means of modeling and maintaining the information sources, information dependencies, and trust structures. We conceptualize and axiomatize KP ontology including static KP and dynamic KP. The proposed KP ontology, provides a formal representation of linking trust in information creators and belief in the information created; lays a foundation for further study of knowledge provenance; provides logical systems for provenance reasoning by machines. The web ontology of KP can be used to annotate web information; and KP reasoner can be used as a tool to trace the origin and to determine the validity of Web information. Since knowledge provenance is based on trust in information sources, this thesis also proposes a logical theory of trust in epistemic logic and situation calculus. In particular, we formally define the semantics of trust; from it, we identify two types of trust: trust in belief and trust in performance; reveal and prove that trust in belief is transitive; trust in performance is not, but by trust in belief, trust in performance can propagate in social networks; by using situation calculus in trust formalization, the context of trust is formally represented by reified fluents; we also propose a distributed logical model for trust reasoning using social networks, by which each agent’s private data about trust relationships can be protected. This study provides a formal theoretical analysis on the transitivity of trust, which supports trust propagation in social networks. This study of trust supports not only knowledge provenance but also the general trust modeling in cyberspace.
28

Astrometry.net: Automatic Recognition and Calibration of Astronomical Images

Lang, Dustin 03 March 2010 (has links)
We present Astrometry.net, a system for automatically recognizing and astrometrically calibrating astronomical images, using the information in the image pixels alone. The system is based on the geometric hashing approach in computer vision: We use the geometric relationships between low-level features (stars and galaxies), which are relatively indistinctive, to create geometric features that are distinctive enough that we can recognize images that cover less than one-millionth of the area of the sky. The geometric features are used to generate rapidly hypotheses about the location---the pointing, scale, and rotation---of an image on the sky. Each hypothesis is then evaluated in a Bayesian decision theory framework in order to ensure that most correct hypotheses are accepted while false hypotheses are almost never accepted. The feature-matching process is accelerated by using a new fast and space-efficient kd-tree implementation. The Astrometry.net system is available via a web interface, and the software is released under an open-source license. It is being used by hundreds of individual astronomers and several large-scale projects, so we have at least partially achieved our goal of helping ``to organize, annotate and make searchable all the world's astronomical information.''
29

Machine Learning Approaches to Biological Sequence and Phenotype Data Analysis

Min, Renqiang 17 February 2011 (has links)
To understand biology at a system level, I presented novel machine learning algorithms to reveal the underlying mechanisms of how genes and their products function in different biological levels in this thesis. Specifically, at sequence level, based on Kernel Support Vector Machines (SVMs), I proposed learned random-walk kernel and learned empirical-map kernel to identify protein remote homology solely based on sequence data, and I proposed a discriminative motif discovery algorithm to identify sequence motifs that characterize protein sequences' remote homology membership. The proposed approaches significantly outperform previous methods, especially on some challenging protein families. At expression and protein level, using hierarchical Bayesian graphical models, I developed the first high-throughput computational predictive model to filter sequence-based predictions of microRNA targets by incorporating the proteomic data of putative microRNA target genes, and I proposed another probabilistic model to explore the underlying mechanisms of microRNA regulation by combining the expression profile data of messenger RNAs and microRNAs. At cellular level, I further investigated how yeast genes manifest their functions in cell morphology by performing gene function prediction from the morphology data of yeast temperature-sensitive alleles. The developed prediction models enable biologists to choose some interesting yeast essential genes and study their predicted novel functions.
30

Kernel-based Copula Processes

Ng, Eddie Kai Ho 22 February 2011 (has links)
The field of time-series analysis has made important contributions to a wide spectrum of applications such as tide-level studies in hydrology, natural resource prospecting in geo-statistics, speech recognition, weather forecasting, financial trading, and economic forecasts and analysis. Nevertheless, the analysis of the non-Gaussian and non-stationary features of time-series remains challenging for the current state-of-art models. This thesis proposes an innovative framework that leverages the theory of copula, combined with a probabilistic framework from the machine learning community, to produce a versatile tool for multiple time-series analysis. I coined this new model Kernel-based Copula Processes (KCPs). Under the new proposed framework, various idiosyncracies can be modeled compactly via a kernel function for each individual time-series, and long-range dependency can be captured by a copula function. The copula function separates the marginal behavior and serial dependency structures, thus allowing them to be modeled separately and with much greater flexibility. Moreover, the codependent structure of a large number of time-series with potentially vastly different characteristics can be captured in a compact and elegant fashion through the notion of a binding copula. This feature allows a highly heterogeneous model to be built, breaking free from the homogeneous limitation of most conventional models. The KCPs have demonstrated superior predictive power when used to forecast a multitude of data sets from meteorological and financial areas. Finally, the versatility of the KCP model is exemplified when it was successfully applied to non-trivial classification problems unaltered.

Page generated in 0.0456 seconds