• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 762
  • 242
  • 119
  • 117
  • 37
  • 34
  • 16
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • Tagged with
  • 1735
  • 354
  • 303
  • 277
  • 261
  • 242
  • 191
  • 191
  • 183
  • 182
  • 181
  • 170
  • 166
  • 166
  • 163
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

Limiting behaviours in physics: From Duality to Super-resolution

Piche, Kevin January 2016 (has links)
In this thesis, we discuss several phenomena exhibiting `limiting behaviour' in physics. This includes the duality principle, delegated quantum computation, and super-resolution. The duality principle places a limit on the coexistence of wave and particle behaviours. We develop a framework that explains apparent violations of this principle while staying within the scope of quantum mechanics. In addition, we relate the duality principle to the sub-fidelity and weak-values. We also show that the maximum recoverable coherence of a qubit has a sharp transition from 0 to 1 when we have access to half of the environment to which the qubit is correlated. Delegated quantum computation consists of a computational weak client who wishes to delegate a complex quantum computation to a powerful quantum server. We develop a new protocol for delegated quantum computation requiring less quantum power than its predecessor. Finally, we develop and test a new theory for eigenmode super-resolution.
282

Optimization Frameworks for Graph Clustering

Luke N Veldt (6636218) 15 May 2019 (has links)
<div>In graph theory and network analysis, communities or clusters are sets of nodes in a graph that share many internal connections with each other, but are only sparsely connected to nodes outside the set. Graph clustering, the computational task of detecting these communities, has been studied extensively due to its widespread applications and its theoretical richness as a mathematical problem. This thesis presents novel optimization tools for addressing two major challenges associated with graph clustering.</div><div></div><div>The first major challenge is that there already exists a plethora of algorithms and objective functions for graph clustering. The relationship between different methods is often unclear, and it can be very difficult to determine in practice which approach is the best to use for a specific application. To address this challenge, we introduce a generalized discrete optimization framework for graph clustering called LambdaCC, which relies on a single tunable parameter. The value of this parameter controls the balance between the internal density and external sparsity of clusters that are formed by optimizing an underlying objective function. LambdaCC unifies the landscape of graph clustering techniques, as a large number of previously developed approaches can be recovered as special cases for a fixed value of the LambdaCC input parameter. </div><div> </div><div>The second major challenge of graph clustering is the computational intractability of detecting the best way to cluster a graph with respect to a given NP-hard objective function. To address this intractability, we present new optimization tools and results which apply to LambdaCC as well as a broader class of graph clustering problems. In particular, we develop polynomial time approximation algorithms for LambdaCC and other more generalized clustering objectives. In particular, we show how to obtain a polynomial-time 2-approximation for cluster deletion, which improves upon the previous best approximation factor of 3. We also present a new optimization framework for solving convex relaxations of NP-hard graph clustering problems, which are frequently used in the design of approximation algorithms. Finally, we develop a new framework for efficiently setting tunable parameters for graph clustering objective functions, so that practitioners can work with graph clustering techniques that are especially well suited to their application. </div>
283

The Evaluation and Control of the Changes in Basic Statistics Encountered in Grouped Data

Scott, James P. 05 1900 (has links)
This dissertation describes the effect that the construction of frequency tables has on basic statistics computed from those frequency tables. It is directly applicable only to normally distributed data summarized by Sturges' Rule. The purpose of this research was to identify factors tending to bias sample statistics when data are summarized, and thus to allow researchers to avoid such bias. The methodology employed was a large scale simulation where 1000 replications of samples of size n = 2 ᵏ⁻¹ for 2 to 12 were drawn from a normally distributed population with a mean of zero and a standard deviation of one. A FORTRAN IV source listing is included. The report concludes that researchers should avoid the use of statistics computed from frequency tables in cases where raw data are available. Where the use of such statistics is unavoidable, the researchers can eliminate their bias by the use of empirical correction factors provided in the paper. Further research is suggested to determine the effect of summarization of data drawn from various non-normal distributions.
284

Accurate and budget-efficient text, image, and video analysis systems powered by the crowd

Sameki, Mehrnoosh 22 February 2018 (has links)
Crowdsourcing systems empower individuals and companies to outsource labor-intensive tasks that cannot currently be solved by automated methods and are expensive to tackle by domain experts. Crowdsourcing platforms are traditionally used to provide training labels for supervised machine learning algorithms. Crowdsourced tasks are distributed among internet workers who typically have a range of skills and knowledge, differing previous exposure to the task at hand, and biases that may influence their work. This inhomogeneity of the workforce makes the design of accurate and efficient crowdsourcing systems challenging. This dissertation presents solutions to improve existing crowdsourcing systems in terms of accuracy and efficiency. It explores crowdsourcing tasks in two application areas, political discourse and annotation of biomedical and everyday images. The first part of the dissertation investigates how workers' behavioral factors and their unfamiliarity with data can be leveraged by crowdsourcing systems to control quality. Through studies that involve familiar and unfamiliar image content, the thesis demonstrates the benefit of explicitly accounting for a worker's familiarity with the data when designing annotation systems powered by the crowd. The thesis next presents Crowd-O-Meter, a system that automatically predicts the vulnerability of crowd workers to believe \enquote{fake news} in text and video. The second part of the dissertation explores the reversed relationship between machine learning and crowdsourcing by incorporating machine learning techniques for quality control of crowdsourced end products. In particular, it investigates if machine learning can be used to improve the quality of crowdsourced results and also consider budget constraints. The thesis proposes an image analysis system called ICORD that utilizes behavioral cues of the crowd worker, augmented by automated evaluation of image features, to infer the quality of a worker-drawn outline of a cell in a microscope image dynamically. ICORD determines the need to seek additional annotations from other workers in a budget-efficient manner. Next, the thesis proposes a budget-efficient machine learning system that uses fewer workers to analyze easy-to-label data and more workers for data that require extra scrutiny. The system learns a mapping from data features to number of allocated crowd workers for two case studies, sentiment analysis of twitter messages and segmentation of biomedical images. Finally, the thesis uncovers the potential for design of hybrid crowd-algorithm methods by describing an interactive system for cell tracking in time-lapse microscopy videos, based on a prediction model that determines when automated cell tracking algorithms fail and human interaction is needed to ensure accurate tracking.
285

Robust Computational Tools for Multiple Testing with Genetic Association Studies

Welbourn, William L., Jr. 01 May 2012 (has links)
Resolving the interplay of the genetic components of a complex disease is a challenging endeavor. Over the past several years, genome-wide association studies (GWAS) have emerged as a popular approach at locating common genetic variation within the human genome associated with disease risk. Assessing genetic-phenotype associations upon hundreds of thousands of genetic markers using the GWAS approach, introduces the potentially high number of false positive signals and requires statistical correction for multiple hypothesis testing. Permutation tests are considered the gold standard for multiple testing correction in GWAS, because they simultaneously provide unbiased Type I error control and high power. However, they demand heavy computational effort, especially with large-scale data sets of modern GWAS. In recent years, the computational problem has been circumvented by using approximations to permutation tests, but several studies have posed sampling conditions in which these approximations are suggestive to be biased. We have developed an optimized parallel algorithm for the permutation testing approach to multiple testing correction in GWAS, whose implementation essentially abates the computational problem. When introduced to GWAS data, our algorithm yields rapid, precise, and powerful multiplicity adjustment, many orders of magnitude faster than existing employed GWAS statistical software. Although GWAS have identified many potentially important genetic associations which will advance our understanding of human disease, the common variants with modest effects on disease risk discovered through this approach likely account for a small proportion of the heritability in complex disease. On the other hand, interactions between genetic and environmental factors could account for a substantial proportion of the heritability in a complex disease and are overlooked within the GWAS approach. We have developed an efficient and easily implemented tool for genetic association studies, whose aim is identifying genes involved in a gene-environment interaction. Our approach is amenable to a wide range of association studies and assorted densities in sampled genetic marker panels, and incorporates resampling for multiple testing correction. Within the context of a case-control study design we demonstrate by way of simulation that our proposed method offers greater statistical power to detect gene-environment interaction, when compared to several competing approaches to assess this type of interaction.
286

Intervention of Performance: Applications of Parametric Studies on Connective Design in Public Interaction

Funderburk, Joshua A. 28 June 2021 (has links)
No description available.
287

Efficient Mobile Computation Offloading with Hard Task Deadlines and Concurrent Local Execution

Teymoori, Peyvand January 2021 (has links)
Mobile computation offloading (MCO) can alleviate the hardware limitations of mobile devices by migrating heavy computational tasks from mobile devices to more powerful cloud servers. This can lead to better performance and energy savings for the mobile devices. This thesis considers MCO over stochastic wireless channels when task completion times are subject to hard deadline constraints. Hard deadlines, however, are difficult to meet in conventional computation offloading due to the randomness caused by the wireless channels. In the proposed offloading policies, concurrent local execution (CLE) is used to guarantee task execution time constraints. By sometimes allowing simultaneous local and remote execution, CLE ensures that job deadlines are always satisfied in the face of any unexpected wireless channel conditions. The thesis introduces online optimal algorithms that reduce the remote and local execution overlap so that energy wastage is minimized. Markov processes are used to model the communication channels. MCO is addressed for three different job offloading schemes: continuous, multi-part, and preemptive. In the case of continuous offloading, referred to as 1-Part offloading, the mobile device will upload the entire job in one piece without interruption, when the scheduler decides to do so. In multi-part computation offloading, the job is partitioned into a known number (K) of parts, and each part is uploaded separately. In this offloading mechanism, which is referred to as K-Part Offloading, the upload initiation times of each part must be determined dynamically during runtime, and there may be waiting time periods between consecutive upload parts. Preemptive offloading is a generalization of K-Part Offloading where the number of task upload parts is unknown. In this scheme, a decision to either continue offloading or to temporarily interrupt the offload is made at the start of each time slot. Compared to the conventional contiguous computation offloading, interrupted offloading mechanisms (i.e., K-Part and preemptive offloading) allow the system to adapt when channel conditions change and therefore may result in lower mobile device energy consumption. This energy reduction will be obtained at the expense of having higher computational complexity. In this thesis, for each offloading scheme, an online computation offloading algorithm is introduced by constructing a time-dilated absorbing Markov chain (TDAMC) and applying dynamic programming (DP). These algorithms are shown to be energy-optimal while ensuring that the hard task deadline constraints are always satisfied. The optimality of these algorithms is proved using Markovian decision process stopping theory. Since the computational complexity of the proposed online algorithms, especially in the case of preemptive offloading, can be significant, three simpler and computationally efficient approximation methods are introduced: Markovian Compression (MC), Time Compression (TC), and Preemption Using Continuous Offloading (Preemption-CO). MC and TC reduce the state space of the offloading Markovian process by using a novel notion of geometric similarity or by running an optimal online offloading algorithm in periodic time steps. In Preemption-CO, while a task is offloaded preemptively, the offloading decision at every time-slot is based on non-preemptive calculations. These methods are used alone or in combination to construct practical offloading algorithms. A variety of results are presented that show the tradeoffs between complexity and mobile energy-saving performance for the different algorithms. / Thesis / Doctor of Philosophy (PhD)
288

Quantum circuit analysis using analytic functions

Abobakr, Mona R.H. January 2019 (has links)
In this thesis, classical computation is first introduced. Finite quantum systems are considered with D-dimensional Hilbert space, and position x and momentum p taking values in Z(D) (the integers modulo D). An analytic rep resentation of finite quantum systems that use Theta function is presented and considered. The first novel part of this thesis is contribution to study reversible classical CNOT gates and their binary inputs and outputs with reversible cir cuits. Furthermore, a reversible classical Toffoli gates are considered, as well as implementation of a Boolean expression with classical CNOT and Toffoli gates. Reversible circuits with classical CNOT and Toffoli gates are also considered. The second novel part of this thesis the study of quantum computation in terms of CNOT and Toffoli gates. Analytic representations and their zeros are considered, while zeros of the inputs and outputs for quantum CNOT and Toffoli gates are studied. Also, approximate computation of their zeros on the output are calculated. Finally, some quantum circuits are discussed. i
289

Evolutionary Optimization of Decision Trees for Interpretable Reinforcement Learning

Custode, Leonardo Lucio 27 April 2023 (has links)
While Artificial Intelligence (AI) is making giant steps, it is also raising concerns about its trustworthiness, due to the fact that widely-used black-box models cannot be exactly understood by humans. One of the ways to improve humans’ trust towards AI is to use interpretable AI models, i.e., models that can be thoroughly understood by humans, and thus trusted. However, interpretable AI models are not typically used in practice, as they are thought to be less performing than black-box models. This is more evident in Reinforce- ment Learning, where relatively little work addresses the problem of performing Reinforce- ment Learning with interpretable models. In this thesis, we address this gap, proposing methods for Interpretable Reinforcement Learning. For this purpose, we optimize Decision Trees by combining Reinforcement Learning with Evolutionary Computation techniques, which allows us to overcome some of the challenges tied to optimizing Decision Trees in Reinforcement Learning scenarios. The experimental results show that these approaches are competitive with the state-of-the-art score while being extremely easier to interpret. Finally, we show the practical importance of Interpretable AI by digging into the inner working of the solutions obtained.
290

Disentanglement Puzzles and Computation

Miller, Jacob K. January 2017 (has links)
No description available.

Page generated in 0.2242 seconds