• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1292
  • 700
  • 234
  • 111
  • 93
  • 43
  • 36
  • 18
  • 16
  • 15
  • 15
  • 14
  • 11
  • 10
  • 10
  • Tagged with
  • 3122
  • 581
  • 546
  • 364
  • 354
  • 296
  • 294
  • 293
  • 234
  • 217
  • 212
  • 208
  • 189
  • 185
  • 177
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Defining Display Complexity in Electric Utility System Operator Displays

McElhaney, Steven Hunt 14 December 2013 (has links)
In the electric utility industry, displays provide power system operators with information on and the status of the system, who then make decisions on how to maintain the safety, the reliability and the efficient operation of the utility generation and transmission grid based on that information. Complexity of the data presented and the display itself can lead to errors or misjudgments that can cause power system operators to make unwise decisions. The primary goal of this research was to develop a method to quantify display complexity for select displays used by system operators when operating the electric generation and transmission grids. Three studies were performed: (1) complexity measure development, (2) validation of the measure using usability and situation awareness (SA) techniques, and (3) display revisions based on complexity measure findings. Fifteen 15 different complexity metrics were originally considered (additive models, multiplicative models, and combination models with five different weighting schemes). The additive model with equal weighting was found to be the most sensitive in differentiating displays and was used in the later studies. For the validation study, system operators were asked to complete a usability questionnaire and a paper-based SA test using the current displays. Correlation and scatter plot analyses was used to determine if the complexity metric and usability and SA scores were related. Results of the validation study indicated that usability and SA scores for the studied displays were not well correlated with the complexity metric. In study 3, the highest and lowest scoring displays were redesigned with an emphasis on maintaining functionality but reducing aspects of complexity that were driving the complexity score. Systems operators again completed the usability and SA testing using the redesigned displays and again correlation analysis was performed. As was the case with study 2, usability scores were not correlated with the complexity metric; however, SA scores were significantly correlated. The complexity metric developed here can be used to quantify the complexity in a display and identify redesign opportunities to reduce non-essential information, as displays that are less complex should result in improved operator performance and satisfaction with the display.
52

Membership testing in transformation monoids

Beaudry, Martin January 1987 (has links)
No description available.
53

The conceptual design of robotic architectures using complexity criteria /

Khan, Waseem A. January 2007 (has links)
No description available.
54

Computational complexity analysis of decision tree algorithms

Sani, Habiba M., Lei, Ci, Neagu, Daniel 16 November 2018 (has links)
Yes / Decision tree is a simple but powerful learning technique that is considered as one of the famous learning algorithms that have been successfully used in practice for various classification tasks. They have the advantage of producing a comprehensible classification model with satisfactory accuracy levels in several application domains. In recent years, the volume of data available for learning is dramatically increasing. As a result, many application domains are faced with a large amount of data thereby posing a major bottleneck on the computability of learning techniques. There are different implementations of the decision tree using different techniques. In this paper, we theoretically and experimentally study and compare the computational power of the most common classical top-down decision tree algorithms (C4.5 and CART). This work can serve as part of review work to analyse the computational complexity of the existing decision tree classifier algorithm to gain understanding of the operational steps with the aim of optimizing the learning algorithm for large datasets.
55

Essays in Microeconomic Theory:

Dall'Ara, Pietro January 2024 (has links)
Thesis advisor: Mehmet Ekmekci / This dissertation consists of two independent essays. In the first essay, Coordination in Complex Environments, I introduce a framework to study coordination in highly uncertain environments. Coordination is an important aspect of innovative contexts, where: the more innovative a course of action, the more uncertain its outcome. To explore the interplay of coordination and informational complexity, I embed a beauty-contest game into a complex environment. I uncover a new conformity phenomenon. The new effect may push towards exploration of unknown alternatives, or constitute a status quo bias, depending on the network structure of the connections among players. In the second essay, The Extensive Margin of Bayesian Persuasion, I study the persuasion of a receiver who accesses information only if she exerts attention effort. The sender uses the information to incentivize the receiver to pay attention. I show that persuasion mechanisms are equivalent to signals. In a model of media capture, the sender finds it optimal to censor high states. / Thesis (PhD) — Boston College, 2024. / Submitted to: Boston College. Graduate School of Arts and Sciences. / Discipline: Economics.
56

Instance compression of parametric problems and related hierarchies

Chakraborty, Chiranjit January 2014 (has links)
We define instance compressibility ([13, 17]) for parametric problems in the classes PH and PSPACE.We observe that the problem ƩiCIRCUITSAT of deciding satisfiability of a quantified Boolean circuit with i-1 alternations of quantifiers starting with an existential quantifier is complete for parametric problems in the class Ʃp/i with respect to w-reductions, and that analogously the problem QBCSAT (Quantified Boolean Circuit Satisfiability) is complete for parametric problems in PSPACE with respect to w-reductions. We show the following results about these problems: 1. If CIRCUITSAT is non-uniformly compressible within NP, then ƩiCIRCUITSAT is non-uniformly compressible within NP, for any i≥1. 2. If QBCSAT is non-uniformly compressible (or even if satisfiability of quantified Boolean CNF formulae is non-uniformly compressible), then PSPACE ⊆ NP/poly and PH collapses to the third level. Next, we define Succinct Interactive Proof (Succinct IP) and by adapting the proof of IP = PSPACE ([11, 6]) , we show that QBCNFSAT (Quantified Boolean Formula (in CNF) Satisfiability) is in Succinct IP. On the contrary if QBCNFSAT has Succinct PCPs ([32]) , Polynomial Hierarchy (PH) collapses. After extending the notion of instance compression to higher classes, we study the hierarchical structure of the parametric problems with respect to compressibility. For that purpose, we extend the existing definition of VC-hierarchy ([13]) to parametric problems. After that, we have considered a long list of natural NP problems and tried to classify them into some level of VC-hierarchy. We have shown some of the new w-reductions in this context and pointed out a few interesting results including the ones as follows. 1. CLIQUE is VC1-complete (using the results in [14]). 2. SET SPLITTING and NAE-SAT are VC2-complete. We have also introduced a new complexity class VCE in this context and showed some hardness and completeness results for this class. We have done a comparison of VC-hierarchy with other related hierarchies in parameterized complexity domain as well. Next, we define the compression of counting problems and the analogous classification of them with respect to the notion of instance compression. We define #VC-hierarchy for this purpose and similarly classify a large number of natural counting problems with respect to this hierarchy, by showing some interesting hardness and completeness results. We have considered some of the interesting practical problems as well other than popular NP problems (e.g., #MULTICOLOURED CLIQUE, #SELECTED DOMINATING SET etc.) and studied their complexity for both decision and counting version. We have also considered a large variety of circuit satisfiability problems (e.g., #MONOTONE WEIGHTED-CNFSAT, #EXACT DNF-SAT etc.) and proved some interesting results about them with respect to the theory of instance compressibility.
57

ON THE PROPERTIES AND COMPLEXITY OF MULTICOVERING RADII

Mertz, Andrew Eugene 01 January 2005 (has links)
People rely on the ability to transmit information over channels of communication that aresubject to noise and interference. This makes the ability to detect and recover from errorsextremely important. Coding theory addresses this need for reliability. A fundamentalquestion of coding theory is whether and how we can correct the errors in a message thathas been subjected to interference. One answer comes from structures known as errorcorrecting codes.A well studied parameter associated with a code is its covering radius. The coveringradius of a code is the smallest radius such that every vector in the Hamming space of thecode is contained in a ball of that radius centered around some codeword. Covering radiusrelates to an important decoding strategy known as nearest neighbor decoding.The multicovering radius is a generalization of the covering radius that was proposed byKlapper [11] in the course of studying stream ciphers. In this work we develop techniques forfinding the multicovering radius of specific codes. In particular, we study the even weightcode, the 2-error correcting BCH code, and linear codes with covering radius one.We also study questions involving the complexity of finding the multicovering radius ofcodes. We show: Lower bounding the m-covering radius of an arbitrary binary code is NPcompletewhen m is polynomial in the length of the code. Lower bounding the m-coveringradius of a linear code is Σp2-complete when m is polynomial in the length of the code. IfP is not equal to NP, then the m-covering radius of an arbitrary binary code cannot beapproximated within a constant factor or within a factor nϵ, where n is the length of thecode and ϵ andlt; 1, in polynomial time. Note that the case when m = 1 was also previouslyunknown. If NP is not equal to Σp2, then the m-covering radius of a linear code cannot beapproximated within a constant factor or within a factor nϵ, where n is the length of thecode and ϵ andlt; 1, in polynomial time.
58

The Coordination Dynamics of Multiple Agents

Unknown Date (has links)
A fundamental question in Complexity Science is how numerous dynamic processes coordinate with each other on multiple levels of description to form a complex whole - a multiscale coordinative structure (e.g. a community of interacting people, organs, cells, molecules etc.). This dissertation includes a series of empirical, theoretical and methodological studies of rhythmic coordination between multiple agents to uncover dynamic principles underlying multiscale coordinative structures. First, a new experimental paradigm was developed for studying coordination at multiple levels of description in intermediate-sized (N = 8) ensembles of humans. Based on this paradigm, coordination dynamics in 15 ensembles was examined experimentally, where the diversity of subjects movement frequency was manipulated to induce di erent grouping behavior. Phase coordination between subjects was found to be metastable with inphase and antiphase tendencies. Higher frequency diversity led to segregation between frequency groups, reduced intragroup coordination, and dispersion of dyadic phase relations (i.e. relations at di erent levels of description). Subsequently, a model was developed, successfully capturing these observations. The model reconciles the Kuramoto and the extended Haken-Kelso-Bunz model (for large- and small-scale coordination respectively) by adding the second-order coupling from the latter to the former. The second order coupling is indispensable in capturing experimental observations and connects behavioral complexity (i.e. multistability) of coordinative structures across scales. Both the experimental and theoretical studies revealed multiagent metastable coordination as a powerful mechanism for generating complex spatiotemporal patterns. Coexistence of multiple phase relations gives rise to many topologically distinct metastable patterns with di erent degrees of complexity. Finally, a new data-analytic tool was developed to quantify complex metastable patterns based on their topological features. The recurrence of topological features revealed important structures and transitions in high-dimensional dynamic patterns that eluded its non-topological counterparts. Taken together, the work has paved the way for a deeper understanding of multiscale coordinative structures. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2018. / FAU Electronic Theses and Dissertations Collection
59

Parallelism with limited nondeterminism

Finkelstein, Jeffrey 05 March 2017 (has links)
Computational complexity theory studies which computational problems can be solved with limited access to resources. The past fifty years have seen a focus on the relationship between intractable problems and efficient algorithms. However, the relationship between inherently sequential problems and highly parallel algorithms has not been as well studied. Are there efficient but inherently sequential problems that admit some relaxed form of highly parallel algorithm? In this dissertation, we develop the theory of structural complexity around this relationship for three common types of computational problems. Specifically, we show tradeoffs between time, nondeterminism, and parallelizability. By clearly defining the notions and complexity classes that capture our intuition for parallelizable and sequential problems, we create a comprehensive framework for rigorously proving parallelizability and non-parallelizability of computational problems. This framework provides the means to prove whether otherwise tractable problems can be effectively parallelized, a need highlighted by the current growth of multiprocessor systems. The views adopted by this dissertation—alternate approaches to solving sequential problems using approximation, limited nondeterminism, and parameterization—can be applied practically throughout computer science.
60

Computational Structure of GPSG Models: Revised Generalized Phrase Structure Grammar

Ristad, Eric Sven 01 September 1989 (has links)
The primary goal of this report is to demonstrate how considerations from computational complexity theory can inform grammatical theorizing. To this end, generalized phrase structure grammar (GPSG) linguistic theory is revised so that its power more closely matches the limited ability of an ideal speaker--hearer: GPSG Recognition is EXP-POLY time hard, while Revised GPSG Recognition is NP-complete. A second goal is to provide a theoretical framework within which to better understand the wide range of existing GPSG models, embodied in formal definitions as well as in implemented computer programs. A grammar for English and an informal explanation of the GPSG/RGPSG syntactic features are included in appendices.

Page generated in 0.0467 seconds