• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1314
  • 700
  • 234
  • 112
  • 97
  • 43
  • 36
  • 18
  • 16
  • 16
  • 15
  • 15
  • 11
  • 10
  • 10
  • Tagged with
  • 3151
  • 582
  • 547
  • 368
  • 355
  • 298
  • 296
  • 294
  • 237
  • 221
  • 215
  • 208
  • 191
  • 186
  • 180
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

On the complexity of finding optimal edge rankings /

Yue, Fung-ling. January 1996 (has links)
Thesis (M. Phil.)--University of Hong Kong, 1997. / Includes bibliographical references (leaf 83-84).
92

Algorithmic applications of propositional proof complexity /

Sabharwal, Ashish, January 2005 (has links)
Thesis (Ph. D.)--University of Washington, 2005. / Vita. Includes bibliographical references (p. 155-165).
93

An integrated complexity analysis of problems from computational biology

Hailett, Micheiel Trevor 17 July 2018 (has links)
We perform an Integrated complexity analysis on a number of combinatorial problems arising from the field of computational biology. The classic framework of NP-completeness, algorithmic design techniques for bounded width graphs, and parameterized computational complexity together provide a clear and detailed map of the intrinsic hardness of the following problems: INTERVALIZING COLORED GRAPHS and SHORTEST COMMON SUPERSEQUENCE. The fundamental concern of parameterized complexity is the apparent qualitative difference in algorithmic behaviour displayed by many problems when one or more input parameters are bounded. For many problems, only a small range of values for these parameters capture most instances arising in practice. This is certainly the case in computational biology in several specific arenas such as DNA physical mapping or multiple sequence alignment. At its most general level, parameterized complexity partitions problems into two classes: fixed parameter tractable (FPT) and fixed parameter intractable (hard for classes of the W-hierarchy.) The former indicates that the particular parameterization may allow for efficient practical algorithms whilst the latter indicates the parameterization is not effective (asymptotically) in alleviating the intractability. The problem INTERVALIZING COLORED GRAPHS (ICG) models in a straightforward albeit limited way the determination of contig assemblies in the mapping of DNA. We show ICG to be NP-complete (no polynomial time algorithm unless P=NP), not finite-state (a very general algorithmic design technique for bounded width graphs fails), and hard for the parameterized complexity class W[1] (a specific parameterized version of ICG does not admit an efficient algorithm unless many other well-known - and apparently hard - problems admit efficient algorithms). Both SHORTEST COMMON SUPERSEQUENCE and its sister problem LONGEST COMMON SUBSEQUENCE have applications in multiple sequence alignment. We show that SHORTEST COMMON SUPERSEQUENCE PARAMETERIZED BY THE NUMBER OF INPUT STRINGS AND THE SIZE OF THE ALPHABET is hard for complexity class W[1]. As is the case with ICG, this implies that it does not admit efficient algorithms unless some unlikely computational complexity collapses occur. / Graduate
94

L'influence de la reconnaissance sur la puissance d'agir : une approche biographique de personnes en situation de vulnérabilité / Mechanisms wich play a role in the devlopment of the empowerment : a biographical approach of people in a vulnerability situation

Lehoux-Fleury, Catherine 22 December 2017 (has links)
Cette thèse étudie les mécanismes qui entrent en action dans le développement du pouvoir d’agir des individus en situation de vulnérabilité. À cette fin, nous nous appuyons d’une part sur des travaux en sociologie, philosophie, linguistique, et d’autre part sur notre expérience professionnelle dans le domaine du socio-éducatif (expérience retransmise et analysée ici via un récit d’investigation professionnelle), qui nous a permis de recueillir le témoignage de praticiens et de patients. L’auteure a ainsi procédé à douze échanges conversationnels avec des personnes en situation de handicap et de précarité. Ces entretiens sont retranscrits en annexe dans le tome II de cette thèse, tandis que le tome I procède à leur analyse avec les outils de la recherche biographique (socianalyse issue de l’analyse institutionnelle, étude via les catégories de Walter R. Heinz : des grilles d’analyses adaptées par Christine Delory-Momberger). Nous montrons aussi comment un tiers-lieu, un groupe d’échanges de pratiques, a été utilisé pour « déplier la complexité », au sens d’Edgar Morin, de l’approche biographique et de la question de l’implication du chercheur. Les intrications entre pouvoir d’agir et reconnaissance se retrouvent incarnées,au sein de cette recherche, par l’ensemble des personnes dont nous analysons les entretiens. À cette occasion, nous revisitons notamment les réflexions de Blaise Pascal,Hegel, Hannah Arendt, et les travaux de Paul Ricoeur, d’Axel Honneth et d’Emmanuel Renault. Cette démarche théorique s’est doublée de deux interactions concrètes avec la caisse primaire d’assurance maladie et avec un foyer d’accueil de personnes handicapées : nous illustrons comment, via la jonction de nos considérations universitaires et de notre sensibilité de praticienne dans le social, nous avons pu amener ces deux institutions à modifier leur approche, leur regard. / This PhD thesis studies the mechanisms which play a role in the development of the empowerment of people in a vulnerable situation. To this end, we base our approach on previous works in sociology, philosophy, linguistics, and also on our professional experiences in the socio-educative area which allowed us to collect testimonies of social workers and their patients.The author thus proceeded to twelve conversational exchanges with people with disabilities or in situations of precariousness. These exchanges are transcribed into an annex, which constitutes the second volume of this thesis, while the first volume provides their analysis via biographical research tools (socianalysis stemming from institutional analysis, analysis via Walter R. Heinz's categories : an analysis framework adapted by Christine Delory-Momberger). What is more, our experience is summarized and analyzed here via a professional investigation narrative account. We also show how a « Third Place » (a group of exchanges of practices) was used in order to, to quote Edgar Morin, « unfold the complexity » of the biographical approach and of the issue of the researcher’s implication.The entanglements between empowerment and social recognition are here embodied, in this research, by each person we interviewed. On this occasion, we revisit in particular some thoughts by Blaise Pascal, Hegel, Hannah Arendt, and the works of Paul Ricoeur, Axel Honneth, and Emmanuel Renault. This theoretical approach is accompanied by two concrete interactions with the CPAM (the main institution of the French Health System) and a group home for people with disabilities : We illustrate how, via the junction of our university point of view and our sensibility as a social worker, we have been able to convince these two institutions to adapt their approach and their perspectives.
95

Essays in Decision Theory

Gu, Yuan 12 June 2023 (has links)
This dissertation studies decision theories for both individual and interactive choice problems. This thesis proposes three non-standard models that modify assumptions and settings of standard models. Chapter 1 provides an overview of this dissertation. In the second chapter I present a model of decision-making under uncertainty in which an agent is constrained in her cognitive ability to consider complex acts. The complexity of an act is identified by the corresponding partition of state space. The agent ranks acts according to the expected utility net of complexity cost. I introduce a new axiom called Aversion to Complexity, that depicts an agent's aversion to complex acts. This axiom, together with other modified classical expected utility axioms characterizes a Complexity Aversion Representation. In addition, I present applications to competitive markets with uncertainty and optimal contract design. The third Chapter discusses how a complexity averse agent measures the complexity cost of an act after she receives new information. I propose an updating rule for the complexity cost function called Minimal Complexity Updating. The idea is that if the agent is told that the true state must belong to a particular event, she needs not consider the complexity of an act outside of this event. The main result characterizes axiomatically the Minimal Complexity Aversion Representation. Lastly, I apply the idea of Minimal Complexity Updating to the theory of rational inattention. The last chapter deals with a variant model of fictitious play, in which each player has a perturbation term that measures to what extent his rival will stick to the rules of traditional fictitious play. I find that the empirical distribution can converge to a pure Nash equilibrium if the perturbation term is bounded. Furthermore, I introduce an updating rule for the perturbation term. I prove that if the perturbation term is updated in accordance with this rule, then play can converge to a pure Nash equilibrium. / Doctor of Philosophy / This dissertation studies decision theories for both individual and interactive choice problems. This thesis proposes three non-standard models that modify assumptions and settings of standard models. Chapter 1 provides an overview of this dissertation. In the second chapter I present a model of decision-making under uncertainty in which an agent is constrained in her cognitive ability to consider complex acts. The complexity of an act is identified by the corresponding partition of state space. The agent ranks acts according to the expected utility net of complexity cost. I introduce a new axiom called Aversion to Complexity, that depicts an agent's aversion to complex acts. This axiom, together with other modified classical expected utility axioms characterizes a Complexity Aversion Representation. In addition, I present applications to competitive markets with uncertainty and optimal contract design. The third Chapter discusses how a complexity averse agent measures the complexity cost of an act after she receives new information. I propose an updating rule for the complexity cost function called Minimal Complexity Updating. The idea is that if the agent is told that the true state must belong to a particular event, she needs not consider the complexity of an act outside of this event. The main result characterizes axiomatically the Minimal Complexity Aversion Representation. Lastly, I apply the idea of Minimal Complexity Updating to the theory of rational inattention. The last chapter deals with a variant model of fictitious play, in which each player has a perturbation term that measures to what extent his rival will stick to the rules of traditional fictitious play. I find that the empirical distribution can converge to a pure Nash equilibrium if the perturbation term is bounded. Furthermore, I introduce an updating rule for the perturbation term. I prove that if the perturbation term is updated in accordance with this rule, then play can converge to a pure Nash equilibrium.
96

DWELLING AND WORK PLACES IN THE POST-INDUSTRIAL ERA

Proefrock, Philip S. January 2006 (has links)
No description available.
97

Ambiguity in Public Organizations - Is it always Negative and Difficult to Manage or does Theory Assume Too Much?: A Case Study Review of Customs and Border Protection's Container Security Initiative

Mullins, Dustin Ashley 26 September 2011 (has links)
This research study provides a conceptual framework to understand how public managers strategically engage ambiguity and translate the complexity associated with ambiguity to manageable objectives to control complex work within federal programs. A central assumption for this study is that ambiguity is an organizational reality due to the social nature of administrative systems, influencing how managers approach and understand problem sets. This research study demonstrates the impact of management strategies in combating organizational ambiguity, at a strategic level, as well as mitigating and reducing uncertainty at more tactical levels of an organization. Theoretically, this study engages the current divide between organizational theory and public management scholarship by providing an empirical perspective on the management and execution of a key national security program. Through examination of the Container Security Initiative (CSI) program within the Department of Homeland Security (DHS) Customs and Border Protection (CBP), this research explores how ambiguity and uncertainty, within bureaucratic settings, is managed on an ongoing basis in the pursuit of defined goals and objectives. Examination of how public managers strategically engage ambiguity and implementation pressures, which manifest as a result of systemic external and internal pressures, to translate complexity associated with the ambiguity into manageable program objectives, provides valuable insight into the impact of managerial processes within public organizations. Through this managerial process and by setting priorities and objectives, public managers decompose and translate ambiguity and complexity in order to more actively and effectively utilize strategies and resources in support of those defined objectives. Central to the translation process is managing the interface between the strategic and tactical dimensions of programs, through goal setting and priority definition, enabling the execution of key program activities and operations. This study's findings build upon existing research that examines the role of management within public organizations, as well as challenges several assumptions within the extant literature regarding the influence and consequences of ambiguity within public organizations. Addressing the need to empirically demonstrate how management matters, this research emphasizes the role public managers play in actively engaging and managing organizational and program complexity in order to accomplish the objectives of public bureaucracies. / Ph. D.
98

Managing Complexity

Mariotti, Humberto, Zauhy, Cristina 14 October 2014 (has links)
This article is a brief introduction to complexity, complex thinking and complexitymanagement. Its purpose is to present an update on the applications of the complexitysciences particularly to the universe of corporations and management. It includes anexample taken from the globalized world and two more stories from the corporateenvironment. Some details on how to think about complexity and how to apply theconceptual and operative tools of complex thinking are provided. The article ends withsome remarks on personal, interpersonal and corporate benefits of the complexthinking.
99

Information theory for multi-party peer-to-peer communication protocols / Théorie de l’information pour protocoles de communication peer-to-peer

Urrutia, Florent 25 May 2018 (has links)
Cette thèse a pour sujet les protocoles de communication peer-to-peer asynchrones. Nous introduisons deux mesures basées sur la théorie de l'information,la Public Information Complexity (PIC) et la Multi-party Information Complexity (MIC), étudions leurs propriétés et leur relation avec d'autres mesures fondamentales en calcul distribué, telles que la communication complexity et la randomness complexity. Nous utilisons ensuite ces deux mesures pour étudier la fonction parité et la fonction disjointness. / This thesis is concerned with the study of multi-party communicationprotocols in the asynchronous message-passing peer-to-peer model. We introducetwo new information measures, the Public Information Complexity(PIC) and the Multi-party Information Complexity (MIC), study their propertiesand how they are related to other fundamental quantities in distributedcomputing such as communication complexity and randomness complexity.We then use these two measures to study the parity function and the disjointness function.
100

New bounds for information complexity and quantum query complexity via convex optimization tools

Brandeho, Mathieu 28 September 2018 (has links) (PDF)
Cette thèse rassemble trois travaux sur la complexité d'information et sur la complexité en requête quantique. Ces domaines d'études ont pour points communs les outils mathématiques pour étudier ces complexités, c'est-à-dire les problèmes d'optimisation.Les deux premiers travaux concernent le domaine de la complexité en requête quantique, en généralisant l'important résultat suivant: dans l'article cite{LMRSS11}, leurs auteurs parviennent à caractériser la complexité en requête quantique, à l'aide de la méthode par adversaire, un programme semi-définie positif introduit par A. Ambainis dans cite{Ambainis2000}. Cependant, cette caractérisation est restreinte aux modèles à temps discret, avec une erreur bornée. Ainsi, le premier travail consiste à généraliser leur résultat aux modèles à temps continu, tandis que le second travail est une démarche, non aboutie, pour caractériser la complexité en requête quantique dans le cas exact et pour erreur non bornée.Dans ce premier travail, pour caractériser la complexité en requête quantique aux modèles à temps discret, nous adaptons la démonstration des modèles à temps discret, en construisant un algorithme en requête adiabatique universel. Le principe de cet algorithme repose sur le théorème adiabatique cite{Born1928}, ainsi qu'une solution optimale du dual de la méthode par adversaire. À noter que l'analyse du temps d'exécution de notre algorithme adiabatique est basée sur preuve qui ne nécessite pas d'écart dans le spectre de l'Hamiltonien.Dans le second travail, on souhaite caractériser la complexité en requête quantique pour une erreur non bornée ou nulle. Pour cela on reprend et améliore la méthode par adversaire, avec une approche de la mécanique lagrangienne, dans laquelle on construit un Lagrangien indiquant le nombre de requêtes nécessaires pour se déplacer dans l'espace des phases, ainsi on peut définir l'``action en requête''. Or ce lagrangien s'exprime sous la forme d'un programme semi-defini, son étude classique via les équations d'Euler-Lagrange nécessite l'utilisation du théorème de l'enveloppe, un puissant outils d'économathématiques. Le dernier travail, plus éloigné, concerne la complexité en information (et par extension la complexité en communication) pour simuler des corrélations non-locales. Ou plus précisement la quantitié d'information (selon Shannon) que doive s'échanger deux parties pour obtenir ses corrélations. Dans ce but, nous définissons une nouvelle complexité, denommée la zero information complexity IC_0, via le modèle sans communication. Cette complexité a l'avantage de s'exprimer sous la forme d'une optimization convexe. Pour les corrélations CHSH, on résout le problème d'optimisation pour le cas à une seule direction où nous retrouvons un résultat connu. Pour le scénario à deux directions, on met numériquement en évidence la validité de cette borne, et on résout une forme relaxée de IC_0 qui est un nouveau résultat. / Doctorat en Sciences de l'ingénieur et technologie / info:eu-repo/semantics/nonPublished

Page generated in 0.0799 seconds