• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1314
  • 700
  • 234
  • 112
  • 97
  • 43
  • 36
  • 18
  • 16
  • 16
  • 15
  • 15
  • 11
  • 10
  • 10
  • Tagged with
  • 3151
  • 582
  • 547
  • 368
  • 355
  • 298
  • 296
  • 294
  • 237
  • 221
  • 215
  • 208
  • 191
  • 186
  • 180
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Complexidade computacional e o problema P vs NP / Computational complexity and the P vs NP problem

Oliveira, Igor Carboni 08 February 2010 (has links)
Orientador: Arnaldo Vieira Moura / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica / Made available in DSpace on 2018-08-16T09:31:55Z (GMT). No. of bitstreams: 1 Oliveira_IgorCarboni_M.pdf: 1109272 bytes, checksum: 3ab44664e4e0b862409cc8038c431a06 (MD5) Previous issue date: 2010 / Resumo: A teoria de complexidade computacional procura estabelecer limites para a eficiência dos algoritmos, investigando a dificuldade inerente dos problemas computacionais. O problema P vs NP é uma questão central em complexidade computacional. Informalmente, ele procura determinar se, para uma classe importante de problemas computacionais, a busca exaustiva por soluções é essencialmente a melhor alternativa algorítmica possível. Esta dissertação oferece tanto uma introdução clássica ao tema, quanto uma exposição a diversos teoremas mais avançados, resultados recentes e problemas em aberto. Em particular, o método da diagonalização é discutido em profundidade. Os principais resultados obtidos por diagonalização são os teoremas de hierarquia de tempo e de espaço (Hartmanis e Stearns [54, 104]). Apresentamos uma generalização desses resultados, obtendo como corolários os teoremas clássicos provados por Hartmanis e Stearns. Essa é a primeira vez que uma prova unificada desses resultados aparece na literatura / Abstract: Computational complexity theory is the field of theoretical computer science that aims to establish limits on the efficiency of algorithms. The main open question in computational complexity is the P vs NP problem. Intuitively, it states that, for several important computational problems, there is no algorithm that performs better than a trivial exhaustive search. We present here an introduction to the subject, followed by more recent and advanced results. In particular, the diagonalization method is discussed in detail. Although it is a classical technique in computational complexity, it is the only method that was able to separate strong complexity classes so far. Some of the most important results in computational complexity theory have been proven by diagonalization. In particular, Hartmanis and Stearns [54, 104] proved that, given more resources, one can solve more computational problems. These results are known as hierarchy theorems. We present a generalization of the deterministic hierarchy theorems, recovering the classical results proved by Hartmanis and Stearns as corollaries. This is the first time that such unified treatment is presented in the literature / Mestrado / Teoria da Computação / Mestre em Ciência da Computação
212

Complexidade dos padrões espaciais e espectrais de fitofisionomias de cerrado no estado de São Paulo / Complexity of spatial and spectral patterns of Cerrado's phytophysiognomies in São Paulo state (Brazil)

Mattos, Sergio Henrique Vannucchi Leme de 16 August 2018 (has links)
Orientador: Archimedes Perez Filho / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Geociências / Made available in DSpace on 2018-08-16T22:58:04Z (GMT). No. of bitstreams: 1 Mattos_SergioHenriqueVannucchiLemede_D.pdf: 6600919 bytes, checksum: bfd36ba151a1b9a681d89be124b67670 (MD5) Previous issue date: 2010 / Resumo: O Cerrado se constitui em um mosaico de fitofisionomias que se distinguem entre si pelos padrões espaciais que apresentam. Apesar das características e dinâmica do Cerrado apontarem que ele deve ser compreendido como um sistema complexo, o paradigma da complexidade e os métodos dele derivados ainda são pouco aproveitados no estudo do Cerrado. O objetivo geral da presente pesquisa foi avaliar a complexidade dos padrões espaciais (texturais) e espectrais de fitofisionomias de Cerrado a fim de verificar quais propriedades relativas à sua organização e dinâmica tais padrões podem revelar. Para tanto, foram usadas imagens do sensor multi-espectral Aster relativas a unidades de conservação do estado de São Paulo situadas nos municípios de Assis, Itirapina e Luiz Antônio. Medidas de complexidade baseadas na entropia informacional e de dimensão fractal foram aplicadas às imagens e respectivas curvas de respostas espectrais de fisionomias de Cerrado presentes nas localidades estudadas. Uma área-teste foi selecionada para se avaliar as correlações entre fisionomias, seus padrões texturais e espectrais e fatores pedológicos e geomorfológicos. Os resultados obtidos para as análises textural e espectral de imagens do sensor mostram que cada fisionomia apresenta valores estatisticamente iguais nas diferentes extensões avaliadas, revelando a auto-similaridade dos padrões em várias escalas. Houve também uma forte tendência de cada fisionomia obter os mesmos valores em diferentes localidades, o que permite estabelecer intervalos de valores típicos para cada uma, independentemente da área estudada. Por outro lado, nenhuma medida foi totalmente eficiente em distinguir as diferentes fisionomias de Cerrado de uma mesma localidade, principalmente aquelas com padrões mais semelhantes. Quanto às correlações, foram encontradas associações significativas entre fisionomias e fatores pedogeomorfológicos, porém não houve nenhum fator que respondesse exclusivamente pelas características vegetacionais de determinada fisionomia e nem pela configuração de seus padrões, apontando que elas dependem das inter-relações de vários fatores. Pelos resultados alcançados na presente pesquisa, confirma-se que o Cerrado é um sistema dinâmico complexo e que, portanto, o entendimento de sua organização e dinâmica deve-se pautar nos conceitos, modelos e métodos próprios do paradigma da complexidade. Uma característica marcante aqui revelada é a invariância escalar dos padrões, a qual é indicativa de que o Cerrado apresentaria criticalidade autoorganizada, sendo algumas de suas fisionomias representativas de estados próximos a pontos críticos. Conforme apontam os resultados, fisionomias intermediárias, como cerrado denso, cerrado ss e campo cerrado, apresentariam esse tipo de organização, enquanto fisionomias situadas próximas aos extremos do gradiente vegetacional do Cerrado (como campo sujo e cerradão) representariam estados mais estáveis do sistema / Abstract: Brazilian Cerrado is characterized as a mosaic of phytophysiognomies with different spatial patterns. Despite of its characteristics and dynamics suggest that the Cerrado should be understood as a complex system, the complexity paradigm and methodologies are not widely used in Cerrado studies yet. The general objective of this research has been to evaluate the complexity of spatial (textural) and spectral patterns of Cerrado's phytophysiognomies with the purpose of verifying which properties related to organization and dynamic those patterns could show. For this, images from Aster multispectral sensor were used to study Cerrado areas in conservation reserves at São Paulo State (Brazil). Complexity measures based on information entropy and fractal dimension were applied to physiognomies images and to the correspondent spectral response curves. A test-area was selected to evaluate correlations between physiognomies, their textural and spectral patterns, and pedological-geomorphological factors. Textural and spectral image analysis pointed out that each physiognomy presents statistically equal values for different extents considered, showing self-similarity patterns in several scales. There was also a strong tendency that each physiognomy presents the same values at different localities, attributing a typical range of values for each one, independent of its localization. However, no measure was totally efficient to distinguish different Cerrado's physiognomies, especially those with similar patterns. For correlations, significant associations between physiognomies and pedological-geomorphological factors were founded, but here there was no factor responding exclusively for vegetation characteristics of a given physiognomy and for pattern configurations as well, suggesting that they depend on interrelations of many factors. Results obtained in this work confirm that Cerrado is a complex dynamical system and, therefore, comprehension of its organization and dynamics demands concepts, models, and methods related to complexity paradigm. A remarkable characteristic that was revealed here is about scale-invariance of patterns, which indicates that Cerrado presents self-organization criticality. As results show, this type of organization occurs in intermediary physiognomies, while grassland and forest physiognomies are more stable / Doutorado / Análise Ambiental e Dinâmica Territorial / Doutor em Ciências
213

Navigational Complexity Within Building Codes

McLean, James Stephen 01 January 2017 (has links)
The premise, that building codes have become too complex, has been discussed, commented on, and documented by practicing engineers; however, prior to this research there was little scientific evidence that codes have increased in complexity over time. There are many aspects of building codes that are complicated, and this reflects a combination of the inherent complexity of building design and the dynamical processes that produce the codes. This research focuses on navigational complexity and specifically the aspects that can be quantified to demonstrate current codes are more complex than their predecessors. Navigational complexity is defined as the complexity created by document cross referencing and other unintended structural features of a code. A metric for quantifying navigational complexity has been developed based on estimates of time consumed by an engineer stepping and navigating through codes. The metric can be used to quantify navigational complexity within a given code and between different codes. Although it is unclear as to what extent navigational complexity contributes to the overall level of complexity within a code, this research affirms that navigational complexity has increased in various codes over the years and can be used to compare complexity between different codes. The complexity of building codes has been shown to be increasing in several commonly used codes, and it may be necessary to simplify some codes. Additionally, this research postulates that it is possible for codes to become too complex and that there may be instances where the cognitive limit of navigational complexity within any given code is exceeded. However, building codes are complex for several reasons, and attempting to make codes less complex is not trivial. Without a method to reduce complexity, the task of simplification may be impenetrable. The developed metric for navigational complexity has been coupled with graphical representations to identify areas where navigational complexity can be reduced and areas where it may be beyond the cognitive limit of code users. The combination of numerical data and graphical representations may provide additional significant advantages that are not yet realized. Measuring and understanding navigational complexity within any code opens up the possibility of mitigation through reorganization and developing better navigational tools for future editions.
214

The Role of the Goal in Problem Solving Hard Computational Problems: Do People Really Optimize?

Carruthers, Sarah 03 September 2015 (has links)
Understanding how humans cope with complexity is perhaps one of the most important targets of scientific research. Humans not only excel at solving complex tasks in their day to day life but also take on objectively difficult problems recreationally. Research, which has focused to date on the famous hard optimization problem the Euclidean Traveling Salesperson problem (E-TSP), has indicated that humans are able to find near optimal solutions in linear time to hard optimization problems despite the objective difficulty of the task. The research presented in this work contributes to this research by comparing human performance on the search and optimization versions of two other visually presented computationally hard problems: Vertex Cover and Independent Set. These two problems were selected in part to explore how human performance might differ on not-Euclidean problems. Performance on the optimization version of the problems used in this study is in keeping the previously reported results; however, performance on the search version is even better suggesting that previous problem solving research might have underestimated the power of the human problem solving system. A key result of this work is that differences in performance between the optimization and search versions of these hard problems can be attributed to differences in how problem solvers encode the goal of the task. Consequentially, subjects in these conditions tasked with identical instances of two very nearly identical versions of a problem are in fact solving very different problems. This work presents a framework to improve how human performance results on hard optimization problems are interpreted. It also demonstrates how the search version of hard computational problems can be used to investigate how people cope with complexity free of the confounding aspect of an ill-defined goal. / Graduate
215

Success factors in the transition towards distributed leadership in large organisations

Hayward, Simon January 2015 (has links)
The thesis reviews recent and current literature on leadership, and in particular on distributed leadership and complexity leadership theories. It describes my research into the factors affecting the success of transition towards a more distributed approach to leadership in two cases: one is a large UK private company and the other is a large UK university. The longitudinal research was conducted over the period from 2011 to 2013, using repeated interviews at senior and middle management levels, document analysis and observation to collect a rich set of data about both cases. I used a template to help analyse the data from each case. Through subsequent cross-case analysis the thesis identifies certain factors that influence the degree of success in making the transition to a distributed form of leadership, which involves not only devolved decision making but also increased levels of collaboration and organisational agility, which are key concerns of leaders of large organisations according to recent research across top 250 companies in the UK (Ipsos MORI, 2015: 5). The conclusion from my research is a framework called connected leadership, which describes the critical success factors and how they inter-relate. The first factor is having senior leadership committed to being role models, which is a pre-requisite for successful transition. There are then two factors that lay a strong foundation for the transition, namely having a shared organisational purpose and vison and values-based approach to leadership behaviour. Finally there are factors that then make distributed leadership work in practice: consistently devolved decision making, an emphasis on collaborative achievement, and agility and learning. The thesis provides practitioners with insight at both the organisational and leadership role levels, based on the connected leadership model. At the organisational level, I have derived from the research certain indicators for each factor that help diagnose and plan for the introduction of a distributed leadership approach. At the leadership role level the framework provides a helpful guide to developing leadership capability and role definition. The connected leadership model represents a coherent guide for leaders to use as a template for successful transition to a more distributed, collaborative and agile organisation, which is able to compete effectively in the 21st century networked society. Academically, this thesis provides a synthesis of distributed and complexity leadership theories, as well as drawing on authentic leadership theory, in order to understand the organisational and human dynamics that influence the transition to a more distributed leadership approach. Both cases are large organisations, which means that the factor framework provides relevant insight into how distributed leadership can be effective in large and relatively complex organisations.
216

Predicting program complexity from Warnier-Orr diagrams

White, Barbara January 1982 (has links)
Typescript (photocopy).
217

Algorithms and complexity for annotated sequence analysis

Evans, Patricia Anne 14 December 2017 (has links)
Molecular biologists use algorithms that compare and otherwise analyze sequences that represent genetic and protein molecules. Most of these algorithms, however, operate on the basic sequence and do not incorporate the additional information that is often known about the molecule and its pieces. This research describes schemes to combinatorially annotate this information onto sequences so that it can be analyzed in tandem with the sequence; the overall result would thus reflect both types of information about the sequence. These annotation schemes include adding colours and arcs to the sequence. Colouring a sequence would produce a same-length sequence of colours or other symbols that highlight or label parts of the sequence. Arcs can be used to link sequence symbols (or coloured substrings) to indicate molecular bonds or other relationships. Adding these annotations to sequence analysis problems such as sequence alignment or finding the longest common subsequence can make the problem more complex, often depending on the complexity of the annotation scheme. This research examines the different annotation schemes and the corresponding problems of verifying annotations, creating annotations, and finding the longest common subsequence of pairs of sequences with annotations. This work involves both the conventional complexity framework and parameterized complexity, and includes algorithms and hardness results for both frameworks. Automata and transducers are created for some annotation verification and creation problems. Different restrictions on layered substring and arc annotation are considered to determine what properties an annotation scheme must have to make its incorporation feasible. Extensions to the algorithms that use weighting schemes are explored. schemes are explored. / Graduate
218

Ethics and complexity : exploring the significance and application of complexity thinking in ethical theory with special reference to the graphic novel Watchmen

Lochhead, Hayley January 2011 (has links)
The current era, or postmodern context, is characterized by an overwhelming amount of anxiety concerning humanity’s future, tied to a general perception of the postmodern context as one that is defined by ‘crisis.’ This anxiety-provoking sense of crisis is, I believe, the product of a much more significant destabilization of the paradigmatic base upon which the human worldview is established. The period that extends from the Newtonian era to the late Twentieth Century has been informed by what Edgar Morin calls a ‘paradigm of simplicity’ in terms of which the universe was understood as a perfectly knowable, deterministic system. Following a series of revolutionizing discoveries throughout the Twentieth Century, however, it has come to light that this deterministic paradigm is no longer suitable as a way of understanding the universe. Instead, a ‘paradigm of complexity,’ in which the universe is understood as a complex, self-organizing system that is never totally knowable, has been posited by thinkers such as Morin and Paul Cilliers. Further, both of these thinkers, among others, argue that the acknowledgment of complexity is an inherently ethical matter, since complex systems (such as human communities) present one with difficult choices to make in uncertain situations, rather than determinate sets of rules to follow. This study aims to show that a complexity view of the human lifeworld, does not cast the uncertainty of humanity’s future as a threatening cause of anxiety and dread, but offers us a valuable opportunity for growth, adaptation and the creation of new ethical values. Certainly, an inability to come to grips with this new paradigm has led to desperately reactionary measures on the part of some to secure a semblance of the stability and control that the ‘paradigm of simplicity’ made normative. As a result, the complex reality of the human lifeworld is negated in favour of the misleading belief in the certainty and security provided by a particular metanarrative. By contrast, Jacques Lacan’s fecund poststructuralist theory of subjectivity and Jacques Derrida’s poststructuralist logic offer useful heuristics for the navigation of complexity thinking that neither mistakenly negate moments of uncertainty, anomaly and paradox for the sake of certainty, nor swing to the opposite, equally unacceptable, extreme of absolute relativism. Poststructuralist logic points to the notion of a ‘complexity ethics’ which issues a challenge to the idea, stemming from the ‘paradigm of simplicity’, that it is ever possible for agents to adopt an uncompromised ethical stance. The upshot of this is to argue that it is essential for contemporary humans to learn to live with ethical uncertainty, paradox, compromise, contamination and other figures of complexity, rather than search for an impossible certainty, since this strategy leads to more realistic, moderate, and therefore less dangerous, ethical reasoning. Good popular culture texts, such as the graphic novel Watchmen, which represent the reality of a complex human lifeworld, have the power to communicate these rather difficult philosophical ideas concerning the complexity of the human lifeworld to a wide audience in a very accessible format. An analysis of the varying ethical stances taken by certain characters demonstrates, in concrete terms, just how precisely the novel confirms the poststructuralist argument concerning inescapable ethical contamination. Thus, it is with recourse to such texts that one might begin to answer more concretely the questions, ‘what does complexity imply for ethical theory?’ and ‘what might an ethics for the complex lifeworld entail?'.
219

Retrospective cohort study of type 2 diabetes mellitus (T2DM) in the Wirral peninsula : complexity science

Nwaneri, Chukwuemeka L. January 2014 (has links)
T2DM continues to be a public health burden with its increasing incidence, prevalence, and mortality risks. The aim of this thesis was to examine a population-based cohort of 22,000 people with T2DM diagnosed between 1 January 2000 and 31 December 2010 within the Wirral Peninsula, UK with the aim of: Assessing factors relating to all-cause, cardivascular-, malignancy-, and non-cardiovascular mortality; evaluating the role of glycaemic control, socioeconomic status, smoking, dyslipidaemia, blood pressure, obesity, and nephropathy, as predicting risk factors for mortality; assessing the influence of age at diagnosis, duration of diabetes, year of diagnosis and gender on mortality; examining the life expectancy and mortality patterns and measuring the years of life lost as a result of a diagnosis of T2DM; applying Complexity Science to the dynamic interplay of the various factors in T2DM that lead to unpredictability in health outcomes.
220

Les maires confrontés à l'apprentissage de l'intercommunalité : l'exemple de l'agglomération toulousaine / The mayors confronted with the inter-municipal apprenticeships : the example of the Toulouse city région

Loubet, Lilian 05 October 2011 (has links)
Le développement de l’intercommunalité donne lieu à des recompositions territoriales et gouvernementales majeures. Les élus se confrontent alors à l’exercice complexe de la coopération et à la définition d’un projet territorial. Il s’agira d’étudier la coopération intercommunale au regard des apprentissages (techniques, politiques et territoriaux) opérés par les maires. Ce propos est illustré par l’étude de l’agglomération toulousaine qui, fragmentée en trois communautés d’agglomérations ou urbaine, donne à voir des territoires plus ou moins avancés dans le mouvement d’intégration communautaire. A cette occasion, plus de 80 maires, ainsi que des responsables techniques, ont été interrogés. / The development of the inter-municipal cooperation gives rise to some major territorial and governmental reorganization. The elected in charge are then confronted to the complex exercise of the inter-municipal cooperation and the definition of a territorial project. It will be a question of studying the inter-municipal cooperation towards the technical, political and territorial apprenticeships operated by the mayors. That is illustrated by the study of the Toulouse city région, which is splited in three communities of town : it shows more or less advanced territories in the movement of community integration. Some technical managers and more than eighty mayors were asked about that.

Page generated in 0.0731 seconds