• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1303
  • 700
  • 234
  • 111
  • 97
  • 43
  • 36
  • 18
  • 16
  • 15
  • 15
  • 14
  • 11
  • 10
  • 10
  • Tagged with
  • 3138
  • 581
  • 547
  • 366
  • 355
  • 298
  • 295
  • 293
  • 237
  • 220
  • 213
  • 208
  • 191
  • 186
  • 178
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
611

A Dynamic and Thermodynamic Approach to Complexity.

Yang, Jin 08 1900 (has links)
The problem of establishing the correct approach to complexity is a very hot and crucial issue to which this dissertation gives some contributions. This dissertation considers two main possibilities, one, advocated by Tsallis and co-workers, setting the foundation of complexity on a generalized, non-extensive , form of thermodynamics, and another, proposed by the UNT Center for Nonlinear Science, on complexity as a new condition that, for physical systems, would be equivalent to a state of matter intermediate between dynamics and thermodynamics. In the first part of this dissertation, the concept of Kolmogorov-Sinai entropy is introduced. The Pesin theorem is generalized in the formalism of Tsallis non-extensive thermodynamics. This generalized form of Pesin theorem is used in the study of two major classes of problems, whose prototypes are given by the Manneville and the logistic map respectively. The results of these studies convince us that the approach to complexity must be made along lines different from those of the non-extensive thermodynamics. We have been convinced that the Lévy walk can be used as a prototype model of complexity, as a condition of balance between order and randomness that yields new phenomena such as aging, and multifractality. We reach the conclusions that these properties must be studied within a dynamic rather than thermodynamic perspective. The second part focuses on the study of the heart beating problem using a dynamic model, the so-called memory beyond memory, based on the Lévy walker model. It is proved that the memory beyond memory effect is more obvious in the healthy heart beating sequence. The concepts of fractal, multifractal, wavelet transformation and wavelet transform maximum modulus (WTMM) method are introduced. Artificial time sequences are generated by the memory beyond memory model to mimic the heart beating sequence. Using WTMM method, the multifratal singular spectrums of the sequences are calculated. It is clear that the sequence with strong memory beyond memory effect has broader singular spectrum.2003-08
612

Graph colourings and games

Meeks, Kitty M. F. T. January 2012 (has links)
Graph colourings and combinatorial games are two very widely studied topics in discrete mathematics. This thesis addresses the computational complexity of a range of problems falling within one or both of these subjects. Much of the thesis is concerned with the computational complexity of problems related to the combinatorial game (Free-)Flood-It, in which players aim to make a coloured graph monochromatic ("flood" the graph) with the minimum possible number of flooding operations; such problems are known to be computationally hard in many cases. We begin by proving some general structural results about the behaviour of the game, including a powerful characterisation of the number of moves required to flood a graph in terms of the number of moves required to flood its spanning trees; these structural results are then applied to prove tractability results about a number of flood-filling problems. We also consider the computational complexity of flood-filling problems when the game is played on a rectangular grid of fixed height (focussing in particular on 3xn and 2xn grids), answering an open question of Clifford, Jalsenius, Montanaro and Sach. The final chapter concerns the parameterised complexity of list problems on graphs of bounded treewidth. We prove structural results determining the list edge chromatic number and list total chromatic number of graphs with bounded treewidth and large maximum degree, which are special cases of the List (Edge) Colouring Conjecture and Total Colouring Conjecture respectively. Using these results, we show that the problem of determining either of these quantities is fixed parameter tractable, parameterised by the treewidth of the input graph. Finally, we analyse a list version of the Hamilton Path problem, and prove it to be W[1]-hard when parameterised by the pathwidth of the input graph. These results answer two open questions of Fellows, Fomin, Lokshtanov, Rosamond, Saurabh, Szeider and Thomassen.
613

Temporal Complexity and Stochastic Central Limit Theorem

Pramukkul, Pensri 08 1900 (has links)
Complex processes whose evolution in time rests on the occurrence of a large and random number of intermittent events are the systems under study. The mean time distance between two consecutive events is infinite, thereby violating the ergodic condition and activating at the same time a stochastic central limit theorem that explains why the Mittag-Leffler function is a universal property of nature. The time evolution of these complex systems is properly generated by means of fractional differential equations, thus leading to the interpretation of fractional trajectories as the average over many random trajectories, each of which fits the stochastic central limit theorem and the condition for the Mittag-Leffler universality. Additionally, the effect of noise on the generation of the Mittag-Leffler function is discussed. Fluctuations of relatively weak intensity can conceal the asymptotic inverse power law behavior of the Mittag-Leffler function, providing a reason why stretched exponentials are frequently found in nature. These results afford a more unified picture of complexity resting on the Mittag-Leffler function and encompassing the standard inverse power law definition.
614

Att implementera en handlingsplan : En kvalitativ studie om förändringsprojekt i kommunal verksamhet / Implementing an action plan : A qualitative study about change projects in public service organisations

Janulf, Sophie January 2019 (has links)
Studiens syfte är att generera kunskap om vad som påverkar förändringsprojekts utfall, genom att studera kommunalt anställdas upplevelser av att ansvara för interna förändringsprojekt. Studien är ett försök att belysa förändringsprojekt sett som processer som drivs med hjälp av projektlogik, för att ytterligare förstå svårigheter och möjligheter i kommunala projektsammanhang. För att besvara syftet utfördes semistrukturerade intervjuer med efterföljande analys utifrån ett teoretiskt ramverk, baserad på tidigare forskning inom projektledning. Intervjuerna omfattar personer anställda i olika svenska kommuner, ansvariga för framtagande och implementering av handlingsplaner mot våldsbejakande extremism. Resultatet påvisar att bristande samsyn gällande mål och prioritering samt osäkerhet gällande tillvägagångssätt upplevs försvåra projekten, medan personliga relationer och metoder som främjar samarbete, samsyn och kunskapsdelning upplevs underlätta projekten. Studien drar slutsatsen att svåra genomföranden hänger samman med låg prioriteringsnivå av projektet i organisationen, samt att de metoder som avses användas för att hantera svårigheter är oanvändbara utan handlingsutrymme genom personliga relationer inom organisationen. Genom att definiera förändringsprojekt som process med själva förändringen i fokus stärks också förutsättningarna för att kunna sätta projektmål och ramar som möjliggör ett lyckat resultat. / The aim of this study is to generate knowledge about what affects the outcome of change projects, by studying how employees in public organisations experience their work as responsible for managing change projects. The study is an attempt to shed light on change projects seen as processes that are managed by project logic, to further understand difficulties and possibilities in public project contexts. To answer the aim semi-structured interviews were conducted with following analysis through a theoretical framework, based on previous research of project management. The interviews include employees in different Swedish municipalities, responsible for developing and implementing action plans addressing violent extremism. The result shows that lack of consensus regarding goals and prioritizations as well as uncertainty regarding procedures are experienced as aggravating, while personal relations and methods supporting cooperation, consensus and knowledge-sharing are experienced as facilitating. The study concludes that difficult procedures are connected to low level of priority of the project in the organisation, as well as the methods that are intended to be used to reduce complexity are useless without room for action through personal relations within the organisation. By defining change projects as processes with the actual change in focus, the prerequisite to be able to set targets and frames that enables successful results are also strengthened.
615

Development of Written Complexity and Accuracy in an Intermediate to Advanced German L2 Setting Using Weighted Clause Ratio

Gemini Fox (6634193) 11 June 2019 (has links)
The primary focus of this study is to determine how clausal complexity and accuracy develop over the course of three academic years of intermediate to advanced-level German. This study aims to shed light on issues related to learner development of writing during advanced stages of language acquisition, particularly after conducting a study abroad. The main goal of this study will be to track the writing complexity and accuracy of multiple students longitudinally. This paper will identify Weighted Clause Ratio (Foster & Wigglesworth, 2016), as well as address Skills Acquisition Theory (DeKeyser, 2007), Interaction Hypothesis (Swain, 1985), and the Limited Attentional Capacity Theory (Skehan, 1998). In addition to this, the study will discuss the impact of a study abroad on the language-learning process, task complexity, and the language-learning plateau. Following a review of terminology, I will discuss how Weighted-Clause Ratio will be used to determine clausal accuracy and complexity. The data of this study will be analyzed with results shown in intervals throughout three academic years, comparing each of the three years with one another. Results indicate that accuracy increases drastically over the final two years when compared to the change in the first two years, confirming the effect that study abroad has on the written accuracy of learners, and the complexity showed improvements in some measures over the course of the study, but varied in other measures. I conclude the thesis by discussing by discussing the implications of these findings for our understanding of writing complexity and accuracy, and the long-term effects of study abroad.
616

Microculturas em laboratório: construção de complexidade e seleção por metacontingências / Laboratory microcultures: selection by metacontingencies and the emergence of complexity

Queiroz, Anna Beatriz Müller 08 April 2015 (has links)
Made available in DSpace on 2016-04-29T13:17:55Z (GMT). No. of bitstreams: 1 Anna Beatriz Muller Queiroz.pdf: 4223800 bytes, checksum: 32b0b330f81dbc7cb048c4e1893e85b3 (MD5) Previous issue date: 2015-04-08 / The selection of cultural units by metacontingencies was manipulated in laboratory microcultures in an effort to build and study cultural complexity. The following experimental manipulations were conducted: (a) increasing and decreasing the number of IBCs participants (component complexity and complexity of the task), (b) the complexity of the selected aggregate product of group interaction (environmental complexity) (c) the environmental criteria (for cultural consequences). Five experiments were conducted with university students distributed among generations of 2 to 4 participants. Throughout the experiments experienced participants were replaced with naive ones. The task of each participant was to insert numbers from 0 to 9 in the computer. Consequences were contingent with specific patterns of selection: individual consequences were contingent with individual patterns, and cultural consequences were contingent with products of interactions and organization among the participants (called Aggregate Products, or AP). The experiments comprised 4 phases: (1) selection of AP1, (2) selection of AP2, (3) selection of two previously produced APs, and (4) extinction. The order in which the phases were presented varied between Experiment 1 and 2. Experiments 3 and 4 systematically replicated Experiments 1 and 2, respectively, with variation (decrease) in the number of participants performing the task in phase 3. Experiment 5 replicated Experiment 2 without instructions specifically related to the expected performances. Results showed that the order of phases, did not affect the production of the required AP, but determined the AP produced when either one was possible. Results also indicated that selection by metacontingencies involve the repeated production of the aggregate product, and patterns of interaction or organization. Results also showed that the increase and decrease of participants may both be related to the production of complexity depending on other variables. A discussion of the dimensions of environmental and components complexity studied in the selection of cultural units by metacontingências is presented / A seleção das unidades culturais por metacontingências foi manipulada em microculturas de laboratório em um esforço para construir e estudar complexidade cultural. As seguintes manipulações experimentais foram realizadas: (a) aumentar e diminuir o número de participantes das CCEs (complexidade de componente e complexidade da tarefa), (b) a complexidade do produto agregado produzido pela interação do grupo (complexidade ambiental) (c) o critério ambiental (para as consequências culturais). Cinco experimentos foram realizados com estudantes universitários distribuídos entre gerações de 2-4 participantes. Ao longo dos experimentos participantes experientes foram substituídos por outros ingênuos. A tarefa de cada participante foi inserir números de 0 a 9 no computador. Consequências foram contingente com padrões específicos de seleção: consequências individuais foram contingentes com padrões individuais, e as consequências culturais foram contingente aos produtos de interações e organização entre os participantes (chamados produtos agregados, ou PA). Os experimentos foram compostos de 4 fases: (1) seleção de PA1, (2) seleção de PA2, (3) seleção de dois PAs anteriormente produzidos, e (4) extinção. A ordem em que as fases foram apresentadas variaram entre os Experimentos 1 e 2. Os Experimentos 3 e 4 replicaram sistematicamente os Experimentos 1 e 2, respectivamente, com variação (diminuição) do número de participantes executando a tarefa na fase 3. Experimento 5 replicou os Experimento 2 sem instruções especificamente relacionadas ao desempenho esperado. Os resultados mostraram que a ordem das fases, não afectou a produção do PA requerido, mas determinou o PA produzido quando um ou outro era possível. Os resultados também indicaram que a seleção por metacontingências envolvem a produção repetida do produto agregado, e padrões de interação ou organização. Os resultados também mostraram que o aumento e diminuição de participantes podem ambos estar relacionado com a produção de complexidade, dependendo de outras variáveis. Uma discussão sobre as dimensões da complexidade ambiental e de componentes estudados na seleção de unidades culturais por metacontingências é apresentada
617

Complexidade organizacional e governanÃa corporativa nas empresas listadas na BM&FBOVESPA / Organizational complexity and corporate governance in companies listed on the BM & FBOVESPA

Renata Rouquayrol AssunÃÃo 07 July 2014 (has links)
CoordenaÃÃo de AperfeiÃoamento de Pessoal de NÃvel Superior / A complexidade das organizaÃÃes à vista como o ponto de partida para a separaÃÃo das atividades de propriedade e controle, enquanto que a governanÃa corporativa, que promove o alinhamento de interesses entre o principal e o agente, pode ser compreendida como um meio de aperfeiÃoar os mecanismos de controle das empresas. Nesse contexto, o presente estudo tem por objetivo geral investigar a relaÃÃo entre a complexidade organizacional e das operaÃÃes e a governanÃa corporativa das companhias listadas na BM&FBovespa, à luz dos preceitos da Teoria da ContingÃncia, em que os fatores contingenciais podem influenciar as caracterÃsticas organizacionais relacionadas à estrutura e à estratÃgia das empresas (DONALDSON, 2007). Considera-se nesta pesquisa que a complexidade organizacional se refere à complexidade da estrutura e dos aspectos formais necessÃrios para inserÃÃo da empresa no mercado, e a complexidade das operaÃÃes se relaciona com o complexo conjunto dos recursos e dos processos vÃlidos para o desenvolvimento das suas atividades. Cabe destacar que esta pesquisa se justifica em razÃo dos poucos estudos encontrados na literatura nacional que relacionam as duas temÃticas bem como para que os resultados possam se tornar ponto de partida para anÃlise da estrutura das empresas e da eventual necessidade de implantaÃÃo ou melhoria de controles mais eficientes. Trata-se de pesquisa descritiva, de natureza quantitativa, reunindo uma amostra de 162 empresas listadas na BM&FBovespa. Para a anÃlise dos dados foram empregados os seguintes testes estatÃsticos: AnÃlise Fatorial, RegressÃo Linear MÃltipla, AnÃlise de CorrespondÃncia e AnÃlise de CorrelaÃÃo. Para a mensuraÃÃo da complexidade, sÃo adotadas em conjunto as variÃveis contingenciais idade, tamanho, diversificaÃÃo e internacionalizaÃÃo; e para examinar a governanÃa corporativa, utiliza-se um check-list composto por 16 itens baseado nos estudos de Silveira (2004), Silva e Leal (2005) e Lameira e Ness Jr. (2011) e nas recomendaÃÃes propostas pela ComissÃo de Valores MobiliÃrios - CVM (2002) e pelo Instituto Brasileiro de GovernanÃa Corporativa - IBGC (2009). A partir da AnÃlise Fatorial, identificou-se que a complexidade organizacional à explicada pelas variÃveis tamanho e diversificaÃÃo enquanto que a complexidade das operaÃÃes à explicada pelas variÃveis tamanho, diversificaÃÃo e internacionalizaÃÃo. Na anÃlise de RegressÃo Linear MÃltipla, constatou-se que as duas dimensÃes da complexidade, a governanÃa corporativa sofre influÃncia das variÃveis diversificaÃÃo, internacionalizaÃÃo e idade, sendo esta Ãltima uma relaÃÃo inversa. Por meio da AnÃlise de CorrespondÃncia e da AnÃlise de CorrelaÃÃo, conclui-se que, independentemente da dimensÃo em anÃlise, a complexidade se relaciona positiva e significativamente à governanÃa corporativa, isto Ã, as empresas listadas na BM&FBovespa de maior complexidade possuem maior Ãndice de governanÃa corporativa, confirmando-se a hipÃtese da pesquisa. Os resultados contribuem para aprofundar o conhecimento das temÃticas na medida em que os pressupostos da teoria contingencial sÃo confirmados e ao revelar a existÃncia de um campo em que à necessÃrio um gerenciamento da complexidade organizacional e das operaÃÃes por parte dos gestores no sentido de perceber e de aderir à mais e melhores mecanismos de controles por meio da adoÃÃo de boas prÃticas de governanÃa corporativa. / The organizational complexity is seen as starting point for the separation of property and control activities, while corporate governance, which promotes the alignment of interests between the principal and the agent, might be understood as a means to enhance business control devices. In this context, the general objective of the present study is to investigate the relation between organizational/operational complexity and corporate governance in companies listed on the BM&FBovespa, by the light of the Contingency Theory Precepts, in which contingency factors might influence the organizational characteristics related to structure and to business strategies (DONALDSON, 2007). In this research, it is considered that organizational complexity refers to the complexity of structure and of formal aspects necessary to the insertion of a company into the market. The complexity of operations is related to the complex set of resources and the valid process towards the development of activities. It is important to remember that this research justifies itself in view of the few studies putting the two themes together which can be found in national literature as well as the intention to make the studies become a starting point for the analysis of business structure and the eventual necessity to implant and improve more efficient control. It consists of a descriptive research, of quantitative nature, which puts together a sample of 162 companies listed on the BM&FBovespa. For data analysis the following statistical tests have been used: Factor Analysis, Multiple Linear Regression, Correspondence Analysis, and Correlation Analysis. For measuring complexity, age, size, diversification and internationalization were the contingency variables adopted as a set; and in order to assess corporate governance, a check list made up of 16 items based on the studies of Silveira (2004), Silva and Leal (2005) and Lameira and Ness Jr. (2011) and on recommendations proposed by the ComissÃo de Valores MobiliÃrios - CVM (2002) and by the Instituto Brasileiro de GovernanÃa Corporativa - IBGC (2009) was used. Based on Factor Analysis, it was possible to identify that organizational complexity is explained by size and diversification variables while operational complexity is explained by means of size, diversification and internationalization variables. In the Multiple Linear Analysis, it was detected that the two dimensions of complexity, corporate governance is influenced by diversification, internationalization and age variables, while the latter is an inverted relation. By means of Correspondence Analysis and Correlation Analysis, it has been concluded that, no matter what dimension is being assessed, complexity relates positively and significantly to corporate governance, that is, the most complex companies listed on the BM&FBovespa display the highest levels of corporate governance, which comes as a confirmation of the research hypothesis. Such results contribute to the acquisition of knowledge on these themes when contingency theory presuppositions get confirmed and with the revelation of the existence of a field in which it is necessary for administrators to manage organizational/operational complexity in order to perceive and adhere to more and better control devices by means of the adoption of good practices of corporate governance.
618

Complexity Studies of Firm Dynamics

January 2018 (has links)
abstract: This thesis consists of three projects employing complexity economics methods to explore firm dynamics. The first is the Firm Ecosystem Model, which addresses the institutional conditions of capital access and entrenched competitive advantage. Larger firms will be more competitive than smaller firms due to efficiencies of scale, but the persistence of larger firms is also supported institutionally through mechanisms such as tax policy, capital access mechanisms and industry-favorable legislation. At the same time, evidence suggests that small firms innovate more than larger firms, and an aggressive firm-as-value perspective incentivizes early investment in new firms in an attempt to capture that value. The Ecological Firm Model explores the effects of the differences in innovation and investment patterns and persistence rates between large and small firms. The second project is the Structural Inertia Model, which is intended to build theory around why larger firms may be less successful in capturing new marketshare than smaller firms, as well as to advance fitness landscape methods. The model explores the possibility that firms with larger scopes may be less effective in mitigating the costs of cooperation because conditions may arise that cause intrafirm conflicts. The model is implemented on structured fitness landscapes derived using the maximal order of interaction (NM) formulation and described using local optima networks (LONs), thus integrating these novel techniques. Finally, firm dynamics can serve as a proxy for the ease at which people can voluntarily enter into the legal cooperative agreements that constitute firms. The third project, the Emergent Firm model, is an exploration of how this dynamic of voluntary association may be affected by differing capital institutions, and explores the macroeconomic implications of the economies that emerge out of the various resulting firm populations. / Dissertation/Thesis / Doctoral Dissertation Applied Mathematics for the Life and Social Sciences 2018
619

Complexity issues in counting, polynomial evaluation and zero finding

Briquel, Irénée 29 November 2011 (has links) (PDF)
In the present thesis, we try to compare the classical boolean complexity with the algebraic complexity, by studying problems related to polynomials. We consider the algebraic models from Valiant and from Blum, Shub and Smale (BSS). To study the algebraic complexity classes, one can start from results and open questions from the boolean case, and look at their translation in the algebraic context. The comparison of the results obtained in the two settings will then boost our understanding of both complexity theories. The first part follows this framework. By considering a polynomial canonically associated to a boolean formula, we get a link between boolean complexity issues on the formula and algebraic complexity problems on the polynomial. We studied the complexity of computing the polynomial in Valiant's model, as a function of the complexity of the boolean formula. We found algebraic counterparts to some boolean results. Along the way, we could also use some algebraic methods to improve boolean results, in particular by getting better counting reductions. Another motivation for algebraic models of computation is to offer an elegant framework to the study of numerical algorithms. The second part of this thesis follows this approach. We started from new algorithms for the search of approximate zeros of complex systems of n polynomials in n variables. Up to now, those were BSS machine algorithms. We studied the implementation of these algorithms on digital computers, and propose an algorithm using floating arithmetic for this problem.
620

On the parameterized complexity of finding short winning strategies in combinatorial games

Scott, Allan Edward Jolicoeur 29 April 2010 (has links)
A combinatorial game is a game in which all players have perfect information and there is no element of chance; some well-known examples include othello, checkers, and chess. When people play combinatorial games they develop strategies, which can be viewed as a function which takes as input a game position and returns a move to make from that position. A strategy is winning if it guarantees the player victory despite whatever legal moves any opponent may make in response. The classical complexity of deciding whether a winning strategy exists for a given position in some combinatorial game has been well-studied both in general and for many specific combinatorial games. The vast majority of these problems are, depending on the specific properties of the game or class of games being studied, complete for either PSPACE or EXP. In the parameterized complexity setting, Downey and Fellows initiated a study of "short" (or k-move) winning strategy problems. This can be seen as a generalization of "mate-in-k" chess problems, in which the goal is to find a strategy which checkmates your opponent within k moves regardless of how he responds. In their monograph on parameterized complexity, Downey and Fellows suggested that AW[*] was the "natural home" of short winning strategy problems, but there has been little work in this field since then. In this thesis, we study the parameterized complexity of finding short winning strategies in combinatorial games. We consider both the general and several specific cases. In the general case we show that many short games are as hard classically as their original variants, and that finding a short winning strategy is hard for AW[P] when the rules are implemented as succinct circuits. For specific short games, we show that endgame problems for checkers and othello are in FPT, that alternating hitting set, hex, and the non-endgame problem for othello are in AW[*], and that short chess is AW[*]-complete. We also consider pursuit-evasion parameterized by the number of cops. We show that two variants of pursuit-evasion are AW[*]-hard, and that the short versions of these problems are AW[*]-complete.

Page generated in 0.1272 seconds