• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 84
  • 43
  • 10
  • 5
  • 5
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 208
  • 208
  • 34
  • 34
  • 33
  • 30
  • 28
  • 26
  • 21
  • 21
  • 19
  • 17
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Finite Memory Policies for Partially Observable Markov Decision Proesses

Lusena, Christopher 01 January 2001 (has links)
This dissertation makes contributions to areas of research on planning with POMDPs: complexity theoretic results and heuristic techniques. The most important contributions are probably the complexity of approximating the optimal history-dependent finite-horizon policy for a POMDP, and the idea of heuristic search over the space of FFTs.
62

The Approximability of Learning and Constraint Satisfaction Problems

Wu, Yi 07 October 2010 (has links)
An α-approximation algorithm is an algorithm guaranteed to output a solutionthat is within an α ratio of the optimal solution. We are interested in thefollowing question: Given an NP-hard optimization problem, what is the bestapproximation guarantee that any polynomial time algorithm could achieve? We mostly focus on studying the approximability of two classes of NP-hardproblems: Constraint Satisfaction Problems (CSPs) and Computational Learning Problems. For CSPs, we mainly study the approximability of MAX CUT, MAX 3-CSP,MAX 2-LINR, VERTEX-PRICING, as well as serval variants of the UNIQUEGAMES.• The problem of MAX CUT is to find a partition of a graph so as to maximizethe number of edges between the two partitions. Assuming theUnique Games Conjecture, we give a complete characterization of the approximationcurve of the MAX CUT problem: for every optimum value ofthe instance, we show that certain SDP algorithm with RPR2 roundingalways achieve the optimal approximation curve.• The input to a 3-CSP is a set of Boolean constraints such that each constraintcontains at most 3 Boolean variables. The goal is to find an assignmentto these variables to maximize the number of satisfied constraints.We are interested in the case when a 3-CSP is satisfiable, i.e.,there does exist an assignment that satisfies every constraint. Assumingthe d-to-1 conjecture (a variant of the Unique Games Conjecture), weprove that it is NP-hard to give a better than 5/8-approximation for theproblem. Such a result matches a SDP algorithm by Zwick which givesa 5/8-approximation problem for satisfiable 3-CSP. In addition, our resultalso conditionally resolves a fundamental open problem in PCP theory onthe optimal soundness for a 3-query nonadaptive PCP system for NP withperfect completeness.• The problem of MAX 2-LINZ involves a linear systems of integer equations;these equations are so simple such that each equation contains atmost 2 variables. The goal is to find an assignment to the variables so asto maximize the total number of satisfied equations. It is a natural generalizationof the Unique Games Conjecture which address the hardness ofthe same equation systems over finite fields. We show that assuming theUnique Games Conjecture, for a MAX 2-LINZ instance, even that thereexists a solution that satisfies 1−ε of the equations, it is NP-hard to findone that satisfies ² of the equations for any ε > 0.
63

Family Maths and Complexity Theory

Webb, Paul, Austin, Pam 11 May 2012 (has links) (PDF)
The importance of family involvement is highlighted by findings that parents’ behaviours, beliefs and attitudes affect children’s behaviour in a major way. The Family Maths programme, which is the focus of this study, provides support for the transformative education practices targeted by the South African Department of Education by offering an intervention which includes teachers, learners and their families in an affirming learning community. In this study participating parents were interviewed to investigate their perceptions of the Family Maths programme mainly in terms of their engagement, enjoyment and confidence levels. The major themes and ideas that were generated in this study include the development of positive attitudes, parents and children working and talking together, and the skills exhibited by Family Maths facilitators. These findings are analysed within the parameters of complexity science and the pre-requisite conditions for developing a complex learning community, viz. internal diversity, redundancy, decentralized control, organised randomness and neighbour interactions.
64

Unpredictable predictables: complexity theory and the construction of order in intensive care.

Carroll, Katherine Emily January 2009 (has links)
The Intensive Care Unit (ICU) is a unit that manages the most critically ill, complex and unstable patients in the hospital. As a result, the ICU is characterised by a high degree of clinical and organisational unpredictability and uncertainty. In Western discourse, uncertainty is often portrayed as problematic, and as something to be controlled and reduced. This research challenges this discourse by examining the productive relationship between certainty and uncertainty in the work practices of ICU clinicians, and subsequently, how intensive care clinicians utilise uncertainty to construct order in a highly unpredictable work environment. To understand how order can coexist with ICU’s unremitting unpredictability, complexity theory is used to frame this investigation. This research engaged an emergent, interventionist methodology, deploying multiple methods. Using ethnography, video-ethnography, and video-reflexivity, this research relied on clinicians’ participation in the construction and analysis of video data of the ICU clinicians’ work practices. This resulted in clinician-led practice change in the ICU. This research suggests that methods need to be deployed adaptively in order to deal with the complexity of ICU, in addition to the moment-to-moment emergence of events that require the researcher’s own work plans to be revisited. Moreover, in order to gain traction with, and understand highly complex and changeable environments, the researcher needs to also enter and experience uncertainty herself. Using complexity theory as its analytical tool, this research shows an inseparability of uncertainty and certainty in the ICU which is labeled ‘un/certainty’. Three main conclusions emerge from this research. First, un/certainty predominates in intensive care, and due to this, ordering is a process rather than a final state. Un/certainty is at the heart of the adaptive practices that clinicians enact. These adaptive practices are highly interconnected to the changes that the ICU environment may require, and thus produce a dynamic order in the unit. Second, the researcher herself, in order to come to terms with the complexity and un/certainty of the ICU environment must also enter un/certainty in order to gain traction with the ICU environment: unpredictability and complexity cannot be studied from a neat and disengaged distance. Third, the presence of un/certainty in the ICU can be significant and enabling rather than disabling for clinicians in their ongoing pursuit of dynamically ordering practice. The contribution of un/certainty to frontline practice is as a central driver to managing change and complexity. Therefore it should be positively revalued by health services researchers, policy makers and clinicians alike.
65

Mapping the complexity of computer learning: journeying beyond teaching for computer competency to facilitating computer

Phelps, Renata Unknown Date (has links)
For future generations to maximise their capability to operate within technologically driven economies, it is critical to foster computer abilities at every level of the schooling process. Teachers are central to this process. Yet, for many teachers, the need to integrate computer use in their teaching is threatening and overwhelming. This thesis argues that, given the rapid rate of technological change, skills-based approaches to computer education inadequately prepare teachers for a career of continued technological change. Effective computer education for teachers requires more than skills training. It involves changes in attitudes, values and beliefs that provide confidence for ongoing learning. Furthermore, it involves learning to adapt to change, to be flexible, intuitive and above all persistent. It requires the fostering of teachers who know how to be self-directed and independent in their computer learning, rather than those dependent on structured routines or guidelines. This thesis is the ‘story’ of an action research initiative underpinned by a belief in the importance of approaches to computer education which foster lifelong computer learning. It traces the journey of a reflexive process of change and iterative development in the teaching of an educational information technology (computer) unit to pre-service teacher education students. Over a period of three years (1999-2001) I pursued a central research question, namely: How can I develop my teaching practice to better facilitate the development of capable computer users? The research explores the distinction between a ‘competent’ and a ‘capable’ computer user and trials a range of teaching and learning approaches that aim to facilitate the development of capable computer users.From this constructivist research and teaching process a multidimensional approach to computer education emerged, founded on metacognition and reflection. This approach is demonstrated to offer many advantages over a skills-focused approach. This thesis maps the complexity of the computer learning and teaching context, arguing that simplistic approaches to teaching will produce narrow and limited learning outcomes. Rather, a holistic approach is proposed, one that moves beyond the development of computer competency toward a longer term vision of facilitating computer capability. It is argued that the role of the computer ‘teacher’ is to foster reflective awareness and develop a learning environment that can assist computer learners to become comfortable existing on the ‘edge of chaos’.This research supports previous studies which indicate the important role of computer self efficacy and the influence of factors such as perceived usefulness, anxiety, support and frequency and duration of use. However, the research also documents the unpredictable influence of these factors on individuals’ resultant approach to computers and challenges dichotomous interpretations of such factors. Appropriate attribution is also shown to be a major influence on computer capability, as are factors such as help-seeking, motivation and goal-setting, although again, these influences are non-linear. It is argued that computer capability cannot be ‘taught’ but, rather, computer educators should look to creating environments where its emergence can be facilitated. The metacognitive computer learning context developed and explored through this research is one such approach.
66

Mapping the complexity of computer learning: journeying beyond teaching for computer competency to facilitating computer

Phelps, Renata Unknown Date (has links)
For future generations to maximise their capability to operate within technologically driven economies, it is critical to foster computer abilities at every level of the schooling process. Teachers are central to this process. Yet, for many teachers, the need to integrate computer use in their teaching is threatening and overwhelming. This thesis argues that, given the rapid rate of technological change, skills-based approaches to computer education inadequately prepare teachers for a career of continued technological change. Effective computer education for teachers requires more than skills training. It involves changes in attitudes, values and beliefs that provide confidence for ongoing learning. Furthermore, it involves learning to adapt to change, to be flexible, intuitive and above all persistent. It requires the fostering of teachers who know how to be self-directed and independent in their computer learning, rather than those dependent on structured routines or guidelines. This thesis is the ‘story’ of an action research initiative underpinned by a belief in the importance of approaches to computer education which foster lifelong computer learning. It traces the journey of a reflexive process of change and iterative development in the teaching of an educational information technology (computer) unit to pre-service teacher education students. Over a period of three years (1999-2001) I pursued a central research question, namely: How can I develop my teaching practice to better facilitate the development of capable computer users? The research explores the distinction between a ‘competent’ and a ‘capable’ computer user and trials a range of teaching and learning approaches that aim to facilitate the development of capable computer users.From this constructivist research and teaching process a multidimensional approach to computer education emerged, founded on metacognition and reflection. This approach is demonstrated to offer many advantages over a skills-focused approach. This thesis maps the complexity of the computer learning and teaching context, arguing that simplistic approaches to teaching will produce narrow and limited learning outcomes. Rather, a holistic approach is proposed, one that moves beyond the development of computer competency toward a longer term vision of facilitating computer capability. It is argued that the role of the computer ‘teacher’ is to foster reflective awareness and develop a learning environment that can assist computer learners to become comfortable existing on the ‘edge of chaos’.This research supports previous studies which indicate the important role of computer self efficacy and the influence of factors such as perceived usefulness, anxiety, support and frequency and duration of use. However, the research also documents the unpredictable influence of these factors on individuals’ resultant approach to computers and challenges dichotomous interpretations of such factors. Appropriate attribution is also shown to be a major influence on computer capability, as are factors such as help-seeking, motivation and goal-setting, although again, these influences are non-linear. It is argued that computer capability cannot be ‘taught’ but, rather, computer educators should look to creating environments where its emergence can be facilitated. The metacognitive computer learning context developed and explored through this research is one such approach.
67

Intellectual capital in action: Australian studies

Dumay, Johannes Cornelius January 2008 (has links)
Doctor of Philosophy (PhD) / The overarching objective of this thesis is to investigate and examine several contemporary IC theories and how they are utilised in practice so that understandings of the IC concept can be developed, in order to answer in part the main research question of “How does IC in action influence organisations?” The content of the thesis is based on a review of IC from both a theory and practice perspective and four empirical papers that examines IC theory as it is implemented in practice. In combining these papers into a coherent piece of work, a critical research perspective, as outlined by Alvesson and Deetz (2000), has been utilised as the theoretical framework. The term ‘critical’ is used in this thesis not to find fault with contemporary theory and practice of IC but rather to examine and question the application of IC theory into practice. The end result of doing so is the narrowing of an identified gap between IC theory and practice. A ‘critical’ analysis of IC in action is justified because the development of the concept of IC parallels that of ‘critical’ theory in that both have evolved from changing conditions in society as technology and the proliferation of knowledge that have fundamentally altered the conditions under which organisations operate. The overarching findings of the thesis are based on three outcomes of critical research being insight, critique and transformative re-definitions. Insight into IC is developed by examining contemporary IC frameworks as they have been applied. Critique is developed by putting to the test the implications for organisations as a result of implementing these contemporary IC frameworks. Last, transformative re-definition is achieved by opening a discourse on the impact of implementing IC practices so that academics and practitioners can develop critical, relevant and practical understandings that begins the process of change and develops practical managerial skills. More importantly this thesis identifies how the development of tools to reduce ‘causal ambiguity’ about how intangible resource help create (or destroy) value has the potential to raise the profile of IC as a strategic management technology. But from the wider view of the critical perspective, it is not the intention of this thesis to prescribe specific formulae for the measuring, management and reporting of IC, nor does it intend to further develop theory. So while the individual papers may proffer that certain avenues proved productive in developing insights, critique and transformative re-definition, these avenues are not offered as the preferred way of investigating IC. More specifically the goal of a critical perspective is to open a discourse. The opprurtinity for academics and practitioners to engage in discourse is enabled by the thesis’ focus on the issues identified by highlighting the gap between IC theory and practice. Furthermore, each of the included papers offers the opportunity for further discourse by way of the opportunities that remain for future research. Additionally, the thesis achieves exemplifies a number of different approaches to conducting research into IC practice that puts to the test particular aspects of IC theory in order to develop insights and understandings of IC in practice. As the empirical material only examines a fraction of contemporary IC theory there is scope for further research and thus discourse into the implementation of IC theory into IC practice. This future research should not be constrained by a particular method of research as exemplified in the variety of methods employed to gather the empirical material for the papers which stretches along the continuum of qualitative and quantitative research. This too provides an avenue of for future discourse.
68

Complexities of Parsing in the Presence of Reordering

Berglund, Martin January 2012 (has links)
The work presented in this thesis discusses various formalisms for representing the addition of order-controlling and order-relaxing mechanisms to existing formal language models. An immediate example is shuffle expressions, which can represent not only all regular languages (a regular expression is a shuffle expression), but also features additional operations that generate arbitrary interleavings of its argument strings. This defines a language class which, on the one hand, does not contain all context-free languages, but, on the other hand contains an infinite number of languages that are not context-free. Shuffle expressions are, however, not themselves the main interest of this thesis. Instead we consider several formalisms that share many of their properties, where some are direct generalisations of shuffle expressions, while others feature very different methods of controlling order. Notably all formalisms that are studied here have a semi-linear Parikh image, are structured so that each derivation step generates at most a constant number of symbols (as opposed to the parallel derivations in for example Lindenmayer systems), feature interesting ordering characteristics, created either by derivation steps that may generate symbols in multiple places at once, or by multiple generating processes that produce output independently in an interleaved fashion, and are all limited enough to make the question of efficient parsing an interesting and reasonable goal. This vague description already hints towards the formalisms considered; the different classes of mildly context-sensitive devices and concurrent finite-state automata. This thesis will first explain and discuss these formalisms, and will then primarily focus on the associated membership problem (or parsing problem). Several parsing results are discussed here, and the papers in the appendix give a more complete picture of these problems and some related ones.
69

"Walking the line between structure and freedom" : a case study of teachers' responses to curriculum change using complexity theory

Hetherington, Lindsay Ellen Joan January 2012 (has links)
This thesis uses complexity theory to explore education in the context of a changing curriculum called ‘Opening Minds’. This new curriculum was introduced in the case study school in response to a wider curriculum change which emphasised ‘learning to learn’ and the development of ‘skills for the 21st Century’. In this study, a ‘complexity thinking’ theoretical framework was adopted, drawing especially on the work of Osberg and Biesta (Osberg et al., 2008, Osberg and Biesta, 2007, Biesta and Osberg, 2007) and Davis and Sumara (2006; 2007), paying particular attention to concepts of emergence and complexity reduction. Complexity theory, through the ‘logic of emergence’ offers a challenge to mechanistic approaches to understanding the world which, despite the work of postmodern and poststructural scholars in education, remains dominant in educational practice. The Opening Minds curriculum that is the focus of this case study demonstrated the potential to challenge this mechanistic approach, as the teachers expressed a desire to work in different, flexible and creative ways: this thesis therefore explores complexity theory’s challenge to a mechanistic approach in this particular case. It also addresses the relationship between Opening Minds and science education using complexity thinking. To facilitate exploration and analysis of the case, concepts of temporal and relational emergence and complexity reduction to develop a ‘complexity thinking’ understanding of concepts of agency/structure, power, identity and reflexivity. This entailed reconceptualisation of these ideas in a temporal-relational sense that explicitly incorporates a sensitivity to emergence. Specifically, an additional dimension to Emirbayer and Mische’s (1998) construction of multidimensional agency was added: that of creative agency. The research was conducted as a case study in which a ‘bricolage’ approach to data collection and analysis was used as part of an explicitly ‘complex’ methodology, addressing questions of the challenge of complexity reduction and ethics in research drawing on complexity theory. The findings indicated a challenge for teachers in negotiating tensions as they attempted to adopt approaches that could be considered ‘emergent’ alongside other ‘mechanistic’ practices. These tensions were explored in detail in relation to the concept of ‘reflection’, and in the interaction between science and Opening Minds. Bringing together the empirical and theoretical work in this study, it is suggested that mechanistic and emergent aspects may helpfully be viewed as a ‘vital simultaneity’ within the educational relationship (Davis, 2008) with the interaction between them facilitated by creative agency within a ‘pedagogy of interruption’ (Biesta, 2006). It was further argued that reflection could be used in responsive and flexible ways to support both learning and assessment as a crucial aspect of a pedagogy of interruption. Such a ‘contingently responsive and creative pedagogy’ may support the interaction between science and Opening Minds productively. It is suggested that complex approach to a pedagogy of interruption could support teachers in engaging with the creative and diverse elements of science or learning to learn curricula whilst maintaining the mechanistic aspects of teaching that support students in learning key concepts and skills.
70

Quantum stochastic processes and quantum many-body physics

Bausch, Johannes Karl Richard January 2017 (has links)
This dissertation investigates the theory of quantum stochastic processes and its applications in quantum many-body physics. The main goal is to analyse complexity-theoretic aspects of both static and dynamic properties of physical systems modelled by quantum stochastic processes. The thesis consists of two parts: the first one addresses the computational complexity of certain quantum and classical divisibility questions, whereas the second one addresses the topic of Hamiltonian complexity theory. In the divisibility part, we discuss the question whether one can efficiently sub-divide a map describing the evolution of a system in a noisy environment, i.e. a CPTP- or stochastic map for quantum and classical processes, respectively, and we prove that taking the nth root of a CPTP or stochastic map is an NP-complete problem. Furthermore, we show that answering the question whether one can divide up a random variable $X$ into a sum of $n$ iid random variables $Y_i$, i.e. $X=\sum_{i=1}^n Y_i$, is poly-time computable; relaxing the iid condition renders the problem NP-hard. In the local Hamiltonian part, we study computation embedded into the ground state of a many-body quantum system, going beyond "history state" constructions with a linear clock. We first develop a series of mathematical techniques which allow us to study the energy spectrum of the resulting Hamiltonian, and extend classical string rewriting to the quantum setting. This allows us to construct the most physically-realistic QMAEXP-complete instances for the LOCAL HAMILTONIAN problem (i.e. the question of estimating the ground state energy of a quantum many-body system) known to date, both in one- and three dimensions. Furthermore, we study weighted versions of linear history state constructions, allowing us to obtain tight lower and upper bounds on the promise gap of the LOCAL HAMILTONIAN problem in various cases. We finally study a classical embedding of a Busy Beaver Turing Machine into a low-dimensional lattice spin model, which allows us to dictate a transition from a purely classical phase to a Toric Code phase at arbitrarily large and potentially even uncomputable system sizes.

Page generated in 0.0535 seconds