• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1314
  • 700
  • 234
  • 112
  • 97
  • 43
  • 36
  • 18
  • 16
  • 16
  • 15
  • 15
  • 11
  • 10
  • 10
  • Tagged with
  • 3151
  • 582
  • 547
  • 368
  • 355
  • 298
  • 296
  • 294
  • 237
  • 221
  • 215
  • 208
  • 191
  • 186
  • 180
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
431

Complexity and the practices of communities in healthcare : implications for an internal practice consultant

Briggs, Marion Christine Elizabeth January 2012 (has links)
Current literature regarding quality health services frequently identifies interprofessional collaboration (IPC) as essential to patient-centred care, sustainable health systems, and a productive workforce. The IPC literature tends to focus on interprofessionalism and collaboration and pays little attention to the concept of practice, which is thought to be a represented world of objects and processes that have pre-given characteristics practitioners can know cognitively and apply or manage correctly. Many strategies intended to support IPC simplify and codify the complex, contested, and unpredictable day-to-day interactions among interdependent agents that I argue constitute the practices of a community. These strategies are based in systems thinking, which understand the system as distinct from experience and subject to rational, linear logic. In this thinking, a leader can step outside of the system to develop an ideal plan, which is then implemented to unfold the predetermined ideal future. However, changes in health services and healthcare practices are often difficult to enact and sustain.This thesis problematises the concept of ‘practice’, and claims practices as thoroughly social and emergent phenomenon constituted by interdependent and iterative processes of representation (policies and practice guidelines), signification (sense making through negotiation and reflective and reflexive practices), and improvisation (acting into the circumstances that present at the point and in the moments of care). I argue that local and population-wide patterns are negotiated and iteratively co-expressed through relations of power, values, and identity. Moreover, practice (including the practice of leadership or consulting) is inherently concerned with ethics, which I also formulate as both normative and social/relational in nature. I argue that theory and practice are not separate but paradoxical phenomena that remain in generative tension, which in healthcare is often felt as tension between what we should do (best practice) and what we actually do (best possible practice in the contingent circumstances we find ourselves in). I articulate the implications this has for how knowledge and knowing are understood, how organisations change, and how the role of an internal practice consultant is understood. An important implication is that practice-based evidence and evidence-based practice are iterative and coexpressed(not sequential), and while practice is primordial, it is not privileged over theory.I propose that a practice consultant could usefully become a temporary participant in the practices of a particular community. Through a position of ‘involved detachment’, a consultant can more easily notice and articulate the practices of a community that for participants are most often implicit and taken for granted. Reflective and reflexive consideration of what is taken for granted may change conversations and thus be transformative.
432

Det är enklare i teorin… Om skolutveckling i praktiken : En fallstudie av ett skolutvecklingsprojekt i en gymnasieskola

von Schantz Lundgren, Ina January 2008 (has links)
This dissertation is a case study dealing with a school development project that took place in an upper secondary school as a result of a merger of two schools with different cultures. The project used a method called “Frirumsmodellen” and was planned to be conducted in three steps. The first was to carry out a cultural analysis in order to map the preconditions to start a school development project. The second was to carry out concrete actions and finally study eventual effects from such activities by doing a second cultural analysis. My role was to be a supervisor in the school development work, but at the same time study how this work was conducted and its impact in the ordinary school day. The dissertation takes its departure in the fact that schools are political governed. The mission of schools is never neutral; it is always an expression of behind laying social forces, ideologies and ideals of the contemporary society. Of this reason, there is a close connection between the macro political level and the micro political level. Another point of departure is the transition from a modern to a post modern society that gives the character to the changes that take place in schools. Steering of schools has partly been treated as a technical implementation problem. Schools contain on going conflicts between different interest groups that, more or less regularly, end up in educational reforms. These reforms generate school development activities in the single school. Undoubtedly, this makes school development to a complex process. At a rather late stage of the study I decided not to fulfil my task to follow the original plan. I instead let the school development project as a model to be in focus. The over all purpose was formulated: How is it possible to understand what happened in the school development project in the Falkgymnasiet and why was it not possible to carry it out as it was said in the project plan? To interpret what took place during the project I did create an interpretation frame of implementation and complexity theory that also made it possible to critically scrutinise the “Frirumsmodellen”. Already in an early stage of the process it was obvious that the “Frirumsmodellen” did not supply any tools to use and it became disconnected from the project. The project in it selves was marginalised and made invisible. The headmaster used the situation to change things she thought were important to develop. As a result, things happened, but most of the involved people did not at first hand connect this to the project. It is, of course, difficult in detail to say what caused what. The complexity theory successively made the hidden patterns revealed, hidden unofficial potentates visible, as well as unpredictable conditions that generated reactions from the personnel in front of a development work. Together this was rather efficient obstacles for not changing this school. I also discuss school development and implementation problems on a general level, for example, the possibility to transform a top-down initiated project to be bottom-up driven and using project as a tool for school development work. It was obvious that headmasters and teachers must be prepared to handle the ideological dimensions of problems schools have to face. Consequently, development work is about making problems visible and to handle these in the intersection point between the intentions of educational policies, pedagogical researchers, school administrators, headmasters, teachers and pupils. The ideological dimension also contains an existential issue. Do I as a teacher share the intentions for the development work? If not, how must I act? / <p>Finns som talbok. Inläst ur Växjö University Press, 2008 av talsyntes. Talboken omfattar 1 CD-ROM (18 tim., 33 min.)</p>
433

Role Based Hedonic Games

Spradling, Matthew 01 January 2015 (has links)
In the hedonic coalition formation game model Roles Based Hedonic Games (RBHG), agents view teams as compositions of available roles. An agent's utility for a partition is based upon which role she fulfills within the coalition and which additional roles are being fulfilled within the coalition. I consider optimization and stability problems for settings with variable power on the part of the central authority and on the part of the agents. I prove several of these problems to be NP-complete or coNP-complete. I introduce heuristic methods for approximating solutions for a variety of these hard problems. I validate heuristics on real-world data scraped from League of Legends games.
434

Diagnostic Evaluation of Watershed Models

Martinez Baquero, Guillermo Felipe January 2007 (has links)
With increasing model complexity there is a pressing need for new methods that can be used to mine information from large volumes of model results and available data. This work explores strategies to identify and evaluate the causes of discrepancy between models and data related to hydrologic processes, and to increase our knowledge about watershed input-output relationships. In this context, we evaluate the performance of the abcd monthly water balance model for 764 watersheds in the conterminous United States. The work required integration of the Hydro-Climatic Data Network dataset with various kinds of spatial information, and a diagnostic approach to relating model performance with assumptions and characteristics of the basins. The diagnostic process was implemented via classification of watersheds, evaluation of hydrologic signatures and the identification of dominant processes. Knowledge acquired during this process was used to test modifications of the model for hydrologic regions where the performance was "poor".
435

Measuring Nursing Care Complexity in Nursing Homes

Velasquez, Donna Marie January 2005 (has links)
The quality of care in nursing homes has generally improved since the implementation of the OBRA-1987; however reports of serious problems such as inadequate pain management, pressure sores, malnutrition, and urinary incontinence persist. While the primary concern remains lack of staffing, investigators have found that even the highest staffed nursing homes are deficient in some care processes. It has been suggested that a lack of effective management structure may be a contributing factor. There is theoretical and empirical evidence to suggest that effective management structure is best guided by the complexity of work performed by the organization. The purpose of this study was to develop a reliable and valid instrument to measure nursing care complexity in nursing homes. Items were developed based on a comprehensive review of the literature and the adaptation of items from existing instruments to make them relevant to the nursing home setting. Content validity was evaluated by nurse experts with extensive knowledge of the theory and/or nursing home care. One hundred sixty-eight direct care providers from seven nursing homes located in central and southern Arizona participated in the study.Reliability was estimated using Cronbach's alpha. Reliabilities using individual level data were generally acceptable for a new scale, however, the alpha for the client technology subscale was low (total scale = .78, client technology = .65, operations technology = .78, and knowledge technology = .79). Exploratory factor analysis demonstrated three domains of nursing care complexity as conceptualized. Explained variance for the 3 factors was 36.19%. There was a very modest correlation of the instrument with an established instrument of work unit technology and a modified magnitude estimate of nursing care complexity. One subscale (knowledge technology) discriminated between nursing subunits in the nursing home.The instrument demonstrated modest psychometric properties in measuring nursing care complexity in nursing homes. The strength of the instrument is its ability to measure domains of work complexity based on theory from organizational and nursing science. Further investigation is needed to strengthen the psychometric properties of the instrument and to determine its usefulness in measuring nursing care complexity in nursing homes.
436

Low complexity channel models for approximating flat Rayleigh fading in network simulations

McDougall, Jeffrey Michael 30 September 2004 (has links)
The intricate dependency of networking protocols upon the performance of the wireless channel motivates the investigation of network channel approximations for fading channels. Wireless networking protocols are increasingly being designed and evaluated with the assistance of networking simulators. While evaluating networking protocols such as medium access control, routing, and reliable transport, the network channel model, and its associated capacity, will drastically impact the achievable network throughput. Researcher relying upon simulation results must therefore use extreme caution to ensure the use of similar channel models when performing protocol comparisons. Some channel approximations have been created to mimic the behavior of a fading environment, however there exists little to no justification for these channel approximations. This dissertation addresses the need for a computationally efficient fading channel approximation for use in network simulations. A rigorous flat fading channel model was developed for use in accuracy measurements of channel approximations. The popular two-state Markov model channel approximation is analyzed and shown to perform poorly for low to moderate signal-to-noise ratios (SNR). Three novel channel approximations are derived, with multiple methods of parameter estimation. Each model is analyzed for both statistical performance and network performance. The final model is shown to achieve very accurate network throughput performance by achieving a very close matching of the frame run distributions. This work provides a rigorous evaluation of the popular two-state Markov model, and three novel low complexity channel models in both statistical accuracy and network throughput performance. The novel models are formed through attempts to match key statistical parameters of frame error run and good frame run statistics. It is shown that only matching key parameters is insufficient to achieve an acceptable channel approximation and that it is necessary to approximate the distribution of frame error duration and good frame run duration. The final novel channel approximation, the three-state run-length model, is shown to achieve a good approximation of the desired distributions when some key statistical parameters are matched.
437

Separation logic : expressiveness, complexity, temporal extension

Brochenin, Rémi 25 September 2013 (has links) (PDF)
This thesis studies logics which express properties on programs. These logics were originally intended for the formal verification of programs with pointers. Overall, no automated verification method will be proved tractable here- rather, we give a new insight on separation logic. The complexity and decidability of some essential fragments of this logic for Hoare triples were not known before this work. Also, its combination with some other verification methods was little studied. Firstly, in this work we isolate the operator of separation logic which makes it undecidable. We describe the expressive power of this logic, comparing it to second-order logics. Secondly, we try to extend decidable subsets of separation logic with a temporal logic, and with the ability to describe data. This allows us to give boundaries to the use of separation logic. In particular, we give boundaries to the creation of decidable logics using this logic combined with a temporal logic or with the ability to describe data.
438

Processing terror : an investigation into the immediate and short-term psychological effects of a terrorist attack

Jhangiani, Rajiv Sunil 05 1900 (has links)
In the years since the 9/11 attacks the incidence of terrorism has been on the rise. At the same time, news media coverage of major terrorist attacks has reached epic proportions, greatly expanding the number of individuals psychologically affected by terrorism. The goal of this dissertation is to better understand how individuals cope with terrorism experienced at a distance. Specifically, this investigation focuses on the impact of stress on integrative complexity (IC; a measure of cognitive processing; Suedfeld, Tetlock, & Streufert, 1992) during and shortly after a major terrorist event. Taken together, the findings from the three studies reported in this dissertation provide several insights into this process. Study 1 replicates and extends results from an earlier study of television newscasters reporting live on 9/11 (Jhangiani & Suedfeld, 2005), in the context of the 2005 London bombings and the medium of radio. In doing so, it provides the first empirical evidence outside of the research laboratory for the curvilinear relationship between stress and IC. Specifically, during the early stages of reports concerning the London bombings, a positive relationship is found between negative emotion and IC. However, once the nature and extent of the event become clearer, increases in negative emotion are related to decreases in IC (the disruptive stress hypothesis). Study 2 replicates this curvilinear relationship in the short-term reactions of two prominent political leaders to 9/11 and the 2005 London bombings. For one of these political leaders, the magnitude of his psychological reaction is moderated by the psychological distance between him and the victims of the attacks. Finally, Study 3 finds that two key personality variables, neuroticism and empathy, play important roles in determining the magnitude of the short-term psychological reactions to 9/11 of more than 250 students from Canada and the United States. This finding is particularly true for those students who were psychologically closer to the victims of the attacks. Implications, strengths and limitations of this research, and possible future directions are discussed.
439

Computing sparse multiples of polynomials

Tilak, Hrushikesh 20 August 2010 (has links)
We consider the problem of finding a sparse multiple of a polynomial. Given a polynomial f ∈ F[x] of degree d over a field F, and a desired sparsity t = O(1), our goal is to determine if there exists a multiple h ∈ F[x] of f such that h has at most t non-zero terms, and if so, to find such an h. When F = Q, we give a polynomial-time algorithm in d and the size of coefficients in h. For finding binomial multiples we prove a polynomial bound on the degree of the least degree binomial multiple independent of coefficient size. When F is a finite field, we show that the problem is at least as hard as determining the multiplicative order of elements in an extension field of F (a problem thought to have complexity similar to that of factoring integers), and this lower bound is tight when t = 2.
440

Assessment of Watershed Model Simplification and Potential Application in Small Ungaged Watersheds: A Case Study of Big Creek, Atlanta, GA

Comarova, Zoia A, Ms 11 August 2011 (has links)
Technological and methodological advances of the past few decades have provided hydrologists with advanced and increasingly complex hydrological models. These models improve our ability to simulate hydrological systems, but they also require a lot of detailed input data and, therefore, have a limited applicability in locations with poor data availability. From a case study of Big Creek watershed, a 186.4 km2 urbanizing watershed in Atlanta, GA, for which continuous flow data are available since 1960, this project investigates the relationship between model complexity, data availability and predictive performance in order to provide reliability factors for the use of reduced complexity models in areas with limited data availability, such as small ungaged watersheds in similar environments. My hope is to identify ways to increase model efficiency without sacrificing significant model reliability that will be transferable to ungaged watersheds.

Page generated in 0.075 seconds