• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 602
  • 214
  • 194
  • 161
  • 102
  • 55
  • 40
  • 39
  • 36
  • 26
  • 20
  • 14
  • 11
  • 10
  • 10
  • Tagged with
  • 1748
  • 506
  • 362
  • 339
  • 242
  • 215
  • 177
  • 150
  • 148
  • 148
  • 135
  • 127
  • 124
  • 122
  • 119
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Evaluation of information systems development in the NHS using NIMSAD framework

Kheong Lye, Sue January 1996 (has links)
The principal focus of the research effort was the management of information systems development to support the increased information needs arising from the radical health reforms of 1989. This was undertaken in collaboration with a purchaser and a provider within the health service. An action research approach was adopted wherein the researcher was actively involved in the development and successful implementation of an information system. Initial findings revealed a variety of factors hindering the purchaser and the provider from successfully developing the intended information systems to support the contracting process required in the reforms. A disparity in relative strengths between the purchaser and provider was considered a major constraint hindering the purchaser from developing the intended information system and performing their designated role in the new internal market system of the NHS. Through the rapid development of a computer-based information system the immediate needs of the purchaser and the provider were satisfied, and development of the individuals and the organisation took place. Subsequent to the development, a reflective post-intervention evaluation was carried out using a conceptual problem solving framework. Three important findings emerged from the systems development effort: [1] The employment of prototyping in the evolutionary development of the intended information system is considered to be particularly pertinent and responsive to the uncertain requirements of organisations undergoing change. [2] The embracing of a flexible blend of expert intervention and facilitation is an important element in the information systems development process. {3) The development of the individuals and the organisation is an intrinsic part of developing information systems. Using the NIMSAD framework for post-intervention evaluation of the development effort, various additional findings were abstracted from the critical evaluation and reflection on the adopted approach. The systems development process was evaluated against three identified elements - the problem situation, the problem solving process and the problem solver. Results of the evaluation and reflection revealed deficiencies in the research, which indicate that: [1] The appreciation of the context and content of the problem situation increases the level of understanding of the 'problems' leading to the adoption of appropriate methodologies for conducting the problem solving process. [2] The effectiveness of the adopted problem solving process can be enhanced by the validation of the client's definition of the problem, the facilitation of involvement from participants, the innovative use of prototyping and the need for evaluation of the process. [3] The personal characteristics of the problem solver significantly influence the possible solutions to the identified problems. Contributions from the evaluation of the research effort can be seen in: [1] The suggested reflexive model for action research, with emphasis on evaluation of the actions of the researcher as a problem solver. [2] The need to maintain close links with the client and communicate disparate perceptions of the problem and problem situation. [3] The employment of a flexible blend of expert intervention and facilitation (a hybrid approach enables the resolution of the problem from a multidisciplinary perspective). [4] Suggestion for further research into the personal characteristics of an effective problem solver.
322

Méthanisation par voie sèche discontinue des fumiers : optimisation des paramètres opérationnels du procédé / Optimization of the process parameters controlling dry anaerobic digestionof spent animal bedding in leach-bed reactors

Riggio, Silvio 29 June 2017 (has links)
La Digestion Anaérobie (DA), ou méthanisation, est un procédé qui permet le traitement de déchets organiques et la production d’énergie renouvelable sous forme de biogaz. La DA par voie sèche permet en particulier la valorisation de substrats solides, offrant plusieurs possibilités aux traitements de résidus d’origine agricole tels les fumiers, des substrats constitués d’un mélange de paille, fèces et urine accumulés dans les litières des étables. Parmi les technologies disponibles en méthanisation, les « leach-bed reactors » (LBRs), constituent une option valide mais toutefois peu connue et peu développée soit au niveau scientifique qu’industriel.Dans le but d’optimiser ce procédé, plusieurs problématiques ont été affrontées : (i) la caractérisation bio-physico-chimique du fumier et du potentiel énergétique exprimé dans un LBR; (ii) l’optimisation de l’inoculation des réacteurs et de la température de digestion ; (iii) la co-digestion du fumier avec un substrat facilement biodégradable et la problématique reliées à la gestion des acides gras volatiles (AGVs) ainsi produits.Les résultats montrent que le fumier est un substrat lentement biodégradable qui nécessite un long temps de digestion. Cependant, il s’agit d’un déchet agricole adapté à la valorisation par méthanisation et dont les rendements de dégradation et de production de méthane en LBRs sont intéressants industriellement. Ce substrat est par conséquent une ressource organique précieuse dans le contexte agricole.Il a été montré que le fumier bovin contient une population méthanogène active capable de démarrer un procédé de digestion anaérobie efficacement sans l’ajout d’un inoculum externe spécifique, autant en mode mésophile que thermophile. Une analyse économique a démontré que cette propriété peut être exploitée afin de diminuer les coûts d’investissement initiaux d’un projet à l’échelle industrielle, en favorisant de cette manière le développement de la filière. De plus, les résultats montrent que pour la digestion du fumier en LBRs le mode thermophile ne comporte aucun intérêt par rapport à la production finale de méthane (qui est similaire pour les deux régimes) et que, au contraire, la valorisation par cogénération du méthane produit en thermophile diminue le rendement de production électrique surtout à cause d’une production de méthane très importante en début de digestion. Le régime mésophile parait donc être le mode de fonctionnement le plus adapté dans ce contexte.Enfin, le rôle joué par la percolation du lixiviat sur la mobilisation des AGV accumulés dans la fraction solide a été mis en lumière dans un réacteur de co-digestion traitant une fraction de lentement biodégradable (le fumier) et une fraction facilement biodégradable. Une stratégie a été développée afin d’étudier le problème de l’extraction et de la consommation des AGV dans le but d’améliorer le rendement global du procédé.Pour conclure, ce travail a permis d’optimiser certains paramètres fondamentaux dans la gestion d’un LBR. Cette technologie s’est révélée efficace dans le traitement du fumier, autant en mono-digestion qu’en co-digestion avec un substrat facilement biodégradable. Ces recherches montrent que l’utilisation des LBR est appropriée au contexte agricole et que la modification des paramètres de contrôle permet à ce procédé de répondre efficacement aux problématiques du terrain. Ce travail représente une avancée significative vers la compréhension et le développement des LBRs pour le traitement des résidus agricole et, plus globalement, des énergies renouvelables mobilisant des biomasses agricoles / Anaerobic Digestion (AD) is a process which allows the treatment of organic waste and the production of renewable energy. In particular, dry AD allows the treatment of solid organic substrates, offering several possibilities to the enhancement of agricultural waste such as spent livestock bedding (a mixture of straw, faeces and urine). Among the available biotechnologies in AD, leach-bed reactor (LBRs) is a promising but yet poorly known process both at scientific and industrial level.In order to develop this process, several issues have been studied: (i) the bio-physico-chemical characterization of spent animal bedding and its digestion potential in LBRs; (ii) the optimization of the start-up and the operating temperature of the digesters; (iii) the co-digestion of spent animal bedding with an easily-degradable substrate and the issues connected to the management of the volatile fatty acids (VFAs) produced.The results showed that spent animal bedding is a slowly-degradable substrate which needs a long digestion time. However, it is a substrate suitable to be treated through AD displaying high degradation and methane production rates when processed in LBRs. This substrate is, therefore, a valuable organic resource in the agricultural context.Spent animal bedding was shown to contain an active methanogenic population able to start the process efficiently, both in thermophilic and mesophilic temperature, without requiring a specific external inoculation. An economic study at industrial scale proved that this peculiarity can be used to diminish the investment costs and then promote the development of this process. Moreover, thermophilic temperature was proved to be less advantageous over mesophilic condition. In fact, despite the very close methane yield reached in both temperature range, the different biogas production rates in thermophilic conditions would lead to a reduction of the final electric energy production in this condition. Mesophilic temperature was then shown to be the best operating condition for this process.Finally, the role played by the leachate recirculation in the mobilization of the VFAs accumulating in the solid bulk was highlighted in the case of a reactor co-digesting slowly- (spent livestock bedding) and easily-degradable substrates. A strategy was even proposed to efficiently face such a problem by optimizing both the VFA extraction and consumption with the objectives of increasing the overall process efficiency.In the end, this work allowed to optimize some important parameters for the correct management of the LBRs. This technology was proved to be efficient in the treatment of spent livestock bedding, both as a sole substrate or in co-digestion with an easily-degradable substrate. This research study demonstrates that LBRs is an adapted process for the agricultural context and this technology can easily answer to the full scale issues usually encountered. This work represents a significant advance towards the comprehension and development of LBRs to treat agricultural waste and, more generally, to the development of renewable energies based on biomass
323

Improving Storage with Stackable Extensions

Guerra, Jorge 13 July 2012 (has links)
Storage is a central part of computing. Driven by exponentially increasing content generation rate and a widening performance gap between memory and secondary storage, researchers are in the perennial quest to push for further innovation. This has resulted in novel ways to “squeeze” more capacity and performance out of current and emerging storage technology. Adding intelligence and leveraging new types of storage devices has opened the door to a whole new class of optimizations to save cost, improve performance, and reduce energy consumption. In this dissertation, we first develop, analyze, and evaluate three storage exten- sions. Our first extension tracks application access patterns and writes data in the way individual applications most commonly access it to benefit from the sequential throughput of disks. Our second extension uses a lower power flash device as a cache to save energy and turn off the disk during idle periods. Our third extension is designed to leverage the characteristics of both disks and solid state devices by placing data in the most appropriate device to improve performance and save power. In developing these systems, we learned that extending the storage stack is a complex process. Implementing new ideas incurs a prolonged and cumbersome de- velopment process and requires developers to have advanced knowledge of the entire system to ensure that extensions accomplish their goal without compromising data recoverability. Futhermore, storage administrators are often reluctant to deploy specific storage extensions without understanding how they interact with other ex- tensions and if the extension ultimately achieves the intended goal. We address these challenges by using a combination of approaches. First, we simplify the stor- age extension development process with system-level infrastructure that implements core functionality commonly needed for storage extension development. Second, we develop a formal theory to assist administrators deploy storage extensions while guaranteeing that the given high level goals are satisfied. There are, however, some cases for which our theory is inconclusive. For such scenarios we present an experi- mental methodology that allows administrators to pick an extension that performs best for a given workload. Our evaluation demostrates the benefits of both the infrastructure and the formal theory.
324

Multirobot Localization Using Heuristically Tuned Extended Kalman Filter

Masinjila, Ruslan January 2016 (has links)
A mobile robot needs to know its pose (position and orientation) in order to navigate and perform useful tasks. The problem of determining this pose with respect to a global or local frame is called localisation, and is a key component in providing autonomy to mobile robots. Thus, localisation answers the question Where am I? from the robot’s perspective. Localisation involving a single robot is a widely explored and documented problem in mobile robotics. The basic idea behind most documented localisation techniques involves the optimum combination of noisy and uncertain information that comes from various robot’s sensors. However, many complex robotic applications require multiple robots to work together and share information among themselves in order to successfully and efficiently accomplish certain tasks. This leads to research in collaborative localisation involving multiple robots. Several studies have shown that when multiple robots collaboratively localise themselves, the resulting accuracy in their estimated positions and orientations outperforms that of a single robot, especially in scenarios where robots do not have access to information about their surrounding environment. This thesis presents the main theme of most of the existing collaborative, multi-robot localisation solutions, and proposes an alternative or complementary solution to some of the existing challenges in multirobot localisation. Specifically, in this thesis, a heuristically tuned Extended Kalman Filter is proposed to localise a group of mobile robots. Simulations show that when certain conditions are met, the proposed tuning method significantly improves the accuracy and reliability of poses estimated by the Extended Kalman Filter. Real world experiments performed on custom-made robotic platforms validate the simulation results.
325

Zhodnocení nástrojů provozní analýzy jako podklad pro optimalizaci provozu / Evaluation of operating analysis tools as a base for traffic optimalization

Hrstka, Daniel January 2013 (has links)
Goal of the thesis is to explore, evaluate and, if it is possible, suggest new tools or improve the ones, being used at the moment, for operating analysis, which is used as one of the bases for traffic optimalization of public transport. One of the elements of the thesis is exploring standards of quality and finding out, if they standards are being met on the ground of collected data. At first i will describe and evaluate tools of operating analysis. After identifying possible shortcomings, i will create new outcomes or new tools, or just slightly correct the existing ones, if it is needed. Everything is created on a real data collected from traffic surveys from years 2008 and 2012 in specific cities. After that, i focus on standards of quality and if the standards are being met. For that I'm using graphs, written text and pictures for better understanding. Again everything is based on real data either from chosen transport companies or traffic surveys.
326

Interprofessional Collaboration in the Operating Room: A Nursing Perspective

Levesque, Marie-Julie 28 September 2021 (has links)
The aim of this thesis was to examine the contribution of nurses to interprofessional collaboration (IPC) in the operating room (OR) guided by the Interprofessional Education Collaborative Patient Care Practice (IECPCP) framework. First, a secondary analysis of interviews with 19 registered nurses was conducted. Twenty emergent themes were identified. The most prevalent of the four dimensions (internalization; shared goals and vision; governance; and formalization) consisted of the internalization dimension relating to human interaction and sense of belonging within the interprofessional team. A scoping review then identified 20 studies evaluating four interventions (briefings, checklists, team training, and debriefings) used to improve IPC in the OR. Despite weak study designs, these interventions showed improvements in communication, teamwork, and safety outcomes. OR nurses contribute mainly through interactional processes and they require organizational support to foster their efforts in IPC. Nurse are involved in all IPC interventions and their contribution is important to support IPC in the OR.
327

Vlastnosti systému reálného času v LabVIEW / Features real-time operating system for LabVIEW

Válek, Petr January 2012 (has links)
This thesis deals with certain properties of real time operating systems (RTOS) Ardence PharLap ETS, WindRiver VxWorks and LabVIEW RT module. Three methodologies of RTOS quality comparison are proposed, two of which are practically tested with described experiments on CompactRIO hardware from NI. Furthermore, there is explained need for RTOS usage and experimental comparison of jitter error influence on total harmonic distortion THD defined by two definitions.
328

Návrh pracovního bodu odstředivého čerpadla / The Operating Point of the Centrifugal Pump.

Konečná, Kateřina January 2008 (has links)
Target of diploma thesis The Operating Point of the Centrifugal Pump was create program for analyze system of pumps, it would be an effective instrument at projection these systems. Program is divided into two parts: system characteristic and performance characteristic. Subject of this thesis is create pumps database, enter performance characteristics and succeeding working with them. Program enable regulation pump with change speed and cooperation two pumps, work in parallel or series. The result of program is project system of pumps on enter parameters with more economy work.
329

An operating strategy of run-of-river abstractions for typical rural water supply schemes using Siloam Village as a case study

Makungo, Rachel 10 1900 (has links)
MESHWR / Department of Hydrology and Water Resources / See the attached abstract below
330

PRACTICAL CLOUD COMPUTING INFRASTRUCTURE

James A Lembke (10276463) 12 March 2021 (has links)
<div>Cloud and parallel computing are fundamental components in the processing of large data sets. Deployments of distributed computers require network infrastructure that is fast, efficient, and secure. Software Defined Networking (SDN) separates the forwarding of network data by switches (data plane) from the setting and managing of network policies (control plane). While this separation provides flexibility for setting network policies affecting the establishment of network flows in the data plane, it provides little to no fault tolerance for failures, either benign or caused by corrupted/malicious applications. Such failures can cause network flows to be incorrectly routed through the network or stop such flows altogether. Without protection against faults, cloud network providers using SDN run the risk of inefficient allocation of network resources or even data loss. Furthermore, the asynchronous nature existing protocols for SDN does not provide a mechanism for consistency in network policy updates across multiple switches.</div><div>In addition, cloud and parallel applications require an efficient means for accessing local system data (input data sets, temporary storage locations, etc.). While in many cases it may be possible for a process to access this data by making calls directly to a file system (FS) kernel driver, this is not always possible (e.g. when using experimental distributed FSs where the needed libraries for accessing the FS only exist in user space).</div><div>This dissertation provides a design for fault tolerance of SDN and infrastructure for advancing the performance of user space FSs. It is divided into three main parts. The first part describes a fault tolerant, distributed SDN control plane framework. The second part expands upon the fault tolerant approach to SDN control plane by providing a practical means for dynamic control plane membership as well as providing a simple mechanism for controller authentication through threshold signatures. The third part describes an efficient framework for user space FS access.</div><div>This research makes three contributions. First, the design, specification, implementation, and evaluation of a method for fault tolerant SDN control plane that is inter-operable with existing control plane applications involving minimal instrumentation of the data plane runtime. Second, the design, specification, implementation and evaluation of a mechanism for dynamic SDN control plane membership that all ensure consistency of network policy updates and minimizes switch overhead through the use of distributed key generation and threshold signatures. Third, the design, specification, implementation, and evaluation of a user space FS access framework that is correct to the Portable Operating System Interface (POSIX) specification with significantly better performance over existing user space access methods, while requiring no implementation changes for application programmers.</div>

Page generated in 0.1029 seconds