• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 824
  • 560
  • 151
  • 77
  • 65
  • 43
  • 16
  • 14
  • 12
  • 9
  • 9
  • 9
  • 7
  • 7
  • 5
  • Tagged with
  • 2033
  • 616
  • 589
  • 536
  • 316
  • 313
  • 218
  • 149
  • 143
  • 136
  • 130
  • 121
  • 113
  • 113
  • 108
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Heat Units and Stages of Plant Development

Fisher, W. D., Pegelow, E. J. 03 1900 (has links)
The 1985 and 1986 Cotton Reports have the same publication and P-Series numbers.
152

Relationships among patient characteristics, care processes, and outcomes for patients in coronary care units (CCUs)

Chao, Shir-Ley January 1988 (has links)
The purpose of this research was to describe the relationships among patient characteristics, care processes, and care outcomes for patients in a coronary care unit (CCU). The sample consisted of 179 CCU patients. Data collectors reviewed charts and retrieved the chart information needed to measure the operational variables of APACHE II score (Acute Physiology and Chronic Health Evaluation II), years of age, CCU length of stay, nurse to patient ratio, and mortality. Descriptive statistics were used to analyze the demographic data of the patient characteristics. Correlational statistics were used to analyze the five operational variables in the "CCU Patient Outcomes Model." Pearson correlations revealed significant positive relationships between APACHE II score and age and nurse to patient ratio. Point Biserial correlations revealed significant positive relationships between mortality and APACHE II score and nurse to patient ratio. Patient characteristics were related to care processes. Patient characteristics and care processes were related to patient outcomes.
153

Experimental pool boiling investigation of FC-72 on silicon with artificial cavities, integrated temperature micro-sensors and heater

Hutter, Christian January 2010 (has links)
Today nucleate boiling is widely used in numerous industrial applications such as cooling processes because of the high achieved heat transfer rates for low temperature differences. It remains a possible cooling solution for the next generation of central processing units (CPU), which dissipate heat fluxes exceeding the capabilities of today’s conventional forced air cooling. However, nucleate boiling is a very complex and elusive process involving many mechanisms which are not fully understood yet and a comprehensive model is still missing. For this study a new experimental setup was designed, constructed and commissioned to investigate bubble nucleation, growth, departure and interaction during nucleate pool boiling from a silicon device fully immersed in fluorinert FC-72. The location of bubble nucleation is controlled by artificial cavities etched into the silicon substrate. Boiling is initiated with a heater integrated on the back and micro-sensors indicate the wall temperature at the bubble nucleation site. During this work three different silicon test section designs were fabricated and boiling experiments on these substrates successfully conducted. Bubble growth, bubble departure frequencies and bubble departure diameters for different dimensioned artificial cavities, varied pressure and increasing wall temperature were measured from high-speed imaging sequences. Bubble interactions like vertical and horizontal coalescence were visualised and their impact on the boiling heat transfer investigated. The influence of spacing between two neighbouring artificial cavities on bubble nucleation and departure frequencies, vertical coalescence frequencies and departure diameters was analysed. The acquired data are used as input for a numerical code developed by our collaborators (Brunel University, UK and Los Alamos National Laboratories, USA) and are a first step to validate the code. The code studies the interactions between bubble nucleation sites on solid surfaces as a network. The simulations will help design boiling substrates utilised for chip cooling applications with optimal artificial cavity distribution to maximise the cooling heat transfer.
154

DNA barcodes and meiofaunal identification

Mann, Jenna D. January 2010 (has links)
In recent years there has been a desire to definitively catalogue the life on our planet. In light of the increasing extinction rates that are driven by human activities, it is unlikely that this will be achieved using traditional methods. Whilst most organisms which have a body size of more than 1cm have been described, the vast majority of animal life is smaller than this, collectively known as meiofauna, and is yet to be catalogued. Meiofaunal organisms present a range of problems for traditional taxonomy. Firstly they are microscopic, meaning that morphological features are often difficult to resolve. Secondly these creatures often exhibit cryptic diversity meaning that different species often look the same. Thirdly, it is often the case that the organisms are poorly described in the literature making it very difficult to confirm identification, assuming that someone has already described it. It is possible, however, to obtain DNA sequences from these organisms. DNA barcoding, the use of short sequences of DNA to identify individuals, is now commonly used in a wide range of applications. It has been proposed that a single target gene should be sufficient to describe all organisms this way. Barcodes can be acquired from individuals or from bulk extractions from environmental samples. In the latter case, many of the sequences obtained are novel and unlikely to ever have a type specimen associated with them. When this is the case, assessing the diversity of a sample becomes a computational exercise. However, as yet, there is no agreed standard method adopted for analyzing the barcodes produced. Indeed most methods currently employed lack objectivity. This thesis investigates the efficiency of a range of gene targets and analysis methods for DNA barcoding, with an emphasis on meiofaunal organisms (nematodes, tardigrades and thrips). DNA barcodes were generated for up to three genes for each specimen. Sequences for each gene were analysed using two programs, MOTU_define.pl and DOTUR. These programs use different methods to assign sequences to operational taxonomic units (OTU), which were then compared. An objective method for analysing sequences such as MOTU_define.pl, which relies on only the information contained in the sequences, was found to be most suitable for designating taxa. It does not attempt to apply evolutionary models to the data, and then infer taxa from the derived data. In addition to barcoding, some samples were pre-processed using video capture and editing (VCE). This creates a virtual slide of a specimen so that a sequence can be linked to a morphological identification. VCE proved to be an efficient method to preserve morphological data from specimens.
155

Explaining anomalies : an approach to anomaly-driven revision of a theory

Moss, Laura Elizabeth January 2010 (has links)
This thesis focuses on the explanation of anomalies as an approach to anomaly-driven revision of a theory. An anomaly is identified when a theory (or model of a domain) does not accurately reflect a domain observation, indicating that the theory (or model) requires refinement. In some cases an explanation can be generated for an anomalous observation using existing domain knowledge and hence a revision to the existing theory can be provided. Ontologies have been used in both stages of an investigation presented in this thesis; in the first stage, a domain ontology and expert-acquired strategies have been used as part of a knowledge-based system, EIRA (Explaining, Inferencing, and Reasoning about Anomalies), to generate explanations for an anomaly; in the second stage, domain ontologies have been used to suggest refinements to an incorrect or incomplete domain ontology. In the first stage of the investigation, extensive interviews were held with domain experts; the analysis of which led to the identification of both examples of anomalies encountered in the domain and the strategies used by the domain experts to provide (appropriate) explanations for the anomalies. EIRA is able to replicate these explanations; when EIRA is presented with an anomaly, potential explanations are generated by the application of expert-acquired strategies to the domain knowledge, patient data, and information about the clinical situation. To evaluate this approach, EIRA has been applied in the Intensive Care Unit (ICU) domain and ICU clinicians have evaluated the explanations produced by EIRA. The strategies used by EIRA have been abstracted further to form generic strategies for anomaly resolution. In the second stage, EIRA has been extended to investigate the use of domain ontologies to suggest refinements to an incomplete or incorrect ontology. These additional refinements are generated by reasoning about analogous concepts from the domain ontology. The findings described in this thesis support the belief that ontologies can be used to generate explanations to refine a theory, further, that the extensive domain knowledge contained in an ontology allows for sophisticated refinements of a knowledge base. Previous approaches to theory revision have largely focused on the refinement of an instantiated rule base, in which limited domain knowledge is incorporated in the rules and hence the refinements are essentially captured in a particular knowledge base. In these earlier approaches, refinements to remove the anomaly were generally suggested after applying machine learning techniques on data from the domain; however, this process requires large datasets, the refinements generated are not always acceptable to domain experts, and providing explanations (using an ontology) to account for anomalies have not been investigated. I believe that the findings reported in this thesis are significant and make a number of contributions including a novel approach to anomaly-driven revision of a theory.
156

Understanding the water balance of basement complex areas in Sokoto Basin, North-West Nigeria for improved groundwater management

Abubakar Gada, Murtala January 2014 (has links)
Understanding water balances is essential for sustainable water resource management, especially in semi-arid basement complex areas where there are large demands for water supplies, but the complex hydrogeological conditions limit groundwater development. This research presents an approach for water balance estimation based on the conceptual and computational modelling of six major landscape unit’s which have been classified on the basis of their differing hydrological responses. Detailed conceptual models of the hydrological processes operating in each of the different landscapes in a catchment and the processes that control water movement between the different landscapes are developed based on data analysis, fieldwork and literature. Two computational models (the WaSim soil water balance model and a new water balance model for bare rock) are used to estimate the daily water balance of each of the landscape units taking cognisance of their interconnectivity which includes runoff becoming run-on. Water balance simulations were run for the individual landscapes using input data from the semi-arid Sokoto Basin in Northwest Nigeria, and outputs for representative wet and dry years are used to demonstrate the reliability of model responses. The individual landscapes outputs were subsequently integrated, taking account of their area weighted contributions, to give a catchment-scale water balance which compares favourably with the observed river discharge at Fokku. The catchment water balance results reveal that AET accounts for the largest loss in the catchment at 72 % of the average rainfall for 37 years. This is followed by the groundwater flow to rivers, then runoff to rivers, representing 16% and 11 % of the average rainfall. This research has provided valuable insights into hydrological behaviour of the basement complex system and the effect of landscape variability on the water balance of these areas. The research suggested a rational approach to groundwater resource management in the basement complex areas that takes cognizance of the hydrological behaviour of different landscape units, focussing on areas with deep weathered material within the catchment identified in this research. The research specifically stresses the need to apply methods of water conservation during excess rainfall for future use in the dry season.
157

Testování anglických kolokací u českých studentů angličtiny / Testing English collocations in Czech learners of English

Vaňková, Lenka January 2012 (has links)
The present study focuses on English collocations and their testing in Czech learners of English. Since the importance of collocations and phraseology as a linguistic discipline has not been recognised until recently, these concepts are introduced from the general point of view. First, the attention is given to the history of phraseology and its underlying principles as well as to the major approaches to it. Second, phraseological units are described from the point of view of their categorisation and of some influential typologies of these units. The main focus of interest is collocations, particularly major approaches to them as well as criteria commonly used to describe and delimit collocations from other types of prefabricated units and their definition and classification. Previous research on collocations in learner English is also outlined. The analytical part (based on Granger's (1998) study) analyzes the results of testing English collocations in Czech learners of English. The main focus is on whether these results confirm or deny the results of Granger's study, as well as on the learners' knowledge of and ability to use collocations, particularly on their (non)attaining the native "ideal."
158

Application of stream processing to hydraulic network solvers

24 October 2011 (has links)
M.Ing. / The aim of this research was to investigate the use of stream processing on the graphics processing unit (GPU) and to apply it into the hydraulic modelling of a water distribution system. The stream processing model was programmed and compared to the programming on the conventional, sequential programming platform, namely the CPU. The use of the GPU as a parallel processor has been widely adopted in many different non-graphic applications and the benefits of implementing parallel processing in these fields have been significant. They have the capacity to perform from billions to trillions of floating-point operations per second using programmable shader programs. These great advances seen in the GPU architecture have been driven by the gaming industry and a demand for better gaming experiences. The computational performance of the GPU is much greater than the computational capability of CPU processors. Hydraulic modelling of water distribution systems has become vital to the construction of new water distribution systems. This is because water distribution networks are very complex and are nonlinear in nature. Further, modelling is able to prevent and anticipate problems in a system without physically building the system. The hydraulic model that was used was the Gradient Method, which is the hydraulic model used in the EPANET software package. The Gradient Method produces a linear system which is both positive-definite and symmetric. The Cholesky method is currently being used in the EPANET algorithm in order to solve the linear equations produced by the Gradient Method. Thus, a linear solution method had to be selected for the use in both parallel processing on the GPU and as a hydraulic network solver. The Conjugate Gradient algorithm was selected as an ideal algorithm as it works well with the hydraulic solver and could be converted into a parallel algorithm on the GPU. The Conjugate Gradient Method is one of the best-known iterative techniques used in the solution of sparse symmetric positive definite linear systems. The Conjugate Gradient Method was constructed both in the sequential programming model and the stream processing model, using the CPU and the GPU respectively on two different computer systems. The Cholesky method was also programmed in the sequential programming model for both of the computer systems. A comparison was made between the Cholesky and the Conjugate Gradient Methods in order to evaluate the two methods relative to each other. The findings in this study have shown that stream processing on the GPU can be used in the parallel GPU architecture in order to perform general-purpose algorithms. The results further affirmed that iterative linear solution methods should only be used for large linear systems.
159

Sistema simulador del sonar de los submarinos. Una aplicación para la Marina de Guerra del Perú

Arias Bailly, Bruno January 2008 (has links)
In this Thesis work the development of a Software Simulation that allows closest offer at training to the reality for sonar operator’s personal in formation and described System sets out; with the objective to present the operability on board to Sonar device of the submarine units, as well as interaction with existing simulation modules inside the simulator Attack of Submarine Force of the Navy military of Peru, Located at the Submarine School. The Thesis is divided in four parts: the Introduction to the proposed subject and its problematic; Theoretical Frame that corresponds to variables that include the reach of this work; the State of the Art where a case in individual of a Sonar Simulator Software constructed under another programming language is exposed, which does not have similarity with electronic equipment sonar on board of the submarine units; finally the exposition of a development proposal, improvement and execution of a sonar simulator software that fulfills the functionality of the original electronic equipment on board. The hypothesis is in demonstrating that it’s possible to develop a sonar simulator software that allows to carry out functionalities of the original electronic equipment. Proven this throughout the Thesis, the objective is obtained: to demonstrate that it’s possible to construct a sonar simulator software that shows the contacts, generate the noise corresponding, sent the data necessary to other simulation modules, and simulating the Test containing in the original equipment. / En este trabajo de Tesis se propone el desarrollo de un Software de Simulación que permita brindar un entrenamiento más cercano a la realidad al personal de sonaristas en formación y calificado; con el objetivo de dar a conocer la operabilidad del sonar integrado a bordo de las unidades submarinas, así como la interacción con los módulos de simulación existentes en simulador de ataque de la Fuerza de Submarinos de la Marina de Guerra del Perú, ubicado en la Escuela de Submarinos. La Tesis se divide en cuatro partes: la Introducción al tema propuesto y su problemática; el Marco Teórico que corresponde a las variables que engloban el alcance de este trabajo; el Estado del Arte donde se expone un caso en particular de un software de simulación de sonar desarrollado bajo otro lenguaje de programación, el cual no tiene similitud con el sonar existente a bordo de las unidades submarinas; finalmente el planteamiento de una propuesta de desarrollo, mejoramiento y ejecución de un sistema simulador de sonar que cumpla con las funcionalidades del equipo de a bordo y se integre al sistema de simulación existente. La hipótesis está en demostrar que es posible desarrollar un software de simulación del sonar que permita realizar las funcionalidades que contiene el equipo original. Probado esto a lo largo de la Tesis, se logra el objetivo: demostrar que es posible construir un software simulador de sonar que muestra los contactos, genere el ruido sintético de las hélices, envié los datos necesarios a los otros módulos de simulación, y simule las pruebas Test que contiene el equipo original.
160

Erfarenhet av debriefing inom akutsjukvården.

Bjurlefält, Peter January 2019 (has links)
Samfattning   Bakgrund: I vår vardag världen över drabbas människor oavsett kön och ålder av olyckor, dödsfall och andra traumatiska händelser. Vissa yrken medför en större risk att utsättas för traumatiska händelser. Exempel på dessa är ambulans, räddningspersonal, polis och vårdpersonal inom främst akutsjukvård. Dessa händelser kan ibland bli personen övermäktigt. Ett sätt att motverka bl.a. utbrändhet och posttraumatiskt stressymptom (PTSD) hos vårdpersonalen är att genomföra debriefingsamtal. Detta genomförs med den vårdpersonal som varit delaktig i en traumatisk händelse.     Syfte: Syftet med denna studie var att beskriva erfarenheten av debriefing hos vårdpersonal inom akutsjukvården i samband med traumatiska händelser i sin yrkesutövning.   Metod: Litteraturstudie som totalt innehåller 10 vetenskapliga artiklar varav fem är kvalitativa och fem är kvantitativa.   Resultat: Studieresultatet påvisar att debriefing uppfattas som ett positiv redskap när det gäller avlastning efter en traumatisk händelse för vårdpersonalen inom akutsjukvården. De främsta faktorerna som lyfts i studien är tid och plats för debriefing, debriefingens inverkan på kommunikationen mellan vårdpersonalen, val av debriefingledare och behovet av väl utformade riktlinjer för debriefingen.   Slutsats: I föreliggande studie påvisas att debriefing bör betraktas som ett effektivt verktyg att motverka psykisk ohälsa hos vårdpersonalen och att debriefing även förstärker kommunikationen mellan de olika professionella yrkesgrupperna inom akutsjukvården. Studien visar även att det är viktigt med väl utformade riktlinjer för att debriefingens syfte skall uppnås. / Abstract Background: In our everyday lives worldwide, people are affected regardless of gender and age by accidents, deaths and other traumatic events. Some professions involve a greater risk of being exposed to traumatic events. Examples of these are ambulance, emergency personnel, police and health care personnel, primarily in emergency care. These events can sometimes become overpowering. One way to counteract burnout and post-traumatic stress symptom (PTSD) in healthcare professionals is to carry out debriefing talks. This is done with the healthcare staff who has participated in a traumatic event.   Aim: The purpose of this study was to describe the experience of debriefing in healthcare professionals in emergency care in connection with traumatic events in their professional practice.   Method: Literature study that contains a total of 10 scientific articles, five of which are qualitative and five are quantitative.   Results: The study results show that debriefing is perceived as a positive tool when it comes to unloading after a traumatic event for the healthcare staff in emergency care. The main factors raised in the study are time and place for debriefing, debriefing impact on the communication between the healthcare staff, the choice of debriefing leader and the need for well-designed guidelines for debriefing.   Conclusion: The present study demonstrates that debriefing should be regarded as an effective tool for counteracting mental illness in the healthcare staff and that debriefing also reinforces communication between the various professional occupational groups in emergency care. The study also shows that well-designed guidelines are important for achieving the purpose of the debriefing.

Page generated in 0.0165 seconds