• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 855
  • 412
  • 156
  • 83
  • 79
  • 35
  • 26
  • 16
  • 16
  • 14
  • 13
  • 10
  • 9
  • 8
  • 8
  • Tagged with
  • 2066
  • 2066
  • 546
  • 431
  • 430
  • 382
  • 380
  • 202
  • 188
  • 164
  • 162
  • 155
  • 147
  • 147
  • 144
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Provision of Hospital-based Palliative Care and the Impact on Organizational and Patient Outcomes

Roczen, Marisa L 01 January 2016 (has links)
Hospital-based palliative care services aim to streamline medical care for patients with chronic and potentially life-limiting illnesses by focusing on individual patient needs, efficient use of hospital resources, and providing guidance for patients, patients’ families and clinical providers toward making optimal decisions concerning a patient’s care. This study examined the nature of palliative care provision in U.S. hospitals and its impact on selected organizational and patient outcomes, including hospital costs, length of stay, in-hospital mortality, and transfer to hospice. Hospital costs and length of stay are viewed as important economic indicators. Specifically, lower hospital costs may increase a hospital’s profit margin and shorter lengths of stay can enable patient turnover and efficiency of care. Higher rates of hospice transfers and lower in-hospital mortality may be considered positive outcomes from a patient perspective, as the majority of patients prefer to die at home or outside of the hospital setting. Several data sources were utilized to obtain information about patient, hospital, and county characteristics; patterns of hospitals’ palliative care provision; and patients’ hospital costs, length of stay, in-hospital mortality, and transfer to hospice (if a patient survived hospitalization). The study sample consisted of 3,763,339 patients; 348 urban, general, short-term, acute care, non-federal hospitals; and 111 counties located in six states over a 5-year study (2007-2011). Hospital-based palliative care provision was measured by the presence of three palliative care services, including inpatient palliative care consultation services (PAL), inpatient palliative care units (IPAL), and hospice programs (HOSPC). Derived from Institutional Theory, Resource Dependence Theory, and Donabedian’s Structure Process-Outcome framework, 13 hypotheses were tested using a hierarchical (generalized) linear modeling approach. The study findings suggested that hospital size was associated with a higher probability of hospital-based palliative care provision. Conversely, the presence of palliative care services through a hospital’s health system, network, or joint venture was associated with a lower probability of hospital-based palliative care provision. The study findings also indicated that hospitals with an IPAL or HOSPC incurred lower hospital costs, whereas hospitals with PAL incurred higher hospital costs. The presence of PAL, IPAL, and HOSPC was generally associated with a lower probability of in-hospital mortality and transfer to hospice. Finally, the effects of hospital-based palliative care services on length of stay were mixed, and further research is needed to understand this relationship.
352

A Comparison of Flare Forecasting Methods. III. Systematic Behaviors of Operational Solar Flare Forecasting Systems

Leka, K.D., Park, S-H., Kusano, K., Andries, J., Barnes, G., Bingham, S., Bloomfield, D.S., McCloskey, A.E., Delouille, V., Falconer, D., Gallagher, P.T., Georgoulis, M.K., Kubo, Y., Lee, K., Lee, S., Lobzin, V., Mun, J., Murray, S.A., Nageem, T.A.M.H., Qahwaji, Rami S.R., Sharpe, M., Steenburgh, R., Steward, G., Terkilsden, M. 08 October 2019 (has links)
Yes / A workshop was recently held at Nagoya University (31 October – 02 November 2017), sponsored by the Center for International Collaborative Research, at the Institute for Space-Earth Environmental Research, Nagoya University, Japan, to quantitatively compare the performance of today’s operational solar flare forecasting facilities. Building upon Paper I of this series (Barnes et al. 2016), in Paper II (Leka et al. 2019) we described the participating methods for this latest comparison effort, the evaluation methodology, and presented quantitative comparisons. In this paper we focus on the behavior and performance of the methods when evaluated in the context of broad implementation differences. Acknowledging the short testing interval available and the small number of methods available, we do find that forecast performance: 1) appears to improve by including persistence or prior flare activity, region evolution, and a human “forecaster in the loop”; 2) is hurt by restricting data to disk-center observations; 3) may benefit from long-term statistics, but mostly when then combined with modern data sources and statistical approaches. These trends are arguably weak and must be viewed with numerous caveats, as discussed both here and in Paper II. Following this present work, we present in Paper IV a novel analysis method to evaluate temporal patterns of forecasting errors of both types (i.e., misses and false alarms; Park et al. 2019). Hence, most importantly, with this series of papers we demonstrate the techniques for facilitating comparisons in the interest of establishing performance-positive methodologies. / We wish to acknowledge funding from the Institute for Space-Earth Environmental Research, Nagoya University for supporting the workshop and its participants. We would also like to acknowledge the “big picture” perspective brought by Dr. M. Leila Mays during her participation in the workshop. K.D.L. and G.B. acknowledge that the DAFFS and DAFFS-G tools were developed under NOAA SBIR contracts WC-133R-13-CN-0079 (Phase-I) and WC-133R-14-CN-0103 (PhaseII) with additional support from Lockheed-Martin Space Systems contract #4103056734 for Solar-B FPP Phase E support. A.E.McC. was supported by an Irish Research Council Government of Ireland Postgraduate Scholarship. D.S.B. and M.K.G were supported by the European Union Horizon 2020 research and innovation programme under grant agreement No. 640216 (FLARECAST project; http://flarecast.eu). MKG also acknowledges research performed under the A-EFFort project and subsequent service implementation, supported under ESA Contract number 4000111994/14/D/ MPR. S. A. M. is supported by the Irish Research Council Postdoctoral Fellowship Programme and the US Air Force Office of Scientific Research award FA9550-17-1-039. The operational Space Weather services of ROB/SIDC are partially funded through the STCE, a collaborative framework funded by the Belgian Science Policy Office.
353

Data Analysis Using Experimental Design Model Factorial Analysis of Variance/Covariance (DMAOVC.BAS)

Newton, Wesley E. 01 May 1985 (has links)
DMAOVC.BAS is a computer program written in the compiler version of microsoft basic which performs factorial analysis of variance/covariance with expected mean squares. The program accommodates factorial and other hierarchical experimental designs with balanced sets of data. The program is writ ten for use on most modest sized microprocessors, in which the compiler is available. The program is parameter file driven where the parameter file consists of the response variable structure, the experimental design model expressed in a similar structure as seen in most textbooks, information concerning the factors (i.e. fixed or random, and the number of levels), and necessary information to perform covariance analysis. The results of the analysis are written to separate files in a format that can be used for reporting purposes and further computations if needed.
354

Anomaly detection and root cause diagnosis in cellular networks / Détection d’anomalies et analyse des causes racines dans les réseaux cellulaires

Mdini, Maha 20 September 2019 (has links)
Grâce à l'évolution des outils d'automatisation et d'intelligence artificielle, les réseauxmobiles sont devenus de plus en plus dépendants de la machine. De nos jours, une grandepartie des tâches de gestion de réseaux est exécutée d'une façon autonome, sans interventionhumaine. Dans cette thèse, nous avons focalisé sur l'utilisation des techniques d'analyse dedonnées dans le but d'automatiser et de consolider le processus de résolution de défaillancesdans les réseaux. Pour ce faire, nous avons défini deux objectifs principaux : la détectiond'anomalies et le diagnostic des causes racines de ces anomalies. Le premier objectif consiste àdétecter automatiquement les anomalies dans les réseaux sans faire appel aux connaissancesdes experts. Pour atteindre cet objectif, nous avons proposé un algorithme, Watchmen AnomalyDetection (WAD), basé sur le concept de la reconnaissance de formes (pattern recognition). Cetalgorithme apprend le modèle du trafic réseau à partir de séries temporelles périodiques etdétecte des distorsions par rapport à ce modèle dans le flux de nouvelles données. Le secondobjectif a pour objet la détermination des causes racines des problèmes réseau sans aucuneconnaissance préalable sur l'architecture du réseau et des différents services. Pour ceci, nousavons conçu un algorithme, Automatic Root Cause Diagnosis (ARCD), qui permet de localiser lessources d'inefficacité dans le réseau. ARCD est composé de deux processus indépendants :l'identification des contributeurs majeurs à l'inefficacité globale du réseau et la détection desincompatibilités. WAD et ARCD ont fait preuve d'efficacité. Cependant, il est possible d'améliorerces algorithmes sur plusieurs aspects. / With the evolution of automation and artificial intelligence tools, mobile networks havebecome more and more machine reliant. Today, a large part of their management tasks runs inan autonomous way, without human intervention. In this thesis, we have focused on takingadvantage of the data analysis tools to automate the troubleshooting task and carry it to a deeperlevel. To do so, we have defined two main objectives: anomaly detection and root causediagnosis. The first objective is about detecting issues in the network automatically withoutincluding expert knowledge. To meet this objective, we have proposed an algorithm, WatchmenAnomaly Detection (WAD), based on pattern recognition. It learns patterns from periodic timeseries and detect distortions in the flow of new data. The second objective aims at identifying theroot cause of issues without any prior knowledge about the network topology and services. Toaddress this question, we have designed an algorithm, Automatic Root Cause Diagnosis (ARCD)that identifies the roots of network issues. ARCD is composed of two independent threads: MajorContributor identification and Incompatibility detection. WAD and ARCD have been proven to beeffective. However, many improvements of these algorithms are possible.
355

Trend Analysis and Modeling of Health and Environmental Data: Joinpoint and Functional Approach

Kafle, Ram C. 04 June 2014 (has links)
The present study is divided into two parts: the first is on developing the statistical analysis and modeling of mortality (or incidence) trends using Bayesian joinpoint regression and the second is on fitting differential equations from time series data to derive the rate of change of carbon dioxide in the atmosphere. Joinpoint regression model identifies significant changes in the trends of the incidence, mortality, and survival of a specific disease in a given population. Bayesian approach of joinpoint regression is widely used in modeling statistical data to identify the points in the trend where the significant changes occur. The purpose of the present study is to develop an age-stratified Bayesian joinpoint regression model to describe mortality trends assuming that the observed counts are probabilistically characterized by the Poisson distribution. The proposed model is based on Bayesian model selection criteria with the smallest number of joinpoints that are sufficient to explain the Annual Percentage Change (APC). The prior probability distributions are chosen in such a way that they are automatically derived from the model index contained in the model space. The proposed model and methodology estimates the age-adjusted mortality rates in different epidemiological studies to compare the trends by accounting the confounding effects of age. The future mortality rates are predicted using the Bayesian Model Averaging (BMA) approach. As an application of the Bayesian joinpoint regression, first we study the childhood brain cancer mortality rates (non age-adjusted rates) and their Annual Percentage Change (APC) per year using the existing Bayesian joinpoint regression models in the literature. We use annual observed mortality counts of children ages 0-19 from 1969-2009 obtained from Surveillance Epidemiology and End Results (SEER) database of the National Cancer Institute (NCI). The predictive distributions are used to predict the future mortality rates. We also compare this result with the mortality trend obtained using joinpoint software of NCI, and to fit the age-stratified model, we use the cancer mortality counts of adult lung and bronchus cancer (25-85+ years), and brain and other Central Nervous System (CNS) cancer (25-85+ years) patients obtained from the Surveillance Epidemiology and End Results (SEER) data base of the National Cancer Institute (NCI). The second part of this study is the statistical analysis and modeling of noisy data using functional data analysis approach. Carbon dioxide is one of the major contributors to Global Warming. In this study, we develop a system of differential equations using time series data of the major sources of the significant contributable variables of carbon dioxide in the atmosphere. We define the differential operator as data smoother and use the penalized least square fitting criteria to smooth the data. Finally, we optimize the profile error sum of squares to estimate the necessary differential operator. The proposed models will give us an estimate of the rate of change of carbon dioxide in the atmosphere at a particular time. We apply the model to fit emission of carbon dioxide data in the continental United States. The data set is obtained from the Carbon Dioxide Information Analysis Center (CDIAC), the primary climate-change data and information analysis center of the United States Department of Energy. The first four chapters of this dissertation contribute to the development and application of joinpiont and the last chapter discusses the statistical modeling and application of differential equations through data using functional data analysis approach.
356

The quality of institutions and economic growth in Africa? : An empirical analysis of the relationship between quality of institutions and economic growth in Africa

Ghebresus, Semhar, Luzze, Sauda January 2019 (has links)
Africa is one of the richest continents on the planet in terms of natural resources, but has the highest poverty rate, fastest growing population and includes many of the world ́s most corrupt countries. There is an ongoing discussion if the quality of institutions affects economic growth, and the applicability of economic institutional theory to the African continent. North and Thomas (1973) argues that indicators such as education innovation, capital accumulation, etc. are not causes of growth itself but rather the growth itself. Instead they suggest that economic institutions are the fundamental reason behind economic growth, since they allow new ideas, and firms and stakeholders to exist in the market. Our research question therefore examines if the quality of institutions can explain the rate of economic growth in Africa? In this study we used a panel data analysis based on 12 variables including, GDP per capita growth, Rule of Law, Control of corruption, Voice and Accountability, Government Effectiveness, Regulatory Quality, Political Stability and Absence of Violence/Terrorism, Education, Population, Foreign Direct Investment, Gross Capital Formation as well as initial GDP, between year 2003-2017, to examine the relationship between economic performance and institutional quality in 50 African countries. Our results showed that six out of our six institutional variables had a positive significant effect on economic growth. This supports the theory that institutional quality impacts economic growth.
357

Apports de l' analyse et l'intégration de données génomiques pour l'étude de la transcription et des réseaux de régulation dans le système hématopoïétique / Analysis and integration of genomic data for the study of transcription and regulation networks in the hematopoietic system

Lepoivre, Cyrille 14 November 2012 (has links)
Un des défis fondamentaux de la biologie moderne est une meilleure compréhension des mécanismes de régulation de l'expression des gènes, dont dépendent notamment le fonctionnement et la différentiation des cellules. En outre, leurs dérèglements peuvent être à l'origine de pathologies comme par exemple les cancers. Les technologies haut-débit de l'ère post-génomique permettent la production massive de données concernant notamment l'expression des gènes, les sites de fixation des facteurs de transcription et l'état de la chromatine. Ces données sont une mine d'informations pour l'étude des mécanismes de régulation. Cependant, la quantité et l'hétérogénéité de ces données soulèvent de nombreuses problématiques bioinformatiques liées à l'accès, la visualisation, l'analyse et l'intégration de celles-ci.Cette thèse aborde un certain nombre de ces aspects, à travers plusieurs projets :- la caractérisation bioinformatique de transcrits anti-sens produits par des promoteurs bidirectionnels durant le développement thymocytaire- le développement et l'intégration d'un compendium d'interactions géniques de natures diverses (interactions physiques, régulations, etc), ainsi qu'un outil de visualisation de graphes adapté - l'étude d'un système de transdifférentiation de lymphocytes pre-B en macrophages par induction de CEBPa, et la construction d'un modèle de régulation, grâce à l'analyse intégrée de données de puces à ADN, de ChIP-seq et de séquence / One of the fundamental challenges of modern biology is to better understand the mechanisms regulating gene expression, on which the functioning and differentiation of cells depend. In particular, disorders in these mechanisms may be the cause of diseases such as cancer. High throughput technologies of the post-genomic era allow mass production of data including gene expression, binding sites of transcription factors and chromatin state. These data a wealth of information for the study of regulatory mechanisms. However, the amount and heterogeneity of these data raise many bioinformatics issues related to access, visualization, analysis and integration of these.This thesis addresses a number of these aspects, through several projects:- bioinformatics characterization of antisense transcripts produced by bidirectional promoters during thymocyte development,- development and integration of a compendium of gene interactions of various kinds (physical interactions, regulations, etc.), and a graph visualization tool,- the study of a transdifferentiation system of pre-B lymphocytes into macrophages by induction of CEBPa, and the construction of a regulation model, thanks to the integrated analysis of DNA microarrays, ChIP-seq and sequence data.This work provides an illustration of some of the bioinformatics issues related to the exploitation of these data and methodologies to efficiently extract biological information, particularly to answer questions regarding the mechanisms of transcription and its regulation in the hematopoietic system.
358

A neural fuzzy approach for well log and hydrocyclone data interpretation.

Wong, Kok W. January 1999 (has links)
A novel data analysis approach that is automatic, self-learning and self-explained, and which provides accurate and reliable results is reported. The data analysis tool is capable of performing multivariate non-parametric regression analysis, as well as quantitative inferential analysis using predictive learning. Statistical approaches such as multiple regression or discriminant analysis are usually used to perform this kind of analysis. However, they lack universal capabilities and their success in any particular application is directly affected by the problem complexity.The approach employs the use of Artificial Neural Networks (ANNs) and Fuzzy Logic to perform the data analysis. The features of these two techniques are the means by which the developed data analysis approach has the ability to perform self-learning as well as allowing user interaction in the learning process. Further, they offer a means by which rules may be generated to assist human understanding of the learned analysis model, and so enable an analyst to include external knowledge.Two problems in the resource industry have been used to illustrate the proposed method, as these applications contain non-linearity in the data that is unknown and difficult to derive. They are well log data analysis in petroleum exploration and hydrocyclone data analysis in mineral processing. This research also explores how this proposed data analysis approach could enhance the analysis process for problems of this type.
359

Understanding IS development and acquisition: a process approach

McLeod, Laurie Carina January 2008 (has links)
Computer-based information systems (IS) play an increasingly pervasive and important role in contemporary organisations. Despite decades of continuing research and the development of an extensive prescriptive literature, IS development projects continue to be problematic, with many failing or being seriously challenged. In addition, the IS development environment has changed significantly in recent years, with rapid advances or shifts in technology, increasing devolution of IS responsibility and expenditure to user groups, high levels of packaged software acquisition and customisation, greater outsourcing of IS development, and an increasing emphasis on enterprise-wide and inter-organisational IS. In many cases these changes are interrelated and involve more flexible, ad hoc or non-traditional development approaches. Combined with the fact that at the same time IS have become increasingly sophisticated and integrated, the potential for unpredictable or unintended consequences has also increased. Together, the continued problematic nature of many IS projects and the changing IS development environment, suggest that there is an ongoing need for a fuller understanding of IS development processes and practices. Given the limitations of factor-based, prescriptive studies, an understanding of how contemporary IS development is enacted needs to be grounded in and built upon the cumulative body of research that attempts to understand the complexity and dynamic nature of IS development. Accordingly, this study uses a conceptualisation of IS development as a process in which an IS emerges from a dynamic and interactive relationship between the technology, its social and organisational context, and the negotiated actions of various individuals and groups. The thesis presents the results of an extensive empirical investigation into contemporary ARE development practices based on data collected from New Zealand. The study uses a range of research methods and ultimately develops a sociotechnical process model of IS development as situated action. Following Walsham’s (1993) emphasis on the content, context and process of IS-related organisational change, the methods used in this study are three-fold. First, an extensive literature review is undertaken to provide a comprehensive synthesis of contemporary empirical knowledge about the content of IS development. Second, a survey is used to collect contextual data about IS development and acquisition practices in New Zealand. Finally, these both support an in-depth longitudinal case study of the IS development process in an organisational setting. The literature review synthesises the results of recent empirical studies of the various influences that shape IS development, using a classificatory framework based around actors, project content, IS development processes, and context. The review shows that, while a number of traditional factors influencing IS development continue to be relevant, other factors have emerged as important as a result of changes to the IS development environment and to IS development practice. In particular, increasing recognition within the IS literature has been given to the relative importance of people and process and of the organisational and environmental context in which IS development takes place. The results of the literature review inform the design of a survey instrument intended to provide an updated assessment of IS development and acquisition practices in New Zealand organisations. A Web-based survey was administered to a sample of senior IS managers in 460 public and private sector organisations with 200 or more FTEs. Based on the 106 usable responses, the results of the survey confirm the ongoing relevance of a number of traditional factors identified in the IS literature as facilitating or inhibiting IS development. However, a number of factors were identified as emerging or increasing in relevance in light of changes in the IS development environment. While the survey provides a useful description of contemporary IS development and acquisition practice in New Zealand, it does not enable a detailed understanding of IS development in action. To address this, an IS project in a large New Zealand organisation was followed in action for over two years. The project involved the development of a sophisticated financial database model using a purchased commercial software package and external consultants. As such, it provides a useful exemplar of development in a contemporary IS environment. The case study illustrates how a seemingly small, well-defined project experienced delays and difficulties as might be expected in larger, more complex projects. It offers insights into the significance of external actors, the importance of full stakeholder participation, the influence of initial characterisations of the nature of the project, and the observance of project management processes. Consideration of the project outcome reveals its multi-dimensional, subjective and temporal nature. A process approach (Markus & Robey, 1988) is employed to structure the analysis of the case study. A combination of temporal bracketing, narrative analysis and visual representation is used to analyse the sequence of social action and organisational processes involved in the project and to develop a process explanation of how and why the particular project outcome in this case study developed over time. Underpinning and informing this analysis is the construction and utilisation of a model of IS development as a situated, sociotechnical process. Drawing on theoretical concepts from structuration theory and the sociology of technology, the model considers the situated actions and practices of various individuals and groups involved in IS development, the ways in which these are enacted within different contextual elements, and the role of existing and new technological artefacts in this process. IS development is characterised as iterative and emergent, with change occurring dynamically from a trajectory of situated interactions (in which meanings and actions are negotiated) and intended and unintended consequences. As a whole, this PhD highlights the changing nature of the IS development environment and the way a complex ensemble of ‘factors’ interact to influence IS project outcomes. Common themes emerge around the importance of people and process, and the context in which IS development takes place, while at the same time explicitly including a consideration of technology in the analysis.
360

Mécanismes de gélification et comportement rhéologique d'émulsions d'alcanes partiellement cristallisés.

Springinsfeld, Fabrice 21 October 2009 (has links) (PDF)
La formulation d'émulsions de mélanges binaires d'alcanes permet d'obtenir un système ayant une phénoménologie proche des émulsions laitières de triglycérides. Il est ainsi possible, à partir de ces émulsions, de former un gel et de reproduire le phénomène de barattage, c'est-à-dire de fabriquer, sous contrainte mécanique, ce que l'on pourrait appeler un « beurre d'alcanes ». La phase dispersée comporte deux alcanes, dont l'un des deux est cristallisé à la température d'étude. Le réseau cristallin à l'intérieur des gouttes crée des protubérances à leur surface, qui induisent la gélification du système par deux mécanismes distincts : le jamming, qui est un blocage du système, dû à la rugosité de surface, et sans rupture des films interfaciaux, et la coalescence partielle, qui correspond à la fusion partielle des gouttes sans relaxation de forme, grâce à la rigidité mécanique qu'apporte le réseau cristallin. Il est possible de modifier l'élasticité de ces gels en faisant varier le taux de solide, la taille des gouttes, ou encore la vitesse de refroidissement. En effet, ces trois paramètres influencent la taille et la morphologie des cristaux, qui peuvent ainsi passer d'une forme plaquettaire à une forme en aiguille. L'application de fortes déformations, en cisaillement ou en compression, entraîne la fragmentation des gels dans certains cas, mais aussi la formation de nouvelles connexions entre gouttes dans d'autres cas. La formation de nouveaux liens peut alors entraîner l'apparition de grains macroscopiques, appelés « grains de barattage », ou bien provoquer la séparation entre l'huile et la phase continue aqueuse, s'il n'y a pas assez de cristaux dans les gouttes d'huile.

Page generated in 0.0638 seconds