771 |
A Clinical Decision Support System for the Identification of Potential Hospital Readmission PatientsUnknown Date (has links)
Recent federal legislation has incentivized hospitals to focus on quality of patient
care. A primary metric of care quality is patient readmissions. Many methods exist to
statistically identify patients most likely to require hospital readmission. Correct
identification of high-risk patients allows hospitals to intelligently utilize limited resources
in mitigating hospital readmissions. However, these methods have seen little practical
adoption in the clinical setting. This research attempts to identify the many open research
questions that have impeded widespread adoption of predictive hospital readmission
systems.
Current systems often rely on structured data extracted from health records systems.
This data can be expensive and time consuming to extract. Unstructured clinical notes are
agnostic to the underlying records system and would decouple the predictive analytics
system from the underlying records system. However, additional concerns in clinical
natural language processing must be addressed before such a system can be implemented. Current systems often perform poorly using standard statistical measures.
Misclassification cost of patient readmissions has yet to be addressed and there currently
exists a gap between current readmission system evaluation metrics and those most
appropriate in the clinical setting. Additionally, data availability for localized model
creation has yet to be addressed by the research community. Large research hospitals may
have sufficient data to build models, but many others do not. Simply combining data from
many hospitals often results in a model which performs worse than using data from a single
hospital.
Current systems often produce a binary readmission classification. However,
patients are often readmitted for differing reasons than index admission. There exists little
research into predicting primary cause of readmission. Furthermore, co-occurring evidence
discovery of clinical terms with primary diagnosis has seen only simplistic methods
applied.
This research addresses these concerns to increase adoption of predictive hospital
readmission systems. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2017. / FAU Electronic Theses and Dissertations Collection
|
772 |
Emprego da análise por injeção sequencial (SIA) e de métodos estatísticos para a otimização de processos oxidativos avançados visando o tratamento de amostras da indústria de tintas / Application of Sequential Injection Analysis (SIA) and Statistical Methods for the Optimization of Advanced Oxidative Processes Aiming the Treatment of Coatings Industry SamplesSantos, Allan Cezar Vieira dos 30 July 2010 (has links)
Esta Tese descreve o emprego da Análise por Injeção Sequencial (SIA) e de métodos estatísticos visando a otimização do tratamento de amostras da indústria de tintas através de Processos Oxidativos Avançados (POA\'s), mais especificamente através das reações de Fenton e foto-Fenton. Estudos foram realizados com o composto modelo vermelho de alizarina S para a otimização de parâmetros da reação de Fenton e o com sal tetrassódico de ácido 3,4′,4″,4″′-tetrassulfônico de ftalocianina de cobre para a otimização de parâmetros da reação de foto-Fenton em um foto-reator desenvolvido em laboratório. Nas condições obtidas com SIA, foi observada a degradação de 99,7% de alizarina e de 97% da ftalocianina de cobre. Uma amostra de efluente da indústria de tintas também foi considerada; para esta matriz, os resultados apontaram uma mineralização de 79% do efluente após a reação de foto-Fenton e de 45% após a reação de Fenton. / This Thesis describes the use of Sequential Injection Analysis (SIA) and statistical methods aiming the optimization of the treatment processes of coatings samples by Advanced Oxidative Processes (AOP\'s), specifically by Fenton and photo-Fenton reactions. Studies were performed with model compounds, alizarin red S for optimization of Fenton parameters and a copper phthalocyanine-3,4′,4″,4″′-tetrasulfonic acid tetrasodium salt dye for optimization of photo-Fenton parameters in a homemade photo reactor. At SIA established conditions, it was observed a degradation of 99.7% of alizarin and 97% of the phthalocyanine dye. A sample of wastewater from the coatings industry was also considered; for this matrix, results showed a sample mineralization of 79% after photo-Fenton and 45% after Fenton reactions.
|
773 |
Pedologia quantitativa: espectrometria VIS-NIR-SWIR e mapeamento digital de solos / Quantitative pedology: VIS-NIR-SWIR spectrometry and digital soil mappingRamírez López, Leonardo 17 June 2009 (has links)
Para a avaliação das características do solo relacionadas com o potencial uso dos solos, assim como para a avaliação da fertilidade, as análises químicas e físicas de rotina são os métodos convencionalmente usados. Estes são bastante custosos e demorados o que tem representado no Brasil uma dificuldade no seu uso por parte de pequenos agricultores, além da aplicabilidade da agricultura de precisão no manejo de solos. Atualmente a pedometria está fornecendo a possibilidade de incorporar em ciência do solo técnicas bastante sofisticadas que podem ajudar a diminuir o custo na obtenção da informação e compreender melhor o funcionamento dos processos do solo. Entre os tópicos mais recentes que estão incluídos na pesquisa relacionada com pedometria está a espectroscopia de reflectância. Embora se tenha demonstrado que uma grande quantidade de atributos podem ser estimados a partir da resposta espectral do solo via sensoriamento, ainda não se têm atingido níveis de acurácia ótimos em relação às metodologias convencionais, sobretudo no referente a atributos químicos. Para tanto, o presente trabalho foi desenvolvido com a finalidade de responder basicamente os seguintes questionamentos: a. Existem faixas espectrais específicas das bases trocáveis ou se estas podem mudar em função do argilomineral fornecedor da capacidade de troca de cátions?; b. A calibração de modelos usando unicamente algumas faixas espectrais específicas pode melhorar o desempenho destes?; c. Qual é a influência dos níveis de acurácia dos modelos espectrais sobre mapas construídos com atributos estimados a partir destes?; d. Como os tamanhos dos grupos de amostras de calibração influenciam a acurácia dos modelos?; e. Como a calibração de atributos relacionados com o intemperismo podem auxiliar no mapeamento de classes de solo? / The routine soil analysis is traditionally used on the evaluation of soil attributes related to land use potential, and the assessment of fertility. It is costly and time consuming, making it inaccessible for small farmers, and hampering the applicability of precision agriculture on soil management in Brazil. Currently, pedometrics is providing the possibility of incorporating in soil science sophisticated techniques that can help to reduce the cost of obtaining information and improve the understanding about how several soil processes works. One of the more recent topics on pedometrics is the soil reflectance spectroscopy. Through the soil reflected energy is possible to infer several soil properties, although optimum accuracy levels in the spectral estimation of soil attributes have not yet reached. In this sense, the aim of this study was basically answer the following questions: a. The exchangeable bases have specific spectral bands or the spectral responses of theses depends on the clay mineral?; b. the calibration of models by using only some specific spectral bands may improve the prediction performance?; c. What is the influence of the accuracy of prediction models on maps constructed with predicted soil attributes?; D. How calibration set size affect the accuracy of the models?; e. How the calibration of models for prediction of soil attributes related to soil weathering may assist the digital soil mapping?
|
774 |
Population distribution, habitat selection, and life history of the slough crayfish (Procambarus fallax) in the ridge-slough landscape of the central EvergladesUnknown Date (has links)
Understanding where and why organisms are distributed in the environment are central themes in ecology. Animals live in environments in which they are subject to competing demands, such as the need to forage, to find mates, to reproduce, and to avoid predation. Optimal habitats for these various activities are usually distributed heterogeneously in the landscape and may vary both spatially and temporally, causing animals to adjust their locations in space and time to balance these conflicting demands. In this dissertation, I outline three studies of Procambarus fallax in the ridge-slough landscape of Water conservation Area 3A (WCS-3A). The first section outlines an observational sampling study of crayfish population distribution in a four hectare plot, where I statistically model the density distribution at two spatial scales. ... Secondly, I use radio telemetry to study individual adult crayfish movements at two study sites and evaluate habitat selection using Resource Selection Functions. In the third section, I test the habitat selection theory, ideal free distribution, by assessing performance measures (growth and mortality) of crayfish in the two major vegetation types in a late wet season (November 2007) and early wet season (August 2009). / by Craig van der Heiden. / Thesis (Ph.D.)--Florida Atlantic University, 2012. / Includes bibliographical references at the end of each chapter. / Mode of access: World Wide Web. / System requirements: Adobe Reader.
|
775 |
Detection of multiple change-points in hazard modelsUnknown Date (has links)
Change-point detection in hazard rate function is an important research topic in survival
analysis. In this dissertation, we firstly review existing methods for single change-point detection in
piecewise exponential hazard model. Then we consider the problem of estimating the change point in
the presence of right censoring and long-term survivors while using Kaplan-Meier estimator for the
susceptible proportion. The maximum likelihood estimators are shown to be consistent. Taking one
step further, we propose an counting process based and least squares based change-point detection
algorithm. For single change-point case, consistency results are obtained. We then consider the
detection of multiple change-points in the presence of long-term survivors via maximum likelihood
based and counting process based method. Last but not least, we use a weighted least squares based
and counting process based method for detection of multiple change-points with long-term survivors
and covariates. For multiple change-points detection, simulation studies show good performances of
our estimators under various parameters settings for both methods. All methods are applied to real
data analyses. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
|
776 |
ANÁLISE ESTATÍSTICA DA QUALIDADE NA PRODUÇÃO DE FARELO E ÓLEO DEGOMADO DE SOJA, ESTUDO DE CASO EM EMPRESA DE MÉDIO PORTE EM RIO VERDE - GO.Arantes, Cássia da Silva Castro 10 March 2016 (has links)
Made available in DSpace on 2016-08-10T10:40:37Z (GMT). No. of bitstreams: 1
CASSIA DA SILVA CASTRO ARANTES.pdf: 3047367 bytes, checksum: a63eab38d70349a1a580a883aa0f8bd3 (MD5)
Previous issue date: 2016-03-10 / This study deals with the application of statistical methods for analysis of the
quality production of the Meal and Degummed soybean Oil. As Case Study
object has Guará Industry, which provided the necessary data. These were
analyzed by applying statistical methods such as ANOVA and Tukey Test,
stability analysis using control charts and process capability analysis. With the
analysis came to the conclusion that the warehouse actually influence the
quality of soybeans, the main feedstock of the company. It was also found that
much of the quality characteristics of the products produced show that the
processes are not stable and are unable. During the study, also rose at the
main quality problems the company owned, as well as the causes of these.
Finally, this work shows important information about the enterprise and
suggests improvements to ensure effective gains in the quality of final products
and consequently better results for the organization, preventing and eliminating
unnecessary quality costs. / Este estudo aborda a aplicação de métodos estatísticos para análise da
qualidade da produção de farelo e óleo degomado de soja. Como objeto do
estudo de caso tem-se a Indústria Guará, a qual forneceu os dados
necessários. Estes foram analisados aplicando-se métodos estatísticos, tais
como Anova e Teste de Tukey, análise da estabilidade através de cartas de
controle e análise da capacidade do processo. Com as análises, chegou-se à
conclusão de que os armazéns realmente influenciam na qualidade da soja,
principal matéria prima da empresa. Verificou-se também que grande parte das
características de qualidade dos produtos produzidos demonstram que os
processos não estão estáveis e não são capazes. Durante o estudo, foram
levantados ainda os principais problemas de qualidade que a empresa possui,
bem como as causas destes. Por fim, este trabalho demonstra importantes
informações sobre a empresa e sugere melhorias para garantir ganhos efetivos
na qualidade dos produtos finais e, consequentemente, melhores resultados
para a organização, evitando e eliminando custos de qualidade
desnecessários.
|
777 |
Otimização e análise do desempenho de sistemas frigoríficos utilizando o método de superfície de resposta, o planejamento de experimentos e ensaios de protótipos / Optimization and analysis of the performance of refrigeration systems using response surface methodology, experimental design and prototype experimentsOliveira, Sidnei José de 20 June 2001 (has links)
Os métodos de superfície de resposta e planejamento de experimentos foram utilizados no processo de análise e otimização de sistemas frigoríficos. Foram determinadas as dimensões do tubo capilar juntamente com a carga de refrigerante que proporcionaram as melhores condições de funcionamento a um protótipo. O comportamento de oito variáveis resposta foram estudadas, que são: Capacidade Frigorífica, Coeficiente de Eficácia, Temperatura de Descarga, Super Aquecimento, Sub resfriamento, Vazão de Refrigerante, Temperatura de Evaporação e Temperatura de Condensação. Superfícies de Resposta e Curvas de nível foram levantadas em diversas situações de interesse, visando revelar o comportamento e a sensibilidade do sistema. Alguns fatores revelaram níveis que propiciaram uma reduzida variabilidade para certas variáveis resposta demonstrando o conceito de sistema robusto. O método mostrou-se bastante adequado, contribuindo com resultados de grande valia para a otimização e análise do comportamento de sistemas frigoríficos, além de poder ter sua aplicabilidade ampliada para sistemas térmicos em geral. / The Response Surface Methodology and the Design of Experiments were applied on the analysis and optimization process of refrigeration systems. The dimensions of a capillary tube and refrigerant charge that provided the best working conditions to a prototype were determined. The behavior of the Refrigeration Capacity, Coeficient of Performance, Discharge Temperature, Super Heating, Sub Cooling, Mass Flow Rate, Evaporation Temperature and Condensing Temperature were studied in detail. Surface Response and Contour plots were constructed on many situations in order to reveal the system behavior and sensitivity. Some factor levels provided a small variability to certain responses, demonstrating the concept of robust system. The methodology contribuited properly with valuable results to the optimization and analysis of refrigeration system behavior; besides, its applicability can be easily generalised to thermal systems.
|
778 |
Impact of telecommunication deregulation on international telephone traffic.January 1992 (has links)
by Leung Hon-Kit. / Thesis (M.B.A.)--Chinese University of Hong Kong, 1992. / Includes bibliographical references (leaves 92-93). / ABSTRACT --- p.ii / TABLE OF CONTENTS --- p.iv / ACKNOWLEDGEMENTS --- p.vi / CHAPTER / Chapter I. --- INTRODUCTION --- p.1 / Chapter II. --- METHODOLOGY --- p.5 / "An Account of Telecommunications Deregulation in the U.S., U.K. and Japan" --- p.6 / Major Determinants of International Telephone Demand --- p.7 / Sources of Data --- p.7 / Analysis Method --- p.9 / Chapter III. --- MODELS AND RESULTS --- p.11 / Econometric --- p.11 / Box-Jenkins --- p.29 / Chapter IV. --- SUMMARY AND CONCLUSIONS --- p.38 / Multi-Carriers Effect --- p.39 / IVANS/Leased Circuit Effect --- p.41 / Price Elasticity of Demand --- p.42 / Impact on Dominant Carriers --- p.43 / Inference to Hong Kong Situtation --- p.44 / APPENDIXES --- p.46 / Chapter A. --- Traffic Statistics --- p.46 / Chapter B. --- Collection Charge Statistics --- p.54 / Chapter C. --- Economic Statistics --- p.56 / Chapter D. --- Charts of U.S. Telephone Traffic to Hong Kong and ARIMA Modelling --- p.59 / Chapter E. --- Charts of Hong Kong Telephone Traffic to US and ARIMA Modelling --- p.64 / Chapter F. --- Charts of UK Telephone Traffic to Hong Kong and ARIMA Modelling --- p.70 / Chapter G. --- Charts of Hong Kong Telephone Traffic to U.K. and ARIMA Modelling --- p.75 / Chapter H. --- Charts of Japan Telephone Traffic to Hong Kong and ARIMA Modelling --- p.80 / Chapter I. --- Charts of Hong Kong Telephone Traffic to Japan and ARIMA Modelling --- p.85 / Chapter J. --- Plots of Residuals of the Econometric Models --- p.90 / BIBLIOGRAPHY AND REFERENCE --- p.92
|
779 |
An application of cox hazard model and CART model in analyzing the mortality data of elderly in Hong Kong.January 2002 (has links)
Pang Suet-Yee. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2002. / Includes bibliographical references (leaves 85-87). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Overview --- p.1 / Chapter 1.1.1 --- Survival Analysis --- p.2 / Chapter 1.1.2 --- Tree、-structured Statistical Method --- p.2 / Chapter 1.1.3 --- Mortality Study --- p.3 / Chapter 1.2 --- Motivation --- p.3 / Chapter 1.3 --- Background Information --- p.4 / Chapter 1.4 --- Data Content --- p.7 / Chapter 1.5 --- Thesis Outline --- p.8 / Chapter 2 --- Imputation and File Splitting --- p.10 / Chapter 2.1 --- Imputation of Missing Values --- p.10 / Chapter 2.1.1 --- Purpose of Imputation --- p.10 / Chapter 2.1.2 --- Procedure of Hot Deck Imputation --- p.11 / Chapter 2.1.3 --- List of Variables for Imputation --- p.12 / Chapter 2.2 --- File Splitting --- p.14 / Chapter 2.2.1 --- Splitting by Gender --- p.14 / Chapter 2.3 --- Splitting for Validation Check --- p.1G / Chapter 3 --- Cox Hazard Model --- p.17 / Chapter 3.1 --- Basic Idea --- p.17 / Chapter 3.1.1 --- Survival Analysis --- p.17 / Chapter 3.1.2 --- Survivor Function --- p.18 / Chapter 3.1.3 --- Hazard Function --- p.18 / Chapter 3.2 --- The Cox Proportional Hazards Model --- p.19 / Chapter 3.2.1 --- Kaplan-Meier Estimate and Log-Rank Test --- p.20 / Chapter 3.2.2 --- Hazard Ratio --- p.23 / Chapter 3.2.3 --- Partial Likelihood --- p.24 / Chapter 3.3 --- Extension of the Cox Proportional Hazards Model for Time-dependent Variables --- p.25 / Chapter 3.3.1 --- Modification of the Cox's Model --- p.25 / Chapter 3.4 --- Results of Model Fitting --- p.26 / Chapter 3.4.1 --- Extract the Significant Covariates from the Models --- p.31 / Chapter 3.5 --- Model Interpretation --- p.32 / Chapter 4 --- CART --- p.37 / Chapter 4.1 --- CART Procedure --- p.38 / Chapter 4.2 --- Selection of the Splits --- p.39 / Chapter 4.2.1 --- Goodness of Split --- p.39 / Chapter 4.2.2 --- Type of Variables --- p.40 / Chapter 4.2.3 --- Estimation --- p.40 / Chapter 4.3 --- Pruning the Tree --- p.41 / Chapter 4.3.1 --- Misclassification Cost --- p.42 / Chapter 4.3.2 --- Class Assignment Rule --- p.44 / Chapter 4.3.3 --- Minimal Cost Complexity Pruning --- p.44 / Chapter 4.4 --- Cross Validation --- p.47 / Chapter 4.4.1 --- V-fold Cross-validation --- p.47 / Chapter 4.4.2 --- Selecting the right sized tree --- p.49 / Chapter 4.5 --- Missing Value --- p.49 / Chapter 4.6 --- Results of CART program --- p.51 / Chapter 4.7 --- Model Interpretation --- p.53 / Chapter 5 --- Model Prediction --- p.58 / Chapter 5.1 --- Application to Test Sample --- p.58 / Chapter 5.1.1 --- Fitting test sample to Cox's Model --- p.59 / Chapter 5.1.2 --- Fitting test sample to CART model --- p.61 / Chapter 5.2 --- Comparison of Model Prediction --- p.62 / Chapter 5.2.1 --- Misclassification Rate --- p.62 / Chapter 5.2.2 --- Misclassification Rate of Cox's model --- p.63 / Chapter 5.2.3 --- Misclassification Rate of CART model --- p.64 / Chapter 5.2.4 --- Prediction Result --- p.64 / Chapter 6 --- Conclusion --- p.67 / Chapter 6.1 --- Comparison of Results --- p.67 / Chapter 6.2 --- Comparison of the Two Statistical Techniques --- p.68 / Chapter 6.3 --- Limitation --- p.70 / Appendix A: Coding Description for the Health Factors --- p.72 / Appendix B: Log-rank Test --- p.75 / Appendix C: Longitudinal Plot of Time Dependent Variables --- p.76 / Appendix D: Hypothesis Testing of Suspected Covariates --- p.78 / Appendix E: Terminal node report for both gender --- p.81 / Appendix F: Calculation of Critical Values --- p.83 / Appendix G: Distribution of Missing Value in Learning sample and Test Sample --- p.84 / Bibliography --- p.85
|
780 |
Développement d'algorithmes de détection et d'identification gamma : application à la spectrométrie gamma embarquée / Embedded gamma spectrometry : development of gamma detection and identification algorithmsWilhelm, Emilien 24 November 2016 (has links)
Depuis le début des années 1980, le Commissariat à l’Énergie Atomique développe et met en oeuvre un système de spectrométrie gamma aéroportée, appelé HELINUCTM. Ce système, composé de détecteurs NaI(Tl) d’un volume de 16 L, est utilisé afin d’établir un état des lieux radiologique des sites survolés. Les principales missions du système HELINUC consistent en la réalisation de contrôles environnementaux, l’intervention en situation de crise et la recherche de sources ponctuelles. La réalisation de ces missions nécessite le développement de méthodes d’analyse adaptées. L’approche considérée dans cette thèse repose sur une rupture conceptuelle de l’analyse des spectres par rapport aux méthodes utilisées jusqu’alors au CEA : l’analyse ne repose plus sur une considération individuelle et séquentielle des mesures aéroportées, mais sur une considération globale et simultanée de celles-ci. L’étude et le développement de méthodes statistiques adaptées à la quantification des radionucléides naturels et du 137Cs (de 600 keV à 3 MeV), à l’estimation de la contamination en 241Am (basse énergie, inférieure à100 keV) en cas de crise radiologique et la détection de sources ponctuelles (moyenne énergie, entre 100 keV et600 keV) permettent d’améliorer la précision sur les activités déterminées en vol et la détection de sources de faibles activités. / Since the beginning of 1980’s, the CEA has been developing an airborne gamma spectrometry (AGS) system called HELINUCTM using large volume (16 L) NaI(Tl) detectors. HELINUC is used to produce radioactivity mapping of the soil. The different missions of HELINUC are environmental control of radioactivity, nuclear emergency response and research of orphan sources. The continuous development of analysis methods is then required.The approach considered in this thesis is based on a conceptual break from the analysis of spectra compared to the methods used at the CEA until now: the analysis does not rely on an individual and sequential consideration of airborne measurements, but on an overall and simultaneous consideration of them. The study and development of statistical methods for the quantification of natural radionuclides and 137Cs (from 600 keV to 3 MeV), for the estimation of 241Am contamination (low energy, inferior to 100 keV) in case of radiological emergency and for the detection of orphan sources (medium energy, between 100 keV and 600 keV) improve the accuracy of activities estimation and detection of low activities sources.
|
Page generated in 0.1537 seconds