• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 71
  • 10
  • 9
  • 3
  • 3
  • 1
  • 1
  • Tagged with
  • 146
  • 146
  • 43
  • 40
  • 36
  • 27
  • 19
  • 19
  • 17
  • 16
  • 16
  • 15
  • 15
  • 15
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Uso da cartografia e da geomorfologia na construção de mapas de potencial arqueológico para sítios pré-coloniais: um estudo de caso em Pindamonhangaba, estado de São Paulo / The use of cartography and geomorphology at creation of archaelogical potential maps for pre-colonial sites: a case of study in Pindamonhangaba, São Paulo State

Leal, Silvia Kameyama Domingos 14 December 2017 (has links)
Nesta dissertação, apresentamos os procedimentos adotados para a elaboração de um modelo geoarqueológico de localização de material arqueológico relacionado aos sítios pré-coloniais. A área de estudo abrangeu os limites da carta topográfica do município de Pindamonhangaba, situado na região do Vale do Paraíba no estado de São Paulo. Para a construção desse modelo, realizamos o mapeamento morfológico de feições que julgamos ser propícias ao assentamento humano, por meio da estereoscopia de fotos aéreas e uso de técnicas de geoprocessamento. Com base na carta topográfica, também se produziu uma carta hipsométrica e perfis topográficos. Após a confecção de um mapa prévio de potencial arqueológico, com destaque para as áreas das grandes depressões, colinas e do terraço fluvial do rio Paraíba do Sul, foram elencados 3 setores para a realização de prospecção arqueológica. A execução das etapas de controle de campo resultou na coleta de um fragmento de cerâmica, encontrado a 3 metros de profundidade, próximo a uma depressão, bem como de alguns fragmentos de quartzo, em relevo escarpado que, após análise em laboratório, foram classificados como duvidosos quanto ao potencial arqueológico. Conclui-se que o emprego da Geoarqueologia no estudo de grupos pré-coloniais do Vale do Paraíba Paulista constituiu-se de uma abordagem eficiente para o entendimento do padrão de assentamento em função da distribuição dos recursos naturais. / In this dissertation, we present the procedures used for the creation of an archaeogeological model employed for the localization of archeological material related to precolonial sites. This study encompassed the city limits of Pindamonhangaba, in the Vale do Paraiba region, located in the state of Sao Paulo. For the purposes of this project, we used morphological mapping we deemed suitable for human settlement, through stereocopy of aerial photographs and geoprocessing techniques. Based on the topographic map, we also produced a hypsometric map and topographic profiles. After the compilation of a previous map with archeological potential, highlighting areas of great depressions, hills and the fluvial terrain of the Paraiba do Sul River, three sectors were selected for the archeological prospection. The field work yielded the discovery of a ceramic fragment, found in a excavation of 3 meters deep into the ground, close to a depression, as well as some quartz fragment. After laboratorial testing, these were found to have no archeological value. It is concluded that employing techniques from Geoarcheology to study precolonial groups in Paraíba Paulista Valley emerges as an efficient approach to the knowledgle of natural resources-oriented settlement patterns.
72

Investigation of multivariate prediction methods for the analysis of biomarker data

Hennerdal, Aron January 2006 (has links)
<p>The paper describes predictive modelling of biomarker data stemming from patients suffering from multiple sclerosis. Improvements of multivariate analyses of the data are investigated with the goal of increasing the capability to assign samples to correct subgroups from the data alone.</p><p>The effects of different preceding scalings of the data are investigated and combinations of multivariate modelling methods and variable selection methods are evaluated. Attempts at merging the predictive capabilities of the method combinations through voting-procedures are made. A technique for improving the result of PLS-modelling, called bagging, is evaluated.</p><p>The best methods of multivariate analysis of the ones tried are found to be Partial least squares (PLS) and Support vector machines (SVM). It is concluded that the scaling have little effect on the prediction performance for most methods. The method combinations have interesting properties – the default variable selections of the multivariate methods are not always the best. Bagging improves performance, but at a high cost. No reasons for drastically changing the work flows of the biomarker data analysis are found, but slight improvements are possible. Further research is needed.</p>
73

Predictive Modeling in Western Louisiana: Prehistoric and Historic Settlement in the Kisatchie National Forest

Johanson, Erik Nicholas 01 August 2011 (has links)
This thesis is an effort to provide the US Forest Service with a tool to effectively and efficiently protect and manage the cultural resource heritage of the Kisatchie National Forest. The development and subsequent evaluation of modeling efforts are vital to the archaeology of the region. There are two goals of this modeling project: to evaluate the active US Forest Service Predictive Model and secondly, if warranted, which it was, to improve upon previous models in the region. To do so 23 environmental variables were analyzed, many of which are inter-related, to develop a new set of probability zones while considering temporal and geographic variability in the Forest. The variables of distance to frequently flooded soils and distance to permanent streams proved the most significant and each play a prominent role in the creation of the proposed 2011 Kisatchie National Forest Model. The proposed model constructed within exhibits ideal gain values for each probability zone while accounting for the geographic and temporal variability present within the Kisatchie National Forest. The recommendation of this thesis is for the implementation of the proposed 2011 Kisatchie National Forest model in favor of both the 1995 Fort Polk Predictive Model and the 2010 Fort Polk Predictive Model for the Kisatchie National Forest and its surrounding region.
74

A Study of Missing Data Imputation and Predictive Modeling of Strength Properties of Wood Composites

Zeng, Yan 01 August 2011 (has links)
Problem: Real-time process and destructive test data were collected from a wood composite manufacturer in the U.S. to develop real-time predictive models of two key strength properties (Modulus of Rupture (MOR) and Internal Bound (IB)) of a wood composite manufacturing process. Sensor malfunction and data “send/retrieval” problems lead to null fields in the company’s data warehouse which resulted in information loss. Many manufacturers attempt to build accurate predictive models excluding entire records with null fields or using summary statistics such as mean or median in place of the null field. However, predictive model errors in validation may be higher in the presence of information loss. In addition, the selection of predictive modeling methods poses another challenge to many wood composite manufacturers. Approach: This thesis consists of two parts addressing above issues: 1) how to improve data quality using missing data imputation; 2) what predictive modeling method is better in terms of prediction precision (measured by root mean square error or RMSE). The first part summarizes an application of missing data imputation methods in predictive modeling. After variable selection, two missing data imputation methods were selected after comparing six possible methods. Predictive models of imputed data were developed using partial least squares regression (PLSR) and compared with models of non-imputed data using ten-fold cross-validation. Root mean square error of prediction (RMSEP) and normalized RMSEP (NRMSEP) were calculated. The second presents a series of comparisons among four predictive modeling methods using imputed data without variable selection. Results: The first part concludes that expectation-maximization (EM) algorithm and multiple imputation (MI) using Markov Chain Monte Carlo (MCMC) simulation achieved more precise results. Predictive models based on imputed datasets generated more precise prediction results (average NRMSEP of 5.8% for model of MOR model and 7.2% for model of IB) than models of non-imputed datasets (average NRMSEP of 6.3% for model of MOR and 8.1% for model of IB). The second part finds that Bayesian Additive Regression Tree (BART) produced most precise prediction results (average NRMSEP of 7.7% for MOR model and 8.6% for IB model) than other three models: PLSR, LASSO, and Adaptive LASSO.
75

Statistical and engineering methods for model enhancement

Chang, Chia-Jung 18 May 2012 (has links)
Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as "Minimal Adjustment", which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.
76

Data mining methods applied to healthcare problems

Espinoza, Sofia Elizabeth 02 July 2012 (has links)
Growing adoption of health information technologies is allowing healthcare providers to capture and store enormous amounts of patient data. In order to effectively use this data to improve healthcare outcomes and processes, clinicians need to identify the relevant measures and apply the correct analysis methods for the type of data at hand. In this dissertation, we present various data mining and statistical methods that could be applied to the type of datasets that are found in healthcare research. We discuss the process of identification of appropriate measures and statistical tools, the analysis and validation of mathematical models, and the interpretation of results to improve healthcare quality and safety. We illustrate the application of statistics and data mining techniques on three real-world healthcare datasets. In the first chapter, we develop a new method to assess hydration status using breath samples. Through analysis of the more than 300 volatile organic compounds contained in human breath, we aim to identify markers of hydration. In the second chapter, we evaluate the impact of the implementation of an electronic medical record system on the rate of inpatient medication errors and adverse drug events. The objective is to understand the impact on patient safety of different information technologies in a specific environment (inpatient pediatrics) and to provide recommendations on how to correctly analyze count data with a large amount of zeros. In the last chapter, we develop a mathematical model to predict the probability of developing post-operative nausea and vomiting based on patient demographics and clinical history, and to identify the group of patients at high-risk.
77

The self in action - electrophysiological evidence for predictive processing of self-initiated sounds and its relation to the sense of agency

Timm, Jana 15 January 2014 (has links) (PDF)
Stimuli caused by our own voluntary actions receive a special treatment in the brain. In auditory processing, the N1 and/or P2 components of the auditory event-related brain potential (ERP) to self-initiated sounds are attenuated compared to passive sound exposure, which has been interpreted as an indicator of a predictive internal forward mechanism. Such a predictive mechanism enables differentiating the sensory consequences of one´s own actions from other sensory input and allows the mind to attribute actions to agents and particularly to the self, usually called the “sense of agency”. However, the notion that N1 and/or P2 attenuation effects to self-initiated sounds reflect internal forward model predictions is still controversial. Furthermore, little is known about the relationship between N1 and/or P2 attenuation effects and the sense of agency. Thus, the aim of the present thesis was to further investigate the nature of the N1 and/or P2 attenuation effect to self-initiated sounds and to examine its specific relationship to the sense of agency. The present thesis provides evidence that N1 and/or P2 attenuation effects to self-initiated sounds are mainly determined by movement intention and predictive internal motor signals involved in movement planning and rules out non-predictive explanations of these effects. Importantly, it is shown that sensory attenuation effects in audition are directly related to the feeling of agency, but occur independent of agency judgments. Taken together, the present thesis supports the assumptions of internal forward model theories.
78

Uso da cartografia e da geomorfologia na construção de mapas de potencial arqueológico para sítios pré-coloniais: um estudo de caso em Pindamonhangaba, estado de São Paulo / The use of cartography and geomorphology at creation of archaelogical potential maps for pre-colonial sites: a case of study in Pindamonhangaba, São Paulo State

Silvia Kameyama Domingos Leal 14 December 2017 (has links)
Nesta dissertação, apresentamos os procedimentos adotados para a elaboração de um modelo geoarqueológico de localização de material arqueológico relacionado aos sítios pré-coloniais. A área de estudo abrangeu os limites da carta topográfica do município de Pindamonhangaba, situado na região do Vale do Paraíba no estado de São Paulo. Para a construção desse modelo, realizamos o mapeamento morfológico de feições que julgamos ser propícias ao assentamento humano, por meio da estereoscopia de fotos aéreas e uso de técnicas de geoprocessamento. Com base na carta topográfica, também se produziu uma carta hipsométrica e perfis topográficos. Após a confecção de um mapa prévio de potencial arqueológico, com destaque para as áreas das grandes depressões, colinas e do terraço fluvial do rio Paraíba do Sul, foram elencados 3 setores para a realização de prospecção arqueológica. A execução das etapas de controle de campo resultou na coleta de um fragmento de cerâmica, encontrado a 3 metros de profundidade, próximo a uma depressão, bem como de alguns fragmentos de quartzo, em relevo escarpado que, após análise em laboratório, foram classificados como duvidosos quanto ao potencial arqueológico. Conclui-se que o emprego da Geoarqueologia no estudo de grupos pré-coloniais do Vale do Paraíba Paulista constituiu-se de uma abordagem eficiente para o entendimento do padrão de assentamento em função da distribuição dos recursos naturais. / In this dissertation, we present the procedures used for the creation of an archaeogeological model employed for the localization of archeological material related to precolonial sites. This study encompassed the city limits of Pindamonhangaba, in the Vale do Paraiba region, located in the state of Sao Paulo. For the purposes of this project, we used morphological mapping we deemed suitable for human settlement, through stereocopy of aerial photographs and geoprocessing techniques. Based on the topographic map, we also produced a hypsometric map and topographic profiles. After the compilation of a previous map with archeological potential, highlighting areas of great depressions, hills and the fluvial terrain of the Paraiba do Sul River, three sectors were selected for the archeological prospection. The field work yielded the discovery of a ceramic fragment, found in a excavation of 3 meters deep into the ground, close to a depression, as well as some quartz fragment. After laboratorial testing, these were found to have no archeological value. It is concluded that employing techniques from Geoarcheology to study precolonial groups in Paraíba Paulista Valley emerges as an efficient approach to the knowledgle of natural resources-oriented settlement patterns.
79

Statistical lifetime modeling of FeNiCr alloys for high temperature corrosion in waste to energy plants and metal dusting in syngas production plants / Modélisation statistique de la durée de vie des alliages Fe-Ni-Cr soumis à la corrosion à haute température en environnement UVEOM et metal dusting en installations Syngas

Camperos Guevara, Sheyla Herminia 20 January 2016 (has links)
Au cours des dernières décennies, le contrôle de la corrosion des alliages exposés à des conditions sévères et complexes a été un grand défi pour les applications industrielles. Les coûts de la corrosion sont élevés et les stratégies de prévention sont devenues une demande industrielle importante. Le projet SCAPAC financé par l’ANR, a proposé d’étudier la corrosion lors de deux procédés industriels: le vapo-réformage du méthane et l’incinération des déchets ménagers. Bien que les conditions de fonctionnement de ces deux procédés soient différentes, les approches de modélisation peuvent être similaires. Dans le procédé de vapo-réformage du méthane, les composants métalliques sont soumis à la corrosion par « metal dusting », qui est une forme d’endommagement catastrophique qui affecte les alliages exposés à des températures élevées (400-800 °C) et des atmosphères sursaturées en carbone. De même, les composants métalliques des incinérateurs de déchets qui sont exposés à des atmosphères de combustion sont soumis à la corrosion à haute température sous dépôts de cendres. Le « metal dusting » est un phénomène critique qui a mené à des pertes matérielles importantes et à l’arrêt d’installations industrielles pendant les 50 dernières années. Les mécanismes de cette dégradation ont été identifiés et sont disponibles dans la littérature. Cependant, l'effet de certains paramètres des procédés ne sont pas encore bien compris et nécessitent des compléments d'études. En ce qui concerne la corrosion à haute température, les mécanismes sont bien documentés et une quantité considérable de travaux ont été publiés au cours des dernières décennies. De nombreux matériaux et revêtements ont été développés. Cependant, la performance des matériaux dans des environnements différents n'est pas assez bien comprise pour créer des modèles de prédiction de durée de vie. Une revue bibliographique de ces deux domaines a révélé qu’il existait des approches de modélisation. Néanmoins, il n'y a pas actuellement de modèle prédictifs fiables de durée de vie qui soit disponible dans la littérature pour les alliages commerciaux, et pour une gamme étendue de conditions expérimentales. La présente étude présente une méthodologie pour développer des modèles statistiques de prévision de durée de vie. Il s’agit d’évaluer la performance de matériaux soumis au « metal dusting » et à la corrosion à haute température sous dépôt. Deux bases de données ont été construites pour intégrer les résultats expérimentaux du projet SCAPAC, aussi bien que résultats de la littérature. Ceci afin d’avoir suffisant des données pour la modélisation. Ces bases de données ont permis d'analyser plus de 4000 vitesses de corrosion à l’aide de méthodes statistiques appliquées à différents scénarios. La méthodologie de l’Analyse des Composantes Principales (ACP) a été utilisée pour identifier les paramètres clés des mécanismes de corrosion, qui ont été ensuite utilisés pour construire des modèles de prédiction de durée de vie par Régression Linéaire Multiple (RLM). Pour la corrosion à haute température, trois modèles ont été obtenus dans le scénario de gradient thermique pour trois familles d'alliages: des aciers ferritiques, des alliages austénitiques à base de fer et nickel et des alliages à base de nickel, en montrant des résultats encourageants. Pour la corrosion par « metal dusting », deux modèles ont été obtenus pour expliquer le temps d'incubation et la cinétique croissance de profondeur de piqures, avec des résultats satisfaisants. Les modèles statistiques dans les deux cas ont été comparés avec deux résultats expérimentaux et théoriques montrant un bon accord, qui permet l'évaluation de la durée de vie des matériaux dans les conditions définies. / Over the last decades, the corrosion control of alloys exposed to severe and complex conditions in industrial applications has been a great challenge. Currently, corrosion costs are increasing and preventive strategies have become an important industrial demand. The SCAPAC project funded by the French National Research Agency has proposed to study the corrosion for two separate processes: Steam Methane Reforming (SMR) and Waste to Energy (WtE). Although the operating conditions of both processes are different, the modeling approaches can be similar. Metallic components in the SMR process are subjected to metal dusting corrosion, which is a catastrophic form of damage that affects alloys exposed to highly carburising gases (aC>1) at high temperatures (400–800 °C).[1]. Likewise, metallic components in the Waste to Energy (WtE) process are subjected to high temperature corrosion under deposit that takes place in equipment exposed to atmospheres with high content of corrosive products of combustion. Metal dusting corrosion is considered as a critical phenomenon that has led to worldwide material loss for 50 years. A basic understanding of the degradation mechanisms is available. However, the effect of some process parameters is still not well understood in current literature and requires further study. Otherwise for high temperature corrosion, a considerable amount of literature has been published over the last few decades and the mechanisms are well documented. Also many materials and coatings have been developed. However, the material performance in different environments has not been sufficiently well understood to define suitable criteria for lifetime prediction models regarding operating conditions, due to the high complexity of the corrosion phenomena involved. Literature research in both fields revealed modeling approaches in different kinds of complex conditions and applications. Nevertheless, there are no lifetime models currently available in the open literature for commercial materials that consider a wide range of conditions and the relative weight of the variables involved in the corrosion processes. This dissertation presents a methodology to develop lifetime prediction models to evaluate materials performance under metal dusting and high-temperature corrosion conditions. Two databases were created to integrate experimental results from the SCAPAC project, as well as results from literature to enable sufficient amount of data for modeling. The databases allowed analyzing approximately 4000 corrosion rates by different statistical methods over different scenarios. The Principal Component Analysis (PCA) methodology was performed to identify the key parameters to create lifetime prediction models using Multiple Linear Regressions (MLR). For high-temperature corrosion, three models were obtained in the thermal gradient scenario for three families of alloys: low alloyed steels, Fe/Ni-based high temperature alloys and Ni-based alloys, showing agreeable results. For metal dusting corrosion, two models were obtained to explain the incubation times and the kinetic of pit depth growing, showing satisfactory results. The statistical models in both cases were compared with experimental and theoretical results showing good agreement with experimental findings, which allows performing the lifetime assessment of materials under defined conditions.
80

Predicting Machining Rate in Non-Traditional Machining using Decision Tree Inductive Learning

Konda, Ramesh 01 January 2010 (has links)
Wire Electrical Discharge Machining (WEDM) is a nontraditional machining process used for machining intricate shapes in high strength and temperature resistive (HSTR) materials. WEDM provides high accuracy, repeatability, and a better surface finish; however the tradeoff is a very slow machining rate. Due to the slow machining rate in WEDM, machining tasks take many hours depending on the complexity of the job. Because of this, users of WEDM try to predict machining rate beforehand so that input parameter values can be pre-programmed to achieve automated machining. However, partial success with traditional methodologies such as thermal modeling, artificial neural networks, mathematical, statistical, and empirical models left this problem still open for further research and exploration of alternative methods. Also, earlier efforts in applying the decision tree rule induction algorithms for predicting the machining rate in WEDM had limitations such as use of coarse grained method of discretizing the target and exploration of only C4.5 as the learning algorithm. The goal of this dissertation was to address the limitations reported in literature in using decision tree rule induction algorithms for WEDM. In this study, the three decision tree inductive algorithms C5.0, CART and CHAID have been applied for predicting material removal rate when the target was discretized into varied number of classes (two, three, four, and five classes) by three discretization methods. There were a total of 36 distinct combinations when learning algorithms, discretization methods, and number of classes in the target are combined. All of these 36 models have been developed and evaluated based on the prediction accuracy. From this research, a total of 21 models found to be suitable for WEDM that have prediction accuracy ranging from 71.43% through 100%. The models indentified in the current study not only achieved better prediction accuracy compared to previous studies, but also allows the users to have much better control over WEDM than what was previously possible. Application of inductive learning and development of suitable predictive models for WEDM by incorporating varied number of classes in the target, different learning algorithms, and different discretization methods have been the major contribution of this research.

Page generated in 0.3041 seconds