• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19
  • 14
  • 1
  • 1
  • Tagged with
  • 54
  • 54
  • 14
  • 12
  • 12
  • 10
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Aplicação de redes neurais artificiais e de quimiometria na modelagem do processo de craqueamento catalitico fluido / Application of artificial neural networks and chemometrics in the modeling of fluid catalytic cracking process

Pimentel, Wagner Roberto de Oliveira 18 March 2005 (has links)
Orientador: Antonio Carlos Luz Lisboa / Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Quimica / Made available in DSpace on 2018-08-04T10:42:56Z (GMT). No. of bitstreams: 1 Pimentel_WagnerRobertodeOliveira_D.pdf: 4301837 bytes, checksum: f08b0ac18f93d5fcb06a99e56d205191 (MD5) Previous issue date: 2005 / Resumo: O craqueamento catalítico fluido (FCC) é um dos mais importantes processos de refino da atualidade que produz, dentre outros produtos, gasolina e GLP. Trata-se de um processo que apresenta grande dificuldade de ser modelado fenomenologicamente. Dentro desse contexto surgem as redes neurais artificiais (RNA) como ferramenta de modelagem, visto que as RNA são capazes de ¿aprender¿ o que ocorre no processo por meio de um conjunto limitado de dados e apresentam um menor tempo de processamento se comparado aos modelos fenomenológicos. O objetivo principal deste trabalho é desenvolver modelos empíricos, baseados em RNA e na quimiometria, capazes de relacionar as variáveis de entrada com as variáveis de saída do processo de craqueamento catalítico fluido (planta piloto e unidade industrial). Os dados experimentais foram obtidos na unidade piloto de FCC da Petrobrás localizada na usina de xisto em São Mateus do Sul ¿ PR e os dados industriais foram obtidos da unidade de RLAM localizada em São Francisco do Conde ¿ BA. Para uma boa performance das redes foi utilizada a técnica de análise dos componentes principais (PCA) para um pré-processamento dos dados e em seguida foram usadas redes MLP com os seguintes algoritmos de treinamento supervisionado: Método de Broyden-Fletcher-Goldfarb-Shanno (BFGS), Método do Gradiente Conjugado Escalonado (SCG) e Levenberg-Marquardt (LM)... Observação: O resumo, na íntegra, poderá ser visualizado no texto completo da tese digital / Abstract: The fluidized bed catalytic cracking process is one of the most important refining processes. It produces, among other distillates, gasoline and liquefied petroleum gas (LPG). It is very difficult to model it by fundamental balances. On the other hand, artificial neural networks (ANN) offer convenient tools to describe complex processes. They are able to learn what is going on with in the process through a limited amount of information, requiring less computing time than phenomenological modeling. The main objective of this work was to develop empirical models ¿ based on ANNs and chemometrics ¿ able to relate input and output variables of the FCC process, using data from a pilot and from an industrial plant. Experimental data were obtained from the Petrobras FCC pilot plant located in São Mateus do Sul, Parané, nd from the Petrobras Landulpho Alves Refinery PCC industrial plant located in São Francisco do Conde, Bahia. The principal component analysis (PCA) technique was initially used to preprocess the data. Artificial neural networks were then employed with the following supervising training algorithms: Broyden-Fletcher-Godfarb-Shanno (BFGS), Scale Conjugated Gradient (SCG) and Levenberg-Marquardt (LM). Methods devised to increase the artificial network prediction power were also used... Note: The complete abstract is available with the full electronic digital thesis or dissertations / Doutorado / Engenharia de Processos / Doutor em Engenharia Química
42

Analysing and predicting differences between methylated and unmethylated DNA sequence features

Ali, Isse January 2015 (has links)
DNA methylation is involved in various biological phenomena, and its dysregulation has been demonstrated as being correlated with a number of human disease processes, including cancers, autism, and autoimmune, mental health and neuro-degenerative ones. It has become important and useful in characterising and modelling these biological phenomena in or-der to understand the mechanism of such occurrences, in relation to both health and disease. An attempt has previously been made to map DNA methylation across human tissues, however, the means of distinguishing between methylated, unmethylated and differentially-methylated groups using DNA sequence features remains unclear. The aim of this study is therefore to: firstly, investigate DNA methylation classes and predict these based on DNA sequence features; secondly, to further identify methylation-associated DNA sequence features, and distinguish methylation differences between males and females in relation to both healthy and diseased, sta-tuses. This research is conducted in relation to three samples within nine biological feature sub-sets extracted from DNA sequence patterns (Human genome database). Two samples contain classes (methylated, unmethy-lated and differentially-methylated) within a total of 642 samples with 3,809 attributes driven from four human chromosomes, i.e. chromosomes 6, 20, 21 and 22, and the third sample contains all human chromosomes, which encompasses 1628 individuals, and then 1,505 CpG loci (features) were extracted by using Hierarchical clustering (a process Heatmap), along with pair correlation distance and then applied feature selection methods. From this analysis, author extract 47 features associated with gender and age, with 17 revealing significant methylation differences between males and females. Methylation classes prediction were applied a K-nearest Neighbour classifier, combined with a ten-fold cross- validation, since to some data were severely imbalanced (i.e., existed in sub-classes), and it has been established that direct analysis in machine-learning is biased towards the majority class. Hence, author propose a Modified- Leave-One-Out (MLOO) cross-validation and AdaBoost methods to tackle these issues, with the aim of compositing a balanced outcome and limiting the bias in-terference from inter-differences of the classes involved, which has provided potential predictive accuracies between 75% and 100%, based on the DNA sequence context.
43

thesis.pdf

Sonali D Digambar Patil (14228030) 08 December 2022 (has links)
<p>Accurate 3D landscape models of cities or mountains have wide applications in mission</p> <p>planning, navigation, geological studies, etc. Lidar scanning using drones can provide high</p> <p>accuracy 3D landscape models, but the data is more expensive to collect as the area of</p> <p>each scan is limited. Thanks to recent maturation of Very-High-Resolution (VHR) optical</p> <p>imaging on satellites, people nowadays have access to stereo images that are collected on a</p> <p>much larger area than Lidar scanning. My research addresses unique challenges in satellite</p> <p>stereo, including stereo rectification with pushbroom sensors, dense stereo matching using</p> <p>image pairs with varied appearance, e.g. sun angles and surface plantation, and rasterized</p> <p>digital surface model (DSM) generation. The key contributions include the Continuous 3D-</p> <p>Label Semi-Global Matching (CoSGM) and a large scale dataset for satellite stereo processing</p> <p>and DSM evaluation.</p>
44

<b>INFERRING STRUCTURAL INFORMATION FROM MULTI-SENSOR SATELLITE DATA FOR A LOCALIZED SITE</b>

Arnav Goel (17683527) 05 January 2024 (has links)
<p dir="ltr">Canopy height is a fundamental metric for extracting valuable information about forested areas. Over the past decade, Lidar technology has provided a straightforward approach to measuring canopy height using various platforms such as terrestrial, unmanned aerial vehicle (UAV), airborne, and satellite sensors. However, satellite Lidar data, even with its global coverage, has a sparse sampling pattern that doesn’t provide continuous coverage over the globe. In contrast, satellites like LANDSAT offer seamless and widespread coverage of the Earth's surface through spectral data. Can we exploit the abundant spectral information from satellites like LANDSAT and ECOSTRESS to infer structural information obtained from Lidar satellites like Global Ecosystem Dynamic Investigation (GEDI)? This study aims to develop a deep learning model that can infer canopy height derived from sparsely observed Lidar waveforms using multi-sensor spectral data from spaceborne platforms. Specifically designed for localized site, the model focuses on county-level canopy height estimation, taking advantage of the relationship between canopy height and spectral reflectance that can be established in a local setting – something which might not exist universally. The study hopes to achieve a framework that can be easily replicable as height is a dynamic metric which changes with time and thus requires repeated computation for different time periods.</p><p dir="ltr">The thesis presents a series of experiments designed to comprehensively understand the influence of different spectral datasets on the model’s performance and its effectiveness in different types of test sites. Experiment 1 and 2 utilize Landsat spectral band values to extrapolate canopy height, while Experiment 3 and 4 incorporate ECOSTRESS land surface temperature and emissivity band values in addition to Landsat data. Tippecanoe County, predominantly composed of cropland, serves as the test site for Experiment 1 and 3, while Monroe County, primarily covered by forests, serves as the test site for Experiment 2 and 4. When compared to the Airborne Lidar dataset from the United States Geological Survey (USGS) – 3D Elevation Program (3DEP), the model achieves a Root Mean Square Error (RMSE) of 4.604m for Tippecanoe County using Landsat features while 5.479m for Monroe County. After integrating Landsat and ECOSTRESS features, the RMSE improves to 4.582m for Tippecanoe County but deteriorates to 5.860m for Monroe County. Overall, the study demonstrates comparable results to previous research without requiring feature engineering or extensive pre-processing. Furthermore, it successfully introduces a novel methodology for integrating multiple sources of satellite data to address this problem.</p>
45

<b>Sparse Ensemble Networks for Hyperspectral Image Classification</b>

Rakesh Kumar Iyer (18424698) 23 April 2024 (has links)
<p dir="ltr">We explore the efficacy of sparsity and ensemble model in the classification of hyperspectral images, a pivotal task in remote sensing applications. While Convolutional Neural Networks (CNNs) and Transformer models have shown promise in this domain, each exhibits distinct limitations; CNNs excel in capturing the spatial/local features but falter to capture spectral features, whereas Transformers captures the spectral features at the expense of spatial features. Furthermore, the computational cost associated with training several independent CNN and Transformer networks becomes expensive. To address these limitations, we propose a novel ensemble framework comprising pruned CNNs and Transformers, optimizing both spatial and spectral feature utilization while curbing computational costs. By integrating sparsity through model pruning, our approach effectively reduces redundancy and computational complexity without compromising accuracy. Through extensive experimentation, we find that our method achieves comparable accuracy to its non-sparse counterparts while decreasing the computational cost. Our contribution enhances remote sensing analytics by demonstrating the potential of sparse and ensemble models in improving the precision and computational efficiency of hyperspectral image classification.</p>
46

An investigation into the use of ORM as a conceptual modelling technique with the UML domain model class diagram as benchmark

John, Manju Mereen 02 1900 (has links)
This study investigated the use of ORM as a conceptual modelling technique by using the UML domain model class diagram as benchmark. The rationale was that if the ORM-class diagram compared favourably with the benchmark, then ORM could be proposed as an alternate conceptual modelling technique. Proponents of ORM suggest that it has significant advantages over other techniques for conceptual modelling. The benchmark UML class diagram was developed according to the Unified Process through use-cases and collaboration diagrams. The ORM-class diagram was derived using the Conceptual Schema Design Process and ORM-UML Mapping Process. The evaluation of the two class diagrams was conducted by means of a questionnaire, based on a set of principles for conceptual models. The study concluded that ORM could not be proposed as a conceptual modelling technique up to the UML domain class diagram level without considering additional techniques for capturing the dynamics of the system. / Computer Science / M.Sc. (Computer Science)
47

Analyse des modèles résines pour la correction des effets de proximité en lithographie optique / Resist modeling analysis for optical proximity correction effect in optical lithography

Top, Mame Kouna 12 January 2011 (has links)
Les progrès réalisés dans la microélectronique répondent à la problématique de la réduction des coûts de production et celle de la recherche de nouveaux marchés. Ces progrès sont possibles notamment grâce à ceux effectués en lithographie optique par projection, le procédé lithographique principalement utilisé par les industriels. La miniaturisation des circuits intégrés n’a donc été possible qu’en poussant les limites d’impression lithographique. Cependant en réduisant les largeurs des transistors et l’espace entre eux, on augmente la sensibilité du transfert à ce que l’on appelle les effets de proximité optique au fur et à mesure des générations les plus avancées de 45 et 32 nm de dimension de grille de transistor.L’utilisation des modèles OPC est devenue incontournable en lithographie optique, pour les nœuds technologiques avancés. Les techniques de correction des effets de proximité (OPC) permettent de garantir la fidélité des motifs sur plaquette, par des corrections sur le masque. La précision des corrections apportées au masque dépend de la qualité des modèles OPC mis en œuvre. La qualité de ces modèles est donc primordiale. Cette thèse s’inscrit dans une démarche d’analyse et d’évaluation des modèles résine OPC qui simulent le comportement de la résine après exposition. La modélisation de données et l’analyse statistique ont été utilisées pour étudier ces modèles résine de plus en plus empiriques. Outre la fiabilisation des données de calibrage des modèles, l’utilisation des plateformes de création de modèles dédiées en milieu industriel et la méthodologie de création et de validation des modèles OPC ont également été étudié. Cette thèse expose le résultat de l’analyse des modèles résine OPC et propose une nouvelles méthodologie de création, d’analyse et de validation de ces modèles. / The Progress made in microelectronics responds to the matter of production costs reduction and to the search of new markets. These progresses have been possible thanks those made in optical lithography, the printing process principally used in integrated circuit (IC) manufacturing.The miniaturization of integrated circuits has been possible only by pushing the limits of optical resolution. However this miniaturization increases the sensitivity of the transfer, leading to more proximity effects at progressively more advanced technology nodes (45 and 32 nm in transistor gate size). The correction of these optical proximity effects is indispensible in photolithographic processes for advanced technology nodes. Techniques of optical proximity correction (OPC) enable to increase the achievable resolution and the pattern transfer fidelity for advanced lithographic generations. Corrections are made on the mask based on OPC models which connect the image on the resin to the changes made on the mask. The reliability of these OPC models is essential for the improvement of the pattern transfer fidelity.This thesis analyses and evaluates the OPC resist models which simulates the behavior of the resist after the photolithographic process. Data modeling and statistical analysis have been used to study these increasingly empirical resist models. Besides the model calibration data reliability, we worked on the way of using the models calibration platforms generally used in IC manufacturing.This thesis exposed the results of the analysis of OPC resist models and proposes a new methodology for OPC resist models creation, analysis and validation.
48

An investigation into the use of ORM as a conceptual modelling technique with the UML domain model class diagram as benchmark

John, Manju Mereen 02 1900 (has links)
This study investigated the use of ORM as a conceptual modelling technique by using the UML domain model class diagram as benchmark. The rationale was that if the ORM-class diagram compared favourably with the benchmark, then ORM could be proposed as an alternate conceptual modelling technique. Proponents of ORM suggest that it has significant advantages over other techniques for conceptual modelling. The benchmark UML class diagram was developed according to the Unified Process through use-cases and collaboration diagrams. The ORM-class diagram was derived using the Conceptual Schema Design Process and ORM-UML Mapping Process. The evaluation of the two class diagrams was conducted by means of a questionnaire, based on a set of principles for conceptual models. The study concluded that ORM could not be proposed as a conceptual modelling technique up to the UML domain class diagram level without considering additional techniques for capturing the dynamics of the system. / Computer Science / M.Sc. (Computer Science)
49

ASSESSMENT OF VARIABILITY OF LAND USE IMPACTS ON WATER QUALITY CONTAMINANTS

Johann Alexander Vera (14103150), Bernard A. Engel (5644601) 10 December 2022 (has links)
<p> The hydrological cycle is affected by land use variability. Land use spatial and temporal variability has the power to alter watershed runoff, water resource quantity and quality, ecosystems, and environmental sustainability. In recent decades, agriculture lands, pastures, plantations, and urban areas have increased, resulting in significant increases in energy, water, and fertilizer usage, as well as significant biodiversity losses. </p>
50

QUALITY ASSESSMENT OF GEDI ELEVATION DATA

Wildan Firdaus (12216200) 13 December 2023 (has links)
<p dir="ltr">As a new spaceborne laser remote sensing system, the Global Ecosystem Dynamics Investigation, or GEDI, is being widely used for monitoring forest ecosystems. However, its measurements are subject to uncertainties that will affect the calculation of ground elevation and vegetation height. This research intends to investigate the quality of the GEDI elevation data and its relevance to topography and land cover.</p><p dir="ltr">In this study, the elevation of the GEDI data is compared to 3DEP DEM, which has a higher resolution and accuracy. All the experiments in this study are conducted for two locations with vastly different terrain and land cover conditions, namely Tippecanoe County in Indiana and Mendocino County in California. Through this investigation we expect to gain a comprehensive understanding of GEDI’s elevation quality in various terrain and land cover conditions.</p><p dir="ltr">The results show that GEDI data in Tippecanoe County has better elevation accuracy than the GEDI data in Mendocino County. GEDI in Tippecanoe County is almost four times more accurate than in Mendocino County. Regarding land cover, GEDI have better accuracy in low vegetation areas than in forest areas. The ratio can be around three times better in Tippecanoe County and around one and half times better in Mendocino County. In terms of slope, GEDI data shows a clear positive correlation between RMSE and slope. The trend indicates as slope increases, the RMSE increases concurrently. In other words, slope and GEDI elevation accuracy are inversely related. In the experiment involving slope and land cover, the results show that slope is the most influential factor to GEDI elevation accuracy.</p><p dir="ltr">This study informs GEDI users of the factors they must consider for forest biomass calculation and topographic mapping applications. When high terrain slope and/or high vegetation is present, the GEDI data should be checked with other data sources like 3DEP DEM or any ground truth measurements to assure its quality. We expect these findings can help worldwide users understand that the quality of GEDI data is variable and dependent on terrain relief and land cover.</p>

Page generated in 0.0637 seconds