• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • 12
  • 7
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 65
  • 65
  • 18
  • 15
  • 11
  • 10
  • 9
  • 9
  • 9
  • 9
  • 7
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Hydrologic Impacts Of Climate Change : Uncertainty Modeling

Ghosh, Subimal 07 1900 (has links)
General Circulation Models (GCMs) are tools designed to simulate time series of climate variables globally, accounting for effects of greenhouse gases in the atmosphere. They attempt to represent the physical processes in the atmosphere, ocean, cryosphere and land surface. They are currently the most credible tools available for simulating the response of the global climate system to increasing greenhouse gas concentrations, and to provide estimates of climate variables (e.g. air temperature, precipitation, wind speed, pressure etc.) on a global scale. GCMs demonstrate a significant skill at the continental and hemispheric spatial scales and incorporate a large proportion of the complexity of the global system; they are, however, inherently unable to represent local subgrid-scale features and dynamics. The spatial scale on which a GCM can operate (e.g., 3.75° longitude x 3.75° latitude for Coupled Global Climate Model, CGCM2) is very coarse compared to that of a hydrologic process (e.g., precipitation in a region, streamflow in a river etc.) of interest in the climate change impact assessment studies. Moreover, accuracy of GCMs, in general, decreases from climate related variables, such as wind, temperature, humidity and air pressure to hydrologic variables such as precipitation, evapotranspiration, runoff and soil moisture, which are also simulated by GCMs. These limitations of the GCMs restrict the direct use of their output in hydrology. This thesis deals with developing statistical downscaling models to assess climate change impacts and methodologies to address GCM and scenario uncertainties in assessing climate change impacts on hydrology. Downscaling, in the context of hydrology, is a method to project the hydrologic variables (e.g., rainfall and streamflow) at a smaller scale based on large scale climatological variables (e.g., mean sea level pressure) simulated by a GCM. A statistical downscaling model is first developed in the thesis to predict the rainfall over Orissa meteorological subdivision from GCM output of large scale Mean Sea Level Pressure (MSLP). Gridded monthly MSLP data for the period 1948 to 2002, are obtained from the National Center for Environmental Prediction/ National Center for Atmospheric Research (NCEP/NCAR) reanalysis project for a region spanning 150 N -250 N in latitude and 800 E -900 E in longitude that encapsulates the study region. The downscaling model comprises of Principal Component Analysis (PCA), Fuzzy Clustering and Linear Regression. PCA is carried out to reduce the dimensionality of the larger scale MSLP and also to convert the correlated variables to uncorrelated variables. Fuzzy clustering is performed to derive the membership of the principal components in each of the clusters and the memberships obtained are used in regression to statistically relate MSLP and rainfall. The statistical relationship thus obtained is used to predict the rainfall from GCM output. The rainfall predicted with the GCM developed by CCSR/NIES with B2 scenario presents a decreasing trend for non-monsoon period, for the case study. Climate change impact assessment models developed based on downscaled GCM output are subjected to a range of uncertainties due to both ‘incomplete knowledge’ and ‘unknowable future scenario’ (New and Hulme, 2000). ‘Incomplete knowledge’ mainly arises from inadequate information and understanding about the underlying geophysical process of global change, leading to limitations in the accuracy of GCMs. This is also termed as GCM uncertainty. Uncertainty due to ‘unknowable future scenario’ is associated with the unpredictability in the forecast of socio-economic and human behavior resulting in future Green House Gas (GHG) emission scenarios, and can also be termed as scenario uncertainty. Downscaled outputs of a single GCM with a single climate change scenario represent a single trajectory among a number of realizations derived using various GCMs and scenarios. Such a single trajectory alone can not represent a future hydrologic scenario, and will not be useful in assessing hydrologic impacts due to climate change. Nonparametric methods are developed in the thesis to model GCM and scenario uncertainty for prediction of drought scenario with Orissa meteorological subdivision as a case study. Using the downscaling technique described in the previous paragraph, future rainfall scenarios are obtained for all available GCMs and scenarios. After correcting for bias, equiprobability transformation is used to convert the precipitation into Standardized Precipitation Index-12 (SPI-12), an annual drought indicator, based on which a drought may be classified as a severe drought, mild drought etc. Disagreements are observed between different predictions of SPI-12, resulting from different GCMs and scenarios. Assuming SPI-12 to be a random variable at every time step, nonparametric methods based on kernel density estimation and orthonormal series are used to determine the nonparametric probability density function (pdf) of SPI-12. Probabilities for different categories of drought are computed from the estimated pdf. It is observed that there is an increasing trend in the probability of extreme drought and a decreasing trend in the probability of near normal conditions, in the Orissa meteorological subdivision. The single valued Cumulative Distribution Functions (CDFs) obtained from nonparametric methods suffer from limitations due to the following: (a) simulations for all scenarios are not available for all the GCMs, thus leading to a possibility that incorporation of these missing climate experiments may result in a different CDF, (b) the method may simply overfit to a multimodal distribution from a relatively small sample of GCMs with a limited number of scenarios, and (c) the set of all scenarios may not fully compose the universal sample space, and thus, the precise single valued probability distribution may not be representative enough for applications. To overcome these limitations, an interval regression is performed to fit an imprecise normal distribution to the SPI-12 to provide a band of CDFs instead of a single valued CDF. Such a band of CDFs represents the incomplete nature of knowledge, thus reflecting the extent of what is ignored in the climate change impact assessment. From imprecise CDFs, the imprecise probabilities of different categories of drought are computed. These results also show an increasing trend of the bounds of the probability of extreme drought and decreasing trend of the bounds of the probability of near normal conditions, in the Orissa meteorological subdivision. Water resources planning requires the information about future streamflow scenarios in a river basin to combat hydrologic extremes resulting from climate change. It is therefore necessary to downscale GCM projections for streamflow prediction at river basin scales. A statistical downscaling model based on PCA, fuzzy clustering and Relevance Vector Machine (RVM) is developed to predict the monsoon streamflow of Mahanadi river at Hirakud reservoir, from GCM projections of large scale climatological data. Surface air temperature at 2m, Mean Sea Level Pressure (MSLP), geopotential height at a pressure level of 500 hecto Pascal (hPa) and surface specific humidity are considered as the predictors for modeling Mahanadi streamflow in monsoon season. PCA is used to reduce the dimensionality of the predictor dataset and also to convert the correlated variables to uncorrelated variables. Fuzzy clustering is carried out to derive the membership of the principal components in each of the clusters and the memberships thus obtained are used in RVM regression model. RVM involves fewer number of relevant vectors and the chance of overfitting is less than that of Support Vector Machine (SVM). Different kernel functions are used for comparison purpose and it is concluded that heavy tailed Radial Basis Function (RBF) performs best for streamflow prediction with GCM output for the case considered. The GCM CCSR/NIES with B2 scenario projects a decreasing trend in future monsoon streamflow of Mahanadi which is likely to be due to high surface warming. A possibilistic approach is developed next, for modeling GCM and scenario uncertainty in projection of monsoon streamflow of Mahanadi river. Three GCMs, Center for Climate System Research/ National Institute for Environmental Studies (CCSR/NIES), Hadley Climate Model 3 (HadCM3) and Coupled Global Climate Model 2 (CGCM2) with two scenarios A2 and B2 are used for the purpose. Possibilities are assigned to GCMs and scenarios based on their system performance measure in predicting the streamflow during years 1991-2005, when signals of climate forcing are visible. The possibilities are used as weights for deriving the possibilistic mean CDF for the three standard time slices, 2020s, 2050s and 2080s. It is observed that the value of streamflow at which the possibilistic mean CDF reaches the value of 1 reduces with time, which shows reduction in probability of occurrence of extreme high flow events in future and therefore there is likely to be a decreasing trend in the monthly peak flow. One possible reason for such a decreasing trend may be the significant increase in temperature due to climate warming. Simultaneous occurrence of reduction in Mahandai streamflow and increase in extreme drought in Orissa meteorological subdivision is likely to pose a challenge for water resources engineers in meeting water demands in future.
62

Vibrational spectroscopy of keratin fibres : A forensic approach

Panayiotou, Helen January 2004 (has links)
Human hair profiling is an integral part of a forensic investigation but it is one of the most technically difficult subjects in forensic science. This thesis describes the research and development of a novel approach for the rapid identification of unknown human and other related keratin fibres found at a crime scene. The work presented here is developed systematically and considers sample collection, sample preparation, analysis and interpretation of spectral data for the profiling of hair fibres encountered in criminal cases. Spectral comparison of fibres was facilitated with the use of chemometrics methods such as PCA, SIMCA and Fuzzy Clustering, and the less common approach of multi-criteria decision making methodology (MCDM). The aim of the thesis was to investigate the potential of some vibrational spectroscopy techniques for matching and discrimination of single keratin hair fibres in the context of forensic evidence. The first objective (chapter 3) of the thesis was to evaluate the use of Raman and FT-IR micro-spectroscopy techniques for the forensic sampling of hair fibres and to propose the preferred technique for future forensic hair comparisons. The selection of the preferred technique was based on criteria such as spectral quality, ease of use, rapid analysis and universal application to different hair samples. FT-IR micro-spectroscopy was found to be the most appropriate technique for hair analysis because it enabled the rapid collection of spectra from a wide variety of hair fibres. Raman micro-spectroscopy, on the other hand, was hindered with fluorescence problems and did not allow the collection of spectra from pigmented fibres. This objective has therefore shown that FT-IR micro-spectroscopy is the preferable spectroscopic technique for forensic analysis of hair fibres, whilst Raman spectroscopy is the least preferred. The second objective (chapter 3) was to investigate, through a series of experiments, the effect of chemical treatment on the micro-environment of human hair fibres. The effect of bleaching agents on the hair fibres was studied with some detail at different treatment times and the results indicate a significant change in the chemical environment of the secondary structure of the hair fibre along with changes in the C-C backbone structure. One of the most important outcomes of this research was the behaviour of the fÑ-helix during chemical treatment. The hydrogen bonding in the fÑ-helix provides for the stable structure of the fibre and therefore any disruption to the fÑ-helix will inevitably damage the molecular structure of the fibre. The results highlighted the behaviour of the fÑ-helix, which undergoes a significant decrease in content during oxidation, and is partly converted to a random-coil structure, whilst the fÒ-sheet component of the secondary structure remains unaffected. The reported investigations show that the combination of FT-IR and Raman micro-spectroscopy can provide an insight and understanding into the complex chemical properties and reactions within a treated hair fibre. Importantly, this work demonstrates that with the aid of chemometrics, it is possible to investigate simultaneously FT-IR and Raman micro-spectroscopic information from oxidised hair fibres collected from one subject and treated at different times. The discrimination and matching of hair fibres on the basis of treatment has potential forensic applications. The third objective (chapter 4) attempted to expand the forensic application of FT-IR micro-spectroscopy to other keratin fibres. Animal fibres are commonly encountered in crime scenes and it thus becomes important to establish the origin of those fibres. The aim of this work was to establish the forensic applications of FT-IR micro-spectroscopy to animal fibres and to investigate any fundamental molecular differences between these fibres. The results established a discrimination between fibres consisting predominantly of fÑ-helix and those containing mainly a fÒ-sheet structure. More importantly, it was demonstrated through curve-fitting and chemometrics, that each keratin fibre contains a characteristic secondary structure arrangement. The work presented here is the first detailed FT-IR micro-spectroscopic study, utilising chemometrics as well as MCDM methods, for a wide range of keratin fibres, which are commonly, found as forensic evidence. Furthermore, it was demonstrated with the aid of the rank ordering MCDM methods PROMETHEE and GAIA, that it is possible to rank and discriminate keratin fibres according to their molecular characteristics obtained from direct measurements together with information sourced from the literature. The final objective (chapter 5) of the thesis was to propose an alternative method for the discrimination and matching of single scalp human hair fibres through the use of FT-IR micro-spectroscopy and chemometrics. The work successfully demonstrated, through a number of case scenarios, the application of the technique for the identification of variables such as gender and race for an unknown single hair fibre. In addition, it was also illustrated that known hair fibres (from the suspect or victim) can be readily matched to the unknown hair fibres found at the crime scene. This is the first time that a substantial, systematic FT-IR study of forensic hair identification has been presented. The research has shown that it is possible to model and correlate individual¡¦s characteristics with hair properties at molecular level with the use of chemometrics methods. A number of different, important forensic variables of immediate use to police in a crime scene investigation such as gender, race, treatment, black and white hair fibres were investigated. Blind samples were successfully applied both to validate available experimental data and extend the current database of experimental determinations. Protocols were posed for the application of this methodology in the future. The proposed FT-IR methodology presented in this thesis has provided an alternative approach to the characterisation of single scalp human hair fibres. The technique enables the rapid collection of spectra, followed by the objective analytical capabilities of chemometrics to successfully discriminate animal fibres, human hair fibres from different sources, treated from untreated hair fibres, as well as black and white hair fibres, on the basis of their molecular structure. The results can be readily produced and explained in the courts of law. Although the proposed relatively fast FT-IR technique is not aimed at displacing the two slower existing methods of hair analysis, namely comparative optical microscopy and DNA analysis, it has given a new dimension to the characterisation of hair fibres at a molecular level, providing a powerful tool for forensic investigations.
63

AIM - A Social Media Monitoring System for Quality Engineering

Bank, Mathias 14 June 2013 (has links)
In the last few years the World Wide Web has dramatically changed the way people are communicating with each other. The growing availability of Social Media Systems like Internet fora, weblogs and social networks ensure that the Internet is today, what it was originally designed for: A technical platform in which all users are able to interact with each other. Nowadays, there are billions of user comments available discussing all aspects of life and the data source is still growing. This thesis investigates, whether it is possible to use this growing amount of freely provided user comments to extract quality related information. The concept is based on the observation that customers are not only posting marketing relevant information. They also publish product oriented content including positive and negative experiences. It is assumed that this information represents a valuable data source for quality analyses: The original voices of the customers promise to specify a more exact and more concrete definition of \"quality\" than the one that is available to manufacturers or market researchers today. However, the huge amount of unstructured user comments makes their evaluation very complex. It is impossible for an analysis protagonist to manually investigate the provided customer feedback. Therefore, Social Media specific algorithms have to be developed to collect, pre-process and finally analyze the data. This has been done by the Social Media monitoring system AIM (Automotive Internet Mining) that is the subject of this thesis. It investigates how manufacturers, products, product features and related opinions are discussed in order to estimate the overall product quality from the customers\\\'' point of view. AIM is able to track different types of data sources using a flexible multi-agent based crawler architecture. In contrast to classical web crawlers, the multi-agent based crawler supports individual crawling policies to minimize the download of irrelevant web pages. In addition, an unsupervised wrapper induction algorithm is introduced to automatically generate content extraction parameters which are specific for the crawled Social Media systems. The extracted user comments are analyzed by different content analysis algorithms to gain a deeper insight into the discussed topics and opinions. Hereby, three different topic types are supported depending on the analysis needs. * The creation of highly reliable analysis results is realized by using a special context-aware taxonomy-based classification system. * Fast ad-hoc analyses are applied on top of classical fulltext search capabilities. * Finally, AIM supports the detection of blind-spots by using a new fuzzified hierarchical clustering algorithm. It generates topical clusters while supporting multiple topics within each user comment. All three topic types are treated in a unified way to enable an analysis protagonist to apply all methods simultaneously and in exchange. The systematically processed user comments are visualized within an easy and flexible interactive analysis frontend. Special abstraction techniques support the investigation of thousands of user comments with minimal time efforts. Hereby, specifically created indices show the relevancy and customer satisfaction of a given topic.:1 Introduction 1.1 Chapter Overview 2 Problem Definition and Data Environment 2.1 Commonly Applied Quality Sensors 2.2 The Growing Importance of Social Media 2.3 Social Media based Quality Experience 2.4 Change to the Holistic Concept of Quality 2.5 Definition of User Generated Content and Social Media 2.6 Social Media Software Architectures 3 Data Collection 3.1 Related Work 3.2 Requirement Analysis 3.3 A Blackboard Crawler Architecture 3.4 Semi-supervised Wrapper Generation 3.5 Structure Modifification Detection 3.6 Conclusion 4 Hierarchical Fuzzy Clustering 4.1 Related Work 4.2 Generalization of Agglomerative Crisp Clustering Algorithms 4.3 Topic Groups Generation 4.4 Evaluation 4.5 Conclusion 5 A Social Media Monitoring System for Quality Analyses 5.1 Related Work 5.2 Pre-Processing Workflow 5.3 Quality Indices 5.4 AIM Architecture 5.5 Evaluation 5.6 Conclusion 6 Conclusion and Perspectives 6.1 Contributions and Conclusions 6.2 Perspectives Bibliography / In den letzten Jahren hat sich das World Wide Web dramatisch verändert. War es vor einigen Jahren noch primär eine Informationsquelle, in der ein kleiner Anteil der Nutzer Inhalte veröffentlichen konnte, so hat sich daraus eine Kommunikationsplattform entwickelt, in der jeder Nutzer aktiv teilnehmen kann. Die dadurch enstehende Datenmenge behandelt jeden Aspekt des täglichen Lebens. So auch Qualitätsthemen. Die Analyse der Daten verspricht Qualitätssicherungsmaßnahmen deutlich zu verbessern. Es können dadurch Themen behandelt werden, die mit klassischen Sensoren schwer zu messen sind. Die systematische und reproduzierbare Analyse von benutzergenerierten Daten erfordert jedoch die Anpassung bestehender Tools sowie die Entwicklung neuer Social-Media spezifischer Algorithmen. Diese Arbeit schafft hierfür ein völlig neues Social Media Monitoring-System, mit dessen Hilfe ein Analyst tausende Benutzerbeiträge mit minimaler Zeitanforderung analysieren kann. Die Anwendung des Systems hat einige Vorteile aufgezeigt, die es ermöglichen, die kundengetriebene Definition von \"Qualität\" zu erkennen.:1 Introduction 1.1 Chapter Overview 2 Problem Definition and Data Environment 2.1 Commonly Applied Quality Sensors 2.2 The Growing Importance of Social Media 2.3 Social Media based Quality Experience 2.4 Change to the Holistic Concept of Quality 2.5 Definition of User Generated Content and Social Media 2.6 Social Media Software Architectures 3 Data Collection 3.1 Related Work 3.2 Requirement Analysis 3.3 A Blackboard Crawler Architecture 3.4 Semi-supervised Wrapper Generation 3.5 Structure Modifification Detection 3.6 Conclusion 4 Hierarchical Fuzzy Clustering 4.1 Related Work 4.2 Generalization of Agglomerative Crisp Clustering Algorithms 4.3 Topic Groups Generation 4.4 Evaluation 4.5 Conclusion 5 A Social Media Monitoring System for Quality Analyses 5.1 Related Work 5.2 Pre-Processing Workflow 5.3 Quality Indices 5.4 AIM Architecture 5.5 Evaluation 5.6 Conclusion 6 Conclusion and Perspectives 6.1 Contributions and Conclusions 6.2 Perspectives Bibliography
64

Οργάνωση και διαχείριση βάσεων εικόνων βασισμένη σε τεχνικές εκμάθησης δεδομένων πολυσχιδούς δομής

Μακεδόνας, Ανδρέας 22 December 2009 (has links)
Το ερευνητικό αντικείμενο της συγκεκριμένης διατριβής αναφέρεται στην επεξεργασία έγχρωμης εικόνας με χρήση της θεωρίας γράφων, την ανάκτηση εικόνας καθώς και την οργάνωση / διαχείριση βάσεων δεδομένων με μεθόδους γραφημάτων και αναγνώρισης προτύπων, με εφαρμογή σε πολυμέσα. Τα συγκεκριμένα προβλήματα προσεγγίστηκαν διατηρώντας τη γενικότητά τους και επιλύθηκαν με βάση τα ακόλουθα σημεία: 1. Ανάπτυξη τεχνικών για την επιλογή χαρακτηριστικών από τις εικόνες βάσει χαρακτηριστικών χαμηλού επιπέδου (χρώματος και υφής), για χρήση τους σε εφαρμογές ομοιότητας και ανάκτησης εικόνας. 2. Υπολογισμός μετρικών και αποστάσεων στο χώρο των χαρακτηριστικών. 3. Μελέτη της πολυσχιδούς δομής των εικόνων μιας βάσης στο χώρο των χαρακτηριστικών. 4. Ελάττωση της διάστασης του χώρου και παραγωγή αναπαραστάσεων δύο διαστάσεων. 5. Εφαρμογή των μεθόδων αυτών σε υποκειμενικές αποστάσεις εικόνων. Η θεωρία γράφων και οι μέθοδοι αναγνώρισης προτύπων χρησιμοποιήθηκαν προκειμένου να παρουσιαστούν βέλτιστες λύσεις αφενός στο πρόβλημα της ανάκτησης εικόνων από βάσεις δεδομένων και αφετέρου στην οργάνωση και διαχείριση τέτοιων βάσεων εικόνων. Η διατριβή φέρνει πιο κοντά την επεξεργασία εικόνας με μεθόδους προερχόμενες από τη θεωρία γραφημάτων, τη στατιστική και την αναγνώριση προτύπων. Σε όλη τη διάρκεια της διατριβής, ιδιαίτερη έμφαση δόθηκε στο ζήτημα της εύρεσης του κατάλληλου συνδυασμού μεταξύ της αποτελεσματικότητας των συστημάτων και της αποδοτικότητας στα πλαίσια της εφαρμογής των προτεινόμενων αλγοριθμικών διαδικασιών. Τα αναλυτικά πειραματικά αποτελέσματα που πραγματοποιήθηκαν, αποδεικνύουν την βελτιωμένη απόδοση των προτεινόμενων μεθοδολογιών. / The subject of this doctoral thesis is related to color image processing using graph theoretic methods, image retrieval and image database management and organization in the reduced feature space, using pattern recognition analysis, with multimedia applications. The author attempted to approach the thesis subject by retaining its genericness and addressing the following points: 1. Development of techniques for extraction of image visual attributes based on low level features (color and texture information), to be used for image similarity and retrieval practices. 2. Calculation of metrics and distances in the feature space. 3. Study of the image manifolds created in the selected feature space. 4. Application of dimensionality reduction techniques and production of biplots. 5. Application of the proposed methodologies using perceptual image distances. Graph theory and pattern recognition methodologies were incorporated in order to provide novel solution to color image retrieval of image databases, as well as to image database management and organization. The current thesis brings closer image processing with graph theoretic methodologies, statistical analysis and pattern recognition. Throughout the thesis, consideration has been taken for finding the best trade off between effectiveness and efficiency when applying the proposed algorithmic procedures. The extended experimental results carried out in all stages of the projected studies reveal the enhanced performance of the proposed methodologies.
65

Channel Probing for an Indoor Wireless Communications Channel

Hunter, Brandon 13 March 2003 (has links) (PDF)
The statistics of the amplitude, time and angle of arrival of multipaths in an indoor environment are all necessary components of multipath models used to simulate the performance of spatial diversity in receive antenna configurations. The model presented by Saleh and Valenzuela, was added to by Spencer et. al., and included all three of these parameters for a 7 GHz channel. A system was built to measure these multipath parameters at 2.4 GHz for multiple locations in an indoor environment. Another system was built to measure the angle of transmission for a 6 GHz channel. The addition of this parameter allows spatial diversity at the transmitter along with the receiver to be simulated. The process of going from raw measurement data to discrete arrivals and then to clustered arrivals is analyzed. Many possible errors associated with discrete arrival processing are discussed along with possible solutions. Four clustering methods are compared and their relative strengths and weaknesses are pointed out. The effects that errors in the clustering process have on parameter estimation and model performance are also simulated.

Page generated in 0.509 seconds