• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • 1
  • Tagged with
  • 10
  • 10
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Spatial data quality management

He, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
2

Spatial data quality management

He, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
3

Evaluating the performance of aggregate production planning strategies under uncertainty

Jamalnia, Aboozar January 2017 (has links)
The thesis is presented in three papers format. Paper 1 presents the first bibliometric literature survey of its kind on aggregate production planning (APP) in presence of uncertainty. It surveys a wide range of the literatures which employ operations research/management science methodologies to deal with APP in presence of uncertainty by classifying them into six main categories such as stochastic mathematical programming, fuzzy mathematical programming and simulation. After a preliminary literature analysis, e.g. with regard to number of publications by journal and publication frequency by country, the literature about each of these categories is shortly reviewed. Then, a more detailed statistical analysis of the surveyed research, with respect to the source of uncertainty, number of publications trend over time, adopted APP strategies, applied management science methodologies and their sub-categories, and so on, is presented. Finally, possible future research paths are discussed on the basis of identified research trends and research gaps. The second paper proposes a novel decision model to APP decision making problem based on mixed chase and level strategy under uncertainty where the market demand acts as the main source of uncertainty. By taking into account the novel features, the constructed model turns out to be stochastic, nonlinear, multi-stage and multi-objective. APP in practice entails multiple-objectivity. Therefore, the model involves multiple objectives such as total revenue, total production costs, total labour productivity costs, optimum utilisation of production resources and capacity and customer satisfaction, and is validated on the basis of real world data from beverage manufacturing industry. Applying the recourse approach in stochastic programming leads to empty feasible space, and therefore the wait and see approach is used instead. After solving the model using the real-world industrial data, sensitivity analysis and several forms of trade-off analysis are conducted by changing different parameters/coefficients of the constructed model, and by analysing the compromise between objectives respectively. Finally, possible future research directions, with regard to the limitations of present study, are discussed. The third paper is to appraise the performance of different APP strategies in presence of uncertainty. The relevant models for various APP strategies including the pure chase, the pure level, the modified chase and the modified level strategies are derived from the fundamental model developed for the mixed chase and level strategy in paper 2. The same procedure, which is used in paper 2, follows to solve the models constructed for these strategies with respect to the aforementioned objectives/criteria in order to provide business and managerial insights to operations managers about the effectiveness and practicality of these APP policies under uncertainty. Multiple criteria decision making (MCDM) methods such as additive value function (AVF), the technique for order of preference by similarity to ideal solution (TOPSIS) and VIKOR are also used besides multi-objective optimisation to assess the overall performance of each APP strategy.
4

A Bayesian approach to financial model calibration, uncertainty measures and optimal hedging

Gupta, Alok January 2010 (has links)
In this thesis we address problems associated with financial modelling from a Bayesian point of view. Specifically, we look at the problem of calibrating financial models, measuring the model uncertainty of a claim and choosing an optimal hedging strategy. Throughout the study, the local volatility model is used as a working example to clarify the proposed methods. This thesis assumes a prior probability density for the unknown parameter in a model we try to calibrate. The prior probability density regularises the ill-posedness of the calibration problem. Further observations of market prices are used to update this prior, using Bayes law, and give a posterior probability density for the unknown model parameter. Resulting Bayes estimators are shown to be consistent for finite-dimensional model parameters. The posterior density is then used to compute the Bayesian model average price. In tests on local volatility models it is shown that this price is closer than the prices of comparable calibration methods to the price given by the true model. The second part of the thesis focuses on quantifying model uncertainty. Using the framework for market risk measures we propose axioms for new classes of model uncertainty measures. Similar to the market risk case, we prove representation theorems for coherent and convex model uncertainty measures. Example measures from the latter class are provided using the Bayesian posterior. These are used to value the model uncertainty for a range of financial contracts priced in the local volatility model. In the final part of the thesis we propose a method for selecting the model, from a set of candidate models, that optimises the hedging of a specified financial contract. In particular we choose the model whose corresponding price and hedge optimises some hedging performance indicator. The selection problem is solved using Bayesian loss functions to encapsulate the loss from using one model to price and hedge when the true model is a different model. Linkages are made with convex model uncertainty measures and traditional utility functions. Numerical experiments on a stochastic volatility model and the local volatility model show that the Bayesian strategy can outperform traditional strategies, especially for exotic options.
5

Structure from Motion Using Optical Flow Probability Distributions

Merrell, Paul Clark 18 March 2005 (has links)
Several novel structure from motion algorithms are presented that are designed to more effectively manage the problem of noise. In many practical applications, structure from motion algorithms fail to work properly because of the noise in the optical flow values. Most structure from motion algorithms implicitly assume that the noise is identically distributed and that the noise is white. Both assumptions are false. Some points can be track more easily than others and some points can be tracked more easily in a particular direction. The accuracy of each optical flow value can be quantified using an optical flow probability distribution. By using optical flow probability distributions in place of optical flow estimates in a structure from motion algorithm, a better understanding of the noise is developed and a more accurate solution is obtained. Two different methods of calculating the optical flow probability distributions are presented. The first calculates non-Gaussian probability distributions and the second calculates Gaussian probability distributions. Three different methods for calculating structure from motion are presented that use these probability distributions. The first method works on two frames and can handle any kind of noise. The second method works on two frames and is restricted to only Gaussian noise. The final method works on multiple frames and uses Gaussian noise. A simulation was created to directly compare the performance of methods that use optical flow probability distributions and methods that do not. The simulation results show that those methods which use the probability distributions better estimate the camera motion and the structure of the scene.
6

Modellierung und Ladezustandsdiagnose von Lithium-Ionen-Zellen

Bartholomäus, Ralf, Wittig, Henning 28 February 2020 (has links)
In diesem Beitrag wird ein neuer Ansatz zur Modellierung von Lithium-Ionen-Zellen vorgestellt, bei dem neben einem Modell zur Beschreibung des Nominalverhaltens der Zelle ein Unbestimmtheitsmodell parametriert wird, welches die unvermeidbare Abweichung zwischen dem Nominalmodell und dem tatsächlichen Zellverhalten quantifiziert. Für diese Modellbeschreibung wird ein neuer Algorithmus zur Ladezustandsdiagnose entwickelt, der anstelle eines einzelnen (fehlerbehafteten) Wertes für den Ladezustand ein Vertrauensintervall angibt sowie Artefakte im zeitlichen Verlauf des geschätzten Ladezustandes vermeidet. Die Eigenschaften der Ladezustandsschätzung werden an einer Lithium-Ionen-Zelle und einem Einsatzszenario aus dem automobilen Bereich demonstriert. / In this paper, a new approach to modeling lithium ion cells is presented. In addition to a model that describes the nominal behavior of the cell, an uncertainty model is parameterized which quantifies the unavoidable difference between the nominal model and the true system behavior. For this model description a new algorithm for state of charge estimation is developed, which provides a confidence interval instead of a single unreliable value for the state of charge and avoids artifacts in the progression of the estimated state of charge over time. The properties of the state of charge estimation are demonstrated on a lithium-ion cell in an automotive application scenario.
7

Computational modelling of the neural systems involved in schizophrenia

Thurnham, A. J. January 2008 (has links)
The aim of this thesis is to improve our understanding of the neural systems involved in schizophrenia by suggesting possible avenues for future computational modelling in an attempt to make sense of the vast number of studies relating to the symptoms and cognitive deficits relating to the disorder. This multidisciplinary research has covered three different levels of analysis: abnormalities in the microscopic brain structure, dopamine dysfunction at a neurochemical level, and interactions between cortical and subcortical brain areas, connected by cortico-basal ganglia circuit loops; and has culminated in the production of five models that provide useful clarification in this difficult field. My thesis comprises three major relevant modelling themes. Firstly, in Chapter 3 I looked at an existing neural network model addressing the Neurodevelopmental Hypothesis of Schizophrenia by Hoffman and McGlashan (1997). However, it soon became clear that such models were overly simplistic and brittle when it came to replication. While they focused on hallucinations and connectivity in the frontal lobes they ignored other symptoms and the evidence of reductions in volume of the temporal lobes in schizophrenia. No mention was made of the considerable evidence of dysfunction of the dopamine system and associated areas, such as the basal ganglia. This led to my second line of reasoning: dopamine dysfunction. Initially I helped create a novel model of dopamine neuron firing based on the Computational Substrate for Incentive Salience by McClure, Daw and Montague (2003), incorporating temporal difference (TD) reward prediction errors (Chapter 5). I adapted this model in Chapter 6 to address the ongoing debate as to whether or not dopamine encodes uncertainty in the delay period between presentation of a conditioned stimulus and receipt of a reward, as demonstrated by sustained activation seen in single dopamine neuron recordings (Fiorillo, Tobler & Schultz 2003). An answer to this question could result in a better understanding of the nature of dopamine signaling, with implications for the psychopathology of cognitive disorders, like schizophrenia, for which dopamine is commonly regarded as having a primary role. Computational modelling enabled me to suggest that while sustained activation is common in single trials, there is the possibility that it increases with increasing probability, in which case dopamine may not be encoding uncertainty in this manner. Importantly, these predictions can be tested and verified by experimental data. My third modelling theme arose as a result of the limitations to using TD alone to account for a reinforcement learning account of action control in the brain. In Chapter 8 I introduce a dual weighted artificial neural network, originally designed by Hinton and Plaut (1987) to address the problem of catastrophic forgetting in multilayer artificial neural networks. I suggest an alternative use for a model with fast and slow weights to address the problem of arbitration between two systems of control. This novel approach is capable of combining the benefits of model free and model based learning in one simple model, without need for a homunculus and may have important implications in addressing how both goal directed and stimulus response learning may coexist. Modelling cortical-subcortical loops offers the potential of incorporating both the symptoms and cognitive deficits associated with schizophrenia by taking into account the interactions between midbrain/striatum and cortical areas.
8

Uncertainty in the first principle model based condition monitoring of HVAC systems

Buswell, Richard A. January 2001 (has links)
Model based techniques for automated condition monitoring of HVAC systems have been under development for some years. Results from the application of these methods to systems installed in real buildings have highlighted robustness and sensitivity issues. The generation of false alarms has been identified as a principal factor affecting the potential usefulness of condition monitoring in HVAC applications. The robustness issue is a direct result of the uncertain measurements and the lack of experimental control that axe characteristic of HVAC systems. This thesis investigates the uncertainties associated with implementing a condition monitoring scheme based on simple first principles models in HVAC subsystems installed in real buildings. The uncertainties present in typical HVAC control system measurements are evaluated. A sensor validation methodology is developed and applied to a cooling coil subsystem installed in a real building. The uncertainty in steady-state analysis based on transient data is investigated. The uncertainties in the simplifications and assumptions associated with the derivation of simple first principles based models of heat-exchangers are established. A subsystem model is developed and calibrated to the test system. The relationship between the uncertainties in the calibration data and the parameter estimates are investigated. The uncertainties from all sources are evaluated and used to generate a robust indication of the subsystem condition. The sensitivity and robustness of the scheme is analysed based on faults implemented in the test system during summer, winter and spring conditions.
9

Structural Safety Analysis with Alternative Uncertainty Models

Karuna, K January 2015 (has links) (PDF)
Probabilistic methods have been widely used in structural engineering to model uncertainties in loads and structural properties. The subjects of structural reliability analysis, random vibrations, and structural system identification have been extensively developed and provide the basic framework for developing rational design and maintenance procedures for engineering structures. One of the crucial requirements for successful application of probabilistic methods in these contexts is that one must have access to adequate amount of empirical data to form acceptable probabilistic models for the uncertain variables. When this requirement is not met, it becomes necessary to explore alternative methods for uncertainty modeling. Such efforts have indeed been made in structural engineering, albeit to a much lesser extent as compared to efforts expended in developing probabilistic methods. The alternative frameworks for uncertainty modeling include methods based on the use of interval analysis, convex function representations, theory of fuzzy variables, polymorphic models for uncertainties, and hybrid models which combine two or more of alternative modeling frameworks within the context of a given problem. The work reported in this thesis lies in the broad area of research of modeling uncertainties using non-probabilistic and combined non-probabilistic and probabilistic methods. The thesis document is organized into 5 chapters and 6 annexures. A brief overview of alternative frameworks for uncertainty modeling and their mathematical basis are provided in chapter 1. This includes discussion on modeling of uncertainties using intervals and issues related to uncertainty propagation using interval algebra; details of convex function models and relevance of optimization tools in characterizing uncertainty propagation; discussion on fuzzy variables and their relation to intervals and convex functions; and, issues arising out of treating uncertainties using combined probabilistic and non-probabilistic methods. The notion of aleatoric and epistemic uncertainties is also introduced and a brief mention of polymorphic models for uncertainty, which aim to accommodate alternative forms of uncertainty within a single mathematical model, is made. A review of literature pertaining to applications of non-probabilistic and combined probabilistic and non-probabilistic methods for uncertainty modeling in structural engineering applications is presented in chapter 2. The topics covered include: (a) solutions of simultaneous algebraic equations, eigenvalue problems, ordinary differential equations, and the extension of finite element models to include non-probabilistic uncertainties, (b) issues related to methods for arriving at uncertainty models based on empirical data, and (c) applications to problems of structural safety and structural optimization. The review identifies scope for further research into the following aspects: (a) development of methods for arriving at optimal convex function models for uncertain variables based on limited data and embedding the models thus developed into problems of structural safety assessment, and (b) treatment of inverse problems arising in structural safety based design and optimization which takes into account possible use of combined probabilistic and non-probabilistic modeling frameworks. Chapter 3 considers situations when adequate empirical data on uncertain variables is lacking thereby necessitating the use of non-probabilistic approaches to quantify uncertainties. The study discusses such situations in the context of structural safety assessment. The problem of developing convex function and fuzzy set models for uncertain variables based on limited data and subsequent application in structural safety assessment is considered. Strategies to develop convex set models for limited data based on super-ellipsoids with minimum volume and Nataf’s transformation based method are proposed. These models are shown to be fairly general (for instance, approximations to interval based models emerge as special cases). Furthermore, the proposed convex functions are mapped to a unit multi-dimensional sphere. This enables the evaluation of a unified measure of safety, defined as the shortest distance from the origin to the limit surface in the transformed standard space, akin to the notion used in defining the Hasofer- Lind reliability index. Also discussed are issues related to safety assessment when mixed uncertainty modeling approach is used. Illustrative examples include safety assessment of an inelastic frame with uncertain properties. The study reported in chapter 4 considers a few inverse problems of structural safety analysis aimed at the determination of system parameters to ensure a target level of safety and (or) to minimize a cost function for problems involving combined probabilistic and non-probabilistic uncertainty modeling. Development of load and resistance factor design format, in problems with combined uncertainty models, is also presented. We employ super-ellipsoid based convex function/fuzzy variable models for representing non-probabilistic uncertainties. The target safety levels are taken to be specified in terms of indices defined in standard space of uncertain variables involving standard normal random variables and (or) unit hyper-spheres. A class of problems amenable for exact solutions is identified and a general procedure for dealing with more general problems involving nonlinear performance functions is developed. Illustrations include studies on inelastic frame with uncertain properties. A summary of contributions made in the thesis, along with a few suggestions for future research, are presented in chapter 5. Annexure A-F contain the details of derivation of alternative forms of safety measures, Newton Raphson’s based methods for optimization used in solutions to inverse problems, and details of combining Matlab based programs for uncertainty modeling with Abaqus based models for structural analysis.
10

Uncertainty and correlation modeling for load flow analysis of future electricity distribution systems : Probabilistic modeling of low voltage networks with residential photovoltaic generation and electric vehicle charging

Ramadhani, Umar Hanif January 2021 (has links)
The penetration of photovoltaic (PV) and electric vehicles (EVs) continues to grow and is predicted to claim a vital share of the future energy mix. It poses new challenges in the built environment, as both PV systems and EVs are widely dispersed in the electricity distribution system. One of the vital tools for analyzing these challenges is load flow analysis, which provides insights on power system performance. Traditionally, for simplicity, load flow analysis utilizes deterministic approaches and neglecting  correlation between units in the system. However, the growth of distributed PV systems and EVs increases the uncertainties and correlations in the power system and, hence, probabilistic methods are more appropriate. This thesis contributes to the knowledge of how uncertainty and correlation models can improve the quality of load flow analysis for electricity distribution systems with large numbers of residential PV systems and EVs. The thesis starts with an introduction to probabilistic load flow analysis of future electricity distribution systems. Uncertainties and correlation models are explained, as well as two energy management system strategies: EV smart charging and PV curtailment. The probabilistic impact of these energy management systems in the electricity distribution system has been assessed through a comparison of allocation methods and correlation analysis of the two technologies. The results indicate that these energy management system schemes improve the electricity distribution system performance. Furthermore, an increase in correlations between nodes is also observed due to these schemes. The results also indicate that the concentrated allocation has more severe impacts, in particular at lower penetration levels. Combined PV-EV hosting capacity assessment shows that a combination of EV smart charging with PV curtailment in all buildings can further improve the voltage profile and increase the hosting capacity.  The smart charging scheme also increased the PV hosting capacity slightly. The slight correlation between PV and EV hosting capacity shows that combined hosting capacity analysis of PV systems and EVs is beneficial and is suggested to be done in one framework. Overall, this thesis concludes that an improvement of uncertainty and correlation modeling is vital in probabilistic load flow analysis of future electricity distribution systems.

Page generated in 0.1033 seconds