• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 16
  • 6
  • 2
  • 1
  • Tagged with
  • 34
  • 34
  • 9
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Density estimates of monarch butterflies overwintering in central Mexico

Thogmartin, Wayne E., Diffendorfer, Jay E., López-Hoffman, Laura, Oberhauser, Karen, Pleasants, John, Semmens, Brice X., Semmens, Darius, Taylor, Orley R., Wiederholt, Ruscena 26 April 2017 (has links)
Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9-60.9 million ha(-1). We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was similar to 27.9 million butterflies ha(-1) (95% CI [2.4-80.7] million ha(-1)); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha(-1)). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asciepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates, The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.
2

A Belief Theoretic Approach for Automated Collaborative Filtering

Wickramarathne, Thanuka Lakmal 01 January 2008 (has links)
WICKRAMARATHNE, T. L. (M.S., Electrical and Computer Engineering) A Belief Theoretic Approach for Automated Collaborative Filtering (May 2008) Abstract of a thesis at the University of Miami. Thesis supervised by Professor Kamal Premaratne. No. of pages in text. (84) Automated Collaborative Filtering (ACF) is one of the most successful strategies available for recommender systems. Application of ACF in more sensitive and critical applications however has been hampered by the absence of better mechanisms to accommodate imperfections (ambiguities and uncertainties in ratings, missing ratings, etc.) that are inherent in user preference ratings and propagate such imperfections throughout the decision making process. Thus one is compelled to make various "assumptions" regarding the user preferences giving rise to predictions that lack sufficient integrity. With its Dempster-Shafer belief theoretic basis, CoFiDS, the automated Collaborative Filtering algorithm proposed in this thesis, can (a) represent a wide variety of data imperfections; (b) propagate the partial knowledge that such data imperfections generate throughout the decision-making process; and (c) conveniently incorporate contextual information from multiple sources. The "soft" predictions that CoFiDS generates provide substantial exibility to the domain expert. Depending on the associated DS theoretic belief-plausibility measures, the domain expert can either render a "hard" decision or narrow down the possible set of predictions to as smaller set as necessary. With its capability to accommodate data imperfections, CoFiDS widens the applicability of ACF, from the more popular domains, such as movie and book recommendations, to more sensitive and critical problem domains, such as medical expert support systems, homeland security and surveillance, etc. We use a benchmark movie dataset and a synthetic dataset to validate CoFiDS and compare it to several existing ACF systems.
3

Probabilistic Databases and Their Applications

Zhao, Wenzhong 01 January 2004 (has links)
Probabilistic reasoning in databases has been an active area of research during the last twodecades. However, the previously proposed database approaches, including the probabilistic relationalapproach and the probabilistic object approach, are not good fits for storing and managingdiverse probability distributions along with their auxiliary information.The work in this dissertation extends significantly the initial semistructured probabilistic databaseframework proposed by Dekhtyar, Goldsmith and Hawkes in [20]. We extend the formal SemistructuredProbabilistic Object (SPO) data model of [20]. Accordingly, we also extend the SemistructuredProbabilistic Algebra (SP-algebra), the query algebra proposed for the SPO model.Based on the extended framework, we have designed and implemented a Semistructured ProbabilisticDatabase Management System (SPDBMS) on top of a relational DBMS. The SPDBMS isflexible enough to meet the need of storing and manipulating diverse probability distributions alongwith their associated information. Its query language supports standard database queries as wellas queries specific to probabilities, such as conditionalization and marginalization. Currently theSPDBMS serves as a storage backbone for the project Decision Making and Planning under Uncertaintywith Constraints 1‡ , that involves managing large quantities of probabilistic information. Wealso report our experimental results evaluating the performance of the SPDBMS.We describe an extension of the SPO model for handling interval probability distributions. TheExtended Semistructured Probabilistic Object (ESPO) framework improves the flexibility of theoriginal semistructured data model in two important features: (i) support for interval probabilitiesand (ii) association of context and conditionals with individual random variables. An extended SPO1 This project is partially supported by the National Science Foundation under Grant No. ITR-0325063.(ESPO) data model has been developed, and an extended query algebra for ESPO has also beenintroduced to manipulate probability distributions for probability intervals.The Bayesian Network Development Suite (BaNDeS), a system which builds Bayesian networkswith full data management support of the SPDBMS, has been described. It allows expertswith particular expertise to work only on specific subsystems during the Bayesian network constructionprocess independently and asynchronously while updating the model in real-time.There are three major foci of our ongoing and future work: (1) implementation of a queryoptimizer and performance evaluation of query optimization, (2) extension of the SPDBMS to handleinterval probability distributions, and (3) incorporation of machine learning techniques into theBaNDeS.
4

Architecture des réseaux de distribution en présence de production décentralisée. Planification sous incertitudes et modes d'exploitation décentralisés / multi objective distributed generation planning in a flexible environment

Soroudi, Alireza 04 October 2011 (has links)
La libéralisation du marché de l'électricité a introduit plusieurs nouveaux sujets de recherche intéressants dans la zone du système électrique. Cette thèse aborde l'un des problèmes fascinants parmi eux: l'étude de la génération distribuée à la fois renouvelable et classique d'intégration dans les réseaux de distribution. De Gestionnaires de Réseau de Distribution (GRD) point de vue, il est intéressant de développer une méthodologie globale qui considère les différentes technologies de production décentralisée (GD) comme une option pour la fourniture à la demande. Dans cette thèse, le problème de planification a été modélisé avec la méthodologie de multi-objectif. Cela aidera le planificateur de la prise de décision tout en sachant les arbitrages entre les fonctions objectives. Afin de trouver le front de Pareto optimale du problème, un hybride génétique-immunes algorithme est proposé. La méthode floue satisfaisant est utilisé pour trouver la solution finale. Divers objectifs comme le coût, les pertes actifs, d'émissions et de la satisfaction de contraintes techniques ont été prises en compte. Les variables de décision sont les stratégies de renforcement des réseaux de distribution et aussi les décisions d'investissement concernant les modules GD, dans le cas où GRD peut investir dans des modules de DG aussi. Un autre aspect qui rend les modèles proposés plus flexible, est compte tenu des incertitudes sur les paramètres d'entrée. Les incertitudes des données d'entrée ont été traitées de trois manières différentes à savoir : probabiliste, possibiliste et finalement mélangés possibiliste-probabilistes. Dans cette thèse, deux types de modèles ont été développés: centralisé et dégroupé modèle de planification GD. Dans les deux modèles, le GRD est responsable de fournir un réseau fiable et performant pour ses clients sur son territoire. Dans le contexte de planification centralisée, le GRD est autorisé à faire des investissements dans les modules de la GD. Dans ce modèle, la taille optimale, nombre d'unités de la GD, l'emplacement, la technologie et de la GD, calendrier des investissements dans les modules de GD à la fois et les composants du réseau sont déterminés. Le modèle développé ne sera pas seulement utile dans le contexte de la planification centralisée, mais est également applicable aux marchés de l'énergie d'autres qui ont besoin pour évaluer, surveiller et guider les décisions des développeurs GD. Dans le modèle de planification de la GD dégroupé, le GRD n'est pas autorisé à prendre des décisions d'investissement dans les options de la GD. Les variables de décision du GRD sont limités à renfort de réseau, le placement de condensateurs, la reconfiguration du réseau et des technologies de réseau intelligent. / The process of deregulation that has involved electricity markets has introduced several new interesting research topics in power system area. This thesis addresses one of the fascinating issues among them: the study of distributed generation both renewable and conventional integration in distribution networks. From the distribution network operator (DNO)'s point of view, it is interesting to develop a comprehensive methodology which considers various distributed generation technologies as an option for supplying the demand. In this thesis, the planning problem has been multi-objectively modeled. This will help the planner in decision making while knowing the trade-offs between the objective functions. for finding the Pareto optimal front of the problem a hybrid Genetic-Immune algorithm is proposed. The fuzzy satisfying method is used to find the final solution. Various objectives like cost, active losses, emission and the technical constraint satisfaction have been taken into account. The decision variables are the distribution network reinforcement strategies and also the investment decisions regarding DG units, in case where DNO can invest in DG units too. Another aspect which makes the proposed models more flexible, is considering the uncertainties of the input parameters. The uncertainties of input data have been treated in three different ways namely, probabilistic, possibilistic and finally mixed possibilistic-probabilistic methods. In this thesis, two types of models have been developed: centralized and unbundled DG planning model. In both models, the DNO is responsible to provide a reliable and efficient network for his costumers in its territory. In centrally controlled planning context, the DNO is authorized to make investment in DG units. In this model, the optimal size, number of DG units, location, DG technology and timing of investment in both DG units and network components are determined. The developed model will not only be useful in the centrally controlled planning context but also is applicable to other power markets that need to assess, monitor and guide the decisions of DG developers. In unbundled DG planning model, the DNO is not authorized to make investment decisions in DG options. The decision variables of DNO are limited to feeder/substation expansion/reinforcement, capacitor placement, network reconfiguration and smart grid technologies.
5

Mathematical Model Validation of a Center of Gravity Measuring Platform Using Experimental Tests and FEA

Lashore, Michael 01 June 2015 (has links) (PDF)
This thesis sets out to derive an analytical model for a center of gravity (CG) measuring platform and examines its validity through experimental testing and Finite Element Modeling. The method uses a two-stage platform tilting process to first locate the planar CG coordinates and then find the third CG coordinate normal to the platform. An uncertainty model of the measuring platform was also developed, both CG and uncertainty models were implemented in the form of a MATLAB code. A load cell sizing task was also added to the code to assist the Integration Engineers at Jet Propulsion Laboratory in selecting load cells to design their own version of the CG Platform. The constructed CG Platform for this project used an array of six strain gauges, four C2A-06-062LT-120 Tee Rosettes and two C2A-06-031WW-120 Stacked Rosettes. They were bonded onto the legs of three truss shaped bipods. Results from the Platform Tilting Tests could not be used to validate the CG model as the measured CG and weight values found from the experimental tests contained a considerable amount of error. The errors in the Platform Tilting Tests are believed to stem from the initial errors observed during the bipod rod and strain gauge calibration tests. As an alternative, an FE model of the CG measuring platform was created as another means of validation. The math model of the CG measuring platform was successfully validated by showing that there was less than a 0.01% different between the bipod loads predicted from the MATLAB code and the FE model. Using the FEM generated loads as inputs into the CG code to calculate a CG matched the initial point mass or CG created in the FE model within a 0.01% difference. To validate the CG model even further, another test should be performed using a CG Platform prototype instrumented with load cells to generate new experimental data and compare them with the results from the FE model.
6

Operation and planning of distribution networks with integration of renewable distributed generators considering uncertainties: a review

Zubo, Rana H.A., Mokryani, Geev, Rajamani, Haile S., Aghaei, J., Niknam, T., Pillai, Prashant 29 October 2016 (has links)
Yes / Distributed generators (DGs) are a reliable solution to supply economic and reliable electricity to customers. It is the last stage in delivery of electric power which can be defined as an electric power source connected directly to the distribution network or on the customer site. It is necessary to allocate DGs optimally (size, placement and the type) to obtain commercial, technical, environmental and regulatory advantages of power systems. In this context, a comprehensive literature review of uncertainty modeling methods used for modeling uncertain parameters related to renewable DGs as well as methodologies used for the planning and operation of DGs integration into distribution network. / This work was supported in part by the SITARA project funded by the British Council and the Department for Business, Innovation and Skills, UK and in part by the University of Bradford, UK under the CCIP grant 66052/000000.
7

Encapsulation and abstraction for modeling and visualizing information uncertainty

Streit, Alexander January 2008 (has links)
Information uncertainty is inherent in many real-world problems and adds a layer of complexity to modeling and visualization tasks. This often causes users to ignore uncertainty, especially when it comes to visualization, thereby discarding valuable knowledge. A coherent framework for the modeling and visualization of information uncertainty is needed to address this issue In this work, we have identified four major barriers to the uptake of uncertainty modeling and visualization. Firstly, there are numerous uncertainty modeling tech- niques and users are required to anticipate their uncertainty needs before building their data model. Secondly, parameters of uncertainty tend to be treated at the same level as variables making it easy to introduce avoidable errors. This causes the uncertainty technique to dictate the structure of the data model. Thirdly, propagation of uncertainty information must be manually managed. This requires user expertise, is error prone, and can be tedious. Finally, uncertainty visualization techniques tend to be developed for particular uncertainty types, making them largely incompatible with other forms of uncertainty information. This narrows the choice of visualization techniques and results in a tendency for ad hoc uncertainty visualization. The aim of this thesis is to present an integrated information uncertainty modeling and visualization environment that has the following main features: information and its uncertainty are encapsulated into atomic variables, the propagation of uncertainty is automated, and visual mappings are abstracted from the uncertainty information data type. Spreadsheets have previously been shown to be well suited as an approach to visu- alization. In this thesis, we devise a new paradigm extending the traditional spreadsheet to intrinsically support information uncertainty.Our approach is to design a framework that integrates uncertainty modeling tech- niques into a hierarchical order based on levels of detail. The uncertainty information is encapsulated and treated as a unit allowing users to think of their data model in terms of the variables instead of the uncertainty details. The system is intrinsically aware of the encapsulated uncertainty and is therefore able to automatically select appropriate uncertainty propagation methods. A user-objectives based approach to uncertainty visualization is developed to guide the visual mapping of abstracted uncertainty information. Two main abstractions of uncertainty information are explored for the purpose of visual mapping: the Unified Uncertainty Model and the Dual Uncertainty Model. The Unified Uncertainty Model provides a single view of uncertainty for visual mapping, whereas the Dual Uncertainty Model distinguishes between possibilistic and probabilistic views. Such abstractions provide a buffer between the visual mappings and the uncertainty type of the underly- ing data, enabling the user to change the uncertainty detail without causing the visual- ization to fail. Two main case studies are presented. The first case study covers exploratory and forecasting tasks in a business planning context. The second case study inves- tigates sensitivity analysis for financial decision support. Two minor case studies are also included: one to investigate the relevancy visualization objective applied to busi- ness process specifications, and the second to explore the extensibility of the system through General Purpose Graphics Processor Unit (GPGPU) use. A quantitative anal- ysis compares our approach to traditional analytical and numerical spreadsheet-based approaches. Two surveys were conducted to gain feedback on the from potential users. The significance of this work is that we reduce barriers to uncertainty modeling and visualization in three ways. Users do not need a mathematical understanding of the uncertainty modeling technique to use it; uncertainty information is easily added, changed, or removed at any stage of the process; and uncertainty visualizations can be built independently of the uncertainty modeling technique.
8

Uncertainty Modeling for Nonlinear and Linear Heated Structures

January 2019 (has links)
abstract: This investigation focuses on the development of uncertainty modeling methods applicable to both the structural and thermal models of heated structures as part of an effort to enable the design under uncertainty of hypersonic vehicles. The maximum entropy-based nonparametric stochastic modeling approach is used within the context of coupled structural-thermal Reduced Order Models (ROMs). Not only does this strategy allow for a computationally efficient generation of samples of the structural and thermal responses but the maximum entropy approach allows to introduce both aleatoric and some epistemic uncertainty into the system. While the nonparametric approach has a long history of applications to structural models, the present investigation was the first one to consider it for the heat conduction problem. In this process, it was recognized that the nonparametric approach had to be modified to maintain the localization of the temperature near the heat source, which was successfully achieved. The introduction of uncertainty in coupled structural-thermal ROMs of heated structures was addressed next. It was first recognized that the structural stiffness coefficients (linear, quadratic, and cubic) and the parameters quantifying the effects of the temperature distribution on the structural response can be regrouped into a matrix that is symmetric and positive definite. The nonparametric approach was then applied to this matrix allowing the assessment of the effects of uncertainty on the resulting temperature distributions and structural response. The third part of this document focuses on introducing uncertainty using the Maximum Entropy Method at the level of finite element by randomizing elemental matrices, for instance, elemental stiffness, mass and conductance matrices. This approach brings some epistemic uncertainty not present in the parametric approach (e.g., by randomizing the elasticity tensor) while retaining more local character than the operation in ROM level. The last part of this document focuses on the development of “reduced ROMs” (RROMs) which are reduced order models with small bases constructed in a data-driven process from a “full” ROM with a much larger basis. The development of the RROM methodology is motivated by the desire to optimally reduce the computational cost especially in multi-physics situations where a lack of prior understanding/knowledge of the solution typically leads to the selection of ROM bases that are excessively broad to ensure the necessary accuracy in representing the response. It is additionally emphasized that the ROM reduction process can be carried out adaptively, i.e., differently over different ranges of loading conditions. / Dissertation/Thesis / Doctoral Dissertation Mechanical Engineering 2019
9

Numerical analysis of the nonlinear dynamics of a drill-string with uncertainty modeling

Ritto, Thiago 07 April 2010 (has links) (PDF)
This thesis analyzes the nonlinear dynamics of a drill-string including uncertainty modeling. A drill-string is a slender flexible structure that rotates and digs into the rock in search of oil. A mathematical-mechanical model is developed for this structure including fluid-structure interaction, impact, geometrical nonlinearities and bit-rock interaction. After the derivation of the equations of motion, the system is discretized by means of the finite element method and a computer code is developed for the numerical computations using the software MATLAB. The normal modes of the dynamical system in the prestressed configuration are used to construct a reduced order model for the system. To take into account uncertainties, the nonparametric probabilistic approach, which is able to take into account both system-parameter and model uncertainties, is used. The probability density functions related to the random variables are constructed using the maximum entropy principle and the stochastic response of the system is calculated using the Monte Carlo method. A novel approach to take into account model uncertainties in a nonlinear constitutive equation (bit-rock interaction model) is developed using the nonparametric probabilistic approach. To identify the probabilistic model of the bit-rock interaction model, the maximum likelihood method together with a statistical reduction in the frequency domain (using the Principal Component Analysis) is applied. Finally, a robust optimization problem is performed to find the operational parameters of the system that maximizes its performance, respecting the integrity limits of the system, such as fatigue and instability
10

Simulação de litotipos de depósito de minério de ferro com geoestatística de múltiplos pontos

Silva Júnior, Antônio Alves da January 2013 (has links)
A distribuição espacial e o volume dos domínios litológicos são freqüentemente as maiores fontes de incerteza na modelagem geológica. Geralmente, a interpretação destas características é baseada em critérios subjetivos de observações, sem levar em consideração a incerteza inerente a este processo. Existem métodos de simulação geoestatísticos capazes de quantificar esta incerteza tipológica das unidades geológicas. A maioria desses métodos utiliza como medida de continuidade geológica os modelos de covariância. Entretanto, estas ferramentas de estatística de dois-pontos, raramente, conseguem capturar os padrões de geometrias complexas. Uma alternativa para esta limitação é utilizar métodos de estatística de múltiplos pontos para reproduzir os padrões espaciais de heterogeneidade que são informados por uma imagem de treinamento. Nessa dissertação, será aplicada a geoestatística de múltiplos pontos (SNESIM) para simular os litotipos de um depósito de minério de ferro. A imagem de treinamento foi baseada em seções interpretadas. Os furos de sondagem são utilizados como amostras primárias. As informações geológicas são acessadas por mapas de probabilidade utilizados como informações secundárias. A metodologia é testada na simulação de um depósito de ferro brasileiro com três diferentes litotipos. Os resultados das simulações são comparados contra um modelo de referência e novos furos de sondagens. As geometrias e distribuição espacial das tipologias foram reproduzidas de forma consistente. A incerteza das distribuições e dos volumes dos domínios tipológicos foi quantificada. O algoritmo de múltiplos pontos e a metodologia proposta mostraram grande potencial de aplicação na simulação de depósitos minerais. / The spatial distribution and volumes of lithological domains are often the biggest sources of uncertainty in geological modeling. Usually, the interpretation of these characteristics is based on subjective criteria of observations, without taking into account the uncertainty inherent in this process. There geostatistical simulation methods capable of quantifying this uncertainty typological geological units. Most of these methods uses as a measure of continuity in geological models covariance. However, these two-point statistical is rarely sufficient to capture the patterns of complex geometries. An alternative to this limitation is to use statistical methods of multiple points to reproduce the spatial patterns of heterogeneity that are informed by a training image. In this dissertation, will be applied to multi-point geostatistics (SNESIM) to simulate lithotypes a deposit of iron ore. The training image was based on sections interpreted. The drillholes are used as primary samples. Geologic information is accessed by probability maps used as secondary information. The methodology is tested in the simulation of a deposit of Brazilian iron with three different rock types. The simulation results are compared against a reference model and new drillholes. The geometries and spatial typologies were reproduced consistently. The uncertainty distributions and volumes of typological domains were quantified. The algorithm of multiple points and the proposed methodology showed great potential for application in the simulation of mineral deposits.

Page generated in 0.1134 seconds