• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 16
  • 6
  • 2
  • 1
  • Tagged with
  • 34
  • 34
  • 9
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Simulação de litotipos de depósito de minério de ferro com geoestatística de múltiplos pontos

Silva Júnior, Antônio Alves da January 2013 (has links)
A distribuição espacial e o volume dos domínios litológicos são freqüentemente as maiores fontes de incerteza na modelagem geológica. Geralmente, a interpretação destas características é baseada em critérios subjetivos de observações, sem levar em consideração a incerteza inerente a este processo. Existem métodos de simulação geoestatísticos capazes de quantificar esta incerteza tipológica das unidades geológicas. A maioria desses métodos utiliza como medida de continuidade geológica os modelos de covariância. Entretanto, estas ferramentas de estatística de dois-pontos, raramente, conseguem capturar os padrões de geometrias complexas. Uma alternativa para esta limitação é utilizar métodos de estatística de múltiplos pontos para reproduzir os padrões espaciais de heterogeneidade que são informados por uma imagem de treinamento. Nessa dissertação, será aplicada a geoestatística de múltiplos pontos (SNESIM) para simular os litotipos de um depósito de minério de ferro. A imagem de treinamento foi baseada em seções interpretadas. Os furos de sondagem são utilizados como amostras primárias. As informações geológicas são acessadas por mapas de probabilidade utilizados como informações secundárias. A metodologia é testada na simulação de um depósito de ferro brasileiro com três diferentes litotipos. Os resultados das simulações são comparados contra um modelo de referência e novos furos de sondagens. As geometrias e distribuição espacial das tipologias foram reproduzidas de forma consistente. A incerteza das distribuições e dos volumes dos domínios tipológicos foi quantificada. O algoritmo de múltiplos pontos e a metodologia proposta mostraram grande potencial de aplicação na simulação de depósitos minerais. / The spatial distribution and volumes of lithological domains are often the biggest sources of uncertainty in geological modeling. Usually, the interpretation of these characteristics is based on subjective criteria of observations, without taking into account the uncertainty inherent in this process. There geostatistical simulation methods capable of quantifying this uncertainty typological geological units. Most of these methods uses as a measure of continuity in geological models covariance. However, these two-point statistical is rarely sufficient to capture the patterns of complex geometries. An alternative to this limitation is to use statistical methods of multiple points to reproduce the spatial patterns of heterogeneity that are informed by a training image. In this dissertation, will be applied to multi-point geostatistics (SNESIM) to simulate lithotypes a deposit of iron ore. The training image was based on sections interpreted. The drillholes are used as primary samples. Geologic information is accessed by probability maps used as secondary information. The methodology is tested in the simulation of a deposit of Brazilian iron with three different rock types. The simulation results are compared against a reference model and new drillholes. The geometries and spatial typologies were reproduced consistently. The uncertainty distributions and volumes of typological domains were quantified. The algorithm of multiple points and the proposed methodology showed great potential for application in the simulation of mineral deposits.
12

Simulação de litotipos de depósito de minério de ferro com geoestatística de múltiplos pontos

Silva Júnior, Antônio Alves da January 2013 (has links)
A distribuição espacial e o volume dos domínios litológicos são freqüentemente as maiores fontes de incerteza na modelagem geológica. Geralmente, a interpretação destas características é baseada em critérios subjetivos de observações, sem levar em consideração a incerteza inerente a este processo. Existem métodos de simulação geoestatísticos capazes de quantificar esta incerteza tipológica das unidades geológicas. A maioria desses métodos utiliza como medida de continuidade geológica os modelos de covariância. Entretanto, estas ferramentas de estatística de dois-pontos, raramente, conseguem capturar os padrões de geometrias complexas. Uma alternativa para esta limitação é utilizar métodos de estatística de múltiplos pontos para reproduzir os padrões espaciais de heterogeneidade que são informados por uma imagem de treinamento. Nessa dissertação, será aplicada a geoestatística de múltiplos pontos (SNESIM) para simular os litotipos de um depósito de minério de ferro. A imagem de treinamento foi baseada em seções interpretadas. Os furos de sondagem são utilizados como amostras primárias. As informações geológicas são acessadas por mapas de probabilidade utilizados como informações secundárias. A metodologia é testada na simulação de um depósito de ferro brasileiro com três diferentes litotipos. Os resultados das simulações são comparados contra um modelo de referência e novos furos de sondagens. As geometrias e distribuição espacial das tipologias foram reproduzidas de forma consistente. A incerteza das distribuições e dos volumes dos domínios tipológicos foi quantificada. O algoritmo de múltiplos pontos e a metodologia proposta mostraram grande potencial de aplicação na simulação de depósitos minerais. / The spatial distribution and volumes of lithological domains are often the biggest sources of uncertainty in geological modeling. Usually, the interpretation of these characteristics is based on subjective criteria of observations, without taking into account the uncertainty inherent in this process. There geostatistical simulation methods capable of quantifying this uncertainty typological geological units. Most of these methods uses as a measure of continuity in geological models covariance. However, these two-point statistical is rarely sufficient to capture the patterns of complex geometries. An alternative to this limitation is to use statistical methods of multiple points to reproduce the spatial patterns of heterogeneity that are informed by a training image. In this dissertation, will be applied to multi-point geostatistics (SNESIM) to simulate lithotypes a deposit of iron ore. The training image was based on sections interpreted. The drillholes are used as primary samples. Geologic information is accessed by probability maps used as secondary information. The methodology is tested in the simulation of a deposit of Brazilian iron with three different rock types. The simulation results are compared against a reference model and new drillholes. The geometries and spatial typologies were reproduced consistently. The uncertainty distributions and volumes of typological domains were quantified. The algorithm of multiple points and the proposed methodology showed great potential for application in the simulation of mineral deposits.
13

Modeling of structured 3-D environments from monocular image sequences

Repo, T. (Tapio) 08 November 2002 (has links)
Abstract The purpose of this research has been to show with applications that polyhedral scenes can be modeled in real time with a single video camera. Sometimes this can be done very efficiently without any special image processing hardware. The developed vision sensor estimates its three-dimensional position with respect to the environment and models it simultaneously. Estimates become recursively more accurate when objects are approached and observed from different viewpoints. The modeling process starts by extracting interesting tokens, like lines and corners, from the first image. Those features are then tracked in subsequent image frames. Also some previously taught patterns can be used in tracking. A few features in the same image are extracted. By this way the processing can be done at a video frame rate. New features appearing can also be added to the environment structure. Kalman filtering is used in estimation. The parameters in motion estimation are location and orientation and their first derivates. The environment is considered a rigid object in respect to the camera. The environment structure consists of 3-D coordinates of the tracked features. The initial model lacks depth information. The relational depth is obtained by utilizing facts such as closer points move faster on the image plane than more distant ones during translational motion. Additional information is needed to obtain absolute coordinates. Special attention has been paid to modeling uncertainties. Measurements with high uncertainty get less weight when updating the motion and environment model. The rigidity assumption is utilized by using shapes of a thin pencil for initial model structure uncertainties. By observing continuously motion uncertainties, the performance of the modeler can be monitored. In contrast to the usual solution, the estimations are done in separate state vectors, which allows motion and 3-D structure to be estimated asynchronously. In addition to having a more distributed solution, this technique provides an efficient failure detection mechanism. Several trackers can estimate motion simultaneously, and only those with the most confident estimates are allowed to update the common environment model. Tests showed that motion with six degrees of freedom can be estimated in an unknown environment. The 3-D structure of the environment is estimated simultaneously. The achieved accuracies were millimeters at a distance of 1-2 meters, when simple toy-scenes and more demanding industrial pallet scenes were used in tests. This is enough to manipulate objects when the modeler is used to offer visual feedback.
14

Two Applications of Combinatorial Branch-and-Bound in Complex Networks and Transportation

Rasti, Saeid January 2020 (has links)
In this dissertation, we show two significant applications of combinatorial branch-and-bound as an exact solution methodology in combinatorial optimization problems. In the first problem, we propose a set of new group centrality metrics and show their performance in estimating protein importance in protein-protein interaction networks. The centrality metrics introduced here are extensions of well-known nodal metrics (degree, betweenness, and closeness) for a set of nodes which is required to induce a specific pattern. The structures investigated range from the ``stricter'' induced stars and cliques, to a ``looser'' definition of a representative structure. We derive the computational complexity for each of the newly proposed metrics. Then, we provide mixed integer programming formulations to solve the problems exactly; due to the computational complexity of the problem and the sheer size of protein-protein interaction networks, using a commercial solver with the formulations is not always a viable option. Hence, we also propose a combinatorial branch-and-bound approach to solve the problems introduced. Finally, we conclude this work with a presentation of the performance of the proposed centrality metrics in identifying essential proteins in helicobacter pylori. In the second problem, we introduce the asymmetric probabilistic minimum-cost Hamiltonian cycle problem (APMCHCP) where arcs and vertices in the graph are possible to fail. APMCHCP has applications in many emerging areas, such as post-disaster recovery, electronic circuit design, and security maintenance of wireless sensor networks. For each vertex, we define a chance-constraint to guarantee that the probability of arriving at the vertex must be greater than or equal to a given threshold. Four mixed-integer programming (MIP) formulations are proposed for modeling the problem, including two direct formulations and two recursive formulations. A combinatorial branch-and-bound (CBB) algorithm is proposed for solving the APMCHCP, where data preprocessing steps, feasibility rules, and approaches of finding upper and lower bounds are developed. In the numerical experiments, the CBB algorithm is compared with formulations on a test-bed of two popular benchmark instance sets. The results show that the proposed CBB algorithm significantly outperforms Gurobi solver in terms of both the size of optimally solved instances and the computing time.
15

Bayesian Uncertainty Modeling in Decomposed Multilevel Optimization

Dettwiller, Ian Daniel 06 May 2017 (has links)
Bayesian updating is used to approximate discontinuous multi-interval uncertainty representations (i.e., belief structures) of epistemic uncertainty. Several Bayesian-based approaches are examined for assessing the accuracy of approximating the mean and standard deviation of a belief structure and calculating reliability using posterior distributions. Moreover, a Bayesian-based belief structure approximation is integrated with a decomposed multilevel optimization solution strategy through analytical target cascading, where the ensuing reliability-based design optimization problem within each decomposed element is solved using a single loop single vector approach. The non-deterministic decomposed multilevel optimization approach is demonstrated through solutions to four analytical benchmark problems with mixed aleatory and epistemic uncertainties as well as a nano-enhanced composite sandwich plate problem. Consistent with the integrated computational materials engineering philosophy, the proposed solution strategy for the sandwich plate problem combines micro- and macro-level material modeling and design with structural level analysis and optimization. The orientation distribution of the carbon nanofibers in the micro-mechanical model is described through a belief structure and modeled using a Bayesian approach. Aleatory uncertainty in the ply thickness of the composite facesheets is also considered. This problem is used to demonstrate computationally efficient integration of epistemic uncertainty described through a belief structure for a complex design problem with mixed uncertainties. The results of this study show that the posterior distributions from some of the Bayesian-based approaches are suitable for direct calculation of reliability through joint probability density functions. Moreover, the Bayesian-based approach can provide a computationally efficient method for integrating epistemic and aleatory uncertainties in decomposed multilevel optimization of complex problems.
16

Generalized Empirical Bayes: Theory, Methodology, and Applications

Fletcher, Douglas January 2019 (has links)
The two key issues of modern Bayesian statistics are: (i) establishing a principled approach for \textit{distilling} a statistical prior distribution that is \textit{consistent} with the given data from an initial believable scientific prior; and (ii) development of a \textit{consolidated} Bayes-frequentist data analysis workflow that is more effective than either of the two separately. In this thesis, we propose generalized empirical Bayes as a new framework for exploring these fundamental questions along with a wide range of applications spanning fields as diverse as clinical trials, metrology, insurance, medicine, and ecology. Our research marks a significant step towards bridging the ``gap'' between Bayesian and frequentist schools of thought that has plagued statisticians for over 250 years. Chapters 1 and 2---based on \cite{mukhopadhyay2018generalized}---introduces the core theory and methods of our proposed generalized empirical Bayes (gEB) framework that solves a long-standing puzzle of modern Bayes, originally posed by Herbert Robbins (1980). One of the main contributions of this research is to introduce and study a new class of nonparametric priors ${\rm DS}(G, m)$ that allows exploratory Bayesian modeling. However, at a practical level, major practical advantages of our proposal are: (i) computational ease (it does not require Markov chain Monte Carlo (MCMC), variational methods, or any other sophisticated computational techniques); (ii) simplicity and interpretability of the underlying theoretical framework which is general enough to include almost all commonly encountered models; and (iii) easy integration with mainframe Bayesian analysis that makes it readily applicable to a wide range of problems. Connections with other Bayesian cultures are also presented in the chapter. Chapter 3 deals with the topic of measurement uncertainty from a new angle by introducing the foundation of nonparametric meta-analysis. We have applied the proposed methodology to real data examples from astronomy, physics, and medical disciplines. Chapter 4 discusses some further extensions and application of our theory to distributed big data modeling and the missing species problem. The dissertation concludes by highlighting two important areas of future work: a full Bayesian implementation workflow and potential applications in cybersecurity. / Statistics
17

Information-Based Sensor Management for Static Target Detection Using Real and Simulated Data

Kolba, Mark Philip January 2009 (has links)
<p>In the modern sensing environment, large numbers of sensor tasking decisions must be made using an increasingly diverse and powerful suite of sensors in order to best fulfill mission objectives in the presence of situationally-varying resource constraints. Sensor management algorithms allow the automation of some or all of the sensor tasking process, meaning that sensor management approaches can either assist or replace a human operator as well as ensure the safety of the operator by removing that operator from a dangerous operational environment. Sensor managers also provide improved system performance over unmanaged sensing approaches through the intelligent control of the available sensors. In particular, information-theoretic sensor management approaches have shown promise for providing robust and effective sensor manager performance.</p><p>This work develops information-theoretic sensor managers for a general static target detection problem. Two types of sensor managers are developed. The first considers a set of discrete objects, such as anomalies identified by an anomaly detector or grid cells in a gridded region of interest. The second considers a continuous spatial region in which targets may be located at any point in continuous space. In both types of sensor managers, the sensor manager uses a Bayesian, probabilistic framework to model the environment and tasks the sensor suite to make new observations that maximize the expected information gain for the system. The sensor managers are compared to unmanaged sensing approaches using simulated data and using real data from landmine detection and unexploded ordnance (UXO) discrimination applications, and it is demonstrated that the sensor managers consistently outperform the unmanaged approaches, enabling targets to be detected more quickly using the sensor managers. The performance improvement represented by the rapid detection of targets is of crucial importance in many static target detection applications, resulting in higher rates of advance and reduced costs and resource consumption in both military and civilian applications.</p> / Dissertation
18

Material Substitution in Legacy System Engineering (LSE) With Fuzzy Logic Principles

January 2011 (has links)
abstract: The focus of this research is to investigate methods for material substitution for the purpose of re-engineering legacy systems that involves incomplete information about form, fit and function of replacement parts. The primary motive is to extract as much useful information about a failed legacy part as possible and use fuzzy logic rules for identifying the unknown parameter values. Machine elements can fail by any number of failure modes but the most probable failure modes based on the service condition are considered critical failure modes. Three main parameters are of key interest in identifying the critical failure mode of the part. Critical failure modes are then directly mapped to material properties. Target material property values are calculated from material property values obtained from the originally used material and from the design goals. The material database is searched for new candidate materials that satisfy the goals and constraints in manufacturing and raw stock availability. Uncertainty in the extracted data is modeled using fuzzy logic. Fuzzy member functions model the imprecise nature of data in each available parameter and rule sets characterize the imprecise dependencies between the parameters and makes decisions in identifying the unknown parameter value based on the incompleteness. A final confidence level for each material in a pool of candidate material is a direct indication of uncertainty. All the candidates satisfy the goals and constraints to varying degrees and the final selection is left to the designer's discretion. The process is automated by software that inputs incomplete data; uses fuzzy logic to extract more information and queries the material database with a constrained search for finding candidate alternatives. / Dissertation/Thesis / Collection of Images used in Thesis / M.S. Mechanical Engineering 2011
19

Incertitude des données biomécaniques : modélisation et propagation dans les modèles de diagnostic des pathologies du système musculosquelettique / Uncertainty of biomechanical data : modeling and propagation in the diagnostics models of diseases of musculoskeletal system

Hoang, Tuan Nha 16 December 2014 (has links)
Les pathologies du système musculosquelettique concernant les déformations / anomalies osseuses et musculaires (e.g. paralysie cérébrale) ont un fort impact sur la qualité de vie des personnes concernées. Les objectifs de la thèse sont d’approfondir les études précédentes en intégrant la modélisation de l’incertitude des données biomécaniques et biomédicales dans les modèles de diagnostic prédictif des pathologies du système musculosquelettique. L’intervalle a été choisi pour représenter l’incertitude des données biomécaniques. Ce formalisme est simple et peu coûteux sur le plan du calcul. Les données (physiologiques, morphologiques, mécaniques et analyse du mouvement) ont été recueillies à partir de la littérature en utilisant les moteurs de recherche des articles scientifiques fiables pour établir un espace d’incertitude. La nouvelle méthode de classement (nommée US-ECM) proposée est une méthode de classement semi-supervisé qui utilise la partition crédale pour représenter les connaissances partielles sur les clusters. L’utilisation de la fonction de croyance pour représenter ces éléments de connaissance permet de les combiner d’une manière souple et robuste. De plus, l’extension de cette méthode avec un espace d’incertitude multidimensionnelle a montré la meilleure performance par rapport à la version originale. L’analyse des avis d’expert permettra d’inclure ou d’exclure les sources de données selon leurs niveaux de fiabilité. Ensuite, le modèle de regroupement (US-ECM) développé sera appliqué sur une nouvelle base de données pour évaluer l’impact de la fiabilité des données sur la performance de diagnostic. / The aim of the project is to investigate the modeling of the reliability/incertitude/imprecision of biomedical and biomechanics data (medical images, kinematics/kinetics/EMG data, etc.) and its propagation in the predictive diagnosls models of the disorders of musculoskeletal systems. These diagnosis models will be based on multimodal and multidimensional patient data (3D medical imaging, mechanical data,dinical data,etc.). The literature-based data have been collected to estabish an uncertainty space, which represents fused data from multiple sources, of morphological, mechanical, and movement analysis properties of the musculoskeletal system from multiple sources (i.e. research papers from Science Direct and Pubmed). After that,a new clustering method (US-ECM) is proposed for integrating fused data from multiple sources ln form of a multidimensional uncertainty space (US). Reliability of biomechanical data was evaluated by a fusion approach expert opinion. Reliability criteria of a data source (ie scientific paper published) focus on the technique used the acquisition protocol and measurement and the number of data. A system of questionnaires was developed to co!lect expert opinion. Then, the theory of beliet functions has been applied to merge these opinion to establish a confidence level each data source.
20

Data Summarization for Large Time-varying Flow Visualization and Analysis

Chen, Chun-Ming 29 December 2016 (has links)
No description available.

Page generated in 0.1099 seconds