Spelling suggestions: "subject:"3dmodeling incertainty"" "subject:"3dmodeling ncertainty""
1 |
Distributed Estimation in Sensor Networks with Modeling UncertaintyZhou, Qing 03 October 2013 (has links)
A major issue in distributed wireless sensor networks (WSNs) is the design of efficient distributed algorithms for network-wide dissemination of information acquired by individual sensors, where each sensor, by itself, is unable to access enough data for reliable decision making. Without a centralized fusion center, network-wide reliable inferencing can be accomplished by recovering meaningful global statistics at each sensor through iterative inter-sensor message passing.
In this dissertation, we first consider the problem of distributed estimation of an unknown deterministic scalar parameter (the target signal) in a WSN, where each sensor receives a single snapshot of the field. An iterative distributed least-squares (DLS) algorithm is investigated with and without the consideration of node failures. In particular, without sensor node failures it is shown that every instantiation of the DLS algorithm converges, i.e., consensus is reached among the sensors, with the limiting agreement value being the centralized least-squares estimate. With node failures during the iterative exchange process, the convergence of the DLS algorithm is still guaranteed; however, an error exists be- tween the limiting agreement value and the centralized least-squares estimate. In order to reduce this error, a modified DLS scheme, the M-DLS, is provided. The M-DLS algorithm involves an additional weight compensation step, in which a sensor performs a one-time weight compensation procedure whenever it detects the failure of a neighbor. Through analytical arguments and simulations, it is shown that the M-DLS algorithm leads to a smaller error than the DLS algorithm, where the magnitude of the improvement dependents on the network topology.
We then investigate the case when the observation or sensing mode is only partially known at the corresponding nodes, perhaps, due to their limited sensing capabilities or other unpredictable physical factors. Specifically, it is assumed that the observation validity at a node switches stochastically between two modes, with mode I corresponding to the desired signal plus noise observation mode (a valid observation), and mode II corresponding to pure noise with no signal information (an invalid observation). With no prior information on the local sensing modes (valid or invalid), we introduce a learning-based distributed estimation procedure, the mixed detection-estimation (MDE) algorithm, based on closed-loop interactions between the iterative distributed mode learning and the target estimation. The online learning (or sensing mode detection) step re-assesses the validity of the local observations at each iteration, thus refining the ongoing estimation update process. The convergence of the MDE algorithm is established analytically, and the asymptotic performance analysis studies shows that, in the high signal-to-noise ratio (SNR) regime, the MDE estimation error converges to that of an ideal (centralized) estimator with perfect information about the node sensing modes. This is in contrast with the estimation performance of a naive average consensus based distributed estimator (with no mode learning), whose estimation error blows up with an increasing SNR.
|
2 |
Uncertainty analysis of integrated powerhead demonstrator mass flowrate testing and modelingMolder, King Jeffries 06 August 2005 (has links)
A methodology has been developed to quantify the simulation uncertainty of a computational model calibrated against test data. All test data used in the study undergoes an experimental uncertainty analysis. The modeling software ROCETS is used and its structure is explained. The way the model was calibrated is presented. Next, a general simulation uncertainty analysis methodology is shown that is valid for calibrated models. Finally the ROCETS calibrated model and its simulation uncertainty are calculated using the general methodology and compared to a second set of comparison test data. The simulation uncertainty analysis methodology developed and implemented can be used for any modeling with a calibrated model. The methodology works well for a process of incremental testing and recalibration of the model whenever new test data is available.
|
3 |
An Investigation of the Behavior of Structural Systems with Modeling UncertaintiesHardyniec, Andrew B. 24 March 2014 (has links)
Recent advancements in earthquake engineering have caused a movement toward a probabilistic quantification of the behavior of structural systems. Analysis characteristics, such as ground motion records, material properties, and structural component behavior are defined by probabilistic distributions. The response is also characterized probabilistically, with distributions fitted to analysis results at intensity levels ranging from the maximum considered earthquake ground motion to collapse. Despite the progress toward a probabilistic framework, the variability in structural analysis results due to modeling techniques has not been considered.
This work investigates the uncertainty associated with modeling geometric nonlinearities and Rayleigh damping models on the response of planar frames at multiple ground motion intensity levels. First, an investigation is presented on geometric nonlinearity approaches for planar frames, followed by a critical review of current damping models. Three frames, a four-story buckling restrained braced frame, a four-story steel moment resisting frame, and an eight-story steel moment resisting frame, are compared using two geometric nonlinearity approaches and five Rayleigh damping models. Static pushover analyses are performed on the models in the geometric nonlinearities study, and incremental dynamic analyses are performed on all models to compare the response at the design based earthquake ground motion (DBE), maximum considered earthquake ground motion (MCE), and collapse intensity levels. The results indicate noticeable differences in the responses at the DBE and MCE levels and significant differences in the responses at the collapse level. Analysis of the sidesway collapse mechanisms indicates a shift in the behavior corresponding to the different modeling assumptions, though the effects were specific to each frame.
The FEMA P-695 Methodology provided a framework that defined the static and dynamic analyses performed during the modeling uncertainties studies. However, the Methodology is complex and the analyses are computationally expensive. To expedite the analyses and manage the results, a toolkit was created that streamlines the process using a set of interconnected modules. The toolkit provides a program that organizes data and reduces mistakes for those familiar with the process while providing an educational tool for novices of the Methodology by stepping new users through the intricacies of the process.
The collapse margin ratio (CMR), calculated in the Methodology, was used to compare the collapse behavior of the models in the modeling uncertainties study. Though it provides a simple scalar quantity for comparison, calculation of the CMR typically requires determination of the full set of incremental dynamic analysis curves, which require prohibitively large analysis time for complex models. To reduce the computational cost of calculating the CMR, a new parallel computing method, referred to as the fragility search method, was devised that uses approximate collapse fragility curves to quickly converge on the median collapse intensity value. The new method is shown to have favorable attributes compared to other parallel computing methods for determining the CMR. / Ph. D.
|
4 |
Encapsulation and abstraction for modeling and visualizing information uncertaintyStreit, Alexander January 2008 (has links)
Information uncertainty is inherent in many real-world problems and adds a layer of complexity to modeling and visualization tasks. This often causes users to ignore uncertainty, especially when it comes to visualization, thereby discarding valuable knowledge. A coherent framework for the modeling and visualization of information uncertainty is needed to address this issue In this work, we have identified four major barriers to the uptake of uncertainty modeling and visualization. Firstly, there are numerous uncertainty modeling tech- niques and users are required to anticipate their uncertainty needs before building their data model. Secondly, parameters of uncertainty tend to be treated at the same level as variables making it easy to introduce avoidable errors. This causes the uncertainty technique to dictate the structure of the data model. Thirdly, propagation of uncertainty information must be manually managed. This requires user expertise, is error prone, and can be tedious. Finally, uncertainty visualization techniques tend to be developed for particular uncertainty types, making them largely incompatible with other forms of uncertainty information. This narrows the choice of visualization techniques and results in a tendency for ad hoc uncertainty visualization. The aim of this thesis is to present an integrated information uncertainty modeling and visualization environment that has the following main features: information and its uncertainty are encapsulated into atomic variables, the propagation of uncertainty is automated, and visual mappings are abstracted from the uncertainty information data type. Spreadsheets have previously been shown to be well suited as an approach to visu- alization. In this thesis, we devise a new paradigm extending the traditional spreadsheet to intrinsically support information uncertainty.Our approach is to design a framework that integrates uncertainty modeling tech- niques into a hierarchical order based on levels of detail. The uncertainty information is encapsulated and treated as a unit allowing users to think of their data model in terms of the variables instead of the uncertainty details. The system is intrinsically aware of the encapsulated uncertainty and is therefore able to automatically select appropriate uncertainty propagation methods. A user-objectives based approach to uncertainty visualization is developed to guide the visual mapping of abstracted uncertainty information. Two main abstractions of uncertainty information are explored for the purpose of visual mapping: the Unified Uncertainty Model and the Dual Uncertainty Model. The Unified Uncertainty Model provides a single view of uncertainty for visual mapping, whereas the Dual Uncertainty Model distinguishes between possibilistic and probabilistic views. Such abstractions provide a buffer between the visual mappings and the uncertainty type of the underly- ing data, enabling the user to change the uncertainty detail without causing the visual- ization to fail. Two main case studies are presented. The first case study covers exploratory and forecasting tasks in a business planning context. The second case study inves- tigates sensitivity analysis for financial decision support. Two minor case studies are also included: one to investigate the relevancy visualization objective applied to busi- ness process specifications, and the second to explore the extensibility of the system through General Purpose Graphics Processor Unit (GPGPU) use. A quantitative anal- ysis compares our approach to traditional analytical and numerical spreadsheet-based approaches. Two surveys were conducted to gain feedback on the from potential users. The significance of this work is that we reduce barriers to uncertainty modeling and visualization in three ways. Users do not need a mathematical understanding of the uncertainty modeling technique to use it; uncertainty information is easily added, changed, or removed at any stage of the process; and uncertainty visualizations can be built independently of the uncertainty modeling technique.
|
5 |
Improved Robust Stability Bounds for Sampled Data Systems with Time Delayed Feedback ControlKurudamannil, Jubal J. 15 May 2015 (has links)
No description available.
|
Page generated in 0.0558 seconds