• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 4
  • Tagged with
  • 16
  • 16
  • 7
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Statistics of Met-Ocean Conditions Between West and Central Gulf of Mexico Based on Field Measurements

Su, Lin 2012 May 1900 (has links)
Statistics of met-ocean conditions including wind, current and wave at the location between west and central Gulf of Mexico (GOM) are derived based on about three year of field measurements. Two-parameter Weibull distribution has been employed to fit wind speed at 10m over sea level and current speed in various depth. The joint probability contour was derived based on First-Order Reliability Method. In addition, the joint distribution of wind speed and direction was visualized by wind-rose diagram. The results provided in this study may provide essential information to the probability distribution of met-ocean condition in the particular location and can be used as a reference in the future designs.
2

Experimental observation of turbulent structure at region surrounding the mid-channel braid bar

Khan, M.A., Sharma, N., Pu, Jaan H., Pandey, M., Azamathulla, H. 08 April 2021 (has links)
No / River morphological processes are among the most complex and least understood phenomenon in nature. Recent research indicates that the braiding of marine waterways of the estuary zone occurs at an aspect ratio similar to the alluvial braided river. The instability of complex sporadic fluvial processes at river-sea interface is responsible for bar formation in alluvial as well as in marine waterbodies Due to the lack of knowledge of flow characteristics around bar, the flow structure around the sand bar is analyzed. The bursting events play the crucial role in understanding the fluvial characteristics in the vicinity of submerged structure. The study of bursting events around the mid-channel bar is only done by the present author. The effect of submergence ratio on the turbulence behavior in the proximity of bar is analyzed in this study. The flow turbulence generated by the mid-channel bar is also analyzed in detail. The extreme turbulent burst is segregated from low intensity turbulent events by using the hole size concept. The effect of hole size on the parameter Dominance Function is analysed which is not yet studied by any researcher for mid-channel bar. The Momentum Dominance Function (MDF) parameter increases with increase in the Hole Size. This indicates that the magnitude of upward flux increases with increase in the hole size. The effect of bar height on the turbulent burst which is not yet studied by any researchers is analyzed in the present research. The joint probability distribution of bursting events is modeled using the Gram-Charlier bivariate joint probability function. The joint probability distribution gives the details of probabilistic structure of flow in the vicinity of bar. The effect of bar is predominant only in the lower flow layer. The joint probability distribution graph becomes more eccentric toward the dominant quadrants with increase in the submergence ratio. This indicates that the probability of dominant events further increases with increase in the submergence ratio.
3

Probabilistic Tropical Cyclone Surge Hazard Under Future Sea-Level Rise Scenarios: A Case Study in The Chesapeake Bay Region, USA

Kim, Kyutae 11 July 2023 (has links)
Storm surge flooding caused by tropical cyclones is a devastating threat to coastal regions, and this threat is growing due to sea-level rise (SLR). Therefore, accurate and rapid projection of the storm surge hazard is critical for coastal communities. This study focuses on developing a new framework that can rapidly predict storm surges under SLR scenarios for any random synthetic storms of interest and assign a probability to its likelihood. The framework leverages the Joint Probability Method with Response Surfaces (JPM-RS) for probabilistic hazard characterization, a storm surge machine learning model, and a SLR model. The JPM probabilities are based on historical tropical cyclone track observations. The storm surge machine learning model was trained based on high-fidelity storm surge simulations provided by the U.S. Army Corps of Engineers (USACE). The SLR was considered by adding the product of the normalized nonlinearity, arising from surge-SLR interaction, and the sea-level change from 1992 to the target year, where nonlinearities are based on high-fidelity storm surge simulations and subsequent analysis by USACE. In this study, this framework was applied to the Chesapeake Bay region of the U.S. and used to estimate the SLR-adjusted probabilistic tropical cyclone flood hazard in two areas: one is an urban Virginia site, and the other is a rural Maryland site. This new framework has the potential to aid in reducing future coastal storm risks in coastal communities by providing robust and rapid hazard assessment that accounts for future sea-level rise. / Master of Science / Storm surge flooding, which is the rise in sea level caused by tropical cyclones and other storms, is a devastating threat to coastal regions, and its impact is increasing due to sea-level rise (SLR). This poses a considerable risk to communities living near the coast. Therefore, it is crucial to accurately and quickly predict the potential for storm surge flooding. This study aimed to develop a new way that can rapidly estimate peak storm surges under different sea-level rise scenarios for any random synthetic storms of interest and assess the likelihood of their occurrence. The approach is based on historical tropical cyclone datasets and a machine learning model trained on high-quality simulations provided by the US Army Corps of Engineers (USACE). The study focused on the Chesapeake Bay area of the US and estimated the probabilistic tropical cyclone flood hazard in two locations, an urban site in Virginia and a rural site in Maryland. This new approach has the potential to assist in reducing coastal storm risks in vulnerable communities by providing a quick and reliable assessment of the hazard that takes into account the effects of future sea-level rise.
4

Elevation based classification of streams and establishment of regime equations for predicting bankfull channel geometry

Jha, Rajan 06 September 2013 (has links)
Since past more than hundred years, fluvial geomorphologists all across the globe have been trying to understand the basic phenomena and processes that control the behavioral patterns of streams. A large number of stream classification systems has been proposed till date, but none of them have been accepted universally. Lately, a large amount of efforts have been made to develop bankfull relations for estimating channel geometry that can be employed for stream restoration practices. Focusing on these two objectives, in this study a new stream classification system based on elevation above mean sea level has been developed and later using elevation as one of the independent and nondimensionalising parameters, universal and regional regime equations in dimensionless forms have been developed for predicting channel geometry at bankfull conditions. To accomplish the first objective, 873 field measurement values describing the hydraulic geometry and morphology of streams mainly from Canada, UK and USA were compiled and statistically analyzed. Based on similar mode values of three dimensionless channel variables (aspect ratio, sinuosity and channel slope), several fine elevations ranges were merged to produce the final five elevation ranges. These final five zones formed the basis of the new elevation based classification system and were identified with their unique modal values of dimensionless variables. Performing joint probability distributions on each of these zones, trends in the behavior of channel variables while moving from lowland to upland were observed. For the completion of second objective, 405 data points out of initial 873 points were selected and employed for the development of bankfull relations by using bankfull discharge and watershed variables as the input variables. Regression equations developed for width and depth established bankfull discharge as the only required input variable whereas all other watershed variables were proved out to be relatively insignificant. Channel slope equation did not show any dependence on bankfull discharge and was observed to be influenced only by drainage area and valley slope factors. Later when bankfull discharge was replaced by annual average rainfall as the new input variable, watershed parameters (drainage area, forest cover, urban cover etc.) became significant in bankfull width and depth regression equations. This suggested that bankfull discharge in itself encompasses the effects of all the watershed variables and associated processes and thus is sufficient for estimating channel dimensions. Indeed, bankfull discharge based regression equation demonstrated its strong dependence on watershed and rainfall variables. / Master of Science
5

An evaluation of changing profit risks in Kansas cattle feeding operations

Herrington, Matthew Abbott January 1900 (has links)
Master of Science / Department of Agricultural Economics / Ted C. Schroeder / Glynn T. Tonsor / Cattle feeders face significant profit risk when placing cattle on feed. Risks arise from both financial and biological sources. To date, few standardized measures exist to measure current risks against historic levels, or to obtain forward looking risk estimates. Those that do exist could benefit from updates and inclusion of additional risk elements. This study measures the risk of expected profits when cattle are placed on feed. This study creates a forward-looking estimate of expected feedlot profits using futures and options market data as price forecasts. Joint probability distributions are created for prices and cattle performance variables affecting feedlot profit margins. Monte Carlo simulation techniques are then employed to generate probability distributions of expected feedlot profits. Results show cattle feeding is a risky business and cattle feeders have been placing cattle on feed facing significantly negative expected returns since June, 2010. This assessment of negative expected profits is consistent with other findings. Over the study’s 2002 to 2013 time frame, the relative risk to cattle feeding profits accounted for by feed costs has been increasing, while the relative risk levels from feeder cattle and fed cattle prices remain steady. Additionally, the probability of realized per-head profits greater than $100 has been decreasing since 2009 and the probability of realized per-head profits less than $-100 has been increasingly rapidly.
6

A random matrix model for two-colour QCD at non-zero quark density

Phillips, Michael James January 2011 (has links)
We solve a random matrix ensemble called the chiral Ginibre orthogonal ensemble, or chGinOE. This non-Hermitian ensemble has applications to modelling particular low-energy limits of two-colour quantum chromo-dynamics (QCD). In particular, the matrices model the Dirac operator for quarks in the presence of a gluon gauge field of fixed topology, with an arbitrary number of flavours of virtual quarks and a non-zero quark chemical potential. We derive the joint probability density function (JPDF) of eigenvalues for this ensemble for finite matrix size N, which we then write in a factorised form. We then present two different methods for determining the correlation functions, resulting in compact expressions involving Pfaffians containing the associated kernel. We determine the microscopic large-N limits at strong and weak non-Hermiticity (required for physical applications) for both the real and complex eigenvalue densities. Various other properties of the ensemble are also investigated, including the skew-orthogonal polynomials and the fraction of eigenvalues that are real. A number of the techniques that we develop have more general applicability within random matrix theory, some of which we also explore in this thesis.
7

Computation of context as a cognitive tool

Sanscartier, Manon Johanne 09 November 2006
In the field of cognitive science, as well as the area of Artificial Intelligence (AI), the role of context has been investigated in many forms, and for many purposes. It is clear in both areas that consideration of contextual information is important. However, the significance of context has not been emphasized in the Bayesian networks literature. We suggest that consideration of context is necessary for acquiring knowledge about a situation and for refining current representational models that are potentially erroneous due to hidden independencies in the data.<p>In this thesis, we make several contributions towards the automation of contextual consideration by discovering useful contexts from probability distributions. We show how context-specific independencies in Bayesian networks and discovery algorithms, traditionally used for efficient probabilistic inference can contribute to the identification of contexts, and in turn can provide insight on otherwise puzzling situations. Also, consideration of context can help clarify otherwise counter intuitive puzzles, such as those that result in instances of Simpson's paradox. In the social sciences, the branch of attribution theory is context-sensitive. We suggest a method to distinguish between <i>dispositional causes</i> and <i>situational factors</i> by means of contextual models. Finally, we address the work of Cheng and Novick dealing with causal attribution by human adults. Their <i>probabilistic contrast model</i> makes use of contextual information, called focal sets, that must be determined by a human expert. We suggest a method for discovering complete <i>focal sets</i> from probabilistic distributions, without the human expert.
8

Computation of context as a cognitive tool

Sanscartier, Manon Johanne 09 November 2006 (has links)
In the field of cognitive science, as well as the area of Artificial Intelligence (AI), the role of context has been investigated in many forms, and for many purposes. It is clear in both areas that consideration of contextual information is important. However, the significance of context has not been emphasized in the Bayesian networks literature. We suggest that consideration of context is necessary for acquiring knowledge about a situation and for refining current representational models that are potentially erroneous due to hidden independencies in the data.<p>In this thesis, we make several contributions towards the automation of contextual consideration by discovering useful contexts from probability distributions. We show how context-specific independencies in Bayesian networks and discovery algorithms, traditionally used for efficient probabilistic inference can contribute to the identification of contexts, and in turn can provide insight on otherwise puzzling situations. Also, consideration of context can help clarify otherwise counter intuitive puzzles, such as those that result in instances of Simpson's paradox. In the social sciences, the branch of attribution theory is context-sensitive. We suggest a method to distinguish between <i>dispositional causes</i> and <i>situational factors</i> by means of contextual models. Finally, we address the work of Cheng and Novick dealing with causal attribution by human adults. Their <i>probabilistic contrast model</i> makes use of contextual information, called focal sets, that must be determined by a human expert. We suggest a method for discovering complete <i>focal sets</i> from probabilistic distributions, without the human expert.
9

Development and Application of Probabilistic Decision Support Framework for Seismic Rehabilitation of Structural Systems

Park, Joonam 22 November 2004 (has links)
Seismic rehabilitation of structural systems is an effective approach for reducing potential seismic losses such as social and economic losses. However, little or no effort has been made to develop a framework for making decisions on seismic rehabilitation of structural systems that systematically incorporates conflicting multiple criteria and uncertainties inherent in the seismic hazard and in the systems themselves. This study develops a decision support framework for seismic rehabilitation of structural systems incorporating uncertainties inherent in both the system and the seismic hazard, and demonstrates its application with detailed examples. The decision support framework developed utilizes the HAZUS method for a quick and extensive estimation of seismic losses associated with structural systems. The decision support framework allows consideration of multiple decision attributes associated with seismic losses, and multiple alternative seismic rehabilitation schemes represented by the objective performance level. Three multi-criteria decision models (MCDM) that are known to be effective for decision problems under uncertainty are employed and their applicability for decision analyses in seismic rehabilitation is investigated. These models are Equivalent Cost Analysis (ECA), Multi-Attribute Utility Theory (MAUT), and Joint Probability Decision Making (JPDM). Guidelines for selection of a MCDM that is appropriate for a given decision problem are provided to establish a flexible decision support system. The resulting decision support framework is applied to a test bed system that consists of six hospitals located in the Memphis, Tennessee, area to demonstrate its capabilities.
10

LANE TRACKING USING DEPENDENT EXTENDED TARGET MODELS

akbari, behzad January 2021 (has links)
Detection of multiple-lane markings (lane-line) on road surfaces is an essential aspect of autonomous vehicles. Although several approaches have been proposed to detect lanes, detecting multiple lane-lines consistently, particularly across a stream of frames and under varying lighting conditions is still a challenging problem. Since the road's markings are designed to be smooth and parallel, lane-line sampled features tend to be spatially and temporally correlated inside and between frames. In this thesis, we develop novel methods to model these spatial and temporal dependencies in the form of the target tracking problem. In fact, instead of resorting to the conventional method of processing each frame to detect lanes only in the space domain, we treat the overall problem as a Multiple Extended Target Tracking (METT) problem. In the first step, we modelled lane-lines as multiple "independent" extended targets and developed a spline mathematical model for the shape of the targets. We showed that expanding the estimations across the time domain could improve the result of estimation. We identify a set of control points for each spline, which will track over time. To overcome the clutter problem, we developed an integrated probabilistic data association fi lter (IPDAF) as our basis, and formulated a METT algorithm to track multiple splines corresponding to each lane-line.In the second part of our work, we investigated the coupling between multiple extended targets. We considered the non-parametric case and modeled target dependency using the Multi-Output Gaussian Process. We showed that considering dependency between extended targets could improve shape estimation results. We exploit the dependency between extended targets by proposing a novel recursive approach called the Multi-Output Spatio-Temporal Gaussian Process Kalman Filter (MO-STGP-KF). We used MO-STGP-KF to estimate and track multiple dependent lane markings that are possibly degraded or obscured by traffic. Our method tested for tracking multiple lane-lines but can be employed to track multiple dependent rigid-shape targets by using the measurement model in the radial space In the third section, we developed a Spatio-Temporal Joint Probabilistic Data Association Filter (ST-JPDAF). In multiple extended target tracking problems with clutter, sometimes extended targets share measurements: for example, in lane-line detection, when two-lane markings pass or merge together. In single-point target tracking, this problem can be solved using the famous Joint Probabilistic Data Association (JPDA) filter. In the single-point case, even when measurements are dependent, we can stack them in the coupled form of JPDA. In this last chapter, we expanded JPDA for tracking multiple dependent extended targets using an approach called ST-JPDAF. We managed dependency of measurements in space (inside a frame) and time (between frames) using different kernel functions, which can be learned using the trained data. This extension can be used to track the shape and dynamic of dependent extended targets within clutter when targets share measurements. The performance of the proposed methods in all three chapters are quanti ed on real data scenarios and their results are compared against well-known model-based, semi-supervised, and fully-supervised methods. The proposed methods offer very promising results. / Thesis / Doctor of Philosophy (PhD)

Page generated in 0.0877 seconds