Spelling suggestions: "subject:"extremely""
111 |
Improving Detection And Prediction Of Bridge Scour Damage And Vulnerability Under Extreme Flood Events Using Geomorphic And Watershed DataAnderson, Ian 01 January 2018 (has links)
Bridge scour is the leading cause of bridge damage nationwide. Successfully mitigating bridge scour problems depends on our ability to reliably estimate scour potential, design safe and economical foundation elements that account for scour potential, identify vulnerabilities related to extreme events, and recognize changes to the environmental setting that increase risk at existing bridges.
This study leverages available information, gathered from several statewide resources, and adds watershed metrics to create a comprehensive, georeferenced dataset to identify parameters that correlate to bridges damaged in an extreme flood event. Understanding the underlying relationships between existing bridge condition, fluvial stresses, and geomorphological changes is key to identifying vulnerabilities in both existing and future bridge infrastructure. In creating this comprehensive database of bridge inspection records and associated damage characterization, features were identified that correlate to and discriminate between levels of bridge damage.
Stream geomorphic assessment features were spatially joined to every bridge, marking the first time that geomorphic assessments have been broadly used for estimating bridge vulnerability. Stream power assessments and watershed delineations for every bridge and stream reach were generated to supplement the comprehensive database. Individual features were tested for their significance to discriminate bridge damage, and then used to create empirical fragility curves and probabilistic predictions maps to aid in future bridge vulnerability detection. Damage to over 300 Vermont bridges from a single extreme flood event, the August 28, 2011 Tropical Storm Irene, was used as the basis for this study. Damage to historic bridges was also summarized and tabulated. In some areas of Vermont, the storm rainfall recurrence interval exceeded 500 years, causing widespread flooding and damaging over 300 bridges. With a dataset of over 330 features for more than 2,000 observations to bridges that were damaged as well as not damaged in the storm, an advanced evolutionary algorithm performed multivariate feature selection to overcome the shortfalls of traditional logistic regression analysis. The analysis identified distinct combinations of variables that correlate to the observed bridge damage under extreme food events.
|
112 |
Local farmer knowledge of adaptive management on diversified vegetable and berry farms in the northeastern USWhite, Alissa 01 January 2019 (has links)
Agricultural adaptation to climate change is notoriously context specific. Recently updated projections for the Northeastern US forecast increasingly severe and erratic precipitation events which pose significant risks to every sector of agricultural production in the region. Vegetable and berry farmers are among the most vulnerable to the risks of severe precipitation and drought due to the intensive soil and crop management strategies which characterize of this kind of production. To successfully adapt to a changing climate, these farmers need information which is tailored for the unique challenges of vegetable and berry production, framed at the level of climate impacts, and delivered through the familiar lexicon used by farmers in the region.
My approach is grounded by partnerships with farmer networks to inform both the relevance of this information and my outreach strategy for sharing results. This research presents complimentary quantitative and qualitative data sets on adaptive management, and highlights the insight of farmers voices on innovative and promising solutions for managing climate related risks.
The goal of the project was to create usable information for producers through a Farmer First approach which privileges the voices and experiences of farmers in determining the information and resources they need. As part of a broader project, this thesis analyzed the results of a regional survey of vegetable and berry growers conducted over the winter months of 2017-2018. The first chapter reviews theoretical foundations for academic study of agricultural management and climate change, with a focus on information usability. The second chapter applies theories of adaptation and resilience to identify agroecological principles for adapting farm management to water extremes and innovative practices emerging in the region. The third chapter uses a regression modelling approach to explore how adaptive management practices vary across site specific characteristics.
Our analysis identifies trends and principles for adapting to water excess and water deficits on diversified vegetable and berry farms in the Northeast. The research findings highlight how site characteristics influence the selection of adaptive management practices on farms in the Northeast.
|
113 |
Hypothesis Testing for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous TraitsJanuary 2018 (has links)
acase@tulane.edu / Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in the extreme phenotypic samples within the top and bottom percentiles, EPS can boost the study power compared with the random sampling with the same sample size. The existing statistical methods for EPS data test the variants/regions individually. However, many disorders are caused by multiple genetic factors. Therefore, it is critical to simultaneously model the effects of genetic factors, which may increase the power of current genetic studies and identify novel disease-associated genetic factors in EPS. The challenge of the simultaneous analysis of genetic data is that the number (p ~10,000) of genetic factors is typically greater than the sample size (n ~1,000) in a single study. The standard linear model would be inappropriate for this p>n problem due to the rank deficiency of the design matrix. An alternative solution is to apply a penalized regression method – the least absolute shrinkage and selection operator (LASSO).
LASSO can deal with this high-dimensional (p>n) problem by forcing certain regression coefficients to be zero. Although the application of LASSO in genetic studies under random sampling has been widely studied, its statistical inference and testing under EPS remain unknown. We propose a novel sparse model (EPS-LASSO) with hypothesis test for high-dimensional regression under EPS based on a decorrelated score function to investigate the genetic associations, including the gene expression and rare variant analyses. The comprehensive simulation shows EPS-LASSO outperforms existing methods with superior power when the effects are large and stable type I error and FDR control. Together with the real data analysis of genetic study for obesity, our results indicate that EPS-LASSO is an effective method for EPS data analysis, which can account for correlated predictors. / 1 / Chao Xu
|
114 |
Ray stretching statistics, hot spot formation, and universalities in weak random disorderJanuary 2018 (has links)
acase@tulane.edu / I review my three papers on ray stretching statistics, hot spot formation, and universality in motion through weak random media.
In the first paper, we study the connection between stretching exponents and ray densities in weak ray scattering through a random medium. The stretching exponent is a quantitative measure that describes the degree of exponential convergence or divergence among nearby ray trajectories. In the context of non-relativistic particle motion through a correlated random potential, we show how particle densities are strongly related to the stretching exponents, where the `hot spots' in the intensity profile correspond to minima in the stretching exponents. This strong connection is expected to be valid for different random potential distributions, and is also expected to apply to other physical contexts, such as deep ocean waves. The surprising minimum in the average stretching exponent is of great interest due to the associated appearance of the first generation of hot spots, and a detailed discussion will be found in the third paper.
In the second paper, we study the stretching statistics of weak ray scattering in various physical contexts and for different types of correlated disorder. The stretching exponent is mathematically linked to the monodromy matrix that evolves the phase space vector over time. From this point of view, we demonstrate analytically and numerically that the stretching statistics along the forward direction follow universal scaling relationships for different dispersion relations and in disorders of differing correlation structures. Predictions about the location of first caustics can be made using the universal evolution pattern of stretching exponents. Furthermore, we observe that the distribution of stretching exponents in 2D ray dynamics with small angular spread is equivalent to the same distribution in a simple 1D kicked model, which allows us to further explore the relation between stretching statistics and the form of the disorder.
Finally, the third paper focuses on the 1D kicked model with stretching statistics that resemble 2D small-angle ray scattering. While the long time behavior of the stretching exponent displays a simple linear growth, the behavior on the scale of the Lyapunov time is mathematically nontrivial. From an analysis of the evolving monodromy matrices, we demonstrate how the stretching exponent depends on the statistics of the second derivative of the random disorder, especially the mean and standard deviation. Furthermore, the maximal Lyapunov exponent or the Lyapunov length can be expressed as nontrivial functions of the mean and standard deviation of the kicks. Lastly, we show that the higher moments of the second derivative of the disorder have small or negligible effect on the evolution of the stretching exponents or the maximal Lyapunov exponents. / 1 / SicongChen
|
115 |
Extreme-day return as a measure of stock market volatility : comparative study developed vs. emerging capital markets of the worldKabir, Muashab, Ahmed, Naeem January 2010 (has links)
<p>This paper uses a new measure of volatility based on extreme day return occurrences and examines the relative prevailing volatility among worldwide stock markets during 1997-2009. Using several global stock market indexes of countries categorized as an emerging and developed capital markets are utilized. Additionally this study investigates well known anomalies namely Monday effect and January effect. Further using correlation analysis of co movement and extent of integration highlights the opportunities for international diversification among those markets. Evidences during this time period suggest volatility is not the only phenomena of emerging capital markets. Emerging markets offer opportunities of higher returns during volatility. Cross correlation analysis depicts markets have become more integrated during this time frame; still opportunities for higher returns prevail through global portfolio diversification.</p>
|
116 |
L'imaginaire du complot. Discours d'extrême droite en France et aux Etats-UnisJamin, Jérôme 04 July 2007 (has links)
Le nationalisme, la xénophobie, le racisme et lantisémitisme, lopposition aux élites, la stigmatisation des étrangers, les discours anti-immigrés, mais aussi lautoritarisme, lidéologie loi et ordre (Law and order), lantiparlementarisme et lanticommunisme, entre autres traits caractéristiques, représentent quelques-uns des qualificatifs les plus souvent cités dans la littérature consacrée au populisme et à lextrême droite. En fonction des partis politiques concernés, des contextes institutionnels et des particularités nationales et géographiques, ces qualificatifs prendront une dimension centrale ou secondaire selon quil sagira de caractériser un courant populiste ou un parti dextrême droite.
A lappui dune comparaison entre la France et les Etats-Unis, louvrage vise à démontrer que lensemble de ces qualificatifs entretiennent tous à des degrés divers un rapport fondamental avec un imaginaire du complot, cest-à-dire avec un monde de significations structuré et cohérent (normes, significations, images, symboles, valeurs et croyances) qui privilégie la théorie du complot pour expliquer la politique et lhistoire.
|
117 |
Diving In Extreme Environments: : The Scientific Diving ExperienceLang, Michael A. January 2012 (has links)
The scope of extreme-environment diving defined within this work encompasses diving modes outside of the generally accepted no-decompression, open-circuit, compressed-air diving limits on selfcontained underwater breathing apparatus (scuba) in temperate or warmer waters. Extreme-environment diving is scientifically and politically interesting. The scientific diving operational safety and medical framework is the cornerstone from which diving takes place in the scientific community. From this effective baseline, as evidenced by decades of very low DCS incidence rates, the question of whether compressed air is the best breathing medium under pressure was addressed with findings indicating that in certain depth ranges a higher fraction of oxygen (while not exceeding a PC 2 of 1.6 ATA) and a lower fraction of nitrogen result in extended bottom times and a more efficient decompression. Extremeenvironment diving under ice presents a set of physiological. equipment, training and operational challenges beyond regular diving that have also been met through almost 50 years of experience as an underwater research tool. Diving modes such as mixed-gas, surface-supplied diving with helmets may mitigate risk factors that the diver incurs as a result of depth, inert gas narcosis or gas consumption. A close approximation of inert gas loading and decompression status monitoring is a function met by dive computers, a necessity in particular when the diver ventures outside of the single-dive profile into the realm of multi-level, multi-day repetitive diving or decompression diving. The monitoring of decompression status in extreme environments is now done exclusively through the use of dive computers and evaluations of the performance of regulators under ice have determined the characteristics of the next generation of life-support equipment for extreme-environment diving for science. These polar, deep and contaminated water environments require risk assessment that analyzes hazards such as cold stress, hydration, overheating, narcosis, equipment performance and decompression sickness. Scientific diving is a valuable research tool that has become an integral methodology in the pursuit of scientific questions in extreme environments of polar regions, in contaminated waters, and at depth.
|
118 |
Extreme Value Theory with an Application to Bank Failures through ContagionNikzad, Rashid 03 October 2011 (has links)
This study attempts to quantify the shocks to a banking network and analyze the transfer of shocks through the network. We consider two sources of shocks: external shocks due to market and macroeconomic factors which impact the entire banking system, and idiosyncratic shocks due to failure of a single bank. The external shocks will be estimated by using two methods: (i) non-parametric simulation of the time series of shocks that occurred to the banking system in the past, and (ii) using the extreme value theory (EVT) to model the tail part of the shocks. The external shocks we considered in this study are due to exchange rate and treasury bill rate volatility. Also, an ARMA/GARCH model is used to extract iid residuals for this purpose. In the next step, the probability of the failure of banks in the system is studied by using Monte Carlo simulation. We calibrate the model such that the network resembles the Canadian banking system.
|
119 |
Extreme Value Theory with an Application to Bank Failures through ContagionNikzad, Rashid 03 October 2011 (has links)
This study attempts to quantify the shocks to a banking network and analyze the transfer of shocks through the network. We consider two sources of shocks: external shocks due to market and macroeconomic factors which impact the entire banking system, and idiosyncratic shocks due to failure of a single bank. The external shocks will be estimated by using two methods: (i) non-parametric simulation of the time series of shocks that occurred to the banking system in the past, and (ii) using the extreme value theory (EVT) to model the tail part of the shocks. The external shocks we considered in this study are due to exchange rate and treasury bill rate volatility. Also, an ARMA/GARCH model is used to extract iid residuals for this purpose. In the next step, the probability of the failure of banks in the system is studied by using Monte Carlo simulation. We calibrate the model such that the network resembles the Canadian banking system.
|
120 |
Användarcentrerad design och agila metoder : Integrering av prototyping och Extreme ProgrammingLundgren, Jens January 2008 (has links)
Agila metoder är en relativt ny ansats inom programvaruutvecklingsområdet och ses som en reaktion mot plandrivna metoder som har svårt att hantera oförutsägbara och skiftande krav. Agila metoder förespråkar nära och frekvent kundkommunikation och iterativt arbete för att ständigt kunna skapa, prioritera och verifiera krav. Dock uppmärksammar inte agila metoder aspekter som berör programvarans användbarhet. Syftet med rapporten är därför att integrera användarcentrerad design för att öka fokus mot programvarans användbarhet hos agila metoder. Användarcentrerad design är en process som fokuserar på användbarhet genom att användarna är en central aspekt i utvecklingsprocessen. Genom litteraturstudie identifierades förutsättningar men även hinder och svårigheter för en integrering av agila metoder och användarcentrerad design. Med de identifierade föutsättningarna som grund har sedan den användarcentrerade designprocessen prototyping integrerats med den agila metoden Extreme Programming. Resultatet blev en artefakt som integrerar Extreme Programming och prototyping och ökar möjligheten att utveckla en användbar programvara.
|
Page generated in 0.0266 seconds