• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 5
  • 5
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Optimization Models and Algorithms for Vulnerability Analysis and Mitigation Planning of Pyro-Terrorism

Rashidi, Eghbal 12 August 2016 (has links)
In this dissertation, an important homeland security problem is studied. With the focus on wildfire and pyro-terrorism management. We begin the dissertation by studying the vulnerability of landscapes to pyro-terrorism. We develop a maximal covering based optimization model to investigate the impact of a pyro-terror attack on landscapes based on the ignition locations of fires. We use three test case landscapes for experimentation. We compare the impact of a pyro-terror wildfire with the impacts of naturally-caused wildfires with randomly located ignition points. Our results indicate that a pyro-terror attack, on average, has more than twice the impact on landscapes than wildfires with randomly located ignition points. In the next chapter, we develop a Stackelberg game model, a min-max network interdiction framework that identifies a fuel management schedule that, with limited budget, maximally mitigates the impact of a pyro-terror attack. We develop a decomposition algorithm called MinMaxDA to solve the model for three test case landscapes, located in Western U.S. Our results indicate that fuel management, even when conducted on a small scale (when 2% of a landscape is treated), can mitigate a pyro-terror attack by 14%, on average, comparing to doing nothing. For a fuel management plan with 5%, and 10% budget, it can reduce the damage by 27% and 43% on average. Finally, we extend our study to the problem of suppression response after a pyro-terror attack. We develop a max-min model to identify the vulnerability of initial attack resources when used to fight a pyro-terror attack. We use a test case landscape for experimentation and develop a decomposition algorithm called Bounded Decomposition Algorithm (BDA) to solve the problem since the model has bilevel max-min structure with binary variables in the lower level and therefore not solvable by conventional methods. Our results indicate that although pyro-terror attacks with one ignition point can be controlled with an initial attack, pyro-terror attacks with two and more ignition points may not be controlled by initial attack. Also, a faster response is more promising in controlling pyro-terror fires.
2

A sampling-based decomposition algorithm with application to hydrothermal scheduling : cut formation and solution quality

Queiroz, Anderson Rodrigo de 06 February 2012 (has links)
We consider a hydrothermal scheduling problem with a mid-term horizon(HTSPM) modeled as a large-scale multistage stochastic program with stochastic monthly inflows of water to each hydro generator. In the HTSPM we seek an operating policy to minimize the sum of present and expected future costs, which include thermal generation costs and load curtailment costs. In addition to various simple bounds, problem constraints involve water balance, demand satisfaction and power interchanges. Sampling-based decomposition algorithms (SBDAs) have been used in the literature to solve HTSPM. SBDAs can be used to approximately solve problem instances with many time stages and with inflows that exhibit interstage dependence. Such dependence requires care in computing valid cuts for the decomposition algorithm. In order to help maintain tractability, we employ an aggregate reservoir representation (ARR). In an ARR all the hydro generators inside a specific region are grouped to effectively form one hydro plant with reservoir storage and generation capacity proportional to the parameters of the hydro plants used to form that aggregate reservoir. The ARR has been used in the literature with energy balance constraints, rather than water balance constraints, coupled with time series forecasts of energy inflows. Instead, we prefer as a model primitive to have the time series model forecast water inflows. This, in turn, requires that we extend existing methods to compute valid cuts for the decomposition method under the resulting form of interstage dependence. We form a sample average approximation of the original problem and then solve this problem by these special-purpose algorithms. And, we assess the quality of the resulting policy for operating the system. In our analysis, we compute a confidence interval on the optimality gap of a policy generated by solving an approximation on a sampled scenario tree. We present computational results on test problems with 24 monthly stages in which the inter-stage dependency of hydro inflows is modeled using a dynamic linear model. We further develop a parallel implementation of an SBDA. We apply SBDA to solve the HTSPM for the Brazilian power system that has 150 hydro generators, 151 thermal generators and 4 regions that each characterize an aggregate reservoir. We create and solve four different HTSPM instances where we change the input parameters with respect to generation capacity, transmission capacity and load in order to analyze the difference in the total expected cost. / text
3

INVESTIGATION OF FACTORS AFFECTING COLLISION CVD ESTIMATION AND THE IMPACT OF DECOMPOSITION ERRORS ON THE EMG SIGNAL COHERENCE

Majeti, Srivatsa Subba Rao 20 July 2010 (has links)
Experimental measurements are never perfect, even with sophisticated modern instruments. One of the fundamental problems in signal measurement is distinguishing the noise from the signal. Sometimes the two can be partly distinguished on the basis of frequency components: for example, the signal may contain mostly low-frequency components and the noise may be located at higher frequencies. This is the basis of filtering. This thesis discusses some changes in the experimental protocol such as determining a suitable stimulation site to elicit full compound nerve action potentials (CNAP). The effect of sampling frequency and smoothing techniques to improve the resolution of the conduction velocity distribution (CVD) estimates will also be discussed. A change in stimulation site to elicit the full CNAPs is proposed after realizing that it is relatively difficult to stimulate at the same location to recruit the nerve fibers repeatedly at the elbow. Thus, the stimulation site was changed from elbow to wrist to elicit the full CNAPs. From the simulations it is evident that there was some signal information beyond 2.5 kHz frequency resulting in an increase in the sampling rate from 5 kHz to 10 kHz. The results obtained after employing smoothing techniques improved the CVD resolution. The simulation results were corroborated with the experimental results obtained. Another aspect of this thesis is to check the error tolerance of the EMG decomposition algorithm. Once the muscle electrical activity is recorded, MU trains undergo an automatic decomposition process. Decomposition errors are present in most contractions, thus a human operator has to make changes/correct the values of the motor unit firing times. From the data acquired, false negatives, false positives and false negative-positive errors have been introduced. Different levels of errors to measure the coherence between two motor-unit firing trains from a muscle contraction were also introduced. The firing rate curves are computed for each MU to analyze the interactions between two motor units (MU). The false negatives type of errors was found to be least detrimental. Whereas the false positives and false negative-positive errors affected coherence the most, their error tolerance was only a single error per 5 seconds.
4

Predictability of Nonstationary Time Series using Wavelet and Empirical Mode Decomposition Based ARMA Models

Lanka, Karthikeyan January 2013 (has links) (PDF)
The idea of time series forecasting techniques is that the past has certain information about future. So, the question of how the information is encoded in the past can be interpreted and later used to extrapolate events of future constitute the crux of time series analysis and forecasting. Several methods such as qualitative techniques (e.g., Delphi method), causal techniques (e.g., least squares regression), quantitative techniques (e.g., smoothing method, time series models) have been developed in the past in which the concept lies in establishing a model either theoretically or mathematically from past observations and estimate future from it. Of all the models, time series methods such as autoregressive moving average (ARMA) process have gained popularity because of their simplicity in implementation and accuracy in obtaining forecasts. But, these models were formulated based on certain properties that a time series is assumed to possess. Classical decomposition techniques were developed to supplement the requirements of time series models. These methods try to define a time series in terms of simple patterns called trend, cyclical and seasonal patterns along with noise. So, the idea of decomposing a time series into component patterns, later modeling each component using forecasting processes and finally combining the component forecasts to obtain actual time series predictions yielded superior performance over standard forecasting techniques. All these methods involve basic principle of moving average computation. But, the developed classical decomposition methods are disadvantageous in terms of containing fixed number of components for any time series, data independent decompositions. During moving average computation, edges of time series might not get modeled properly which affects long range forecasting. So, these issues are to be addressed by more efficient and advanced decomposition techniques such as Wavelets and Empirical Mode Decomposition (EMD). Wavelets and EMD are some of the most innovative concepts considered in time series analysis and are focused on processing nonlinear and nonstationary time series. Hence, this research has been undertaken to ascertain the predictability of nonstationary time series using wavelet and Empirical Mode Decomposition (EMD) based ARMA models. The development of wavelets has been made based on concepts of Fourier analysis and Window Fourier Transform. In accordance with this, initially, the necessity of involving the advent of wavelets has been presented. This is followed by the discussion regarding the advantages that are provided by wavelets. Primarily, the wavelets were defined in the sense of continuous time series. Later, in order to match the real world requirements, wavelets analysis has been defined in discrete scenario which is called as Discrete Wavelet Transform (DWT). The current thesis utilized DWT for performing time series decomposition. The detailed discussion regarding the theory behind time series decomposition is presented in the thesis. This is followed by description regarding mathematical viewpoint of time series decomposition using DWT, which involves decomposition algorithm. EMD also comes under same class as wavelets in the consequence of time series decomposition. EMD is developed out of the fact that most of the time series in nature contain multiple frequencies leading to existence of different scales simultaneously. This method, when compared to standard Fourier analysis and wavelet algorithms, has greater scope of adaptation in processing various nonstationary time series. The method involves decomposing any complicated time series into a very small number of finite empirical modes (IMFs-Intrinsic Mode Functions), where each mode contains information of the original time series. The algorithm of time series decomposition using EMD is presented post conceptual elucidation in the current thesis. Later, the proposed time series forecasting algorithm that couples EMD and ARMA model is presented that even considers the number of time steps ahead of which forecasting needs to be performed. In order to test the methodologies of wavelet and EMD based algorithms for prediction of time series with non stationarity, series of streamflow data from USA and rainfall data from India are used in the study. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability by the proposed algorithm is checked in two scenarios, first being six months ahead forecast and the second being twelve months ahead forecast. Normalized Root Mean Square Error (NRMSE) and Nash Sutcliffe Efficiency Index (Ef) are considered to evaluate the performance of the proposed techniques. Based on the performance measures, the results indicate that wavelet based analyses generate good variations in the case of six months ahead forecast maintaining harmony with the observed values at most of the sites. Although the methods are observed to capture the minima of the time series effectively both in the case of six and twelve months ahead predictions, better forecasts are obtained with wavelet based method over EMD based method in the case of twelve months ahead predictions. It is therefore inferred that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm could be used to model events such as droughts with reasonable accuracy. Also, some modifications that could be made in the model have been suggested which can extend the scope of applicability to other areas in the field of hydrology.
5

Beyond hairballs: depicting complexity of a kinase-phosphatase network in the budding yeast

Abd-Rabbo, Diala 01 1900 (has links)
No description available.

Page generated in 0.1001 seconds