• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 448
  • 315
  • 59
  • 50
  • 24
  • 12
  • 10
  • 10
  • 9
  • 9
  • 8
  • 7
  • 6
  • 6
  • 6
  • Tagged with
  • 1130
  • 1130
  • 346
  • 295
  • 279
  • 186
  • 136
  • 119
  • 111
  • 108
  • 106
  • 99
  • 85
  • 83
  • 83
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

Cardinality Constrained Robust Optimization Applied to a Class of Interval Observers

McCarthy, Philip James January 2013 (has links)
Observers are used in the monitoring and control of dynamical systems to deduce the values of unmeasured states. Designing an observer requires having an accurate model of the plant — if the model parameters are characterized imprecisely, the observer may not provide reliable estimates. An interval observer, which comprises an upper and lower observer, bounds the plant's states from above and below, given the range of values of the imprecisely characterized parameters, i.e., it defines an interval in which the plant's states must lie at any given instant. We propose a linear programming-based method of interval observer design for two cases: 1) only the initial conditions of the plant are uncertain; 2) the dynamical parameters are also uncertain. In the former, we optimize the transient performance of the interval observers, in the sense that the volume enclosed by the interval is minimized. In the latter, we optimize the steady state performance of the interval observers, in the sense that the norm of the width of the interval is minimized at steady state. Interval observers are typically designed to characterize the widest interval that bounds the states. This thesis proposes an interval observer design method that utilizes additional, but still-incomplete information, that enables the designer to identify tighter bounds on the uncertain parameters under certain operating conditions. The number of bounds that can be refined defines a class of systems. The definition of this class is independent of the specific parameters whose bounds are refined. Applying robust optimization techniques, under a cardinality constrained model of uncertainty, we design a single observer for an entire class of systems. These observers guarantee a minimum level of performance with respect to the aforementioned metrics, as we optimize the worst-case performance over a given class of systems. The robust formulation allows the designer to tune the level of uncertainty in the model. If many of the uncertain parameter bounds can be refined, the nominal performance of the observer can be improved, however, if few or none of the parameter bounds can be refined, the nominal performance of the observer can be designed to be more conservative.
362

Approximation algorithms for minimum knapsack problem

Islam, Mohammad Tauhidul, University of Lethbridge. Faculty of Arts and Science January 2009 (has links)
Knapsack problem has been widely studied in computer science for years. There exist several variants of the problem, with zero-one maximum knapsack in one dimension being the simplest one. In this thesis we study several existing approximation algorithms for the minimization version of the problem and propose a scaling based fully polynomial time approximation scheme for the minimum knapsack problem. We compare the performance of this algorithm with existing algorithms. Our experiments show that, the proposed algorithm runs fast and has a good performance ratio in practice. We also conduct extensive experiments on the data provided by Canadian Pacific Logistics Solutions during the MITACS internship program. We propose a scaling based e-approximation scheme for the multidimensional (d-dimensional) minimum knapsack problem and compare its performance with a generalization of a greedy algorithm for minimum knapsack in d dimensions. Our experiments show that the e- approximation scheme exhibits good performance ratio in practice. / x, 85 leaves ; 29 cm
363

Economic Dispatch using Advanced Dynamic Thermal Rating

Milad, Khaki Unknown Date
No description available.
364

Small-Scale Biogas Upgrading with Membranes: A Farm Based Techno-Economic and Social Assessment for Sustainable Development

Mamone, Richard Michael January 2014 (has links)
Membrane technology can help alleviate problems of matching supply and demand associated with upgrading on a small-scale level through its flexibility in operation. This paper provides a techno-economic assessment of the use of membrane technology via a quantitative and partial qualitative analysis at farm-based level. The purpose of the analysis is to investigate how the economic and environmental utility of the membranes can be maximised, along with outlining the possible reasons to its lack of diffusion. It combines an applied system research method by way of linear programming with interviews and the use of the innovation-decision process theory. A framework was set out to deliver hard and soft data that could also provide contextual in-depth analysis and discussion. It was found that membranes could provide good compatibility with farm based upgrading systems with desirable outcomes for both an economic and environmental viewpoint. More specifically, upgrading to 80 percent (which is below natural gas standards of 96 percent), was found to be more favourable than to upgrade to 96 percent. However, in addition to much further research and deliberation needed before 80 percent biogas can be used commercially in tractors, the study also outlined priority that needs to be given to the local market demand as well as for the need to introduce closer, more personal engagement with the farmers and make trialing and observing membrane technology better facilitated and funded so as to increase its adoption.
365

Robust techniques for regression models with minimal assumptions / M.M. van der Westhuizen

Van der Westhuizen, Magdelena Marianna January 2011 (has links)
Good quality management decisions often rely on the evaluation and interpretation of data. One of the most popular ways to investigate possible relationships in a given data set is to follow a process of fitting models to the data. Regression models are often employed to assist with decision making. In addition to decision making, regression models can also be used for the optimization and prediction of data. The success of a regression model, however, relies heavily on assumptions made by the model builder. In addition, the model may also be influenced by the presence of outliers; a more robust model, which is not as easily affected by outliers, is necessary in making more accurate interpretations about the data. In this research study robust techniques for regression models with minimal assumptions are explored. Mathematical programming techniques such as linear programming, mixed integer linear programming, and piecewise linear regression are used to formulate a nonlinear regression model. Outlier detection and smoothing techniques are included to address the robustness of the model and to improve predictive accuracy. The performance of the model is tested by applying it to a variety of data sets and comparing the results to those of other models. The results of the empirical experiments are also presented in this study. / Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2011.
366

P-Cycle-based Protection in Network Virtualization

Song, Yihong 25 February 2013 (has links)
As the "network of network", the Internet has been playing a central and crucial role in modern society, culture, knowledge, businesses and so on in a period of over two decades by supporting a wide variety of network technologies and applications. However, due to its popularity and multi-provider nature, the future development of the Internet is limited to simple incremental updates. To address this challenge, network virtualization has been propounded as a potential candidate to provide the essential basis for the future Internet architecture. Network virtualization is capable of providing an open and flexible networking environment in which service providers are allowed to dynamically compose multiple coexisting heterogeneous virtual networks on a shared substrate network. Such a flexible environment will foster the deployment of diversified services and applications. A major challenge in network virtualization area is the Virtual Network Embedding (VNE), which aims to statically or dynamically allocate virtual nodes and virtual links on substrate resources, physical nodes and paths. Making effective use of substrate resources requires high-efficient and survivable VNE techniques. The main contribution of this thesis is two high-performance p-Cycle-based survivable virtual network embedding approaches. These approaches take advantage of p-Cycle-based protection techniques that minimize the backup resources while providing a full VN protection scheme against link and node failures.
367

Robust techniques for regression models with minimal assumptions / M.M. van der Westhuizen

Van der Westhuizen, Magdelena Marianna January 2011 (has links)
Good quality management decisions often rely on the evaluation and interpretation of data. One of the most popular ways to investigate possible relationships in a given data set is to follow a process of fitting models to the data. Regression models are often employed to assist with decision making. In addition to decision making, regression models can also be used for the optimization and prediction of data. The success of a regression model, however, relies heavily on assumptions made by the model builder. In addition, the model may also be influenced by the presence of outliers; a more robust model, which is not as easily affected by outliers, is necessary in making more accurate interpretations about the data. In this research study robust techniques for regression models with minimal assumptions are explored. Mathematical programming techniques such as linear programming, mixed integer linear programming, and piecewise linear regression are used to formulate a nonlinear regression model. Outlier detection and smoothing techniques are included to address the robustness of the model and to improve predictive accuracy. The performance of the model is tested by applying it to a variety of data sets and comparing the results to those of other models. The results of the empirical experiments are also presented in this study. / Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2011.
368

An investigation of computer based tools for mathematical programming modelling

Lucas, Cormac Anthony January 1986 (has links)
No description available.
369

Strategic Forest Management Planning Under Uncertainty Due to Fire

Savage, David William 23 February 2010 (has links)
Forest managers throughout Canada must contend with natural disturbance processes that vary over both time and space when developing and implementing forest management plans designed to provide a range of economic, ecological, and social values. In this thesis, I develop a stochastic simulation model with an embedded linear programming (LP) model and use it to evaluate strategies for reducing uncertainty due to forest fires. My results showed that frequent re-planning was sufficient to reduce variability in harvest volume when the burn fraction was low, however, as the burn fraction increased above 0.45%, the best strategy to reduce variability in harvest volume was to account for fire explicitly in the planning process using Model III. A risk analysis tool was also developed to demonstrate a method for managers to improve decision making under uncertainty. The impact of fire on mature and old forest areas was examined and showed that LP forest management planning models reduce the areas of mature and old forest to the minimum required area and fire further reduces the seral area. As the burn fraction increased, the likelihood of the mature and old forest areas satisfying the minimum area requirements decreased. However, if the seral area constraint was strengthened (i.e., the right hand side of the constraint was increased) the likelihood improved. When the planning model was modified to maximize mature and old forest areas, the two fixed harvest volumes (i.e., 2.0 and 8.0 M. m3/decade) had much different impacts on the areas of mature and old forest when the burn fraction was greater than 0.45%. Bootstrapped burn fraction confidence intervals were used to examine the impact of uncertain burn fraction estimates when using Model III to develop harvest schedules. I found that harvest volume bounds were large when the burn fraction was ≥0.45%. I also examined how the uncertainty in natural burn fraction (i.e., estimates of pre-fire suppression average annual area burned) estimates being used for ecosystem management can impact old forest area requirements and the resulting timber supply.
370

Numerically Efficient Water Quality Modeling and Security Applications

Mann, Angelica 02 October 2013 (has links)
Chemical and biological contaminants can enter a drinking water distribution system through one of the many access points to the network and can spread quickly affecting a very large area. This is of great concern, and water utilities need to consider effective tools and mitigation strategies to improve water network security. This work presents two components that have been integrated into EPA’s Water Security Toolkit, an open-source software package that includes a set of tools to help water utilities protect the public against potential contamination events. The first component is a novel water quality modeling framework referred to as Merlion. The linear system describing contaminant spread through the network at the core of Merlion provides several advantages and potential uses that are aligned with current emerging water security applications. This computational framework is able to efficiently generate an explicit mathematical model that can be easily embedded into larger mathematical system. Merlion can also be used to efficiently simulate a large number of scenarios speeding up current water security tools by an order of magnitude. The last component is a pair of mixed-integer linear programming (MILP) formulations for efficient source inversion and optimal sampling. The contaminant source inversion problem involves determining the source of contamination given a small set of measurements. The source inversion formulation is able to handle discrete positive/negative measurements from manual grab samples taken at different sampling cycles. In addition, sensor/sample placement formulations are extended to determine the optimal locations for the next manual sampling cycle. This approach is enabled by a strategy that significantly reduces the size of the Merlion water quality model, giving rise to a much smaller MILP that is solvable in a real-time setting. The approach is demonstrated on a large-scale water network model with over 12,000 nodes while considering over 100 timesteps. The results show the approach is successful in finding the source of contamination remarkably quickly, requiring a small number of sampling cycles and a small number of sampling teams. These tools are being integrated and tested with a real-time response system.

Page generated in 0.0972 seconds