• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 64
  • 48
  • 30
  • 20
  • 4
  • Tagged with
  • 175
  • 125
  • 123
  • 75
  • 75
  • 75
  • 65
  • 62
  • 62
  • 62
  • 44
  • 39
  • 33
  • 33
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Combining Decomposition and Hybrid Algorithms for the Satellite Range Scheduling Problems

Feng, Ti Kan 21 March 2012 (has links)
Multiple-resource satellite scheduling problem (MuRRSP) is a complex and difficult scheduling problem, which schedules a large number of task requests onto ground-station antennas in order to communicate with the satellites. We first examined several exact algorithms that were previously implemented in the machine scheduling field. They are column generation and logic-based Benders decomposition. A new hybrid approach that combines both column generation and logic-based Benders decomposition is proposed. The hybrid performed well when there is a large number of machines. Next, we presented a connection between the parallel machine scheduling problem and MuRRSP in order to solve MuRRSP with exact algorithms. Furthermore, we proposed a strengthened cut in the sub-problem of the logic-based Benders decomposition. The resulting algorithm proved to be very effective. Barbulescu’s benchmark problems were solved and proved optimal with an average run-time less than one-hour. To the best of our knowledge, previous efforts to solve MuRRSP were all heuristic based and no optimal schedules existed.
32

Combining Decomposition and Hybrid Algorithms for the Satellite Range Scheduling Problems

Feng, Ti Kan 21 March 2012 (has links)
Multiple-resource satellite scheduling problem (MuRRSP) is a complex and difficult scheduling problem, which schedules a large number of task requests onto ground-station antennas in order to communicate with the satellites. We first examined several exact algorithms that were previously implemented in the machine scheduling field. They are column generation and logic-based Benders decomposition. A new hybrid approach that combines both column generation and logic-based Benders decomposition is proposed. The hybrid performed well when there is a large number of machines. Next, we presented a connection between the parallel machine scheduling problem and MuRRSP in order to solve MuRRSP with exact algorithms. Furthermore, we proposed a strengthened cut in the sub-problem of the logic-based Benders decomposition. The resulting algorithm proved to be very effective. Barbulescu’s benchmark problems were solved and proved optimal with an average run-time less than one-hour. To the best of our knowledge, previous efforts to solve MuRRSP were all heuristic based and no optimal schedules existed.
33

Law as Information Processes

Collecchia, Lucas 21 November 2013 (has links)
This thesis describes a new theoretical framework for characterizing legal systems and legal thought. Broadly speaking, legal systems can be characterized as undertaking three functional activities: the intake, processing and distribution of information. The thesis defines and explains what those three activities consist of, their interrelation and describes some of the emergent phenomena which arise as a result of their co-existence. Additionally, examples are provided which show elements of legal systems having behavior neatly predicted by information-first methods of analysis. The aim is to develop information-related tools to understand the function of legal systems and subsystems in society by reference to those three activities, and a robust set of fields and concepts are presented for future development.
34

Stochastic Models For Evolution Of Tumor Geometry for Cervical Cancer During Radiation Therapy

Yifang, Liu 05 December 2013 (has links)
Adaptive radiation therapy re-optimizes treatment plans based on updated tumor geometries from magnetic resonance imaging scans. However, the imaging process is costly in labor and equipment. In this study, we develop a mathematical model that describes tumor evolution based on a Markov assumption. We then extend the model to predict tumor evolution with any level of information from a new patient: weekly MRI scans are used to estimate transition probabilities when available, otherwise historical MRI scans are used. In the latter case, patients in the historical data are clustered into two groups, and the model relates the new patient's behavior to the existing two groups. The models are evaluated with 33 cervical cancer patients from Princess Margaret Cancer Centre. The result indicates that our models outperform the constant volume model, which replicates the current clinical practice.
35

Stochastic Models For Evolution Of Tumor Geometry for Cervical Cancer During Radiation Therapy

Yifang, Liu 05 December 2013 (has links)
Adaptive radiation therapy re-optimizes treatment plans based on updated tumor geometries from magnetic resonance imaging scans. However, the imaging process is costly in labor and equipment. In this study, we develop a mathematical model that describes tumor evolution based on a Markov assumption. We then extend the model to predict tumor evolution with any level of information from a new patient: weekly MRI scans are used to estimate transition probabilities when available, otherwise historical MRI scans are used. In the latter case, patients in the historical data are clustered into two groups, and the model relates the new patient's behavior to the existing two groups. The models are evaluated with 33 cervical cancer patients from Princess Margaret Cancer Centre. The result indicates that our models outperform the constant volume model, which replicates the current clinical practice.
36

Problems in Supply Chain Location and Inventory under Uncertainty

Hajizadeh Saffar, Iman 13 August 2010 (has links)
We study three problems on supply chain location and inventory under uncertainty. In Chapter 2, we study the inventory purchasing and allocation problem in a movie rental chain under demand uncertainty. We formulate this problem as a newsvendor-like problem with multiple rental opportunities. We study several demand and return forecasting models based on comparable films using iterative maximum likelihood estimation and Bayesian estimation via Markov chain Monte Carlo simulation. Test results on data from a large movie rental firm reveal systematic under-buying of movies purchased through revenue sharing contracts and over-buying of movies purchased through standard ones. For the movies considered, the model estimates an increase in the average profit per title for new movies by 15.5% and 2.5% for revenue sharing and standard titles, respectively. We discuss the implications of revenue sharing on the profitability of both the rental firm and the studio. In Chapter 3, we focus on the effect of travel time uncertainty on the location of facilities that provide service within a given coverage radius on the transportation network. Three models - expected covering, robust covering and expected p-robust covering - are studied; each appropriate for different types of facilities. Exact and approximate algorithms are developed. The models are used to analyze the location of fire stations in the city of Toronto. Using real traffic data we show that the current system design is quite far from optimality and provide recommendations for improving the performance. In Chapter 4, we continue our analysis in Chapter 3 to study the trade-off between adding new facilities versus relocating some existing facilities. We consider a multi-objective problem that aims at minimizing the number of facility relocations while maximizing expected and worst case network coverage. Exact and approximate algorithms are developed to solve three variations of the problem and find expected--worst case trade-off curves for any given number of relocations. The models are used to analyze the addition of four new fire stations to the city of Toronto. Our results suggest that the benefit of adding four new stations is achievable, at a lower cost, by relocating 4-5 stations.
37

An Integrated Two-stage Innovation Planning Model with Market Segmented Learning and Network Dynamics

Ferreira, Kevin D. 28 February 2013 (has links)
Innovation diffusion models have been studied extensively to forecast and explain the adoption process for new products or services. These models are often formulated using one of two approaches: The first, and most common is a macro-level approach that aggregates much of the market behaviour. An advantage of this method is that forecasts and other analyses may be performed with the necessity of estimating few parameters. The second is a micro-level approach that aims to utilize microeconomic information pertaining to the potential market and the innovation. The advantage of this methodology is that analyses allow for a direct understanding of how potential customers view the innovation. Nevertheless, when individuals are making adoption decisions, the reality of the situation is that the process consists of at least two stages: First, a potential adopter must become aware of the innovation; and second the aware individual must decide to adopt. Researchers, have studied multi-stage diffusion processes in the past, however a majority of these works employ a macro-level approach to model market flows. As a result, a direct understanding of how individuals value the innovation is lacking, making it impossible to utilize this information to model realistic word-of-mouth behaviour and other network dynamics. Thus, we propose a two-stage integrated model that utilizes the benefits of both the macro- and micro-level approaches. In the first stage, potential customers become aware of the innovation, which requires no decision making by the individual. As a result, we employ a macro-level diffusion process to describe the first stage. However, in the second stage potential customers decide whether to adopt the innovation or not, and we utilize a micro-level methodology to model this. We further extend the application to include forward looking behaviour, heterogeneous adopters and segmented Bayesian learning, and utilize the adopter's satisfaction levels to describe biasing and word-of-mouth behaviour. We apply the proposed model to Canadian colour-TV data, and cross-validation results suggest that the new model has excellent predictive capabilities. We also apply the two-stage model to early U.S. hybrid-electric vehicle data and results provide insightful managerial observations.
38

Sequential and simultaneous lifting in the node packing polyhedron

Pavelka, Jeffrey William January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Integer programs (IPs) are a commonly researched class of decision problems. These problems are used in various applications to help companies, governments, or individuals make better decisions by determining optimal resource allocations. While IPs are practical tools, they require an exponential amount of effort to solve, unless P = NP. This fact has led to much research focused on reducing the time required to solve IPs. Cutting planes are a commonly used tool for reducing IP solving time. Lifting, a process of changing the coefficients in an inequality, is often employed to strengthen cutting planes. When lifting, the goal is often to create a facet defining inequality, which is theoretically the strongest cutting plane. This thesis introduces two new lifting procedures for the Node Packing problem. The Node Packing problem seeks to select the maximum number of nodes in a graph such that no two nodes are adjacent. The first lifting method, the Simultaneous Lifting Expansion, takes two inequalities and combines them to make a stronger cut. It works for any two general classes of inequalities, as long as the requisite graph structures are met. The second method, the Cliques On Odd-holes Lifting (COOL) procedure, lifts from an odd-hole inequality to a facet defining inequality. COOL makes use of the Odd Gap Lifting procedure, an efficient method for finding lifting coefficients on odd holes. A computational study shows COOL to be effective in creating cuts in graphs with low edge densities.
39

Simultaneously lifting multiple sets in binary knapsack integer programs

Kubik, Lauren Ashley January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Integer programs (IPs) are mathematical models that can provide organizations with the ability to optimally obtain their goals through appropriate utilization and allocation of available resources. Unfortunately, IPs are NP-complete in the strong sense, and many integer programs cannot be solved. Introduced by Gomory, lifting is a technique that takes a valid inequality and strengthens it. Lifting can result in facet defining inequalities, which are the theoretically strongest inequalities; because of this, lifting techniques are commonly used in commercial IP software to reduce the time required to solve an IP. This thesis introduces two new algorithms for exact simultaneous up lifting multiple sets into binary knapsack problems and introduces sequential simultaneous lifting. The Dynamic Programming Multiple Lifting Set Algorithm (DPMLSA) is a pseudopolynomial time algorithm bounded by O(nb) effort that can exactly uplift an arbitrary number of sets. The Three Set Simultaneous Lifting Algorithm (TSSLA) is a polynomial time algorithm bounded by O(n2) and can exact simultaneously up lift three sets. The simultaneously lifted inequalities generated by the DPMLSA and the TSSLA can be facet defining, and neither algorithm requires starting with a minimal cover. A brief computational study shows that the DPMLSA is fast and required an average of only 0.070 seconds. The computational study also shows these sequential simultaneously lifted inequalities are useful, as the solution time decreased by an overall average of 18.4%. Therefore, implementing the DPMLSA should be beneficial for large IPs.
40

Optimizing quarantine regions through graph theory and simulation

Carlyle, Kyle R. January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Epidemics have been modeled mathematically as a way to safely understand them. For many of these mathematical models, the underlying assumptions they make provide excellent mathematical results, but are unrealistic for practical use. This research branches out from previous work by providing a model of the spread of infectious diseases and a model of quarantining this disease without the limiting assumptions of previous research. One of the main results of this thesis was the development of a core simulation that rapidly simulates the spread of an epidemic on a contact network. This simulation can be easily adapted to any disease through the adjustment of many parameters. This research provides the first definition for a quarantine cut and an ellipsoidal geographic network. This thesis uses the ellipsoidal geographic network to determine what is, and what is not, a feasible quarantine region. The quarantine cut is a new approach to partitioning quarantined and saved individuals in an optimized way. To achieve an optimal quarantine cut, an integer program was developed. Although this integer program runs in polynomial time, the preparation required to execute this algorithm is unrealistic in a disease outbreak scenario. To provide implementable results, a heuristic and some general theory are provided. In a study, the heuristic performed within 10% of the optimal quarantine cut, which shows that the theory developed in this thesis can be successfully used in a disease outbreak scenario.

Page generated in 0.0245 seconds