• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 6
  • Tagged with
  • 36
  • 36
  • 36
  • 26
  • 19
  • 19
  • 11
  • 9
  • 9
  • 8
  • 8
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Accounting for individual choice in public health emergency response planning

Martin, Christopher A. January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Jessica L. Heier Stamm / During public health emergencies, organizations in charge require an immediate and e ffcient method of distributing supplies over a large scale area. Due to the uncertainty of where individuals will choose to receive supplies, these distribution strategies have to account for the unknown demand at each facility. Current techniques rely on population ratios or requests by health care providers. This can lead to an increased disparity in individuals' access to the medical supplies. This research proposes a mathematical programming model, along with a solution methodology to inform distribution system planning for public health emergency response. The problem is motivated by distribution planning for pandemic influenza vaccines or countermeasures. The model uses an individual choice constraint to determine what facility the individual will choose to receive their supplies. This model also determines where to allocate supplies in order to meet the demand of each facility. The model was solved using a decomposition method. This method allows large problems to be solved quickly without losing equity in the solution. In the absence of publicly-available data on actual distribution plans from previous pandemic response e fforts, the method is applied to another representative data set. A computational study of the equity and number of people served depict how the model performed compared to the actual data. The results show that implementing an individual choice constraint will improve the effectiveness of a public health emergency response campaign without losing equity. The thesis provides several contributions to prior research. The first contribution is an optimization model that implements individual choice in a constraint. This determines where individuals will choose to receive their supplies so improved decisions can be made about where to allocate the resources. Another contribution provided is a solution methodology to solve large problems using a decomposition method. This provides a faster response to the public health emergency by splitting the problem into smaller subproblems. This research also provides a computational study using a large data set and the impact of using a model that accounts for individual choice in a distribution campaign.
12

Generating cutting planes through inequality merging for integer programming problems

Hickman, Randal Edward January 1900 (has links)
Doctor of Philosophy / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Integer Programming (IP) problems are a common type of optimization problem used to solve numerous real world problems. IPs can require exponential computational effort to solve using the branch and bound technique. A popular method to improve solution times is to generate valid inequalities that serve as cutting planes. This dissertation introduces a new category of cutting planes for general IPs called inequality merging. The inequality merging technique combines two or more low dimensional inequalities, yielding valid inequalities of potentially higher dimension. The dissertation describes several theoretical results of merged inequalities. This research applies merging inequalities to a frequently used class of IPs called multiple knapsack (MK) problems. Theoretical results related to merging cover inequalities are presented. These results include: conditions for validity, conditions for facet defining inequalities, merging simultaneously over multiple cover inequalities, sequentially merging several cover inequalities on multiple variables, and algorithms that facilitate the development of merged inequalities. Examples demonstrate each of the theoretical discoveries. A computational study experiments with inequality merging techniques using benchmark MK instances. This computational study provides recommendations for implementing merged inequalities, which results in an average decrease of about 9% in computational time for both small and large MK instances. The research validates the effectiveness of using merged inequalities for MK problems and motivates substantial theoretical and computational extensions as future research.
13

The existence and usefulness of equality cuts in the multi-demand multidimensional knapsack problem

DeLissa, Levi January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Integer programming (IP) is a class of mathematical models useful for modeling and optimizing many theoretical and industrial problems. Unfortunately, IPs are NP-complete, and many integer programs cannot currently be solved. Valid inequalities and their respective cuts are commonly used to reduce the effort required to solve IPs. This thesis poses the questions, do valid equality cuts exist and can they be useful for solving IPs? Several theoretical results related to valid equalities are presented in this thesis. It is shown that equality cuts exist if and only if the convex hull is not full dimensional. Furthermore, the addition of an equality cut can arbitrarily reduce the dimension of the linear relaxation. In addition to the theory on equality cuts, the idea of infeasibility conditions are presented. Infeasibility conditions introduce a set of valid inequalities whose intersection is the empty set. infeasibility conditions can be used to rapidly terminate a branch and cut algorithm. Applying the idea of equality cuts to the multi-demand multidimensional knapsack problem resulted in a new class of cutting planes named anticover cover equality (ACE) cuts. A simple algorithm, FACEBT, is presented for finding ACE cuts in a branching tree with complexity O(m n log n). A brief computational study shows that using ACE cuts exist frequently in the MDMKP instances studied. Every instance had at least one equality cut, while one instance had over 500,000. Additionally, computationally challenging instances saw an 11% improvement in computational effort. Therefore, equality cuts are a new topic of research in IP that is beneficial for solving some IP instances.
14

The NFL true fan problem

Whittle, Scott January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Throughout an NFL season, 512 games are played in 17 weeks. For a given fan that follows one team, only 16 of those games usually matter, and the rest of the games carry little significance. The goal of this research is to provide substantial reasons for fans to watch other games. This research finds the easiest path to a division championship for each team. This easiest path requires winning the least number of games. Due to NFL’s complicated tiebreaker rules, games not involving the fan’s team can have major implications for that team. The research calls these games critical because if the wrong team wins, then the fan’s team must win additional games to become the division champion. To identify both the easiest path and the critical games, integer programming is used. Given the amount of two-team, three-team, and four-team division tie scenarios that can occur, 31 separate integer programs are solved for each team to identify the easiest path to the division championship. A new algorithm, Shortest Path of Remaining Teams (SPORT) is used to iteratively search through every game of the upcoming week to determine critical games. These integer programs and the SPORT algorithm were used with the data from the previous 2 NFL seasons. Throughout these 2 seasons, it was found that the earliest a team was eliminated from the possibility of winning a division championship was week 12, and occurred in 2012 and 2013. Also, throughout these 2 seasons, there was an average of 65 critical games per season, with more critical games occurring in the 2013-2014 season. Additionally, the 2012 season was used to compare flexed scheduled games with the critical games for those weeks and it was found that the NFL missed three weeks of potentially scheduling a critical game.
15

Octanary branching algorithm

Bailey, James Patrick January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Integer Programs (IP) are a class of discrete optimization that have been used commercially to improve various systems. IPs are often used to reach an optimal financial objective with constraints based upon resources, operations and other restrictions. While incredibly beneficial, IPs have been shown to be NP-complete with many IPs remaining unsolvable. Traditionally, Branch and Bound (BB) has been used to solve IPs. BB is an iterative algorithm that enumerates all potential integer solutions for a given IP. BB can guarantee an optimal solution, if it exists, in finite time. However, BB can require an exponential number of nodes to be evaluated before terminating. As a result, the memory of a computer using BB can be exceeded or it can take an excessively long time to find the solution. This thesis introduces a modified BB scheme called the Octanary Branching Algorithm (OBA). OBA introduces eight children in each iteration to more effectively partition the feasible region of the linear relaxation of the IP. OBA also introduces equality constraints in four of the children in order to reduce the dimension of the remaining nodes. OBA can guarantee an optimal solution, if it exists, in finite time. In addition, OBA has been shown to have some theoretical improvements over traditional BB. During computational tests, OBA was able to find the first, second and third integer solution with 64.8%, 27.9% and 29.3% fewer nodes evaluated, respectively, than CPLEX. These integers were 44.9%, 54.7% and 58.2% closer to the optimal solution, respectively, when compared to CPLEX. It is recommended that commercial solvers incorporate OBA in the initialization and random diving phases of BB.
16

Primary and secondary log breakdown simulation

Todoroki, Christine Louisa January 1997 (has links)
Log breakdown by sawing can be viewed as a multi-phase process that converts logs into boards by a series of cutting operations. In the primary phase, logs are sawn into s labs of wood known as flitches or cants. These are further processed by secondary operations, that resaw, edge (cut lengthwise) and trim (cut widthwise) the raw material, resulting in the manufacture of the board product whose value is influenced by its composite dimensions and quality (as indicated by a grade). Board grade is in turn determined by the number, type, size, and location of defects. Owing to its biological origins, each log, and subsequent board, is unique. Furthermore, as each sawmill, and processing centre within the mill, has a unique configuration, the problem of determining how each log entering a mill should be sawn is very complex. Effective computer simulation of log breakdown processes must therefore entail detailed descriptions of both geometry and quality of individual logs. Appropriate strategies at each breakdown phase are also required. In this thesis models for emulating log breakdown are developed in conjunction with an existing sawing simulation system which requires, as input, detailed three-dimensional descriptions of both internal and external log characteristics. Models based on heuristic and enumerative procedures, and those based upon the principles of dynamic programming (DP) are formulated, encoded, and compared. Log breakdown phases are considered both independently and in a combined integrated approach-working backwards from the board product through to the primary log breakdown phase. This approach permits methodology developed for the later processes to be embedded within the primary phase thus permitting the determination of a global rather than local solution to the log breakdown problem whose objective is to seek the highest possible solution quality within the minimum possible time. Simulation results indicate that solution quality and processing speeds are influenced by both solution methodology and degree of data complexity. When the structure of either factor is simplified, solutions are generated more rapidly-but with an accompanying reduction in solution quality. A promising compromise that combines DP techniques with mathematical functions based on a subset of the original data is presented. / Subscription resource available via Digital Dissertations only.
17

Primary and secondary log breakdown simulation

Todoroki, Christine Louisa January 1997 (has links)
Log breakdown by sawing can be viewed as a multi-phase process that converts logs into boards by a series of cutting operations. In the primary phase, logs are sawn into s labs of wood known as flitches or cants. These are further processed by secondary operations, that resaw, edge (cut lengthwise) and trim (cut widthwise) the raw material, resulting in the manufacture of the board product whose value is influenced by its composite dimensions and quality (as indicated by a grade). Board grade is in turn determined by the number, type, size, and location of defects. Owing to its biological origins, each log, and subsequent board, is unique. Furthermore, as each sawmill, and processing centre within the mill, has a unique configuration, the problem of determining how each log entering a mill should be sawn is very complex. Effective computer simulation of log breakdown processes must therefore entail detailed descriptions of both geometry and quality of individual logs. Appropriate strategies at each breakdown phase are also required. In this thesis models for emulating log breakdown are developed in conjunction with an existing sawing simulation system which requires, as input, detailed three-dimensional descriptions of both internal and external log characteristics. Models based on heuristic and enumerative procedures, and those based upon the principles of dynamic programming (DP) are formulated, encoded, and compared. Log breakdown phases are considered both independently and in a combined integrated approach-working backwards from the board product through to the primary log breakdown phase. This approach permits methodology developed for the later processes to be embedded within the primary phase thus permitting the determination of a global rather than local solution to the log breakdown problem whose objective is to seek the highest possible solution quality within the minimum possible time. Simulation results indicate that solution quality and processing speeds are influenced by both solution methodology and degree of data complexity. When the structure of either factor is simplified, solutions are generated more rapidly-but with an accompanying reduction in solution quality. A promising compromise that combines DP techniques with mathematical functions based on a subset of the original data is presented. / Subscription resource available via Digital Dissertations only.
18

Primary and secondary log breakdown simulation

Todoroki, Christine Louisa January 1997 (has links)
Log breakdown by sawing can be viewed as a multi-phase process that converts logs into boards by a series of cutting operations. In the primary phase, logs are sawn into s labs of wood known as flitches or cants. These are further processed by secondary operations, that resaw, edge (cut lengthwise) and trim (cut widthwise) the raw material, resulting in the manufacture of the board product whose value is influenced by its composite dimensions and quality (as indicated by a grade). Board grade is in turn determined by the number, type, size, and location of defects. Owing to its biological origins, each log, and subsequent board, is unique. Furthermore, as each sawmill, and processing centre within the mill, has a unique configuration, the problem of determining how each log entering a mill should be sawn is very complex. Effective computer simulation of log breakdown processes must therefore entail detailed descriptions of both geometry and quality of individual logs. Appropriate strategies at each breakdown phase are also required. In this thesis models for emulating log breakdown are developed in conjunction with an existing sawing simulation system which requires, as input, detailed three-dimensional descriptions of both internal and external log characteristics. Models based on heuristic and enumerative procedures, and those based upon the principles of dynamic programming (DP) are formulated, encoded, and compared. Log breakdown phases are considered both independently and in a combined integrated approach-working backwards from the board product through to the primary log breakdown phase. This approach permits methodology developed for the later processes to be embedded within the primary phase thus permitting the determination of a global rather than local solution to the log breakdown problem whose objective is to seek the highest possible solution quality within the minimum possible time. Simulation results indicate that solution quality and processing speeds are influenced by both solution methodology and degree of data complexity. When the structure of either factor is simplified, solutions are generated more rapidly-but with an accompanying reduction in solution quality. A promising compromise that combines DP techniques with mathematical functions based on a subset of the original data is presented. / Subscription resource available via Digital Dissertations only.
19

Primary and secondary log breakdown simulation

Todoroki, Christine Louisa January 1997 (has links)
Log breakdown by sawing can be viewed as a multi-phase process that converts logs into boards by a series of cutting operations. In the primary phase, logs are sawn into s labs of wood known as flitches or cants. These are further processed by secondary operations, that resaw, edge (cut lengthwise) and trim (cut widthwise) the raw material, resulting in the manufacture of the board product whose value is influenced by its composite dimensions and quality (as indicated by a grade). Board grade is in turn determined by the number, type, size, and location of defects. Owing to its biological origins, each log, and subsequent board, is unique. Furthermore, as each sawmill, and processing centre within the mill, has a unique configuration, the problem of determining how each log entering a mill should be sawn is very complex. Effective computer simulation of log breakdown processes must therefore entail detailed descriptions of both geometry and quality of individual logs. Appropriate strategies at each breakdown phase are also required. In this thesis models for emulating log breakdown are developed in conjunction with an existing sawing simulation system which requires, as input, detailed three-dimensional descriptions of both internal and external log characteristics. Models based on heuristic and enumerative procedures, and those based upon the principles of dynamic programming (DP) are formulated, encoded, and compared. Log breakdown phases are considered both independently and in a combined integrated approach-working backwards from the board product through to the primary log breakdown phase. This approach permits methodology developed for the later processes to be embedded within the primary phase thus permitting the determination of a global rather than local solution to the log breakdown problem whose objective is to seek the highest possible solution quality within the minimum possible time. Simulation results indicate that solution quality and processing speeds are influenced by both solution methodology and degree of data complexity. When the structure of either factor is simplified, solutions are generated more rapidly-but with an accompanying reduction in solution quality. A promising compromise that combines DP techniques with mathematical functions based on a subset of the original data is presented. / Subscription resource available via Digital Dissertations only.
20

Primary and secondary log breakdown simulation

Todoroki, Christine Louisa January 1997 (has links)
Log breakdown by sawing can be viewed as a multi-phase process that converts logs into boards by a series of cutting operations. In the primary phase, logs are sawn into s labs of wood known as flitches or cants. These are further processed by secondary operations, that resaw, edge (cut lengthwise) and trim (cut widthwise) the raw material, resulting in the manufacture of the board product whose value is influenced by its composite dimensions and quality (as indicated by a grade). Board grade is in turn determined by the number, type, size, and location of defects. Owing to its biological origins, each log, and subsequent board, is unique. Furthermore, as each sawmill, and processing centre within the mill, has a unique configuration, the problem of determining how each log entering a mill should be sawn is very complex. Effective computer simulation of log breakdown processes must therefore entail detailed descriptions of both geometry and quality of individual logs. Appropriate strategies at each breakdown phase are also required. In this thesis models for emulating log breakdown are developed in conjunction with an existing sawing simulation system which requires, as input, detailed three-dimensional descriptions of both internal and external log characteristics. Models based on heuristic and enumerative procedures, and those based upon the principles of dynamic programming (DP) are formulated, encoded, and compared. Log breakdown phases are considered both independently and in a combined integrated approach-working backwards from the board product through to the primary log breakdown phase. This approach permits methodology developed for the later processes to be embedded within the primary phase thus permitting the determination of a global rather than local solution to the log breakdown problem whose objective is to seek the highest possible solution quality within the minimum possible time. Simulation results indicate that solution quality and processing speeds are influenced by both solution methodology and degree of data complexity. When the structure of either factor is simplified, solutions are generated more rapidly-but with an accompanying reduction in solution quality. A promising compromise that combines DP techniques with mathematical functions based on a subset of the original data is presented. / Subscription resource available via Digital Dissertations only.

Page generated in 0.1686 seconds