Spelling suggestions: "subject:"coperations 3research"" "subject:"coperations 1research""
71 |
Algorithms for derivative-free optimization /Rios, Luis Miguel. January 2009 (has links)
Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2009. / Source: Dissertation Abstracts International, Volume: 70-06, Section: B, page: 3764. Adviser: Nikolaos Sahinidis. Includes bibliographical references (leaves 109-119) Available on microfilm from Pro Quest Information and Learning.
|
72 |
Mathematical programming in data mining: Models for binary classification with application to collusion detection in online gamblingDomm, Maryanne January 2003 (has links)
Data mining is a semi-automated technique to discover patterns and trends in large amounts of data and can be used to build statistical models to predict those patterns and trends. One type of prediction model is a classifier, which attempts to predict to which group a particular item belongs. An important binary classifier, the Support Vector Machine classifier, uses non-linear optimization to find a hyperplane separating the two classes of data. This classifier has been reformulated as a linear program and as a pure quadratic program. We propose two modeling extensions to the Support Vector Machine classifier. The first, the Linearized Proximal Support Vector Machine classifier, linearizes the objective function of the pure quadratic version. This reduces the importance the classifier places on outlying data points. The second extension improves the conceptual accuracy of the model. The Integer Support Vector Machine classifier uses binary indicator variables to indicate potential misclassification errors and minimizes these errors directly. Performance of both these new classifiers was evaluated on a simple two dimensional data set as well as on several data sets commonly used in the literature and was compared to the original classifiers. These classifiers were then used to build a model to detect collusion in online gambling. Collusion occurs when two or more players play differently against each other than against the rest of the players. Since their communication cannot be intercepted, collusion is easier for online gamblers. However, collusion can still be identified by examining the playing style of the colluding players. By analyzing the record of play from online poker, a model to predict whether a hand contains colluding players or not can be built. We found that these new classifiers performed about as well as previous classifiers and sometimes worse and sometimes better. We also found that one form of online collusion could be detected, but not perfectly.
|
73 |
Agent-based heuristics for large, multiple-mode, resource-constrained project scheduling problemsKnotts, Gary Wayne, 1962- January 1998 (has links)
In this dissertation we address large, multiple-mode, resource-constrained project scheduling problems with the objective of minimizing makespan. After noting that projects often fail and new research is needed, we provide the formal definition of the resource-constrained project scheduling problem and review the existing literature. We then introduce a new model based on digital electronics. We conceptualize our model using agent technology and discuss it as extension of existing models with more representational power. We also describe how our model supports distributed planning. After implementing our model, we conduct two computational studies. In the first, we develop two agent types: basic and enhanced where the enhanced agent is more sophisticated in selecting an activity execution mode. We apply these agents to the scheduling of 500 instances of a small project originally published by Maroto and Tormos (1994). We evaluate the performance of the agents in conjunction with their use of eight heuristic prioritization rules: shortest and longest processing time, fewest and most immediate successors, smallest and greatest resource demand, earliest start time, and earliest due date. Our results show that enhanced agents consistently outperform basic agents while the results regarding priority rules were mixed. In the second computational study, we further develop our enhanced agents by providing still more sophisticated mode selection. We also evaluate static versus dynamic prioritization and two more priority rules: shortest and longest duration critical path. For this study we generated 2500, 5000, 7500, and 10000 activity projects. For each of these, we generated networks with complexities of 1.5, 1.8, and 2.1. For these twelve networks, we generated 20 problem instances for every possible combination of resource factor = 0.25, 0.50, 0.75, 1.0 and resource strength = 0.2, 0.5, 0.8. We graphically evaluated scheduling performance, computation times, and failure rates and conducted an extensive statistical analysis. We found that enhanced agents using shortest processing time priority consistently produced the shortest schedules. However, these agents fail more often than basic agents. We found that dynamic prioritization requires more computation time, but provides little improvement in scheduling performance. We conclude this work with suggestions for future research.
|
74 |
Diverse routing in network planningVohnout, Sonia Isabel, 1964- January 1990 (has links)
This thesis discusses an algorithm and two heuristics for solving a particular network optimization problem: The node-disjoint paths problem. The goal of this optimization problem is to find two node-disjoint paths between a given origin-destination pair whose total cost is minimum. This problem is shown to be NP-Hard. Two heuristics are investigated in this thesis. The sequential shortest paths heuristic, is the faster of the two methods, but the quality of the solution may be sacrificed. On the other hand, the simultaneous shortest paths heuristic, which yields very good solutions, has higher complexity. We also discuss an implicit enumeration algorithm that is used to verify the quality of the solution obtained from the heuristics.
|
75 |
Independence systems and stable set relaxationsMcClosky, Benjamin January 2008 (has links)
Many fundamental combinatorial optimization problems involve the search for subsets of graph elements which satisfy some notion of independence. This thesis develops techniques for optimizing over a class of independence systems and focuses on systems having the vertex set of a finite graph as a ground set. The search for maximum stable sets in a graph offers a well-studied example of such a problem. More generally, for any integer k ≥ 1, the maximum co-k-plex problem fits into this framework as well. Co-k-plexes are defined as a relaxation of stable sets.
This thesis studies co-k-plexes from polyhedral, algorithmic, and enumerative perspectives. The polyhedral analysis explores the relationship between the stable set polytope and co-k-plex polyhedra. Results include generalizations of odd holes, webs, wheels, and the claw. Sufficient conditions for the integrality of some related linear systems and results on the composition of stable set polyhedra are also given. The algorithmic analysis involves the development of heuristic and exact algorithms for finding maximum k-plexes. This problem is closely related to the search for co-k-plexes. The final chapter includes results on the enumerative structure of co-k-plexes in certain graphs.
|
76 |
Ellipsoidal approximation to polytopes and computational study of Lenstra's algorithmGao, Liyan January 2002 (has links)
General integer programming is an important mathematical approach for many decision-making problems. In this field, a major theoretical breakthrough came in 1983 when H. W. Lenstra, Jr. proposed a polynomial-time algorithm for general integer programs while the number of variables is fixed. Two key ingredients of Lenstra's algorithm are ellipsoidal approximation of polytopes and lattice basis reduction. However, the lack of practically efficient algorithms and software for the ellipsoidal approximation of polytopes has made it difficult to study the computational properties of Lenstra's algorithm.
To bridge the gap between theory and computational practice for Lenstra's algorithm, we study both the ellipsoidal approximation to polytopes and computational properties of Lenstra's algorithm in this thesis. We have developed a reliable and efficient algorithm for computing the maximum volume ellipsoid inscribing a given polytope. This algorithm effectively exploits the problem-specific structures and utilizes a primal-dual type, interior point method. We show that this algorithm has a sound theoretical foundation, and demonstrate that it performs considerably better than a number of other algorithms through extensive numerical experiments.
Using our ellipsoidal approximation algorithm as a subroutine, we have implemented a version of Lenstra's algorithm for general integer programming feasibility problems. At each node, the method uses ellipsoidal approximation and lattice basis reduction to find a "thin" direction of the polytope, and branches on hyperplanes, rather than on variables as in the traditional branch-and-bound method. In this procedure, it is guaranteed that the number of branches at each node is bounded and small. Our numerical results on small- to medium-sized test instances suggest that Lenstra's algorithm examine much fewer nodes than the traditional branch-and-bound method. However, there is a tradeoff between many nodes and fast reoptimization as in the traditional branch-and-bound method and fewer nodes but more time-consuming decisions on branching as in Lenstra's algorithm. Currently, the main bottle-neck in the performance of the algorithm lies at the step of lattice basis reduction. If this step is sufficiently improved, then Lenstra's algorithm, when combined with other techniques such as cutting planes, promises to be efficient for certain classes of difficult problems.
|
77 |
New algorithms for pathwidth computationLi, Ming January 2004 (has links)
The notions of pathwidth and the closely related treewidth have become more and more important recently. The importance lies not only in theory but also in practice. Theoretically, lots of NP-hard problems become polynomially solvable when restricted in graphs with bounded pathwidth (or treewidth). Practically, pathwidth and treewidth have significant applications in many different fields such as searching games, VLSI design, matrix computation, etc. Computing pathwidth is an NP-complete problem for general graphs, but polynomially solvable for treewidth-bounded graphs. However, there is no known practical algorithm to compute pathwidth for treewidth-bounded graphs. In this dissertation, a new algorithm for computing pathwidth and finding an optimal pathwidth-decomposition for treewidth-bounded graph is presented. This algorithm uses an interval completion technique and the branch-and-bound method to make the pathwidth computation process more efficient, practical, and easy to implement. It can also be easily converted to a parallel algorithm. The data structure for implementing this algorithm is presented, and some computational results are shown. Some heuristic methods to approximate pathwidth for general graphs are also given, especially for series-parallel graphs.
|
78 |
DECOMPOSITION IN NONLINEAR AND STOCHASTIC PROGRAMMINGHSIA, WEI SHEN January 1973 (has links)
No description available.
|
79 |
A ROBUST MODIFICATION OF NEWTON'S METHOD FOR NONLINEAR OPTIMIZATIONGARG, NARESH KUMAR January 1979 (has links)
No description available.
|
80 |
ON SUBJECTIVE DATA IN THE MULTICRITERIA DECISION PROBLEMHAMMONS, CHARLES BARCLAY January 1979 (has links)
No description available.
|
Page generated in 0.0885 seconds