• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1406
  • 107
  • 73
  • 54
  • 26
  • 24
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 11
  • 5
  • Tagged with
  • 2122
  • 2122
  • 556
  • 389
  • 328
  • 277
  • 259
  • 225
  • 209
  • 203
  • 175
  • 162
  • 157
  • 141
  • 136
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Statistical learning for decision making : interpretability, uncertainty, and inference

Letham, Benjamin January 2015 (has links)
Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2015. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student-submitted PDF version of thesis. / Includes bibliographical references (pages 183-196). / Data and predictive modeling are an increasingly important part of decision making. Here we present advances in several areas of statistical learning that are important for gaining insight from large amounts of data, and ultimately using predictive models to make better decisions. The first part of the thesis develops methods and theory for constructing interpretable models from association rules. Interpretability is important for decision makers to understand why a prediction is made. First we show how linear mixtures of rules can be used to make sequential predictions. Then we develop Bayesian Rule Lists, a method for learning small, ordered lists of rules. We apply Bayesian Rule Lists to a large database of patient medical histories and produce a simple, interpretable model that solves an important problem in healthcare, with little sacrifice to accuracy. Finally, we prove a uniform generalization bound for decision lists. In the second part of the thesis we focus on decision making from sales transaction data. We develop models and inference procedures for using transaction data to estimate quantities such as willingness-to-pay and lost sales due to stock unavailability. We develop a copula estimation procedure for making optimal bundle pricing decisions. We then develop a Bayesian hierarchical model for inferring demand and substitution behaviors from transaction data with stockouts. We show how posterior sampling can be used to directly incorporate model uncertainty into the decisions that will be made using the model. In the third part of the thesis we propose a method for aggregating relevant information from across the Internet to facilitate informed decision making. Our contributions here include an important theoretical result for Bayesian Sets, a popular method for identifying data that are similar to seed examples. We provide a generalization bound that holds for any data distribution, and moreover is independent of the dimensionality of the feature space. This result justifies the use of Bayesian Sets on high-dimensional problems, and also explains its good performance in settings where its underlying independence assumption does not hold. / by Benjamin Letham. / Ph. D.
192

The Gomory-Chvátal closure : polyhedrality, complexity, and extensions

Dunkel, Juliane January 2011 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2011. / Vita. Cataloged from PDF version of thesis. / Includes bibliographical references (p. 163-166). / In this thesis, we examine theoretical aspects of the Gomory-Chvátal closure of polyhedra. A Gomory-Chvátal cutting plane for a polyhedron P is derived from any rational inequality that is valid for P by shifting the boundary of the associated half-space towards the polyhedron until it intersects an integer point. The Gomory-ChvAital closure of P is the intersection of all half-spaces defined by its Gomory-Chvátal cuts. While it is was known that the separation problem for the Gomory-Chvátal closure of a rational polyhedron is NP-hard, we show that this remains true for the family of Gomory-Chvátal cuts for which all coefficients are either 0 or 1. Several combinatorially derived cutting planes belong to this class. Furthermore, as the hyperplanes associated with these cuts have very dense and symmetric lattices of integer points, these cutting planes are in some- sense the "simplest" cuts in the set of all Gomory-Chvátal cuts. In the second part of this thesis, we answer a question raised by Schrijver (1980) and show that the Gomory-Chvátal closure of any non-rational polytope is a polytope. Schrijver (1980) had established the polyhedrality of the Gomory-Chvdtal closure for rational polyhedra. In essence, his proof relies on the fact that the set of integer points in a rational polyhedral cone is generated by a finite subset of these points. This is not true for non-rational polyhedral cones. Hence, we develop a completely different proof technique to show that the Gomory-Chvátal closure of a non-rational polytope can be described by a finite set of Gomory-Chvátal cuts. Our proof is geometrically motivated and applies classic results from polyhedral theory and the geometry of numbers. Last, we introduce a natural modification of Gomory-Chvaital cutting planes for the important class of 0/1 integer programming problems. If the hyperplane associated with a Gomory-Chvátal cut for a polytope P C [0, 1]' does not contain any 0/1 point, shifting the hyperplane further towards P until it intersects a 0/1 point guarantees that the resulting half-space contains all feasible solutions. We formalize this observation and introduce the class of M-cuts that arises by strengthening the family of Gomory- Chvátal cuts in this way. We study the polyhedral properties of the resulting closure, its complexity, and the associated cutting plane procedure. / by Juliane Dunkel / Ph.D.
193

Integer optimization in data mining / Data mining via integer optimization

Shioda, Romy, 1977- January 2003 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2003. / Includes bibliographical references (p. 103-107). / While continuous optimization methods have been widely used in statistics and data mining over the last thirty years, integer optimization has had very limited impact in statistical computation. Thus, our objective is to develop a methodology utilizing state of the art integer optimization methods to exploit the discrete character of data mining problems. The thesis consists of two parts: The first part illustrates a mixed-integer optimization method for classification and regression that we call Classification and Regression via Integer Optimization (CRIO). CRIO separates data points in different polyhedral regions. In classification each region is assigned a class, while in regression each region has its own distinct regression coefficients. Computational experimentation with real data sets shows that CRIO is comparable to and often outperforms the current leading methods in classification and regression. The second part describes our cardinality-constrained quadratic mixed-integer optimization algorithm, used to solve subset selection in regression and portfolio selection in asset allocation. We take advantage of the special structures of these problems by implementing a combination of implicit branch-and-bound, Lemke's pivoting method, variable deletion and problem reformulation. Testing against popular heuristic methods and CPLEX 8.0's quadratic mixed-integer solver, we see that our tailored approach to these quadratic variable selection problems have significant advantages over simple heuristics and generalized solvers. / by Romy Shioda. / Ph.D.
194

Revenue management and learning in systems of reusable resources

Owen, Zachary Davis January 2018 (has links)
Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2018. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student-submitted PDF version of thesis. / Includes bibliographical references (pages 183-186). / Many problems in revenue management and operations management more generally can be framed as problems of resource allocation. This thesis focuses on developing policies and guarantees for resource allocation problems with reusable resources and on learning models for personalized resource allocation. First, we address the problem of pricing and assortment optimization for reusable resources under time-homogeneous demand. We demonstrate that a simple randomized policy achieves at least one half of the optimal revenue in both the pricing and assortment settings. Further, when prices are fixed a priori, we develop a method to compute the optimal randomized state-independent assortment policy. The performance of our policies is evaluated in numerical experiments based on arrival rate and parking time data from a municipal parking system. Though our algorithms perform well, our computational results suggest that dynamic pricing strategies are of limited value in the face of a consistent demand stream. Motivated in part by the computational results of the previous section, in the second section, we consider the problem of pricing and assortment optimization for reusable resource under time-varying demand. We develop a time-discretization strategy that yields a constant-factor performance guarantee relative to the optimal policy continuous-time policy. Additionally, we develop heuristic methods that implement a bid-price strategy between available resources based on pre-computed statistics that is computable in real-time. These methods effectively account for the future value of resources that in turn depend on the future patterns of demand. We validate our methods on arrival patterns derived from real arrival rate patterns in a parking context. In the third part, we consider the problem of learning contextual pricing policies more generally. We propose a framework for making personalized pricing decisions based on a multinomial logit model with features based on both customer attributes, item attributes, and their interactions. We demonstrate that our modeling procedure is coherent and in the well specified setting we demonstrate finite sample bounds on the performance of our strategy based on the size of the training data. / by Zachary Davis Owen. / Ph. D.
195

Dynamic prediction of terminal-area severe convective weather penetration

Schonfeld, Daniel (Daniel Ryan) January 2015 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2015. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student-submitted PDF version of thesis. / Includes bibliographical references (pages [110]-112). / Despite groundbreaking technology and revised operating procedures designed to improve the safety of air travel, numerous aviation accidents still occur every year. According to a recent report by the FAA's Aviation Weather Research Program, over 23% of these accidents are weather-related, typically taking place during the takeoff and landing phases. When pilots fly through severe convective weather, regardless of whether or not an accident occurs, they cause damage to the aircraft, increasing maintenance cost for airlines. These concerns, coupled with the growing demand for air transportation, put an enormous amount of pressure on the existing air traffic control system. Moreover, the degree to which weather impacts airspace capacity, defined as the number of aircraft that can simultaneously y within the terminal area, is not well understood. Understanding how weather impacts terminal area air traffic flows will be important for quantifying the effect that uncertainty in weather forecasting has on flows, and developing an optimal strategy to mitigate this effect. In this thesis, we formulate semi-dynamic models and employ Multinomial Logistic Regression, Classification and Regression Trees (CART), and Random Forests to accurately predict the severity of convective weather penetration by flights in several U.S. airport terminal areas. Our models perform consistently well when re-trained on each individual airport rather than using common models across airports. Random Forests achieve the lowest prediction error with accuracies as high as 99%, false negative rates as low as 1%, and false positive rates as low as 3%. CART is the least sensitive to differences across airports, exhibiting very steady performance. We also identify weather-based features, particularly those describing the presence of fast-moving, severe convective weather within the projected trajectory of the flight, as the best predictors of future penetration. / by Daniel Schonfeld. / S.M.
196

Fairness in operations : from theory to practice / Fairness in operations.

Trichakis, Nikolaos K January 2011 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2011. / Cataloged from PDF version of thesis. / Includes bibliographical references (p. 131-136). / This thesis deals with two basic issues in resource allocation problems. The first issue pertains to how one approaches the problem of designing the "right" objective for a given resource allocation problem. The notion of what is "right" can be fairly nebulous; we consider two issues that we see as key: efficiency and fairness. We approach the problem of designing objectives that account for the natural tension between efficiency and fairness in the context of a framework that captures a number of problems of interest to operations managers. We state a precise version of the design problem, provide a quantitative understanding of the tradeoff between efficiency and fairness inherent to this design problem and demonstrate the approach in a case study that considers air traffic management. Secondly, we deal with the issue of designing implementable policies that serve such objectives, balancing efficiency and fairness in practice. We do so specifically in the context of organ allocation for transplantation. In particular, we propose a scalable, data-driven method for designing national policies for the allocation of deceased donor kidneys to patients on a waiting list, in a fair and efficient way. We focus on policies that have the same form as the one currently used in the U.S., that are policies based on a point system, which ranks patients according to some priority criteria, e.g., waiting time, medical urgency, etc., or a combination thereof. Rather than making specific assumptions about fairness principles or priority criteria, our method offers the designer the flexibility to select his desired criteria and fairness constraints from a broad class of allowable constraints. The method then designs a point system that is based on the selected priority criteria, and approximately maximizes medical efficiency, i.e., life year gains from transplant, while simultaneously enforcing selected fairness constraints. Using our method, we design a point system that has the same form, uses the same criteria and satisfies the same fairness constraints as the point system that was recently proposed by U.S. policymakers. In addition, the point system we design delivers an 8% increase in extra life year gains. We evaluate the performance of all policies under consideration using the same statistical and simulation tools and data as the U.S. policymakers use. We perform a sensitivity analysis which demonstrates that the increase in extra life year gains by relaxing certain fairness constraints can be as high as 30%. / by Nikolaos K. Trichakis. / Ph.D.
197

New applications in Revenue Management / New applications in RM

Thraves Cortés-Monroy, Charles Mark January 2017 (has links)
Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2017. / Cataloged from PDF version of thesis. / Includes bibliographical references. / Revenue Management (RM) is an area with important advances in theory and practice in the last thirty years. This thesis presents three different new applications in RM with a focus on: the firms' perspective, the government's perspective as a policy maker, and the consumers' perspective (in terms of welfare). In this thesis, we first present a two-part tariff pricing problem faced by a satellite data provider. We estimate unobserved data with parametric density functions in order to generate instances of the problem. We propose a mixed integer programming formulation for pricing. As the problem is hard to solve, we propose heuristics that make use of the MIP formulation together with intrinsic properties of the problem. Furthermore, we contrast this approach with a dynamic programming approach. Both methodologies outperform the current pricing strategy of the satellite provider, even assuming misspecifications in the assumptions made. Subsequently, we study how the government can encourage green technology adoption through a rebate to consumers. We model this setting as a Stackleberg game where firms interact in a price-setting competing newsvendor problem where the government gives a rebate to consumers in the first stage. We show the trade-off between social welfare when the government decides an adoption target instead of a utilitarian objective. Then, we study the impact of competition and demand uncertainty on the three agents involved: firms, government, and consumers. This thesis recognizes the need to measure consumers' welfare for multiple items under demand uncertainty. As a result, this thesis builds on existing theory in order to incorporate demand uncertainty in Consumer Surplus. In many settings, produced quantities might not meet the realized demand at a given market price. This comes as an obstacle in the computation of consumer surplus. To address this, we define the concept of an allocation rule. In addition, we study the impact of uncertainty on consumers for different demand noise (additive and multiplicative) and for various allocation rules. / by Charles Mark Thraves Cortés-Monroy. / Ph. D.
198

A data-driven approach to mitigate risk in global food supply chains

Anoun, Amine January 2017 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2017. / Cataloged from PDF version of thesis. / Includes bibliographical references (pages 141-143). / Economically motivated adulteration of imported food poses a serious threat to public health, and has contributed to several poisoning incidents in the past few years in the U.S. [1]. Prevention is achieved by sampling food shipments coming to the U.S. However, the sampling resources are limited: all shipments are electronically sampled [2], but only a small percentage of shipments are physically inspected. In an effort to mitigate risk in shipping supply chains, we develop a data-driven approach to identify risky shippers and manufacturers exporting food to the U.S., as well as U.S. based consignees and importers receiving imported products. We focus our analysis on honey and shrimp, two products that are routinely imported and frequently adulterated. We obtain over 62,000 bills of lading of honey entering the U.S. between 2006 and 2015 from public sources, and over a million shipment records of shrimp entering the U.S. between 2007 and 2015 from the Food and Drugs Administration (FDA). We analyze these data to identify common patterns between high risk shippers, manufacturers, U.S. consignees and importers, and use them to determine structural features of shipping supply chains that correlate with risk of adulteration. In our analysis of shrimp manufacturers, we distinguish two types of adulteration: intentional (driven by economic motivation) and unintentional (due to negligence or poor sanitary conditions). We use a Bayesian approach to model both the sampling or inspection procedure of the FDA, and the risk of adulteration. Our model is able to predict which companies are at risk of committing adulteration with high out-of-sample accuracy. We find that both geographical features (e.g., travel route, country of origin and transnational paths) and network features (e.g., number of partners, weight dispersion and diversity of the product portfolio) are significant and predictive of suspicious behavior. These outcomes can inform various decisions faced by the FDA in their sampling policy for honey and shrimp shipments, and their site inspection policy for consignees and importers. This work can also extend to other commodities with similar mechanisms, and provides a general framework to better detect food safety failures and mitigate risk in food supply chains. / by Amine Anoun. / S.M.
199

Joint pricing and inventory decision for competitive products

Ye, Kelly (Kelly Yunqing) January 2008 (has links)
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2008. / Includes bibliographical references (p. 37-38). / We consider the joint pricing and inventory decision problem for a single retailer who orders, stocks and sells multiple products. The products are competitive in nature, e.g., these maybe similar products from multiple brands. Demand for a product depends on its price as well as the price of all competing products. We show that the optimal pricing and inventory policy is similar to the base-stock, list-price policy which is known to be optimal for the single product case. In addition, the base-stock level of each product is nonincreasing with the inventory level of other products. This structure suggests that one can improve profit by simultaneously managing all the products rather than managing each product independently of other products. / by Kelly (Yunqing) Ye. / S.M.
200

The minority achievement gap in a suburban school district

Chandler, Lincoln J., 1977- January 2008 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2008. / Includes bibliographical references (p. 189-192). / For many decades, the American educational system has yielded significant differences in achievement among students in different racial groups, a phenomenon commonly known as the "Achievement Gap". Despite the volume of research devoted to studying achievement gaps, school administrators faced with the challenge of reducing these gaps have had limited success. There are a number of factors, regarding the individual, the school, and the setting, which can contribute to achievement gaps, but in a particular community, the prevalence of such factors, and their individual contribution to the gap, is unclear. In this dissertation, we employ a variety of statistical methods that provide a bridge between large-scale studies of achievement gaps and the analyses necessary to address the needs of a single community. First, we establish a collection of metrics designed to measure relative and absolute differences in achievement, for groups of arbitrary size and distribution. Using data from a middle-class, racially integrated school district, we employ these metrics to measure the magnitude of the achievement gap for individual students from grades three through eight. We also assess the potential role of previously-identified correlates of low achievement, such as poverty and student mobility. Last, we evaluate the potential impact of strategies for narrowing the gap. / by Lincoln Jamond Chandler. / Ph.D.

Page generated in 0.1166 seconds