Spelling suggestions: "subject:"[een] OPERATIONS"" "subject:"[enn] OPERATIONS""
251 |
Integer optimization in data mining / Data mining via integer optimizationShioda, Romy, 1977- January 2003 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2003. / Includes bibliographical references (p. 103-107). / While continuous optimization methods have been widely used in statistics and data mining over the last thirty years, integer optimization has had very limited impact in statistical computation. Thus, our objective is to develop a methodology utilizing state of the art integer optimization methods to exploit the discrete character of data mining problems. The thesis consists of two parts: The first part illustrates a mixed-integer optimization method for classification and regression that we call Classification and Regression via Integer Optimization (CRIO). CRIO separates data points in different polyhedral regions. In classification each region is assigned a class, while in regression each region has its own distinct regression coefficients. Computational experimentation with real data sets shows that CRIO is comparable to and often outperforms the current leading methods in classification and regression. The second part describes our cardinality-constrained quadratic mixed-integer optimization algorithm, used to solve subset selection in regression and portfolio selection in asset allocation. We take advantage of the special structures of these problems by implementing a combination of implicit branch-and-bound, Lemke's pivoting method, variable deletion and problem reformulation. Testing against popular heuristic methods and CPLEX 8.0's quadratic mixed-integer solver, we see that our tailored approach to these quadratic variable selection problems have significant advantages over simple heuristics and generalized solvers. / by Romy Shioda. / Ph.D.
|
252 |
Revenue management and learning in systems of reusable resourcesOwen, Zachary Davis January 2018 (has links)
Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2018. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student-submitted PDF version of thesis. / Includes bibliographical references (pages 183-186). / Many problems in revenue management and operations management more generally can be framed as problems of resource allocation. This thesis focuses on developing policies and guarantees for resource allocation problems with reusable resources and on learning models for personalized resource allocation. First, we address the problem of pricing and assortment optimization for reusable resources under time-homogeneous demand. We demonstrate that a simple randomized policy achieves at least one half of the optimal revenue in both the pricing and assortment settings. Further, when prices are fixed a priori, we develop a method to compute the optimal randomized state-independent assortment policy. The performance of our policies is evaluated in numerical experiments based on arrival rate and parking time data from a municipal parking system. Though our algorithms perform well, our computational results suggest that dynamic pricing strategies are of limited value in the face of a consistent demand stream. Motivated in part by the computational results of the previous section, in the second section, we consider the problem of pricing and assortment optimization for reusable resource under time-varying demand. We develop a time-discretization strategy that yields a constant-factor performance guarantee relative to the optimal policy continuous-time policy. Additionally, we develop heuristic methods that implement a bid-price strategy between available resources based on pre-computed statistics that is computable in real-time. These methods effectively account for the future value of resources that in turn depend on the future patterns of demand. We validate our methods on arrival patterns derived from real arrival rate patterns in a parking context. In the third part, we consider the problem of learning contextual pricing policies more generally. We propose a framework for making personalized pricing decisions based on a multinomial logit model with features based on both customer attributes, item attributes, and their interactions. We demonstrate that our modeling procedure is coherent and in the well specified setting we demonstrate finite sample bounds on the performance of our strategy based on the size of the training data. / by Zachary Davis Owen. / Ph. D.
|
253 |
Dynamic prediction of terminal-area severe convective weather penetrationSchonfeld, Daniel (Daniel Ryan) January 2015 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2015. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student-submitted PDF version of thesis. / Includes bibliographical references (pages [110]-112). / Despite groundbreaking technology and revised operating procedures designed to improve the safety of air travel, numerous aviation accidents still occur every year. According to a recent report by the FAA's Aviation Weather Research Program, over 23% of these accidents are weather-related, typically taking place during the takeoff and landing phases. When pilots fly through severe convective weather, regardless of whether or not an accident occurs, they cause damage to the aircraft, increasing maintenance cost for airlines. These concerns, coupled with the growing demand for air transportation, put an enormous amount of pressure on the existing air traffic control system. Moreover, the degree to which weather impacts airspace capacity, defined as the number of aircraft that can simultaneously y within the terminal area, is not well understood. Understanding how weather impacts terminal area air traffic flows will be important for quantifying the effect that uncertainty in weather forecasting has on flows, and developing an optimal strategy to mitigate this effect. In this thesis, we formulate semi-dynamic models and employ Multinomial Logistic Regression, Classification and Regression Trees (CART), and Random Forests to accurately predict the severity of convective weather penetration by flights in several U.S. airport terminal areas. Our models perform consistently well when re-trained on each individual airport rather than using common models across airports. Random Forests achieve the lowest prediction error with accuracies as high as 99%, false negative rates as low as 1%, and false positive rates as low as 3%. CART is the least sensitive to differences across airports, exhibiting very steady performance. We also identify weather-based features, particularly those describing the presence of fast-moving, severe convective weather within the projected trajectory of the flight, as the best predictors of future penetration. / by Daniel Schonfeld. / S.M.
|
254 |
Fairness in operations : from theory to practice / Fairness in operations.Trichakis, Nikolaos K January 2011 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2011. / Cataloged from PDF version of thesis. / Includes bibliographical references (p. 131-136). / This thesis deals with two basic issues in resource allocation problems. The first issue pertains to how one approaches the problem of designing the "right" objective for a given resource allocation problem. The notion of what is "right" can be fairly nebulous; we consider two issues that we see as key: efficiency and fairness. We approach the problem of designing objectives that account for the natural tension between efficiency and fairness in the context of a framework that captures a number of problems of interest to operations managers. We state a precise version of the design problem, provide a quantitative understanding of the tradeoff between efficiency and fairness inherent to this design problem and demonstrate the approach in a case study that considers air traffic management. Secondly, we deal with the issue of designing implementable policies that serve such objectives, balancing efficiency and fairness in practice. We do so specifically in the context of organ allocation for transplantation. In particular, we propose a scalable, data-driven method for designing national policies for the allocation of deceased donor kidneys to patients on a waiting list, in a fair and efficient way. We focus on policies that have the same form as the one currently used in the U.S., that are policies based on a point system, which ranks patients according to some priority criteria, e.g., waiting time, medical urgency, etc., or a combination thereof. Rather than making specific assumptions about fairness principles or priority criteria, our method offers the designer the flexibility to select his desired criteria and fairness constraints from a broad class of allowable constraints. The method then designs a point system that is based on the selected priority criteria, and approximately maximizes medical efficiency, i.e., life year gains from transplant, while simultaneously enforcing selected fairness constraints. Using our method, we design a point system that has the same form, uses the same criteria and satisfies the same fairness constraints as the point system that was recently proposed by U.S. policymakers. In addition, the point system we design delivers an 8% increase in extra life year gains. We evaluate the performance of all policies under consideration using the same statistical and simulation tools and data as the U.S. policymakers use. We perform a sensitivity analysis which demonstrates that the increase in extra life year gains by relaxing certain fairness constraints can be as high as 30%. / by Nikolaos K. Trichakis. / Ph.D.
|
255 |
New applications in Revenue Management / New applications in RMThraves Cortés-Monroy, Charles Mark January 2017 (has links)
Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2017. / Cataloged from PDF version of thesis. / Includes bibliographical references. / Revenue Management (RM) is an area with important advances in theory and practice in the last thirty years. This thesis presents three different new applications in RM with a focus on: the firms' perspective, the government's perspective as a policy maker, and the consumers' perspective (in terms of welfare). In this thesis, we first present a two-part tariff pricing problem faced by a satellite data provider. We estimate unobserved data with parametric density functions in order to generate instances of the problem. We propose a mixed integer programming formulation for pricing. As the problem is hard to solve, we propose heuristics that make use of the MIP formulation together with intrinsic properties of the problem. Furthermore, we contrast this approach with a dynamic programming approach. Both methodologies outperform the current pricing strategy of the satellite provider, even assuming misspecifications in the assumptions made. Subsequently, we study how the government can encourage green technology adoption through a rebate to consumers. We model this setting as a Stackleberg game where firms interact in a price-setting competing newsvendor problem where the government gives a rebate to consumers in the first stage. We show the trade-off between social welfare when the government decides an adoption target instead of a utilitarian objective. Then, we study the impact of competition and demand uncertainty on the three agents involved: firms, government, and consumers. This thesis recognizes the need to measure consumers' welfare for multiple items under demand uncertainty. As a result, this thesis builds on existing theory in order to incorporate demand uncertainty in Consumer Surplus. In many settings, produced quantities might not meet the realized demand at a given market price. This comes as an obstacle in the computation of consumer surplus. To address this, we define the concept of an allocation rule. In addition, we study the impact of uncertainty on consumers for different demand noise (additive and multiplicative) and for various allocation rules. / by Charles Mark Thraves Cortés-Monroy. / Ph. D.
|
256 |
A data-driven approach to mitigate risk in global food supply chainsAnoun, Amine January 2017 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2017. / Cataloged from PDF version of thesis. / Includes bibliographical references (pages 141-143). / Economically motivated adulteration of imported food poses a serious threat to public health, and has contributed to several poisoning incidents in the past few years in the U.S. [1]. Prevention is achieved by sampling food shipments coming to the U.S. However, the sampling resources are limited: all shipments are electronically sampled [2], but only a small percentage of shipments are physically inspected. In an effort to mitigate risk in shipping supply chains, we develop a data-driven approach to identify risky shippers and manufacturers exporting food to the U.S., as well as U.S. based consignees and importers receiving imported products. We focus our analysis on honey and shrimp, two products that are routinely imported and frequently adulterated. We obtain over 62,000 bills of lading of honey entering the U.S. between 2006 and 2015 from public sources, and over a million shipment records of shrimp entering the U.S. between 2007 and 2015 from the Food and Drugs Administration (FDA). We analyze these data to identify common patterns between high risk shippers, manufacturers, U.S. consignees and importers, and use them to determine structural features of shipping supply chains that correlate with risk of adulteration. In our analysis of shrimp manufacturers, we distinguish two types of adulteration: intentional (driven by economic motivation) and unintentional (due to negligence or poor sanitary conditions). We use a Bayesian approach to model both the sampling or inspection procedure of the FDA, and the risk of adulteration. Our model is able to predict which companies are at risk of committing adulteration with high out-of-sample accuracy. We find that both geographical features (e.g., travel route, country of origin and transnational paths) and network features (e.g., number of partners, weight dispersion and diversity of the product portfolio) are significant and predictive of suspicious behavior. These outcomes can inform various decisions faced by the FDA in their sampling policy for honey and shrimp shipments, and their site inspection policy for consignees and importers. This work can also extend to other commodities with similar mechanisms, and provides a general framework to better detect food safety failures and mitigate risk in food supply chains. / by Amine Anoun. / S.M.
|
257 |
Joint pricing and inventory decision for competitive productsYe, Kelly (Kelly Yunqing) January 2008 (has links)
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2008. / Includes bibliographical references (p. 37-38). / We consider the joint pricing and inventory decision problem for a single retailer who orders, stocks and sells multiple products. The products are competitive in nature, e.g., these maybe similar products from multiple brands. Demand for a product depends on its price as well as the price of all competing products. We show that the optimal pricing and inventory policy is similar to the base-stock, list-price policy which is known to be optimal for the single product case. In addition, the base-stock level of each product is nonincreasing with the inventory level of other products. This structure suggests that one can improve profit by simultaneously managing all the products rather than managing each product independently of other products. / by Kelly (Yunqing) Ye. / S.M.
|
258 |
The minority achievement gap in a suburban school districtChandler, Lincoln J., 1977- January 2008 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2008. / Includes bibliographical references (p. 189-192). / For many decades, the American educational system has yielded significant differences in achievement among students in different racial groups, a phenomenon commonly known as the "Achievement Gap". Despite the volume of research devoted to studying achievement gaps, school administrators faced with the challenge of reducing these gaps have had limited success. There are a number of factors, regarding the individual, the school, and the setting, which can contribute to achievement gaps, but in a particular community, the prevalence of such factors, and their individual contribution to the gap, is unclear. In this dissertation, we employ a variety of statistical methods that provide a bridge between large-scale studies of achievement gaps and the analyses necessary to address the needs of a single community. First, we establish a collection of metrics designed to measure relative and absolute differences in achievement, for groups of arbitrary size and distribution. Using data from a middle-class, racially integrated school district, we employ these metrics to measure the magnitude of the achievement gap for individual students from grades three through eight. We also assess the potential role of previously-identified correlates of low achievement, such as poverty and student mobility. Last, we evaluate the potential impact of strategies for narrowing the gap. / by Lincoln Jamond Chandler. / Ph.D.
|
259 |
Sequential data inference via matrix estimation : causal inference, cricket and retailAmjad, Muhammad Jehangir January 2018 (has links)
Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2018. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student-submitted PDF version of thesis. / Includes bibliographical references (pages 185-193). / This thesis proposes a unified framework to capture the temporal and longitudinal variation across multiple instances of sequential data. Examples of such data include sales of a product over a period of time across several retail locations; trajectories of scores across cricket games; and annual tobacco consumption across the United States over a period of decades. A key component of our work is the latent variable model (LVM) which views the sequential data as a matrix where the rows correspond to multiple sequences while the columns represent the sequential aspect. The goal is to utilize information in the data within the sequence and across different sequences to address two inferential questions: (a) imputation or "filling missing values" and "de-noising" observed values, and (b) forecasting or predicting "future" values, for a given sequence of data. Using this framework, we build upon the recent developments in "matrix estimation" to address the inferential goals in three different applications. First, a robust variant of the popular "synthetic control" method used in observational studies to draw causal statistical inferences. Second, a score trajectory forecasting algorithm for the game of cricket using historical data. This leads to an unbiased target resetting algorithm for shortened cricket games which is an improvement upon the biased incumbent approach (Duckworth-Lewis-Stern). Third, an algorithm which leads to a consistent estimator for the time- and location-varying demand of products using censored observations in the context of retail. As a final contribution, the algorithms presented are implemented and packaged as a scalable open-source library for the imputation and forecasting of sequential data with applications beyond those presented in this work. / by Muhammad Jehangir Amjad. / Ph. D.
|
260 |
Safety at what price? : setting anti-terrorist policies for checked luggage on US domestic aircraftCohen, Jonathan E. W. (Jonathan Ephraim Weis), 1976- January 2000 (has links)
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2000. / Includes bibliographical references (leaves 45-46). / In this thesis, we considered the costs and benefits of implementing Positive Passenger Bag Match (PPBM) - an anti-terrorist measure to keep bombs out of checked luggage - on US domestic passenger flights. We constructed a stochastic model for comparing the cost-effectiveness of three alternative approaches to PPBM: no PPBML implementation; a PPBM implementation that is applied to 5% of passengers; and a full (100%) implementation of PPBM. We made ranges of estimates concerning the level of terrorist risk, the costs of PPBM operation, the consequences of successful terrorist bombings, and the anti-terrorist effectiveness of both the partial and full PPBM implementations. Calculations showed that there were circumstances under which each policy was the most cost-effective of the three. Of the three options, not implementing PPBM at all was the most cost-effective approach for the largest percentage of the scenarios considered. We found that 5% PPBM captured the next largest portion of the scenarios, and was generally the optimal strategy when annual PPBMI operation costs were low, when 5% PPBM anti-terrorist effectiveness was high, and when the consequences of successful bombings were severe. We found 100%(. PPBM to be the optimal strategy for most scenarios which involved highly costly terrorist bombings, a high level of terrorist risk, and a 100% PPBM policy that provided much added security over 5% PPBM. / by Jonathan E.W. Cohen. / S.M.
|
Page generated in 0.0638 seconds