• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1406
  • 107
  • 73
  • 54
  • 26
  • 24
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 11
  • 5
  • Tagged with
  • 2122
  • 2122
  • 556
  • 389
  • 328
  • 277
  • 259
  • 225
  • 209
  • 203
  • 175
  • 162
  • 157
  • 141
  • 136
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
631

Algorithmic advancements in discrete optimization : applications to machine learning and healthcare operations

Pauphilet, Jean(Jean A.) January 2020 (has links)
Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, May, 2020 / Cataloged from the official PDF of thesis. / Includes bibliographical references (pages 235-253). / In the next ten years, hospitals will operate like air-traffic control centers whose role is to coordinate care across multiple facilities. Consequently, the future of hospital operations will have three salient characteristics. First, data. The ability to process, analyze and exploit data effectively will become a vital skill for practitioners. Second, a holistic approach, since orchestrating care requires the concurrent optimization of multiple resources, services, and time scales. Third, real-time personalized decisions, to respond to the increasingly closer monitoring of patients. To support this transition and transform our healthcare system towards better outcomes at lower costs, research in operations and analytics should address two concurrent goals: First, develop new methods and algorithms for decision-making in a data-rich environment, which answer key concerns from practitioners and regulators, such as reliability, interpretability, and fairness. / Second, put its models and algorithms to the test of practice, to ensure a path towards implementation and impact. Accordingly, this thesis is comprised of two parts. The first three chapters present methodological contributions to the discrete optimization literature, with particular emphasis on problems emerging from machine learning under sparsity. Indeed, the most important operational decision-making problems are by nature discrete and their sizes have increased with the widespread adoption of connected devices and sensors. In particular, in machine learning, the gigantic amount of data now available contrasts with our limited cognitive abilities. Hence, sparse models, i.e., which only involve a small number of variables, are needed to ensure human understanding. The last two chapters present applications and implementation of machine learning and discrete optimization methods to improve operations at a major academic hospital. / From raw electronic health records of patients, we build predictive models to predict patient flows and prescriptive models to optimize patient-bed assignment in real-time. More importantly, we implement our models in a 600-bed institution. Our impact is two-fold: methodological and operational. Integrating advanced analytics in their daily operations and building a data-first culture constitutes a major paradigm shift. / by Jean Pauphilet. / Ph. D. / Ph.D. Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center
632

Modeling of Spaza shop operations using soft and hard operational research techniques

Sabwa, Jean-Marie January 2009 (has links)
Includes bibliographical references (leaves 90-93). / Globalization has transformed the world into a big village in which the rich are becoming richer and the poor getting poorer. In the commercial world the trend is for big business to buy out the smaller companies and consequently get bigger. Yet it is arguable that small businesses have assisted in providing much needed services to small communities that occupy informal settlements and exist on or below the poverty datum line. The South African government has amongst its main objectives the alleviation of poverty and the improvement of life in previously disadvantaged communities. The government has allowed the micro-enterprises and small businesses in the informal sector to thrive and in this sector are Spaza shops that supply a wide range of grocery commodities to informal settlements. This paper is about an application framework of soft and hard operational research (OR) techniques used to address the performance of micro-enterprises with Spaza shops in Western Cape as a specific case study. The techniques include Strategic Options Development and Analysis (SODA) using Causal mapping and Soft System Methodology (SSM). These were chosen because of their suitability to understand performance problems faced by Spaza shops owners and find ways of improving the current situation by modelling the intervention of stakeholders. The improvement of Spaza shop businesses is a matter for all stakeholders. Causal mapping, helped to identify and structure the multiple conflicting aspects of Spaza shops business. Soft System Methodology made it possible to conceptualize the intervention model based on the rich picture and root definitions for relevant world-views and see what changes are culturally feasible and systematically desirable. Computer simulations were used to help design and test performance measurement indicators for the Spaza shops so as to enable decision-makers to choose the optimal strategy. Statistical analysis came into account to enable us to capture the seasonality and bring up clustering patterns.
633

The use of problem structuring methods to explore the functioning and management of a selected NGO

Anyogu, Alexander A January 2007 (has links)
Includes bibliographical references (leaves 61-64). / Poverty eradication is one of the major challenges facing South Africa and the rest of the continent. Concern around poverty alleviation in South Africa encompasses lack of capacity as well as inefficiency in the management and administration of poverty alleviation projects. Therefore, poverty alleviation agencies ought to be mindful of the issues that could affect their organizational efficiency, especially issues around organizational management. Addressing issues of management amongst the poverty alleviation agencies is necessary to assist role players in the implementation of efficient and effective poverty alleviation programs. The research explored issues around the management structure of a selected non-government organisation (SHAWCO). The objective was to develop a shared understanding of the organizational structure, amongst the members of the management team, and identify (if any) inefficiencies within the structure of the organisation. Problem Structuring Methods have been identified as a collection of tools that assist decision makers in addressing complex societal problems, and seek to alleviate or improve situations characterized by uncertainty, conflict and complexity. The study used Problem Structuring Methods to investigate the possible difficulties SHAWCO is facing as a result of management inefficiency. Interviews were used to uncover issues around the functioning and management of the organization, and an interactive problem structuring workshop was later conducted to develop a shared understanding of the identified issues.
634

The address sort and other computer sorting techniques

Underhill, Leslie G January 1971 (has links)
Originally this project was to have been a feasibility study of the use of computers in the library. It soon became clear that the logical place in the library at which to start making use of the computer was the catalogue. Once the catalogue was in machine-readable form it would be possible to work backwards to the book ordering and acquisitions system and forwards to the circulation and book issue system. One of the big advantages in using the computer to produce the catalogue would be the elimination of the "skilled drudgery" of filing. Thus vast quantities of data would need to be sorted. And thus the scope of this project was narrowed down from a general feasibility study, firstly to a study of a particular section of the library and secondly to one particularly important aspect of that section - that of sorting with the aid of the computer. I have examined many, but by no means all computer sorting techniques, programmed them in FORTRAN as efficiently as I was able, and compared their performances on the IBM 1130 computer of the University of Cape Town. I have confined myself to internal sorts, i.e. sorts that take place in core. This thesis stops short of applying the best of these techniques to the library. I intend however to do so, and to work back to the original scope of my thesis.
635

Fully polynomial time approximation schemes for sequential decision problems

Mostagir, Mohamed January 2005 (has links)
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2005. / Includes bibliographical references (p. 65-67). / This thesis is divided into two parts sharing the common theme of fully polynomial time approximation schemes. In the first part, we introduce a generic approach for devising fully polynomial time approximation schemes for a large class of problems that we call list scheduling problems. Our approach is simple and unifying, and many previous results in the literature follow as direct corollaries of our main theorem. In the second part, we tackle a more difficult problem; the stochastic lot sizing problem, and give the first fully polynomial time approximation scheme for it. Our approach is based on simple techniques that could arguably have wider applications outside of just designing fully polynomial time approximation schemes. / by Mohamed Mostagir. / S.M.
636

Patterns of heart attacks

Shenk, Kimberly N January 2010 (has links)
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2010. / Cataloged from PDF version of thesis. / Includes bibliographical references (p. 66-68). / Myocardial infarction is a derivative of heart disease that is a growing concern in the United States today. With heart disease becoming increasingly predominant, it is important to not only take steps toward preventing myocardial infarction, but also towards predicting future myocardial infarctions. If we can predict that the dynamic pattern of an individual's diagnostic history matches a pattern already identified as high-risk for myocardial infarction, more rigorous preventative measures can be taken to alter that individual's trajectory of health so that it leads to a better outcome. In this paper we utilize classification and clustering data mining methods concurrently to determine whether a patient is at risk for a future myocardial infarction. Specifically, we apply the algorithms to medical claims data from more than 47,000 members over five years to: 1) find different groups of members that have interesting temporal diagnostic patterns leading to myocardial infarction and 2) provide out-of-sample predictions of myocardial infarction for these groups. Using clustering methods in conjunction with classification algorithms yields improved predictions of myocardial infarction over using classification alone. In addition to improved prediction accuracy, we found that the clustering methods also effectively split the members into groups with different and meaningful temporal diagnostic patterns leading up to myocardial infarction. The patterns found can be a useful profile reference for identifying patients at high-risk for myocardial infarction in the future. / by Kimberly N. Shenk. / S.M.
637

Human machine collaborative decision making in a complex optimization system

Malasky, Jeremy S January 2005 (has links)
Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2005. / Includes bibliographical references (p. 149-151). / Numerous complex real-world applications are either theoretically intractable or unable to be solved in a practical amount of time. Researchers and practitioners are forced to implement heuristics in solving such problems that can lead to highly sub-optimal solutions. Our research focuses on inserting a human "in-the-loop" of the decision-making or problem solving process in order to generate solutions in a timely manner that improve upon those that are generated either scolely by a human or solely by a computer. We refer to this as Human-Machine Collaborative Decision-Making (HMCDM). The typical design process for developing human-machine approaches either starts with a human approach and augments it with decision-support or starts with an automated approach and augments it with operator input. We provide an alternative design process by presenting an 1HMCDM methodology that addresses collaboration from the outset of the design of the decision- making approach. We apply this design process to a complex military resource allocation and planning problem which selects, sequences, and schedules teams of unmanned aerial vehicles (UAVs) to perform sensing (Intelligence, Surveillance, and Reconnaissance - ISR) and strike activities against enemy targets. Specifically, we examined varying degrees of human-machine collaboration in the creation of variables in the solution of this problem. We also introduce an IIHMCDM method that combines traditional goal decomposition with a model formulation into an Iterative Composite Variable Approach for solving large-scale optimization problems. / (cont.) Finally, we show through experimentation the potential for improvement in the quality and speed of solutions that can be achieved through the use of an HMCDM approach. / by Jeremy S. Malasky. / S.M.
638

Understanding the low volatility anomaly in the South African equity market

Khuzwayo, Bhekinkosi January 2015 (has links)
The Capital Asset Pricing Model (CAPM) advocates that expected return has a linear proportional relationship with beta (and subsequently volatility). As such, the higher the systematic risk of a security the higher the CAPM expected return. However, empirical results have hardly supported this view as argued as early as Black (1972). Instead, an anomaly has been evidenced across a multitude of developed and emerging markets, where portfolios constructed to have lower volatility have outperformed their higher volatility counterparts as found by Baker and Haugen (2012). This result has been found to exist in most Equity markets globally. In the South African market the studies of Khuzwayo (2011), Panulo (2014) and Oladele (2014) focused on establishing whether low volatility portfolios had outperformed market-cap weighted portfolios in the South African market. While they found this to be the case, it is important to understand if this is truly an anomaly or just a result of prevailing market conditions that have rewarded lower volatility stocks over the back-test period. As such, those conditions might not exist in the future and low volatility portfolios might then underperform. This research does not aim to show, yet again, the existence of this 'anomaly'; instead the aim is to dissect if there is any theoretical backing for low volatility portfolios to outperform high volatility portfolios. If this can be uncovered, then it should help one understand if the 'anomaly' truly exists and also if it can be expected to continue into the future.
639

An examination of heuristic algorithms for the travelling salesman problem

Höck, Barbar Katja January 1988 (has links)
The role of heuristics in combinatorial optimization is discussed. Published heuristics for the Travelling Salesman Problem (TSP) were reviewed and morphological boxes were used to develop new heuristics for the TSP. New and published heuristics were programmed for symmetric TSPs where the triangle inequality holds, and were tested on micro computer. The best of the quickest heuristics was the furthest insertion heuristic, finding tours 3 to 9% above the best known solutions (2 minutes for 100 nodes). Better results were found by longer running heuristics, e.g. the cheapest angle heuristic (CCAO), 0-6% above best (80 minutes for 100 nodes). The savings heuristic found the best results overall, but took more than 2 hours to complete. Of the new heuristics, the MST path algorithm at times improved on the results of the furthest insertion heuristic while taking the same time as the CCAO. The study indicated that there is little likelihood of improving on present methods unless a fundamental new approach is discovered. Finally a case study using TSP heuristics to aid the planning of grid surveys was described.
640

Aiding Decision making for foodbank Cape Town

Blake, Timothy James January 2010 (has links)
No description available.

Page generated in 0.1484 seconds