• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 63
  • 1
  • Tagged with
  • 227
  • 85
  • 71
  • 44
  • 30
  • 30
  • 27
  • 26
  • 26
  • 22
  • 22
  • 21
  • 21
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Combinatorial Auctions for Truckload Transportation Procurement

Ma, Zhong 01 August 2008 (has links)
The goal of this dissertation is to understand the market-based mechanisms that enable shippers to allocate lanes in an efficient way for truckload (TL) transportation procurement despite the self-interest of carriers. To understand the market-based mechanisms, we focus on proposing some novel models and mechanisms to enhance the use of combinatorial auction for TL transportation procurement. In this dissertation, our approach to gaining the understanding consists of three parts: 1. We develop a carrier optimal bid generation model for carriers (bidders) to discover the best sets of lanes to bid for at a given round. The optimal bid generation model simultaneously generates alternative tours and selects the most profitable package bid for the carrier under a myopic strategy, which has never been considered before. The simultaneous tour generation and selection significantly lessen the computational complexities of a carrier's optimization problem since it is unnecessary for the carrier to calculate the values for all possible packages. 2. We present an iterative combinatorial auction design that integrates the optimization problems for both the shipper and the bidders where the approximate dual prices derived from the result of a winner determination solution are used by the bidders in identifying profitable lanes. The auctions also allow the bidders to submit exclusive-OR (XOR) bids and are able to deal with some common business considerations. The extension of the concept of active bids enables this mechanism to effectively mitigate the exposure problem, the threshold problem, and the free-riding problem. Furthermore, both the shippers and the carriers are better off compared to multi-round auctions that do not integrate the shippers' and carriers' optimizations. 3. We extend a deterministic winner determination model to a two-stage stochastic winner determination model for TL transportation procurement under shipment volume uncertainty. We demonstrate that the value of the stochastic solution is always at least as good as one obtained by a deterministic model based on using expected shipment volumes. The sWDP model is to the best of our knowledge the first winner determination formulation of any kind that explicitly incorporates demand uncertainty.
22

Fútbol strategies applied to optimize combinatortial problems to create efficent results – the soccer heuristic

Kubik, Krista M January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd Easton / Heuristics are often implemented to find better solutions to computationally challenging problems. Heuristics use varying techniques to search for quality solutions. Several optimization heuristics have drawn inspiration from real world practices. Ant colony optimization mimics ants in search of food. Genetic algorithms emulate traits being passed from a parent to a child. Simulated annealing imitates annealing metal. This thesis presents a new variable neighborhood search optimization heuristic, fútbol Strategies applied to Optimize Combinatorial problems to Create Efficient Results, which is called the SOCCER heuristic. This heuristic mimics fútbol and the closest player to the ball performs his neighborhood search and players are assigned different neighborhoods. The SOCCER heuristic is the first application of variable neighborhood search heuristic that uses a complex structure to select neighborhoods. The SOCCER heuristic can be applied to a variety of optimization problems. This research implemented the SOCCER heuristic for job shop scheduling problems. This implementation focused on creating a quality schedule for a local limestone company. A small computational study shows that the SOCCER heuristic can quickly solve complex job shop scheduling problems with most instances finishing in under an half an hour. The optimized schedules reduced the average production time by 7.27%. This is roughly a 2 day decrease in the number of days required to produce a month’s worth of orders. Thus, the SOCCER heuristic is a new optimization tool that can aid companies and researchers find better solutions to complex problems.
23

Multivariate Bayesian Process Control

Yin, Zhijian 01 August 2008 (has links)
Multivariate control charts are valuable tools for multivariate statistical process control (MSPC) used to monitor industrial processes and to detect abnormal process behavior. It has been shown in the literature that Bayesian control charts are optimal tools to control the process compared with the non-Bayesian charts. To use any control chart, three control chart parameters must be specified, namely the sample size, the sampling interval and the control limit. Traditionally, control chart design is based on its statistical performance. Recently, industrial practitioners and academic researchers have increasingly recognized the cost benefits obtained by applying the economically designed control charts to quality control, equipment condition monitoring, and maintenance decision-making. The primary objective of this research is to design multivariate Bayesian control charts (MVBCH) both for quality control and conditional-based maintenance (CBM) applications. Although considerable research has been done to develop MSPC tools under the assumption that the observations are independent, little attention has been given to the development of MSPC tools for monitoring multivariate autocorrelated processes. In this research, we compare the performance of the squared predication error (SPE) chart using a vector autoregressive moving average with exogenous variables (VARMAX) model and a partial least squares (PLS) model for a multivariate autocorrelated process. The study shows that the use of SPE control charts based on the VARMAX model allows rapid detection of process disturbances while reducing false alarms. Next, the economic and economic-statistical design of a MVBCH for quality control considering the control limit policy proved to be optimal by Makis(2007) is developed. The computational results illustrate that the MVBCH performs considerably better than the MEWMA chart, especially for smaller mean shifts. Sensitivity analyses further explore the impact of the misspecified out-of-control mean on the actual average cost. Finally, design of a MVBCH for CBM applications is considered using the same control limit policy structure and including an observable failure state. Optimization models for the economic and economic statistical design of the MVBCH for a 3 state CBM model are developed and comparison results show that the MVBCH performs better than recently developed CBM Chi-square chart.
24

Measuring Emotional Responses to Interaction: Evaluation of Sliders and Physiological Reactions

Lottridge, Danielle 18 February 2011 (has links)
Recent work has proposed sliders as a useful way to measure self-reported emotion continuously. My dissertation extends this work to ask: what are relevant properties of affective self-report on sliders and variations? How reliable are affective self-reports? How do they relate to physiological data? What are individual and cultural differences? How can this method be applied to ehealth? Three emotion self-report tools (one-slider, two-slider, a touchscreen) were developed and evaluated in four experiments. The first experiment was within-subjects. Participants viewed short videos, with four self-report conditions (including no reporting) and physiological capture (heart rate variability and skin conductance). In a re-rating task, the sliders models were found to be more reliable than the touchscreen (Lottridge & Chignell, 2009a). The second and third experiments were between-subjects, and examined individual and cultural differences. Canadian and Japanese participants watched a nature video, while rating emotions and answering questions. Analyses were carried out within and across the datasets. Larger operation span displayed a minor benefit. Valence and arousal ratings were not strongly related to skin conductance. The Japanese performed on par with Canadians but reported worse performance. Based on the results, the recommendation was made that a single slider be used to rate valence, that arousal be estimated with skin conductance, and that slider psychometrics be used to assess cognitive load over time. In the fourth experiment, diabetic participants watched Diabetes-related videos. They clustered into usage patterns: some moved the slider very little during videos and more afterward, some hardly moved the slider, and some used it as expected. Two novel metrics facilitated these analyses: Emotional Bandwidth, an application of information entropy that characterizes the granularity of the self reports (Lottridge & Chignell, 2009b) and Emotional Majority Agreement, the amount of agreement relative to a sample’s self-reports (Lottridge & Chignell, 2009c). In summary, this dissertation contributes a method of measuring emotion through sliders and skin conductance that has been evaluated in a number of experimental studies. It contributes the empirical results, design recommendations, and two novel metrics of emotional response. Limitations and implications for future research and practice are also discussed.
25

Bankruptcy Prediction of Companies in the Retail-apparel Industry using Data Envelopment Analysis

Kingyens, Angela Tsui-Yin Tran 17 December 2012 (has links)
Since 2008, the world has been in recession. As daily news outlets report, this crisis has prompted many small businesses and large corporations to file for bankruptcy, which has grave global social implications. Despite government intervention and incentives to stimulate the economy that have put nations in hundreds of billions of dollars of debt, and have reduced the prime rates to almost zero, efforts to combat the increase in unemployment rate as well as the decrease in discretionary income have been troublesome. It is a vicious cycle: consumers are apprehensive of spending due to the instability of their jobs and ensuing personal financial problems; businesses are weary from the lack of revenue and are forced to tighten their operations which likely translates to layoffs; and so on. Cautious movement of cash flows are rooted in and influenced by the psychology of the players (stakeholders) of the game (society). Understandably, the complexity of this economic fallout is the subject of much attention. And while the markets have recovered much of the lost ground as of late, there is still great opportunity to learn about all the possible factors of this recession, in anticipation of and bracing for one more downturn before we emerge from this crisis. In fact, there is no time like today more appropriate for research in bankruptcy prediction because of its relevance, and in an age where documentation is highly encouraged and often mandated by law, the amount and accessibility of data is paramount – an academic’s paradise! The main objective of this thesis was to develop a model supported by Data Envelopment Analysis (DEA) to predict the likelihood of failure of US companies in the retail-apparel industry based on information available from annual reports – specifically from financial statements and their corresponding Notes, Management’s Discussion and Analysis, and Auditor’s Report. It was hypothesized that the inclusion of variables which reflect managerial decision-making and economic factors would enhance the predictive power of current mathematical models that consider financial data exclusively. With a unique and comprehensive dataset of 85 companies, new metrics based on different aspects of the annual reports were created then combined with a slacks-based measure of efficiency DEA model and modified layering classification technique to capture the multidimensional complexity of bankruptcy. This approach proved to be an effective prediction tool, separating companies with a high risk of bankruptcy from those that were healthy, with a reliable accuracy of 80% – an improvement over the widely-used Altman bankruptcy model having 70%, 58% and 50% accuracy when predicting cases today, from one year back and from two years back, respectively. It also provides a probability of bankruptcy based on a second order polynomial function in addition to targets for improvement, and was designed to be easily adapted for analysis of other industries. Finally, the contributions of this thesis benefit creditors with better risk assessment, owners with time to improve current operations as to avoid failure altogether, as well as investors with information on which healthy companies to invest in and which unhealthy companies to short.
26

Optimization Models and Techniques for Radiation Treatment Planning Applied to Leksell Gamma Knife(R) Perfexion(TM)

Ghaffari, Hamid 11 December 2012 (has links)
Radiation treatment planning is a process through which a certain plan is devised in order to irradiate tumors or lesions to a prescribed dose without posing surrounding organs to the risk of receiving radiation. A plan comprises a series of shots at di erent positions with di erent shapes. The inverse planning approach which we propose utilizes certain optimization techniques and builds mathematical models to come up with the right location and shape, for each shot, automating the whole process. The models which we developed for PerfexionTM unit (Elekta, Stockholm, Sweden), in essence, have come to the assistance of oncologists in automatically locating isocentres and de ning sector durations. Sector duration optimization (SDO) and sector duration and isocentre location optimization (SDIO) are the two classes of these models. The SDO models, which are, in fact, variations of equivalent uniform dose optimization model, are solved by two nonlinear optimization techniques, namely Gradient Projection and our home-developed Interior Point Constraint Generation. In order to solve SDIO model, a commercial optimization solver has been employed. This study undertakes to solve the isocentre selection and sector duration optimization. Moreover, inverse planning is evaluated, using clinical data, throughout the study. The results show that automated inverse planning contributes to the quality of radiation treatment planning in an unprecedentedly optimal fashion, and signi cantly reduces computation time and treatment time.
27

Evaluating the Efficiency, Usability and Safety of Computerized Order Sets in Sunnybrook Health Sciences Centre's Computerized Provider Order Entry System

Chan, Julie Min-Ting 13 January 2010 (has links)
Few studies examine technical barriers, such as usability, to successful Computerized Provider Order Entry (CPOE) system implementation. This thesis explores this literature gap through the evaluation of the Sunnybrook CPOE order set system with a heuristic evaluation of the Sunnybrook CPOE system, the development of a more user friendly CPOE order set interface (Test Design), and a usability study comparing three order set formats (Sunnybrook CPOE, Test Design, and paper order sets). A randomized trial was conducted with 27 Sunnybrook physicians at the hospital. Results showed that the Sunnybrook CPOE order set system was less efficient (task times were on average 364 seconds longer than Test Design and 344 seconds longer than paper), less user-friendly (users were less confident, less satisfied, and more frustrated with Sunnybrook CPOE tasks), and less safe than paper order sets (more harmful errors). Test Design was as efficient and safe as paper order sets.
28

Using Queueing Analysis to Guide Combinatorial Scheduling in Dynamic Environments

Tran, Tony 02 January 2012 (has links)
The central thesis of this dissertation is that insight from queueing analysis can effectively guide standard (combinatorial) scheduling algorithms in dynamic environments. Scheduling is generally concerned with complex combinatorial decisions for static problems, whereas queueing theory simplifies the combinatorics and focuses on dynamic systems. We examine a queueing network with flexible servers under queueing and scheduling techniques. Based on the strengths of queueing analysis and scheduling, we develop a hybrid model that guides scheduling with results from the queueing model. In order to include setup times, we create a logic-based Benders decomposition model for a static representation of the queueing network. Our model is able to find optimal schedules up to 5 orders of magnitude faster than the only other model in the literature. A hybrid model is then developed for the dynamic problem and shown to achieve the best mean flow time while also guaranteeing maximal capacity.
29

Evaluating a New Display of Information Generated from LiDAR Point Clouds

Barbut, Ori 21 March 2012 (has links)
The design of a texture display for three-dimensional Light Detection and Ranging (LiDAR) point clouds is investigated. The objective is to present a low fidelity display that is simple to compute in real-time, which utilizes the pattern processing capabilities of a human operator to afford an understanding of the environment. The efficacy of the display is experimentally evaluated by in comparison with a baseline point cloud rendering. Subjects were shown data based on virtual hills, and were asked to plan the least-steep traversal, and identify the hill from a set of distractors. The major conclusions are: comprehension of LiDAR point clouds from the sensor origin is difficult without further processing of the data, a separated vantage point improves understanding of the data, and a simple computation to present local point cloud derivative data significantly improves the understanding of the environment, even when observed from the sensor origin.
30

Using Queueing Analysis to Guide Combinatorial Scheduling in Dynamic Environments

Tran, Tony 02 January 2012 (has links)
The central thesis of this dissertation is that insight from queueing analysis can effectively guide standard (combinatorial) scheduling algorithms in dynamic environments. Scheduling is generally concerned with complex combinatorial decisions for static problems, whereas queueing theory simplifies the combinatorics and focuses on dynamic systems. We examine a queueing network with flexible servers under queueing and scheduling techniques. Based on the strengths of queueing analysis and scheduling, we develop a hybrid model that guides scheduling with results from the queueing model. In order to include setup times, we create a logic-based Benders decomposition model for a static representation of the queueing network. Our model is able to find optimal schedules up to 5 orders of magnitude faster than the only other model in the literature. A hybrid model is then developed for the dynamic problem and shown to achieve the best mean flow time while also guaranteeing maximal capacity.

Page generated in 0.0433 seconds