• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1406
  • 107
  • 73
  • 54
  • 26
  • 24
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 11
  • 5
  • Tagged with
  • 2122
  • 2122
  • 556
  • 389
  • 328
  • 277
  • 259
  • 225
  • 209
  • 203
  • 175
  • 162
  • 157
  • 141
  • 136
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
941

An Analogy Based Method for Freight Forwarding Cost Estimation

Straight, Kevin Andrew 25 September 2014 (has links)
<p> The author explored estimation by analogy (EBA) as a means of estimating the cost of international freight consignment. A version of the k-Nearest Neighbors algorithm (k-NN) was tested by predicting job costs from a database of over 5000 actual jobs booked by an Irish freight forwarding firm over a seven year period. The effect of a computer intensive training process on overall accuracy of the method was found to be insignificant when the method was implemented with four or fewer neighbors. Overall, the accuracy of the analogy based method, while still significantly less accurate than manually working up estimates, might be worthwhile to implement in practice, depending labor costs in an adopting firm. A simulation model was used to compare manual versus analytical estimation methods. The point of indifference occurs when it takes a firm more than 1.5 worker hours to prepare a manual estimate (at current Irish labor costs). Suggestions are given for future experiments to improve the sampling policy of the method to improve accuracy and to improve overall scalability.</p>
942

Critical systems thinking, theory and practice : a case study of an intervention in two British local authorities

Munlo, Isaac January 1997 (has links)
This thesis reports an intervention informed by critical systems thinking. The intervention drew upon a variety of systems and operational research methods to systemically explore the problems facing housing services for older people. Stakeholders were then supported in developing a response to these problems in the form of an integrated model of user involvement and multi-agency working. The methods used in this study included Cognitive Mapping, Critical Systems Heuristics, Interactive Planning and Viable System Modelling. Following a description of the project and its outcomes, the author's practical experiences are used to reflect back on critical systems thinking. Five innovations are presented in the thesis: First a new method called 'Problem Mapping' is developed. This has five stages: (i) interviewing stakeholders to surface problems and identify further potential interviewees; (ii) listing the problems as seen through the eyes of the various stakeholders; (iii) consolidating the list by removing duplicate problems and synthesising similar problems into larger 'problem statements'; (iv) mapping the relationships between problems; and (v) presenting the results back to stakeholders to inform the development of proposals for improvement. Reflection upon the use of this method indicates that it is particularly valuable where there are multiple stakeholders who are not initially visible to researchers, each of whom sees different aspects of a problem situation. Second, Problem Mapping is used to systemically express the problems facing housing services for older people in two geographical areas in the UK. This shows how problems in the areas of assessment, information provision and planning are mutually reinforcing, making a strong case for change. Third, a process of evolving an integrated model of user involvement and multi-agency working is presented. The model was designed in facilitated workshops by managers from statutory agencies, based on specifications developed by a variety of stakeholders (including service users and carers). Fourth, the strengths and weaknesses of Cognitive Mapping (one of the methods used in the project) are discussed. Significant limitations of this method are highlighted. Fifth, contributions and reflections on the theoretical and practical basis of the research are presented. These among others focus on the theory of boundary critique, which is an important aspect of critical systems thinking. It is often assumed that boundary critique is only undertaken at the start of an intervention to ensure that its remit has been adequately defined. However, this project shows that it is both possible and desirable to use the theory of boundary critique in an on-going basis in interventions to inform the creative design of methods.
943

Estimation of Travel Time Distribution and Travel Time Derivatives

Wan, Ke 04 December 2014 (has links)
<p>Given the complexity of transportation systems, generating optimal routing decisions is a critical issue. This thesis focuses on how routing decisions can be computed by considering the distribution of travel time and associated risks. More specifically, the routing decision process is modeled in a way that explicitly considers the dependence between the travel times of different links and the risks associated with the volatility of travel time. Furthermore, the computation of this volatility allows for the development of the travel time derivative, which is a financial derivative based on travel time. It serves as a value or congestion pricing scheme based not only on the level of congestion but also its uncertainties. In addition to the introduction (Chapter 1), the literature review (Chapter 2), and the conclusion (Chapter 6), the thesis consists of two major parts: </p><p> In part one (Chapters 3 and 4), the travel time distribution for transportation links and paths, conditioned on the latest observations, is estimated to enable routing decisions based on risk. Chapter 3 sets up the basic decision framework by modeling the dependent structure between the travel time distributions for nearby links using the copula method. In Chapter 4, the framework is generalized to estimate the travel time distribution for a given path using Gaussian copula mixture models (GCMM). To explore the data from fundamental traffic conditions, a scenario-based GCMM is studied. A distribution of the path scenario representing path traffic status is first defined; then, the dependent structure between constructing links in the path is modeled as a Gaussian copula for each path scenario and the scenario-wise path travel time distribution is obtained based on this copula. The final estimates are calculated by integrating the scenario-wise path travel time distributions over the distribution of the path scenario. In a discrete setting, it is a weighted sum of these conditional travel time distributions. Different estimation methods are employed based on whether or not the path scenarios are observable: An explicit two-step maximum likelihood method is used for the GCMM based on observable path scenarios; for GCMM based on unobservable path scenarios, extended Expectation Maximum algorithms are designed to estimate the model parameters, which introduces innovative copula-based machine learning methods. </p><p> In part two (Chapter 5), travel time derivatives are introduced as financial derivatives based on road travel times&mdash;a non-tradable underlying asset. This is proposed as a more fundamental approach to value pricing. The chapter addresses (a) the motivation for introducing such derivatives (that is, the demand for hedging), (b) the potential market, and (c) the product design and pricing schemes. Pricing schemes are designed based on the travel time data captured by real time sensors, which are modeled as Ornstein-Uhlenbeck processes and more generally, continuous time auto regression moving average (CARMA) models. The risk neutral pricing principle is used to generate the derivative price, with reasonably designed procedures to identify the market value of risk. </p>
944

System goodput (gs)| A modeling and simulation approach to refute current thinking regarding system level quality of service

Sahlin, John P. 11 February 2014 (has links)
<p> This dissertation represents a modeling and simulation approach toward determining whether distributed computing architectures (e.g., Cloud Computing) require state of the art servers to ensure top performance, and whether alternate approaches can result in optimized Quality of Service by reducing operating costs while maintaining high overall system performance. The author first investigated the origins of Cloud Computing, to ensure that the model of distributed computing architectures still applied to the Cloud Computing business model. After establishing that Cloud Computing was in fact a new iteration of a current architecture, the author conducted a series of modeling and simulation experiments using the OPNET Modeler system dynamics tool to evaluate whether variations in the server infrastructure altered the overall system performance of a distributed computing architecture environment. This modeling exercise focused on comparing state of the art commodity Information Technology (IT) servers to those meeting the Advanced Telecommunications Association (AdvancedTCA or ATCA) open standard requirements, which are generally at least one generation behind commodity servers in terms of performance benchmarks. After modeling an enterprise IT environment, and simulating network traffic using the OPNET Modeler tool, the author concluded that there is no system-level performance degradation in using AdvancedTCA servers for the consolidation effort, using ANOVA/Tukey and Kruskal-Wallis multivariate data analysis of the simulation results. In order to conduct this comparison, the author developed a system-level performance benchmark, System Goodput (GS) to represent end to end performance of services, a more appropriate measure of the performance of distributed systems such as Cloud Computing. The analysis of the data proved that individual component benchmarks are not an accurate predictor of system-level performance. After establishing that using slower servers (e.g., ATCA) does not affect overall system performance in a Cloud Computing environment, the author developed a model for optimizing system-level Quality of Service (QoS) for Cloud Computing infrastructures by relying on the more rugged ATCA servers to extend the service life of a Cloud Computing environment, resulting in a much lower Total Ownership Cost (TOC) for the Cloud Computing infrastructure provider.</p>
945

Modelling Forest Fire Initial Attack Airtanker Operations

Clark, Nicholas A. 21 November 2012 (has links)
The Ontario Ministry of Natural Resources uses airtankers for forest fire suppression that now have onboard GPS units that track their real-time location, velocity and altitude. However, the GPS data does not indicate which fire is being fought, the time each airtanker spends travelling to and from each fire or the time each airtanker spends flying between each fire and the lake from which it scoops water to drop on the fire. A pattern recognition algorithm was developed and used to determine what was happening at each point along the airtanker’s track, including the time and location of every water pickup. This pre-processed data was used to develop detailed models of the airtanker service process. A discrete-event simulation model of the initial attack airtanker system was also developed and used to show how service process models can be incorporated in other models to help solve complex airtanker management decision-making problems.
946

Topics in Wind Farm Layout Optimization: Analytical Wake Models, Noise Propagation, and Energy Production

Zhang, Yun 17 July 2013 (has links)
Wind farm layout optimization (WFLO) is the design of wind turbine layout, subject to various financial and engineering objectives and constraints. The first topic of this thesis focuses on solving two variations of WFLO that have different analytical aerodynamic models, and illustrate deep integration of the wake models into mixed-integer programs and constraint programs. Formulating WFLO as MIP and CP enables more quantitative analysis than previous studies could do with heuristics, and allows the practitioners to use an off-the-shelf optimization solver to tackle the WFLO problem. The second topic focuses on another version of WFLO that has two competing objectives: minimization of noise and maximization of energy. A genetic algorithm (NSGA-II) is used. Under these two objectives, solutions are presented to illustrate the flexibility of this optimization framework in terms of supplying a spectrum of design choices with different numbers of turbines and different levels of noise and energy output.
947

Modelling Forest Fire Initial Attack Airtanker Operations

Clark, Nicholas A. 21 November 2012 (has links)
The Ontario Ministry of Natural Resources uses airtankers for forest fire suppression that now have onboard GPS units that track their real-time location, velocity and altitude. However, the GPS data does not indicate which fire is being fought, the time each airtanker spends travelling to and from each fire or the time each airtanker spends flying between each fire and the lake from which it scoops water to drop on the fire. A pattern recognition algorithm was developed and used to determine what was happening at each point along the airtanker’s track, including the time and location of every water pickup. This pre-processed data was used to develop detailed models of the airtanker service process. A discrete-event simulation model of the initial attack airtanker system was also developed and used to show how service process models can be incorporated in other models to help solve complex airtanker management decision-making problems.
948

Topics in Wind Farm Layout Optimization: Analytical Wake Models, Noise Propagation, and Energy Production

Zhang, Yun 17 July 2013 (has links)
Wind farm layout optimization (WFLO) is the design of wind turbine layout, subject to various financial and engineering objectives and constraints. The first topic of this thesis focuses on solving two variations of WFLO that have different analytical aerodynamic models, and illustrate deep integration of the wake models into mixed-integer programs and constraint programs. Formulating WFLO as MIP and CP enables more quantitative analysis than previous studies could do with heuristics, and allows the practitioners to use an off-the-shelf optimization solver to tackle the WFLO problem. The second topic focuses on another version of WFLO that has two competing objectives: minimization of noise and maximization of energy. A genetic algorithm (NSGA-II) is used. Under these two objectives, solutions are presented to illustrate the flexibility of this optimization framework in terms of supplying a spectrum of design choices with different numbers of turbines and different levels of noise and energy output.
949

Applications of forecasting and optimisation in the Australian national electricity market

Baloi, C. A. Unknown Date (has links)
No description available.
950

Applications of forecasting and optimisation in the Australian national electricity market

Baloi, C. A. Unknown Date (has links)
No description available.

Page generated in 0.1199 seconds