1 |
A Hybrid Cost Model for Evaluating Query Execution PlansWang, Ning 22 January 2024 (has links)
Query optimization aims to select a query execution plan among all query paths for a given query. The query optimization of traditional relational database management systems (RDBMSs) relies on estimating the cost of the alternative query plans in the query plan search space provided by a cost model. The classic cost model (CCM) may lead the optimizer to choose query plans with poor execution time due to inaccurate cardinality estimations and simplifying assumptions. A learned cost model (LCM) based on machine learning does not rely on such estimations and learns the cost from runtime. While learned cost models are shown to improve the average performance, they may not guarantee that optimal performance will be consistently achieved. In addition, the query plans generated using the LCM may not necessarily outperform the query plans generated with the CCM. This thesis proposes a hybrid approach to solve this problem by striking a balance between the LCM and the CCM. The hybrid model uses the LCM when it is expected to be reliable in selecting a good plan and falls back to the CCM otherwise. The evaluation results of the hybrid model demonstrate promising performance, indicating potential for successful use in future applications.
|
2 |
Why are dividends sticky?Tsai, Chun-Li 01 November 2005 (has links)
This dissertation investigates the sluggish adjustment process of dividend
payment in the stock market. First, I focus on the individual stocks. A casual
investigation of observed dividends for individual stocks shows dividend adjustments
are sluggish and discrete; this is not consistent with the Lintner??s stylized fact (1956) in
which dividend adjustments are assumed to change continuously. Thus, I examine three
possible explanations to account for dividend stickiness and discreteness: menu-costs
(i.e. a constant adjustment cost), decision-making delays, and dividend adjustment
asymmetry. I reject Dixit??s menu-cost model as an appropriate specification for the
sluggish adjustment process of dividends. The empirical results imply that decisionmaking
delays and dividend adjustment asymmetry might be possible explanations for
sticky and discrete dividends on selected individual stocks.
Second, I focus on the aggregate stock market. I use a quadratic adjustment cost
model to examine whether adjustment costs can explain the slow adjustment of
aggregate dividends. The empirical results suggest that adjustment costs might be a
significant factor explaining the slow dividend adjustment for S&P 500. The value of
relative weigh cost is related to the specification of target dividend. If target dividendsare related to earnings, then the empirical results suggest that the adjustment costs are
about forty-fold more important than the deviation cost between the actual dividend and
the target level in determining the dynamic dividend adjustment process. If target
dividends are specified as proportion to the stock prices, the adjustment costs are about
fourteen-fold more important than the deviation cost between actual dividend and target
level when managers determine the dividends.
|
3 |
Transferable rights in a recreational fishery: an application to the red snapper fishery in the Gulf of MexicoKim, Hwa Nyeon 17 September 2007 (has links)
Overfishing of red snapper in the Gulf of Mexico has significantly increased
lately. A major regulation to reduce the overfishing is Total Allowable Catches (TAC) in
combination with a season closure. The restrictions on entry lead to an inefficient
outcome, however, because the resource is not used by the fishermen who value it the
most. As an alternative to restricting entry, transferable rights (TR) programs are being
increasingly considered. Under a TR program, a market is created to trade a right to use
a resource and the total benefits of the participants are maximized through such a trade.
The principal objective of this dissertation is to comprehensively assess
economic and biological consequences of the red snapper fishery for the TR program.
To date the literature lacks sufficient discussion of how recreational TR programs would
function. I, therefore, propose an economically desirable institutional framework for the
TR program in the recreational fishery. I draw some lessons from hunting programs and
applications of other TR programs to find better schemes for the TR program in the
recreational fishery.This dissertation uses theoretical and empirical models as well as institutional
settings to develop the TR program. A theoretical model is provided to investigate which
unit of measurement for the TRs is preferable. For empirical models I first estimate an
empirically based recreation demand that incorporates TR permit demand and then
develop a simulation submodel using the estimated demand. I find price instruments,
such as fees or TR programs, are very efficient to reduce fishing trips but they also lead
to distributional impacts on trips by low income (or low cost) anglers. Partial simulation
results indicate that an efficiency benefit of the TR program would be significant
because recreational trip demand in the current closed season is not trivial.
I conclude that the TR program in the recreational fishery will economically and
biologically provide a great deal of merit to reduce the overfishing situation and a
substantial efficiency gain to Gulf anglers. Some institutional barriers, especially from
the large transaction cost can also be overcome if electronic systems or the Internet are
used.
|
4 |
Decision Support System (DSS) for Machine Selection: A Cost Minimization ModelMendez Pinero, Mayra I. 16 January 2010 (has links)
Within any manufacturing environment, the selection of the production or assembly machines is part of the day to day responsibilities of management. This is especially true when there are multiple types of machines that can be used to perform each assembly or manufacturing process. As a result, it is critical to find the optimal way to select machines when there are multiple related assembly machines available. The objective of this research is to develop and present a model that can provide guidance to management when making machine selection decisions of parallel, non-identical, related electronics assembly machines. A model driven Decision Support System (DSS) is used to solve the problem with the emphasis in optimizing available resources, minimizing production disruption, thus minimizing cost. The variables that affect electronics product costs are considered in detail. The first part of the Decision Support System was developed using Microsoft Excel as an interactive tool. The second part was developed through mathematical modeling with AMPL9 mathematical programming language and the solver CPLEX90 as the optimization tools. The mathematical model minimizes total cost of all products using a similar logic as the shortest processing time (SPT) scheduling rule. This model balances machine workload up to an allowed imbalance factor. The model also considers the impact on the product cost when expediting production. Different scenarios were studied during the sensitivity analysis, including varying the amount of assembled products, the quantity of machines at each assembly process, the imbalance factor, and the coefficient of variation (CV) of the assembly processes. The results show that the higher the CV, the total cost of all products assembled increased due to the complexity of balancing machine workload for a large number of products. Also, when the number of machines increased, given a constant number of products, the total cost of all products assembled increased because it is more difficult to keep the machines balanced. Similar results were obtained when a tighter imbalance factor was used.
|
5 |
Decision Support System (DSS) for Machine Selection: A Cost Minimization ModelMendez Pinero, Mayra I. 16 January 2010 (has links)
Within any manufacturing environment, the selection of the production or assembly machines is part of the day to day responsibilities of management. This is especially true when there are multiple types of machines that can be used to perform each assembly or manufacturing process. As a result, it is critical to find the optimal way to select machines when there are multiple related assembly machines available. The objective of this research is to develop and present a model that can provide guidance to management when making machine selection decisions of parallel, non-identical, related electronics assembly machines. A model driven Decision Support System (DSS) is used to solve the problem with the emphasis in optimizing available resources, minimizing production disruption, thus minimizing cost. The variables that affect electronics product costs are considered in detail. The first part of the Decision Support System was developed using Microsoft Excel as an interactive tool. The second part was developed through mathematical modeling with AMPL9 mathematical programming language and the solver CPLEX90 as the optimization tools. The mathematical model minimizes total cost of all products using a similar logic as the shortest processing time (SPT) scheduling rule. This model balances machine workload up to an allowed imbalance factor. The model also considers the impact on the product cost when expediting production. Different scenarios were studied during the sensitivity analysis, including varying the amount of assembled products, the quantity of machines at each assembly process, the imbalance factor, and the coefficient of variation (CV) of the assembly processes. The results show that the higher the CV, the total cost of all products assembled increased due to the complexity of balancing machine workload for a large number of products. Also, when the number of machines increased, given a constant number of products, the total cost of all products assembled increased because it is more difficult to keep the machines balanced. Similar results were obtained when a tighter imbalance factor was used.
|
6 |
Cloud De-Duplication Cost ModelHocker, Christopher 22 June 2012 (has links)
No description available.
|
7 |
An investigation into the data collection process for the development of cost modelsDelgado-Arvelo, Ysolina January 2012 (has links)
This thesis is the result of many years of research in the field of manufacturing cost modelling. It particularly focuses on the Data Collection Process for the development of manufacturing cost models in the UK Aerospace Industry with no less important contributions from other areas such as construction, process and software development. The importance of adopting an effective model development process is discussed and a new CMD Methodology is proposed. In this respect, little research has considered the development of the cost model from the point of view of a standard and systematic Methodology, which is essential if an optimum process is to be achieved. A Model Scoping 3 Framework, a functional Data Source and Data Collection Library and a referential Data Type Library are the core elements of the proposed Cost Model Development Methodology. The research identified a number of individual data collection methods, along with a comprehensive list of data sources and data types, from which essential data for developing cost models could be collected. A Taxonomy based upon sets of generic characteristics for describing the individual data collection, data sources and data types was developed. The methods, tools and techniques were identified and categorised according to these generic characteristics. This provides information for selecting between alternative methods, tools and techniques. The need to perform frequent iterations of data collection, data identification, data analysis and decision making tasks until an acceptable cost model has been developed has become an inherent feature of the CMDP. It is expected that the proposed model scoping framework will assist cost engineering and estimating practitioners in: defining the features, activities of the process and the attributes of the product for which a cost model is required, and also in identifying the cost model characteristics before the tasks of data identification and collection start. It offers a structured way of looking at the relationship between data sources, cost model characteristics and data collection tools and procedures. The aim was to make the planning process for developing cost models more effective and efficient and consequently reduce the time to generate cost models.
|
8 |
High Performance Analytics in Complex Event ProcessingQi, Yingmei 02 January 2013 (has links)
Complex Event Processing (CEP) is the technical choice for high performance analytics in time-critical decision-making applications. Although current CEP systems support sequence pattern detection on continuous event streams, they do not support the computation of aggregated values over the matched sequences of a query pattern. Instead, aggregation is typically applied as a post processing step after CEP pattern detection, leading to an extremely inefficient solution for sequence aggregation. Meanwhile, the state-of-art aggregation techniques over traditional stream data are not directly applicable in the context of the sequence-semantics of CEP. In this paper, we propose an approach, called A-Seq, that successfully pushes the aggregation computation into the sequence pattern detection process. A-Seq succeeds to compute aggregation online by dynamically recording compact partial sequence aggregation without ever constructing the to-be-aggregated matched sequences. Techniques are devised to tackle all the key CEP- specific challenges for aggregation, including sliding window semantics, event purging, as well as sequence negation. For scalability, we further introduce the Chop-Connect methodology, that enables sequence aggregation sharing among queries with arbitrary substring relationships. Lastly, our cost-driven optimizer selects a shared execution plan for effectively processing a workload of CEP aggregation queries. Our experimental study using real data sets demonstrates over four orders of magnitude efficiency improvement for a wide range of tested scenarios of our proposed A-Seq approach compared to the state-of-art solutions, thus achieving high-performance CEP aggregation analytics.
|
9 |
The Optimal Strategy for Executing Vendor Managed InventoryLin, Wei-chih 03 January 2006 (has links)
This paper evaluates how a firm uses vendor managed inventory (VMI) to manage inventories under costs constraint in uncertainly industrial environment and various demand. The emphasis of this research is to discuss the impacts how demand variation influences the effects of the firm executing VMI and how different supplier levels (ABC) affects firm¡¦s VMI strategy. This research built a cost model which contained construction cost, overhead cost, inventory cost, and shortage cost and compared the profits ex-using VMI with the profits using VMI. Furthermore, use the same model to find out the most proper VMI decisions in different supplier levels¡¦ conditions.
This paper obtained some conclusions. First, the firm using VMI under uncertain demand would take more advantages than under certain demand. Second, we can find the best VMI strategy of a firm.
|
10 |
A Process Based Cost Model for Multi-Layer Ceramic Manufacturing of Solid Oxide Fuel CellsKoslowske, Mark T. 10 August 2003 (has links)
"Planar Solid Oxide Fuel Cell manufacturing can be considered in the pilot plant stage with efforts driving towards large volume manufacturing. The science of the solid oxide fuel cell is advancing rapidly to expand the knowledge base and use of material combinations and layer forming methods for the unit cell. Few of the many processing methods, over 15, reported in literature for layer formation are used today in high volume manufacturing. It is difficult to establish future market demand and cost levels needed to plan a course of action today. The need to select amongst different designs, materials and processes will require a tool to aid in these decisions. A modeling tool is presented to robustly compare the various process combinations and manufacturing variable to make solid oxide fuel cells in order to identify key trends prior to making strategic investment decisions. The ability to accurately forecast investment requirements and manufacturing cost for a given high volume manufacturing (HVM) process based on expected volume is critical for strategic decisions, product placement and investor communications. This paper describes the use of an updated process based cost model that permits the comparison of manufacturing cost data for various process combinations, production volumes, and electrolyte layer thickness tolerances. The effect of process yield is addressed. Processing methods discussed include tape casting, screen printing and sputtering."
|
Page generated in 0.0786 seconds