• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 5
  • 1
  • 1
  • 1
  • Tagged with
  • 25
  • 25
  • 6
  • 6
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

An empirical investigation into the estimation of software development effort

Hughes, Robert T. January 1997 (has links)
Any guidance that might help to reduce the problems of accurately estimating software development effort could assist software producers to set more realistic budgets for software projects. This investigation attempted to make a contribution to this by documenting some of the practical problems with introducing structured effort estimation models at a site in the United Kingdom of an international supplier of telephone switching software. The theory of effort modelling was compared with actual practice by examining how the estimating experts at the telephone switching software producer currently carried out estimating. Two elements of the estimation problem emerged: judging the size of the job to be done and gauging the productivity of the development environment. Expert opinion was particularly important to the initial process, particularly when existing software was being enhanced. The study then identified development effort drivers and customised effort models applicable to real-time telecommunications applications. Many practical difficulties were found concerning the actual methods used to record past project data, although the issues surrounding these protocols appeared to be rarely dealt with explicitly in the research literature. The effectiveness of the models was trialled by forecasting the effort for some new projects and then comparing these estimates with the actual effort. The key research outcomes were, firstly the identification and validation of a set of relevant functional effort drivers applicable in a real-time telecommunications software development environment and the building of an effective effort model, and, secondly, the evaluation of alternative prediction approaches including analogy or case-based reasoning. While analogy was a useful tool, some methods of implementing analogy were flawed theoretically and did not consistently outperform 'traditional' model building techniques such as Least Squares Regression (LSR) in the environment under study. This study would, however, support analogy as a complementary technique to algorithmic modelling
2

Descriptive multi-resource and multi-project cost models for subcontractors

Kim, Dae Young, 1977- 21 July 2015 (has links)
Subcontractors have finite resources that should be allocated simultaneously across many projects in a dynamic manner. Significant scheduling problems are posed by concurrent multi-projects with limited resources. Unfortunately, with a lack of appropriate models, subcontractors mostly make decisions based on their previous experiences to allocate their resources to multiple projects. Moreover, subcontractors frequently reallocate their resources in response to schedule changes and site conditions. In response to schedule changes or project demand, these factors should be taken into account before reallocating resources: site conditions, completion dates, overtime usage, productivity, and complementarity. This research has developed a descriptive cost model for a subcontractor with multi-resources and multi-projects. The model was designed for a subcontractor to use as a decision-making tool for resources allocation and scheduling. The model identified several factors affecting productivity. Moreover, when the model was tested using hypothetical data, it produced some effective combinations of resource allocation with associated total costs. The objective of this thesis is to identify the effect of productivity changes on the total cost resulting from shifting crews across projects using a descriptive model. Furthermore, a subcontractor minimizes total costs by balancing overtime costs, tardiness penalties, and incentive bonus, while satisfying available processing time constraints. / text
3

TRICARE versus FEHBP : a pilot study of comparative inpatient costs in region 10.

Hone, Anne Burke. January 1997 (has links)
Thesis (M.S. in Operations Research) Naval Postgraduate School, June 1997. / Thesis advisors, Donald P. Gaver, James A. Scaramozzino. Includes bibliographical references (p. 77-79). Also available online.
4

Comparative Analysis of the Cost Models Used for Estimating Renovation Costs of Universities in Texas

Faquih, Yaquta Fakhruddin 2010 August 1900 (has links)
Facility managers use various cost models and techniques to estimate the cost of renovating a building and to secure the required funds needed for building renovation. A literature search indicates that these techniques offer both advantages and disadvantages that need to be studied and analyzed. Descriptive statistical methods and qualitative analysis are employed to identify and compare techniques used by facility managers to calculate the expected renovation costs of a building. The cost models presently used to predict the cost and accumulate the budget required for renovation of a building were determined through interviews with ten Texas-based university facilities managers. The data and information gathered were analyzed and compared. Analysis of results suggests that traditional methods like Floor Area Method (FAM) is the most accurate, less time consuming, easy to use as well as convenient for data collection. Case-Based Reasoning (CBR), though not as widely used as FAM, is known to facilities managers. This is due to the fact that, if a new type of project needs to be renovated, and the data for a similar project is not available with the facilities manager, a completely new database needs to be created. This issue can be resolved by creating a common forum where data for all types of project could be made available for the facilities managers. Methods such as regression analysis and neural networks are known to give more accurate results. However, of the ten interviewees, only one was aware of these new models but did not use them as they would be helpful for very large projects and they would need expertise. Thus such models should be simplified to not only give accurate results in less time but also be easy to use. These results may allow us to discuss changes needed within the various cost models.
5

Querying sensor networks : requirements, semantics, algorithms and cost models

Brenninkmeijer, Christian Y. A. January 2010 (has links)
No description available.
6

Applying Costing Models for Competitive Advantage

Petcavage, Sheila 01 January 2016 (has links)
Making good supply management decisions is essential to competing in the global market, as these decisions often account for more than 60% of the average company's total costs. The purpose for this single case study was to explore the strategy that a large manufacturing firm in northeast Ohio used to identify costs when making effective purchasing decisions. The total cost of ownership (TCO) theory was the conceptual framework for the study. The data collection included a semistructured interview with a senior level supply manager and a focus group consisting of mid-level supply managers. Member checking provided verification of the interpreted participants' responses. Methodological triangulation included 2 company documents pertinent to the supply management department that resulted in 4 emerging themes: identifying total costs, tools for implementing TCO, supplier rating and management, and detailed recordkeeping. The findings of this study revealed a simpler approach to capturing and organizing data than was acknowledged in the literature reviewed. The findings showed TCO supported purchasing decisions that often resulted in domestically or regionally purchased products rather than offshore buys. Therefore, reassessment of true total costs by senior manufacturing supply managers might impact social change as more procurement decisions forego sourcing offshore and bring manufacturing of products back to local communities.
7

Economically optimum design of cusum charts when there is a multiplicity of assignable causes

Hsu, Margaretha Mei-Ing 02 March 2010 (has links)
This study is concerned with the design of cumulative sum charts based on a minimum cost criterion when there are multiple assignable causes occurring randomly, but with known effect. A cost model is developed that relates the design parameters (i.e. sampling interval, decision limit, reference value and sample size) of a cusum chart and the cost and risk factors of the process to the long run average loss cost per hour for the process. Optimum designs for various sets of cost and risk factors are found by minimizing the long run average loss-cost per hour of the process with respect to the design parameters of a cusum chart. Optimization is accomplished by use of Brown's method. A modified Brownian motion approximation is used for calculating ARLs in the cost model. The nature of the loss-cost function is investigated numerically. The effects of changes in the design parameters and in the cost and risk factors are also studied. An investigation of the limiting behavior of the loss-cost function as the decision limit approaches infinity reveals that in some cases there exist some points that yield a lower loss-cost than that of the local minimum obtained by Brown's method. It is conjectured that if the model is extended to include more realistic assumption about the occurrence of assignable causes then only the local minimum solutions will remain. This paper also shows that the multiple assignable cause model can be well approximated by a matched single cause model. Then in practice it may be sufficient to find the optimum design for the matched. single cause model. / Ph. D.
8

Strategic and operational capabilities in steel production : Product variety and performance

Storck, Joakim January 2009 (has links)
Steel producers that employ niche market strategies are continuously seeking to reduce production cost while maintaining adiverse product mix. The business model is typically based onmarketing of high–strength special or stainless steels. However,the desire to avoid direct cost competition is over time gradually leading towards increased product variety and smaller ordervolumes (tonnes per order) for each product. This thesis analyses how production cost is linked to productvariety in steel strip production. Results are based on new modelsfor assessment of opportunities for performance improvement inhigh product–variety steel production. The need for flexible production processes increases with increasing product variety. Operational capabilities linked to processflexibility determine the extent to which steel producers caneliminate in–process inventory and accomplish close coupling between process steps. Niche market producers that invest inprocess flexibility improvements can lower production costs bothdue to reduced work–in–process and lower energy consumption.An additional benefit is reduced environmental impact. The following problems are addressed: • Development of a method to assess the influence of productvariety on performance in steel production. • Development of models of continuous casting and hotrolling that account for product variety and cost effectswith consideration of varying degrees of process flexibility. • Development of a strategy process model that focus on thestrategic value of operational capabilities related to processflexibility. Investments in operational capabilities regarding process flexibility have a strategic impact. An appreciation for the effectsof process flexibility should permeate the organisation’s daily work since the accumulated contribution of many, seemingly unimportant, incremental changes significantly influences thestrategic opportunities of the company. / Stålproducenter med nischmarknadsstrategier försöker ständigt sänka sina produktionskostnader samtidigt som en varierad produktflora bibehålls. Affärsmodellen bygger i typfallet på försäljning av höghållfasta specialstål eller rostfria stål. Strävan att undvika direkt priskonkurrens leder dock med tiden gradvis till ökad produktvariation och mindre ordervolymer (ton per order)för varje produkt. Denna avhandling analyserar hur produktionskostnaden är kopplad till graden av produktvariation vid tillverkning av band.Resultaten bygger på nya modeller för utvärdering av förutsättningarnaför prestandaförbättring i stålindustri med stor produktvariation. Behovet av flexibla produktionsprocesser ökar med ökande produktvariation. Praktiska förmågor kopplade till processflexibilitet avgör till vilken grad ståltillverkare förmår att eliminera mellanlager och åstadkomma en tät koppling mellan processteg.Nischmarknadsproducenter som investerar i förbättrad flexibilitet kan sänka sina produktionskostnader både genom minskad mängd produkter i arbete och reducerad energiförbrukning. Detta medför också minskad miljöbelastning.Följande problemställningar adresseras: • Utveckling av en metod för att utvärdera inverkan av produktvariationpå prestanda vid ståltillverkning. • Utveckling av en modell för stränggjutning och varmvalsningsom tar hänsyn till produktvariation och kostnadseffekterför olika grad av processflexibilitet. • Utveckling av en strategimodell som fokuserar på det strategiskavärdet av operativa förmågor kopplade till processflexibilitet. Investeringar i operativa förmågor vad avser processflexibilitet är av strategisk betydelse. Förståelse för betydelsen av processflexibilitetbör genomsyra det dagliga arbetet eftersom det samladebidraget av många, till synes obetydliga, små förändringar haren avgörande inverkan på företagets strategiska förutsättningar / QC 20100810
9

Une méthode d'optimisation hybride pour une évaluation robuste de requêtes / A Hybrid Method to Robust Query Processing

Moumen, Chiraz 29 May 2017 (has links)
La qualité d'un plan d'exécution engendré par un optimiseur de requêtes est fortement dépendante de la qualité des estimations produites par le modèle de coûts. Malheureusement, ces estimations sont souvent imprécises. De nombreux travaux ont été menés pour améliorer la précision des estimations. Cependant, obtenir des estimations précises reste très difficile car ceci nécessite une connaissance préalable et détaillée des propriétés des données et des caractéristiques de l'environnement d'exécution. Motivé par ce problème, deux approches principales de méthodes d'optimisation ont été proposées. Une première approche s'appuie sur des valeurs singulières d'estimations pour choisir un plan d'exécution optimal. A l'exécution, des statistiques sont collectées et comparées à celles estimées. En cas d'erreur d'estimation, une ré-optimisation est déclenchée pour le reste du plan. A chaque invocation, l'optimiseur associe des valeurs spécifiques aux paramètres nécessaires aux calculs des coûts. Cette approche peut ainsi induire plusieurs ré-optimisations d'un plan, engendrant ainsi de mauvaises performances. Dans l'objectif d'éviter cela, une approche alternative considère la possibilité d'erreurs d'estimation dès la phase d'optimisation. Ceci est modélisé par l'utilisation d'un ensemble de points d'estimations pour chaque paramètre présumé incertain. L'objectif est d'anticiper la réaction à une sous-optimalité éventuelle d'un plan d'exécution. Les méthodes dans cette approche cherchent à générer des plans robustes dans le sens où ils sont capables de fournir des performances acceptables et stables pour plusieurs conditions d'exécution. Ces méthodes supposent souvent qu'il est possible de trouver un plan robuste pour l'ensemble de points d'estimations considéré. Cette hypothèse reste injustifiée, notamment lorsque cet ensemble est important. De plus, la majorité de ces méthodes maintiennent sans modification un plan d'exécution jusqu'à la terminaison. Cela peut conduire à de mauvaises performances en cas de violation de la robustesse à l'exécution. Compte tenu de ces constatations, nous proposons dans le cadre de cette thèse une méthode d'optimisation hybride qui vise deux objectifs : la production de plans d'exécution robustes, notamment lorsque l'incertitude des estimations utilisées est importante, et la correction d'une violation de la robustesse pendant l'exécution. Notre méthode s'appuie sur des intervalles d'estimations calculés autour des paramètres incertains, pour produire des plans d'exécution robustes. Ces plans sont ensuite enrichis par des opérateurs dits de contrôle et de décision. Ces opérateurs collectent des statistiques à l'exécution et vérifient la robustesse du plan en cours. Si la robustesse est violée, ces opérateurs sont capables de prendre des décisions de corrections du reste du plan sans avoir besoin de rappeler l'optimiseur. Les résultats de l'évaluation des performances de notre méthode indiquent qu'elle fournit des améliorations significatives dans la robustesse d'évaluation de requêtes. / The quality of an execution plan generated by a query optimizer is highly dependent on the quality of the estimates produced by the cost model. Unfortunately, these estimates are often imprecise. A body of work has been done to improve estimate accuracy. However, obtaining accurate estimates remains very challenging since it requires a prior and detailed knowledge of the data properties and run-time characteristics. Motivated by this issue, two main optimization approaches have been proposed. A first approach relies on single-point estimates to choose an optimal execution plan. At run-time, statistics are collected and compared with estimates. If an estimation error is detected, a re-optimization is triggered for the rest of the plan. At each invocation, the optimizer uses specific values for parameters required for cost calculations. Thus, this approach can induce several plan re-optimizations, resulting in poor performance. In order to avoid this, a second approach considers the possibility of estimation errors at the optimization time. This is modelled by the use of multi-point estimates for each error-prone parameter. The aim is to anticipate the reaction to a possible plan sub-optimality. Methods in this approach seek to generate robust plans, which are able to provide good performance for several run-time conditions. These methods often assume that it is possible to find a robust plan for all expected run-time conditions. This assumption remains unjustified. Moreover, the majority of these methods maintain without modifications an execution plan until the termination. This can lead to poor performance in case of robustness violation at run-time. Based on these findings, we propose in this thesis a hybrid optimization method that aims at two objectives : the production of robust execution plans, particularly when the uncertainty in the used estimates is high, and the correction of a robustness violation during execution. This method makes use of intervals of estimates around error-prone parameters. It produces execution plans that are likely to perform reasonably well over different run-time conditions, so called robust plans. Robust plans are then augmented with what we call check-decide operators. These operators collect statistics at run-time and check the robustness of the current plan. If the robustness is violated, check-decide operators are able to make decisions for plan modifications to correct the robustness violation without a need to recall the optimizer. The results of performance studies of our method indicate that it provides significant improvements in the robustness of query processing.
10

An integrated cost model for product line engineering

Nóbrega, Jarley Palmeira 31 January 2008 (has links)
Made available in DSpace on 2014-06-12T15:51:39Z (GMT). No. of bitstreams: 2 arquivo1682_1.pdf: 1782765 bytes, checksum: f72b8949fcd20828665cc0a45ca4034d (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2008 / Dentro da comunidade de desenvolvimento de software, o processo de reutilizar artefatos ao invés de construí-los do zero normalmente conhecido como reuso de software tem se mostrado uma maneira efetiva de evitar os problemas associados ao estouro de orçamentos e cronogramas de projeto. Apesar do imenso potencial, a adoção de reuso em larga escala ainda não prevalece dentro das organizações. Entre os fatores que contribuem para isso, estão os obstáculos econômicos enfrentados pelas empresas, com uma clara preocupação sobre os custos para desenvolver software para e com reuso. Atualmente, as decisões relacionadas com reuso são tratadas sob um ponto de vista econômico, devido ao fato do desenvolvimento de software reutilizável ser considerado pelas organizações como um investimento. Além disso, a adoção de linhas de produto de software dentro desse contexto traz à tona alguns inibidores de reuso, como por exemplo, a aplicação dos modelos de custo para reuso de forma restrita, a falta de uma estratégia para a análise de investimentos, e o fato que poucos modelos de custo possuem uma abordagem baseada na utilização de cenários de reuso. Nesse contexto, esse trabalho apresenta um modelo integrado de custo para engenharia de linhas de produto, com o objetivo de auxiliar as organizações em seus processos de tomada de decisões na avaliação de investimentos em reuso. Os fundamentos para o modelo foram baseados em uma vasta pesquisa sobre modelos de custo para reuso e sua especialização para linhas de produto de software. O modelo apresenta a definição de funções de custo e benefícios, cenários de reuso e uma estratégia de investimento para linhas de produto. Também é apresentado um modelo de simulação baseado na técnica de Monte Carlo. Por último, um estudo de caso discute os resultados de dentro do contexto de um projeto real de desenvolvimento de software, onde o modelo foi aplicado

Page generated in 0.0599 seconds