• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 14
  • 11
  • 9
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 74
  • 74
  • 44
  • 25
  • 22
  • 18
  • 12
  • 11
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Modeling, measurement and performance of World Wide Web transactions

Barford, Paul R. January 2001 (has links)
Thesis (Ph.D.)--Boston University / PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you. / The size, diversity and continued growth of the World Wide Web combine to make its understanding difficult even at the most basic levels. The focus of our work is in developing novel methods for measuring and analyzing the Web which lead to a deeper understanding of its performance. We describe a methodology and a distributed infrastructure for taking measurements in both the network and end-hosts. The first unique characteristic of the infrastructure is our ability to generate requests at our Web server which closely imitate actual users. This ability is based on detailed analysis of Web client behavior and the creation of the Scalable URL Request Generator (SURGE) tool. SURGE provides us with the flexibility to test different aspects of Web performance. We demonstrate this flexibility in an evaluation of the 1.0 and 1.1 versions of the Hyper Text Transfer Protocol. The second unique aspect of our approach is that we analyze the details of Web transactions by applying critical path analysis (CPA). CPA enables us to precisely decompose latency in Web transactions into propagation delay, network variation, server delay, client delay and packet loss delays. We present analysis of pe1formance data collected in our infrastructure. Our results show that our methods can expose surprising behavior in Web servers, and can yield considerable insight into the causes of delay variability in Web transactions. / 2031-01-01
32

The Concurrent Development Scheduling Problem (CDSP)

Paul, Leroy W 27 October 2005 (has links)
The concurrent development (CD) project is defined as the concurrent development of both hardware and software that is integrated together later for a deliverable product. The CD Scheduling Problem (CDSP) is defined as most CD baseline project schedules being developed today are overly optimistic. That is, they finish late. This study researches those techniques being used today to produce CD project schedules and looks for ways to close the gap between the baseline project schedule and reality. In Chapter 1, the CDSP is introduced. In Chapter 2, a review is made of published works. A review is also made of commercial scheduling software applications to uncover their techniques as well as a review of organizations doing research on improving project scheduling. In Chapter 3, the components of the CDSP are analyzed for ways to improve. In Chapter 4, the overall methodology of the research is discussed to include the development of the Concurrent Development Scheduling Model (CDSM) that quantifies the factors driving optimism. The CDSM is applied to typical CD schedules with the results compared to Monte Carlo simulations of the same schedules. The results from using the CDSM on completed CD projects are also presented. The CDSM does well in predicting the outcome. In Chapter 5, the results of the experiments run to develop the CDSM are given. In Chapter 6 findings and recommendations are given. Specifically, a list of findings is given that a decision maker can use to analyze a baseline project schedule and assess the schedules optimism. These findings will help define the risks in the CD schedule. Also included is a list of actions that the decision maker may be able to take to reduce of the risk of the project to improve the chances of coming in on time.
33

Methodology for the multi-objective, resource-constrained project scheduling problem

Nudtasomboon, Nudtapon 12 March 1993 (has links)
This study is concerned with the problem of resource-constrained project scheduling which includes splittable and nonsplittable jobs, renewable and nonrenewable resources, variation in resource availabi1ity, time-resource tradeoff, time-cost tradeoff, and multiple objectives. The problem is formulated as a zero-one integer programming model. A specialized solution technique is developed for the preemptive goal programming, resource-constrained project. scheduling problem for time, cost, and resource leveling objectives. In addition, single objective algorithms are also provided for the time, cost, and resource leveling objectives. These algorithms are based on the idea of the implicit enumeration process, and use the special structures of the problem to expedite the search process. Computer-generated problems are used to test each of the single objective algorithms. The results show that the algorithms give optimal solutions to tested problems with time and cost objectives using a reasonable computation time; however, heuristic solutions are more feasible for problems with resource leveling objective. The multiple objective algorithm is illustrated through application to a warehouse project problem. / Graduation date: 1993
34

Branch and Bound Algorithm for Multiprocessor Scheduling

Rahman, Mostafizur January 2009 (has links)
The multiprocessor task graph scheduling problem has been extensively studied asacademic optimization problem which occurs in optimizing the execution time of parallelalgorithm with parallel computer. The problem is already being known as one of the NPhardproblems. There are many good approaches made with many optimizing algorithmto find out the optimum solution for this problem with less computational time. One ofthem is branch and bound algorithm.In this paper, we propose a branch and bound algorithm for the multiprocessor schedulingproblem. We investigate the algorithm by comparing two different lower bounds withtheir computational costs and the size of the pruned tree.Several experiments are made with small set of problems and results are compared indifferent sections.
35

Statistical Critical Path Identification and Classification

Panagiotakopoulos, Georgios 01 May 2011 (has links)
This thesis targets the problem of critical path identification in sub-micron devices. Delays are described using Probability density functions (Pdfs) in order to model the probabilistic nature of the problem. Thus, a deterministic critical path response is not possible. The probability that each path is critical is reported instead. Extensive literature review has being done and presented in detail. Heuristics for accurate critical path calculations are described and results are compared to those from Monte Carlo simulations.
36

Geração de fraturas autossimilares em meios desordenados: técnicas do caminho crítico e do caminho mínimo / Generating self-similar fractures in disordered media: techniques of critical path and the minimal path

Oliveira, Erneson Alves de January 2008 (has links)
OLIVEIRA, Erneson Alves de. Geração de fraturas autossimilares em meios desordenados: técnicas do caminho crítico e do caminho mínimo. 2008. 54 f. Dissertação (Mestrado em Física) - Programa de Pós-Graduação em Física, Departamento de Física, Centro de Ciências, Universidade Federal do Ceará, Fortaleza, 2008. / Submitted by Edvander Pires (edvanderpires@gmail.com) on 2014-11-03T20:05:53Z No. of bitstreams: 1 2008_dis_eaoliveira.pdf: 13308297 bytes, checksum: 51bfea9dc79470d1077454f8be1b593a (MD5) / Approved for entry into archive by Edvander Pires(edvanderpires@gmail.com) on 2014-11-03T20:12:15Z (GMT) No. of bitstreams: 1 2008_dis_eaoliveira.pdf: 13308297 bytes, checksum: 51bfea9dc79470d1077454f8be1b593a (MD5) / Made available in DSpace on 2014-11-03T20:12:15Z (GMT). No. of bitstreams: 1 2008_dis_eaoliveira.pdf: 13308297 bytes, checksum: 51bfea9dc79470d1077454f8be1b593a (MD5) Previous issue date: 2008 / In this work we propose two models for fracture generation in regular substrates. In the first model, we iteratively apply the concept of critical path to systematically determine the lower “conductivity” element in the connected spanning network. At each iteration, once these elements are identified as local “cracks ́ ́, they are permanently removed from the structure up to the point in which a macroscopic fracture can destroy the global network connectivity. This fracture is then topologically characterized as self-similar with fractal dimension Dp ≈ 1.21. In the second model, we employ the algorithm of Dijkstra to determine the minimal path in a random energy landscape and remove its highest energy element. As in the previous model, these elements are considered to be local “cracks ́ ́ till a subset of them can be identified as a macroscopic fracture. The average over many samples of fractures calculated for different system sizes reveals the presence of a self-similar structure with fractal dimension Df ≈ 1.21. The resemblance between the two exponents Dp e Df suggests that the two models belong to the same universality class. / Neste trabalho propomos dois modelos para a geração de fraturas em substratos regulares. No primeiro modelo, empregamos iterativamente o conceito de caminho crítico para determinar sistematicamente o elemento de menor “condutividade” da rede. Estes elementos são então identificados como “falhas” e removidos permanentemente da estrutura até que uma fratura macroscópica destrua a conectividade global da rede. Uma vez detectada, esta fratura é caracterizada topologicamente como uma estrutura auto-similar de dimensão fractal Dp ≈ 1.21. No segundo modelo, empregamos iterativamente o algoritmo de Dijkstra para determinar o caminho mínimo em uma paisagem aleatória, retirando sistematicamente desta estrutura o elemento de maior energia. Como no modelo anterior, estes elementos são identificados como “falhas” até que um conjunto conecto deles resulte em uma fratura macroscópica. A média realizada sobre várias amostras de fraturas em diferentes tamanhos de substratos revela a presença de uma estrutura auto-similar de dimensão fractal Df ≈ 1.21. A semelhança numérica entre os expoentes Dp e Df sugere que os dois modelos pertencem à mesma classe de universalidade.
37

Optimalizace projektového managementu pomocí IT technologií / Optimization of Project Management using IT Technologies

Kořínek, Tomislav January 2014 (has links)
The goal of the dissertation is to find an effective form of optimization of the progress developer projects using IT technology in the point of view of a project manager. The theoretical part of the dissertation is dedicated to the project management's theory and clarification all used terms. After the theoretical part follows the description of progress of a standard developer project of building a department store. The fourth part is dedicated to the freely available software for a project management in perspective of applicability for a project control, a documentation process and sharing comments to be used inside a company in relation to the other external companies. Then follows a description of the concrete developer project in help of the management theory contained in the second part including activities definition, their logical sequence and competence math. Within the seventh part the first is identified a critical path in using CPM method (Critical Path Method) and then there are described the critical parts of the project and a description of solving the optimization of the project management, which provides a selected software product. At the end of the seventh part is appraised how successfully the critical parts of the project were optimized and increased the efficiency of the work of the project team.
38

Návrh projektu a aplikace metodiky projektového managementu v podniku / Project Design and Project Management Methodology Application in a Company

Klaudíny, Michal January 2012 (has links)
The master's thesis considers a project proposal for improvement of hardware equipment in the company VURAL, a.s.. Project management methods and procedures defined by IPMA (International Project Management Association) are used in this thesis.
39

Design of time-phased critical path scheduling logic in remanufacturing Material Requirements Planning

Tucker, Gerald E 25 November 2020 (has links)
This thesis develops and presents a new remanufacturing MRP time-phased scheduling algorithm utilizing a critical path concept, as in the project management field, for incorporation into remanufacturing production planning MRP calculations. The algorithm automates the remanufacturing lead time allowance calculation for child subassemblies and component parts in the form of Stack Time, and as such creates a linkage between the parent remanufacturing routing operation to which a remanufactured subassembly or component part is allocated for further processing, and the parent routing operation from which it is disassembled. This new MRP scheduling algorithm is optimal for calculating the total planned production time of remanufacturing production routings, and is appropriate for even large, complex, multilevel BOM structures.
40

An investigation into the use of construction delay and disruption analysis methodologies

Braimah, Nuhu January 2008 (has links)
Delay and disruption (DD) to contractors’ progress, often resulting in time and cost overruns, are a major source of claims and disputes in the construction industry. At the heart of the matter in dispute is often the question of the extent of each contracting party’s responsibility for the delayed project completion and extra cost incurred. Various methodologies have been developed over the years as aids to answering this question. Whilst much has been written about DD, there is limited information on the extent of use of these methodologies in practice. The research reported in this thesis was initiated to investigate these issues in the UK, towards developing a framework for improving DD analysis. The methodology adopted in undertaking this research was the mixed method approach involving first, a detailed review of the relevant literature, followed by an industry-wide survey on the use of these methodologies and associated problems. Following this, interviews were conducted to investigate the identified problems in more depth. The data collected were analysed, with the aid of SPSS and Excel, using a variety of statistical methods including descriptive statistics analysis, relative index analysis, Kendall’s concordance and factor analysis. The key finding was that DD analysis methodologies reported in the literature as having major weaknesses are the most widely used in practice mainly due to deficiencies in programming and record keeping practice. To facilitate the use of more reliable methodologies, which ensure more successful claims resolution with fewer chances of disputes, a framework has been developed comprising of: (i) best practice recommendations for promoting better record-keeping and programming practice and; (ii) a model for assisting analysts in their selection of appropriate delay analysis methodology for any claims situation. This model was validated by means of experts’ review via a survey and the findings obtained suggest that the model is valuable and suitable for use in practice. Finally, areas for further research were identified.

Page generated in 0.0564 seconds