Spelling suggestions: "subject:"computer software - evaluatuation"" "subject:"computer software - evalualuation""
1 |
Programontwikkelingsmetodologieë17 March 2015 (has links)
M.Sc. (Computer Science) / The data processing crisis in software development today can be ascribed firstly to insufficient requirements defmition, which results from a lack of communication between developer and user, and secondly to insufficient project management. During the last decade we succeeded in adding more control and discipline to the traditional software development life cycle, but requirements specification remains a problem. The traditional software development life-cycle is long and inflexible and the results do not satisfy the requirements of the user. The prototyping approach can be part of a solution to the problems posed by this situation. The author proposes a four-dimensional conceptual model as a framework for a Prototyping methodology that was developed as basis for this study. In business practice today, confusion exists as to what prototypes are the best to use - prototypes that are developed to become the complete system, or prototypes that are thrown away. Dimension one of the model is discussed in terms of type of prototype. With type of prototype is meant one of the different approaches to prototyping in the software development process. The author standardized on throw-away prototypes and evolutionary prototypes. The most general and well-known usage of prototyping is during the requirements :definition phase. However, this is not the only use of prototyping. Dimension two of the model describes the different areas of usage of prototyping, e.g. requirements definition, as technique during JAD sessions, during simulation, during the minimizing of risk and in the development of working models. The development of prototypes should be an easy and rapid process, however, this is dependent on the tools that are used in the process. Dimension three of the model is discussed in terms of tools.
|
2 |
Design metrics forensics : an analysis of the primitive metrics in the Zage design metricsKwan, Pak Leung January 1994 (has links)
The Software Engineering Research Center (SERC) Design Metrics Research Team at Ball State University has developed a design metric D(G) of the form:D(G) = D~ + DiWhere De is the architectural design metric (external design metric) and D; is the detailed design metric (internal design metric).Questions to be investigated in this thesis are:Why can D, be an indicator of the potential error modules?Why can D; be an indicator of the potential error modules?Are there any significant factors that dominate the design metrics?In this thesis, the report of the STANFINS data is evaluated by using correlation analysis, regression analysis, and several other statistical techiques. The STANFINS study is chosen because it contains approximately 532 programs, 3,000 packages and 2,500,000 lines of Ada.The design metrics study was completed on 21 programs (approximately 24,000 lines of code) which were selected by CSC development teams. Error reports were also provided by CSC personnel. / Department of Computer Science
|
3 |
Application of lessons learned into legacy software systemsLuce, Jennifer Lynne 01 July 2000 (has links)
No description available.
|
4 |
Software reliability prediction based on design metricsStineburg, Jeffrey January 1999 (has links)
This study has presented a new model for predicting software reliability based on design metrics. An introduction to the problem of software reliability is followed by a brief overview of software reliability models. A description of the models is given, including a discussion of some of the issues associated with them. The intractability of validating life-critical software is presented. Such validation is shown to require extended periods of test time that are impractical in real world situations. This problem is also inherent in fault tolerant software systems of the type currently being implemented in critical applications today. The design metrics developed at Ball State University is proposed as the basis of a new model for predicting software reliability from information available during the design phase of development. The thesis investigates the proposition that a relationship exists between the design metric D(G) and the errors that are found in the field. A study, performed on a subset of a large defense software system, discovered evidence to support the proposition. / Department of Computer Science
|
5 |
Using the Design Metrics Analyzer to improve software qualityWilburn, Cathy A. January 1994 (has links)
Effective software engineering techniques are needed to increase the reliability of software systems, to increase the productivity of development teams, and to reduce the costs of software development. Companies search for an effective software engineering process as they strive to reach higher process maturity levels and produce better software. To aid in this quest for better methods of software engineering. the Design Metrics Research Team at Ball State University has analyzed university and industry software to be able to detect error-prone modules. The research team has developed, tested and validated their design metrics and found them to be highly successful. These metrics were typically collected and calculated by hand. So that these metrics can be collected more consistently, more accurately and faster, the Design Metrics Analyzer for Ada (DMA) was created. The DMA collects metrics from the files submitted based on a subprogram level. The metrics results are then analyzed to yield a list of stress points, which are modules that are considered to be error-prone or difficult for developers. This thesis describes the Design Metrics Analyzer, explains its output and how it functions. Also, ways that the DMA can be used in the software development life cycle are discussed. / Department of Computer Science
|
6 |
An empirical study of software design balance dynamicsBhattrai, Gopendra R. January 1995 (has links)
The Design Metrics Research Team in the Computer Science Department at Ball State University has been engaged in developing and validating quality design metrics since 1987. Since then a number of design metrics have been developed and validated. One of the design metrics developed by the research team is design balance (DB). This thesis is an attempt to validate the metric DB. In this thesis, results of the analysis of five systems are presented. The main objective of this research is to examine if DB can be used to evaluate the complexity of a software design and hence the quality of the resulting software. Two of the five systems analyzed were student projects and the remaining three were from industry. The five systems analyzed were written in different languages, had different sizes and exhibited different error rates. / Department of Computer Science
|
7 |
Performance measurement as a tool for software engineeringVan Aardt, Jan Markus 22 July 2005 (has links)
Some software development teams regard software performance measurement as a mere luxury. When it happens, it often tends to be infrequent, insufficient and subjective. Countless software projects were sent into an uncontrollable spiral of poor management and unsatisfactory results. By revisiting old ideas and policies, many companies have turned themselves around. To ensure that software engineering does the same, technologies and procedures have to be reevaluated. The fact that many companies have decided to cut costs on technology expenditure necessitates software development teams to look for alternative options for deploying high performance software systems. As many companies are moving into the electronic era and evolving to the next stage of evolution, electronic commerce, the more important it has become to apply these concepts on Internet development projects and procedures. The Internet market has shown that two software providers are aiming for worldwide domination of Internet server deployment, being Microsoft and Apache. Currently, the Apache web server is the most commonly used server on the Internet today (60%), with Microsoft's Internet Information Server (25%) in a strong second place. The need for higher throughput and better services is getting more with each passing day. It increases the pressure on these two software vendors to provide the best architecture for their clients' needs. This study intends to provide the reader with an objective view of a basic performance comparison between these two products and tries to find a correlation between the performance tests and the products' popularity standings. The tests for this study were performed on identical hardware architectures with one difference, being the operating system. By comparing the costly proprietary Microsoft solution with its cheaper open source rival, Linux, certain opinions were tested. Would a product developed by a software company that invests millions of dollars in their products perform better than this free-for-all solution, or would the selfless inputs of hundreds of developers all over the world finally payoff through the creation of the world's best Internet server? The results of these tests were evaluated through formal statistical methods, providing overall comparisons of several common uses of web servers. These results were implemented in a small field test to prove the findings in practice with some interesting outcomes in terms of supportive technologies, new rapid application development (RAD) tools and data access models. This research in itself will not change the mind of any Internet programmer. What it hopes to achieve is to demonstrate software engineers that current processes and methods of developing software are not always the right way of doing things. Furthermore, it highlights many important factors often ignored or overlooked while managing software projects. Change management, process re-engineering and risk management form crucial elements of software development projects. By not adhering to certain critical elements of software development, software projects stand the chance of not reaching their goals and could even fail completely. Performance measurement acts as a tool for software engineering, providing guidelines for technological decisions, project management and ultimately, project success. / Dissertation (MSc (Computer Science))--University of Pretoria, 2005. / Computer Science / unrestricted
|
8 |
Utilizing Runtime Information for Accurate Root Cause Identification in Performance DiagnosisWeng, Lingmei January 2023 (has links)
This dissertation highlights that existing performance diagnostic tools often become less effective due to their inherent inaccuracies in modern software. To overcome these inaccuracies and effectively identify the root causes of performance issues, it is necessary to incorporate supplementary runtime information into these tools. Within this context, the dissertation integrates specific runtime information into two typical performance diagnostic tools: profilers and causal tracing tools.
The integration yields a substantial enhancement in the effectiveness of performance diagnosis. Among these tools, gprof stands out as a representative profiler for performance diagnosis. Nonetheless, its effectiveness diminishes as the time cost calculated based on CPU sampling fails to accurately and adequately pinpoint the root causes of performance issues in complex software. To tackle this challenge, the dissertation introduces an innovative methodology called value-assisted cost profiling (vProf). This approach incorporates variable values observed during runtime into the profiling process.
By continuously sampling variable values from both normal and problematic executions, vProf refines function cost estimates, identifies anomalies in value distributions, and highlights potentially problematic code areas that could be the actual sources of performance is- sues. The effectiveness of vProf is validated through the diagnosis of 18 real-world performance is- sues in four widely-used applications. Remarkably, vProf outperforms other state-of-the-art tools, successfully diagnosing all issues, including three that had remained unresolved for over four years.
Causal tracing tools reveal the root causes of performance issues in complex software by generating tracing graphs. However, these graphs often suffer from inherent inaccuracies, characterized by superfluous (over-connected) and missed (under-connected) edges. These inaccuracies arise from the diversity of programming paradigms. To mitigate the inaccuracies, the dissertation proposes an approach to derive strong and weak edges in tracing graphs based on the vertices’ semantics collected during runtime. By leveraging these edge types, a beam-search-based diagnostic algorithm is employed to identify the most probable causal paths. Causal paths from normal and buggy executions are differentiated to provide key insights into the root causes of performance issues. To validate this approach, a causal tracing tool named Argus is developed and tested across multiple versions of macOS. It is evaluated on 12 well-known spinning pinwheel issues in popular macOS applications. Notably, Argus successfully diagnoses the root causes of all identified issues, including 10 issues that had remained unresolved for several years.
The results from both tools exemplify a substantial enhancement of performance diagnostic tools achieved by harnessing runtime information. The integration can effectively mitigate inherent inaccuracies, lend support to inaccuracy-tolerant diagnostic algorithms, and provide key insights to pinpoint the root causes.
|
9 |
Analysis of PSP-like processes for software engineeringConrad, Paul Jefferson 01 January 2006 (has links)
The purpose of this thesis is to provide the California State University, San Bernardino, Department of Computer Science with an analysis and recommended solution to improving the software development process.
|
10 |
Microcomputer-assisted site design in landscape architecture: evaluation of selected commercial softwareHahn, Howard Davis. January 1985 (has links)
Call number: LD2668 .T4 1985 H33 / Master of Landscape Architecture
|
Page generated in 0.1219 seconds