• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 117
  • 104
  • 29
  • 12
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 338
  • 338
  • 338
  • 109
  • 105
  • 85
  • 76
  • 60
  • 56
  • 47
  • 46
  • 46
  • 40
  • 39
  • 39
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

The formulation of performance measurements for software development projects

Hughes, Wayne Guy 05 February 2014 (has links)
M.B.A / The performance of project delivery in the software development domain has a poor history of successes. This research focused on identifying some of the reasons for this poor performance for software development projects in order to propose a possible framework for measuring and evaluating a software development project's performance. The proposed framework is intended to link individual project performance into the strategic performance measurements for an organisation and hence includes the aspect of being able to evaluate an individual project's performance relative to other projects within a programme, organisation or industry. The research was conducted through in-depth interviews and literature studies. The following three Key findings resulted from this study: • Firstly that unless an organisation establishes upfront what the intended use of the measurements is, as well as how they as an organisation and project define "project success" in terms of their overall objectives, any measurements taken will be oflittle value. • Secondly that there is still strong support for the generic measurements of Time, Cost, and Quality all within good customer relations, however, these need to take into account the aspects of project complexity, project management skills, the team fit and the accuracy of the estimates. • Thirdly that the ultimate framework adopted by an organisation for measuring and evaluating project performance should be simple without being simplistic and hence should be generic and easy to implement across a broad range of projects.
92

Multi-user interface for group ranking: a user-centered approach

Luk, Wai-Lan 11 1900 (has links)
The proliferation of collaborative computer applications in the past decade has resulted in a corresponding increase in the need for multi-user interfaces. The current research seeks to contribute to the design of a user-centered multi-user interface for a group ranking task. User requirements were identified by observing groups perform the ranking task in a non-computer environment. A design was proposed based on these identified requirements. The user-centered design was compared to preliminary designs based on the intuitions of programmers. The conclusions indicate that an analysis of observations in the non-computer environment does yield insight beyond the initial intuition of programmers. A prototype based on the user-centered design was implemented. Informal user evaluation was performed by observing users working with the prototype and obtaining verbal feedback both on the ease of use of the system and on possible improvements. The informal user evaluation provides evidence for the usefulness of user-centered design. The evaluation also suggests that not all features identified were found useful and not all features necessary were identified. / Business, Sauder School of / Management Information Systems, Division of / Graduate
93

An implementation and analysis of the configurable security architecture

Hardy, Alexandre 10 September 2012 (has links)
M.Sc. / The Configurable Security Architecture (ConSA) describes an architecture that may be used to implement a wide variety of security policies. The architecture supports application and system security, unlike traditional security systems. ConSA allows for various degrees of security and efficiency determined by the implementation of the system. Arbitrary security policies may be implemented and possibly changed even if the system is running. If such an architecture were adopted by the industry, a wide variety of security policies could be assembled with off the shelf components. Such a situation is clearly desirable. This text describes the implementation of a ConSA prototype system. The prototype demonstrates that a configurable security system is possible and that the goals specified above can be met. The prototype is implemented in the Linux operating system due to the large number of UNIX based machines used by corporations. To begin a discussion of a security architecture, classic security models must be revisited. Chapter 2 introduces these models. Chapter 4 describes Linux security features, and how classical security models may be implemented in Linux. As well as an introduction to the environment of the prototype, these chapters will also serve to highlight the abilities of the ConSA model. Various obstacles are encountered in the implementation of a new security architecture. An implementation must strive to support existing applications (with little or no modification to the application) while supporting new features that increase the value of the system. The obstacles that are encountered in the implementation of a ConSA system are investigated and solutions for these obstacles are presented. The ConSA architecture is revised to provide a specification that supports the implementation of the architecture, and specifies the operation of each of the ConSA components sufficiently for an implementation on various platforms. The prototype supports three different implementations of ConSA that demonstrate the ease with which the system can be moved to different architectures, operating environments or security requirements. There have been several extensions to the UNIX security model. Many of these are implemented in the Linux operating system. The ConSA system must improve on these extensions to be a viable security alternative for Linux. Chapter 15 introduces a few of these extensions, many of which provide innovative approaches to security not present in classical models. The implementation of these extensions in the ConSA architecture is provided theoretically to illustrate that ConSA can indeed fulfil the role of these extensions. A prototype must be evaluated to determine if the system is of value. The final chapter investigates the shortcomings of the prototype and together with chapter 4 illustrates the benefits of the ConSA architecture.
94

Integrating formal specification and verification methods in software development

He, Xudong January 1989 (has links)
This dissertation is a part of an intended long-term research project with the objectives to make software development more scientific and rigorous, thereby to achieve better software quality and to facilitate automated software production; and has two major components: the design of the specification transition paradigm for software development and the theoretical study of the system specification phase in the paradigm. First, after an extensive analysis and comparison of various formalisms, a paradigm for integrating various formal specification and verification methods (predicate transition Petri nets, first order temporal logic, the algebraic, the axiomatic, the denotational, and the operational approaches) in software development has been developed. The model more effectively incorporates foremost formalisms than any other models (the Automatic Programming Project [Bal85], the CIP Project [ClP85], the Larch Project [GHW85] and the RAISE Project [MG87]) and has the following distinctive features: (1) specifications are viewed both as a set of products and a set of well-defined steps of a process, (2) specifications (as a set of products) at different development steps are to be written and verified by different formalisms, (3) specification (as a process) spans from the requirement phase to the detailed design phase, (4) specification for both concurrent and sequential software is supported, and (5) specifications for different aspects (concurrent control abstraction, data abstraction, and procedural abstraction) of a piece of software are dealt with separately. Second, an intensive and in-depth investigation of the system specification phase in the paradigm results in: - a design methodology for predicate transition nets, which incorporates the separate definition technique in Ada [Ada83] and state decomposition technique in Statechart [Har88] into the traditional transformation techniques for Petri nets, and therefore will significantly reduce the design complexity and enhance the comprehensibility of large predicate transition net specifications; - the establishment of a fundamental relationship between predicate transition nets and first order temporal logic and the design of an algorithm for systematically translating predicate transition nets into equivalent temporal logic formulae. Therefore the goal to combine the strengths of both formalisms, i.e. to use predicate transition nets as a specification method and to use temporal logic as a verification method, is achieved; and - the discovery of a special temporal logic proof technique based on a Hilbert-style logic system to verify various properties of predicate transition nets and the associated theorems. Thus temporal logic is effectively used as an analysis method for both safety and liveness properties of predicate transition nets. / Ph. D.
95

Personal computer development system software architecture

Antia, Yezdi F. January 1985 (has links)
The rapid proliferation of microprocessor based products has increased the need of Microcomputer Development Systems. The IBM PC's software architecture is modified to make it an Intel Series III compatible development system. Universal development interface (UDI) is used to allow all Intel languages and object modules to execute on the IBM PC. The development languages available are the 8086/8088 assembly language, with F0RTRAN-86 and Pascal-86 as the high level languages. The exact working and operating procedures of the software development tools, like an assembler, compiler, linker, locater, hex to object converter and a debugger are explained in detail. Mathematical support is either through an 8087 or its emulator. Detailed explanation of high level language program execution is given, including the run time support needed. A serial loader program is also available to downline load programs from the IBM PC development system to other target machines, like the SDK-86 single board computer. / M.S.
96

Predictive software design measures

Love, Randall James 11 June 2009 (has links)
This research develops a set of predictive measures enabling software testers and designers to identify and target potential problem areas for additional and/or enhanced testing. Predictions are available as early in the design process as requirements allocation and as late as code walk-throughs. These predictions are based on characteristics of the design artifacts prior to coding. Prediction equations are formed at established points in the software development process called milestones. Four areas of predictive measurement are examined at each design milestone for candidate predictive metrics. These areas are: internal complexity, information flow, defect categorization, and the change in design. Prediction equations are created from the set of candidate predictive metrics at each milestone. The most promising of the prediction equations are selected and evaluated. The single "best" prediction equation is selected at each design milestone. The resulting predictions are promising in terms of ranking areas of the software design by the number of predicted defects. Predictions of the actual number of defects are less accurate. / Master of Science
97

A reliability model incorporating software quality metrics

Yerneni, Ashok January 1989 (has links)
Large scale software development efforts in the past decade have posed a problem in terms of the reliability of the software. The size and complexity of software that is being developed is growing rapidly and integrating diverse pieces of software in the operational environment also poses severe reliability issues, resulting in increased development and operational costs. A number of reliability models have been defined in the literature to deal with problems of this kind. However, most of these models treat the system as a "black box" and do not consider the complexity of the software in its reliability predictions. Also, reliability is predicted after the system had been completely developed leaving little scope for any major design changes to improve system reliability. This thesis reports on an effort to develop a reliability model based on complexity metrics which characterize a software system and runtime metrics which reflect the degree of testing of the system. A complete development of the reliability model is presented here. The model is simple and reflects on our intuition of the software development process and our understanding of the significance of the complexity metrics. Credibility analysis is done on the model by simulating a number of systems and applying the model. Data collected from three FORTRAN coded systems developed for NASA Goddard was used as representative of the actual software systems. An analysis of the results is finally presented. / Master of Science / incomplete_metadata
98

CADMADE, an approach towards a device-independent standard for CAD/CAM software development

Jayaram, Sankar January 1989 (has links)
Every year thousands of specialized CAD/CAM applications programs are developed to meet the needs of industry, education and research. The international 3-D graphics standard, PHIGS, has proven to be very useful in the creation of custom CAD/CAM software. Although PHIGS+ promises to deliver some geometric modeling procedures, not nearly enough is being done to support the writing of CAD/CAM software. CAD/CAM applications programmers should have available a standardized high level applications programming environment which supports the creation of device-independent and portable design and manufacturing software. In this dissertation, one approach towards the establishment of a CAD/CAM programming standard has been presented. This programming environment is called CADMADE - Computer-Aided Design and Manufacturing Applications Development Environment. CADMADE includes not only graphics programming support, but also high level procedures to support the creation of geometric modeling, mechanical design, manufacturing, expert systems and user interface software. The requirements of CADMADE have been created. CADMADE consists of five environments: the User Interface Environment (UIE), the Design and Modeling Environment (DME), the Virtual Manufacturing Environment (VME), the Expert Consultation Environment (ECE) and the PHIGS+ Environment. The User Interface Environment has been designed in great detail. A prototype of the User Interface Environment has been created using PHIGS. Examples of applications programs which use the prototype User Interface Environment are presented. The Design and Modeling Environment has also been designed. A new set of logical input/output devices has been created for the Design and Modeling Environment. The requirements of the Expert Consultation Environment and some new concepts in expert system consultation are discussed. / Ph. D.
99

The use of reference process models to capture open source migration activities

Molefe, Onkgopotse 12 1900 (has links)
South Africa has shown an increased interest and awareness of Open Source Software (OSS) in the past decade. One of the reasons for this was the support from the Shuttleworth Foundation for Open Source initiatives. Migrating to OSS is a difficult and time consuming activity that should not be underestimated by the migration team. Careful planning and roll-out procedures should be in place before one commence on this journey. Process reference models are used in different fields to capture the generic process flow of activities. For the OSS domain, no process reference models could be found for migration purposes. Therefore, this study has as aim the suggestion of an initial set of process reference models for an organisational OSS migration. These process reference models were identified by capturing the process models for a case study that entailed the migration of the CSIR software systems and desktops from proprietary to OSS. From this set of process models, the migration processes were identified and refined to a set of suggested process reference models for organisational OSS migrations. This set of process reference models are useful to determine the processes necessary for organisations considering migrating to OSS. The study is divided into four research questions, where the first focusses on use and value of process reference models and the second on what is already known about OSS migration processes. The third deals with key processes within an organisational open source migration (OOSM) and the last with process reference models for an OOSM. For the first research question, the use and value of process reference models and the usefulness of utilising process reference models is discussed as well as using process models as a modelling tool to identify and capture processes. For the second research question, a summary is provided of what we know about OSS migration processes and a description about what the researcher and others have learnt about OSS, OSS migrations, process reference models, the process and its structure. For the third research question, the key processes within an OOSM is discussed as well as all the processes that took place during the OSS migration project from basic administrative processes to complex processes, from the beginning of the project until its completion. Lastly, for the fourth research question, process reference models that are essential for an OOSM and possible generic migration process models bound to reoccur are identified by the researcher and validated using a focus group discussion. / M.Tech. (Information Technology)
100

Optimized and integrated alignment system for functional proton radiosurgery

Shihadeh, Fadi Easa 01 January 2007 (has links)
In this thesis work, a system for proton beam alignment was studied and optimized in many of its functional areas. The resulting system was named Positioning Alignment Control System (PACS). The PACS system is an integrated and efficient system as a result of the work done on it in the course of this thesis work.

Page generated in 0.1208 seconds