Spelling suggestions: "subject:"5oftware ¿ devevelopment"" "subject:"5oftware ¿ agentdevelopment""
151 |
Risk management in software developmentLabuschagne, Mariet 01 1900 (has links)
This dissertation discusses risk management in the context of software development.
It commences by investigating why so many software development projects fail. It
then focuses on approaches to software development that emerged as attempts to
improve the success rate. A common shortcoming to these approaches is identified,
namely that they only cater for the tasks that need to be done, ignoring possible
unexpected problems. After having motivated the need for risk management, the
framework for a risk management methodology is discussed, outlining the steps in
the risk management process. Decision-making guidelines and best practices follow,
as well as a discussion about the way they should be implemented as part of the risk
management effort. Guidelines are provided for the implementation of risk
management as part of software development. Finally, the risks that may cause the
failure of the implementation of risk management are identified and guidelines
provided to address them. / Computing / M. Sc. (Information Systems)
|
152 |
Program management practices in context of Scrum : a case study of two South African software development SMMEsSingh, Alveen January 2015 (has links)
Submitted in fulfilment of the requirements for the degree of Doctor of Technology: Information Technology, Durban University of Technology, Durban, 2015. / Agile approaches have proliferated within the software development arena over the past decade. Derived mainly from Lean manufacturing principles, agile planning and control mechanisms appear minimal and fluid when compared to more traditional software engineering approaches. Scrum ranks among the more popular permutations of agile. Contemporary literature represents a rich source of contributions for agile in areas such as practice guidelines, experience reports, and methodology tailoring; but the vast majority of these publications focus on the individual project level only, leaving much uncertainty and persistent questions in the multi-project space. Questions have recently been raised, by both academics and practitioners alike, concerning the ability of Scrum to scale from the individual project level to the multi-project space.
Program management is an area encompassing practice and research areas concerned mainly with harmonizing the existence of competing simultaneous projects. Existing literature on program management essentially perceives projects as endeavours that can be carefully planned at the outset, and controlled in accordance with strong emphasis placed on economic and schedule considerations. This complexion seems to be mostly a result of well-established and ingrained management frameworks like Project Management Institute (PMI), and is largely at odds with emerging practices like Scrum. This disparity represents a gap in the literature and supports the need for deeper exploration. The conduit for this exploration was found in two South African software development small to medium sized enterprises (SMMEs) practicing Scrum. The practical realities and constraints faced by these SMMEs elicited the need for more dynamic program management practices in support of their quest to maximize usage of limited resources. This thesis examines these practices with the aim of providing new insights into the program management discourse in the context of Scrum software development environments.
The research approach is qualitative and interpretive in nature. The in-depth exploratory case study research employed the two software SMMEs as units of analysis. Traditional ethnographic techniques were commissioned alongside minimal researcher participation in project activities. Activity Theory honed the data analysis effort and helped to unearth the interrelationships between SMME characteristics, program management practices, and Scrum software development. The results of the data analysis are further refined and fashioned into eleven knowledge areas that represent containers of program management practices. This is the product of thematic analysis of literature and data generated from fieldwork. Seeing as the observed practices were highly dynamic in nature, concept analysis provided a mechanism by which to depict them as snapshots in time. As a theoretical contribution, proposed frameworks were crafted to show how program management practices might be understood in the context of organizations striving towards agile implementation. Furthermore, representations of the mutually influential interfaces of SMME characteristics and Scrum techniques that initiate the observed fluid nature of program management practices, are brought to the fore.
|
153 |
Uniform framework for the objective assessment and optimisation of radiotherapy image qualityReilly, Andrew James January 2011 (has links)
Image guidance has rapidly become central to current radiotherapy practice. A uniform framework is developed for evaluating image quality across all imaging modalities by modelling the ‘universal phantom’: breaking any phantom down into its constituent fundamental test objects and applying appropriate analysis techniques to these through the construction of an automated analysis tree. This is implemented practically through the new software package ‘IQWorks’ and is applicable to both radiotherapy and diagnostic imaging. For electronic portal imaging (EPI), excellent agreement was observed with two commercial solutions: the QC-3V phantom and PIPS Pro software (Standard Imaging) and EPID QC phantom and epidSoft software (PTW). However, PIPS Pro’s noise correction strategy appears unnecessary for all but the highest frequency modulation transfer function (MTF) point and its contrast to noise ratio (CNR) calculation is not as described. Serious flaws identified in epid- Soft included erroneous file handling leading to incorrect MTF and signal to noise ratio (SNR) results, and a sensitivity to phantom alignment resulting in overestimation of MTF points by up to 150% for alignment errors of only ±1 pixel. The ‘QEPI1’ is introduced as a new EPI performance phantom. Being a simple lead square with a central square hole it is inexpensive and straightforward to manufacture yet enables calculation of a wide range of performance metrics at multiple locations across the field of view. Measured MTF curves agree with those of traditional bar pattern phantoms to within the limits of experimental uncertainty. An intercomparison of the Varian aS1000 and aS500-II detectors demonstrated an improvement in MTF for the aS1000 of 50–100% over the clinically relevant range 0.4–1 cycles/mm, yet with a corresponding reduction in CNR by a factor of p 2. Both detectors therefore offer advantages for different clinical applications. Characterisation of cone-beam CT (CBCT) facilities on two Varian On-Board Imaging (OBI) units revealed that only two out of six clinical modes had been calibrated by default, leading to errors of the order of 400 HU for some modes and materials – well outside the ±40 HU tolerance. Following calibration, all curves agreed sufficiently for dose calculation accuracy within 2%. CNR and MTF experiments demonstrated that a boost in MTF f50 of 20–30% is achievable by using a 5122 rather than a 3842 matrix, but with a reduction in CNR of the order of 30%. The MTF f50 of the single-pulse half-resolution radiographic mode of the Varian PaxScan 4030CB detector was measured in the plane of the detector as 1.0±0.1 cycles/mm using both a traditional tungsten edge and the new QEPI1 phantom. For digitally reconstructed radiographs (DRRs), a reduction in CT slice thickness resulted in an expected improvement in MTF in the patient scanning direction but a deterioration in the orthogonal direction, with the optimum slice thickness being 1–2 mm. Two general purposes display devices were calibrated against the DICOM Greyscale Standard Display Function (GSDF) to within the ±20% limit for Class 2 review devices. By providing an approach to image quality evaluation that is uniform across all radiotherapy imaging modalities this work enables consistent end-to-end optimisation of this fundamental part of the radiotherapy process, thereby supporting enhanced use of image-guidance at all relevant stages of radiotherapy and better supporting the clinical decisions based on it.
|
154 |
A database system architecture supporting coexisting query languages and data modelsHepp, Pedro E. January 1983 (has links)
Database technology is already recognised and increasingly used in administering and organising large bodies of data and as an aid in developing software. This thesis considers the applicability of this technology in small but potentially expanding application environments with users of varying levels of competence. A database system architecture with the following main characteristics is proposed: Database technology is already recognised and increasingly used in administering and organising large bodies of data and as an aid in developing software. This thesis considers the applicability of this technology in small but potentially expanding application environments with users of varying levels of competence. A database system architecture with the following main characteristics is proposed:Database technology is already recognised and increasingly used in administering and organising large bodies of data and as an aid in developing software. This thesis considers the applicability of this technology in small but potentially expanding application environments with users of varying levels of competence. A database system architecture with the following main characteristics is proposed : 1. It is based on a set of software components that facilitates the implementation and evolution of a software development environment centered on a database. 2. It enables the implementation of different user interfaces to provide adequate perceptions of the information content of the database according to the user's competence, familiarity with the system or the complexity of the processing requirements. 3. it is oriented toward databases that require moderate resources from the computer system to start an application. Personal or small-group databases are likely to benefit most from this approach.
|
155 |
An Object-Oriented Systems Analysis and Design Technique for Telemetry SystemsCarroll, Don, Miller, Craig, Nickens, Don 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / Object Oriented techniques have been successfully applied to all phases of software development including Requirements Analysis, Design, and Implementation. There has been some reluctance to extend the Object paradigm into the System Analysis, Architecture Development, and System Design phases. This is due to reasons generally related to technology immaturity and concerns with applicability. Telemetry systems in particular appear to be somewhat slow in embracing object technology. The Range Standardization and Automation program has integrated proven techniques to successfully produce an Object-oriented Systems Model. This paper presents the techniques and benefits.
|
156 |
Collaborative Software Development and Sustaining Engineering: An Improved Method to Meet the NASA Mission.Mann, David 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper reports on the Space Shuttle, Record and Playback Subsystem (RPS) upgrade
project turnaround brought about through extensive collaborative software development.
The new project and systems engineering methodologies employed on this project resulted
in many positive effects over the status quo method employed to develop and upgrade
systems. These effects include; 1) a reduction in the initial software development costs, 2)
a reduction in the development timeline, 3) improved marketability of the software
technology developed, 4) improved product quality deployed to operations, and 5)
improved maintainability. Attributes within each of the aforementioned are examined in
support of these assertions.
Prior to implementing this new method, the RPS upgrade project had been under
development for seven years using the standard software development method. This
involves developing custom applications using Commercial Off The Shelf (COTS)
hardware, operating systems and compilers. A change in strategy was effected on this
pathfinder project by adopting a COTS telemetry ground station software package to
provide basic ground station functionality and building additional required capabilities to
complete the project. The merits of having employed this methodology are explored using
the probable outcome of continued custom software development as a basis for
comparison .
This collaboration between the United Space Alliance (USA) and AP Data Systems
Inc.(an AP Labs company), resulted in software innovations in FM and PCM processing
software as well as general ground station management software. The four technology
transfer submittals for new software innovations resulting from this collaboration are
discussed.
|
157 |
AUTOMATION OF A CLOUD HOSTED APPLICATION : Performance, Automated Testing, Cloud Computing / AUTOMATION OF A CLOUD HOSTED APPLICATION : Performance, Automated Testing, Cloud ComputingPenmetsa, Jyothi Spandana January 2016 (has links)
Context: Software testing is the process of assessing quality of a software product to determine whether it matches with the existing requirements of the customer or not. Software testing is one of the “Verification and Validation,” or V&V, software practices. The two basic techniques of software testing are Black-box testing and White box testing. Black-box testing focuses solely on the outputs generated in response to the inputs supplied neglecting the internal components of the software. Whereas, White-box testing focuses on the internal mechanism of the software of any application. To explore the feasibility of black-box and white-box testing under a given set of conditions, a proper test automation framework needs to be deployed. Automation is deployed in order to reduce the manual effort and to perform testing continuously, thereby increasing the quality of the product. Objectives: In this research, cloud hosted application is automated using TestComplete tool. The objective of this thesis is to verify the functionality of cloud application such as test appliance library through automation and to measure the impact of the automation on release cycles of the organisation. Methods: Here automation is implemented using scrum methodology which is an agile development software process. Using scrum methodology, the product with working software can be delivered to the customers incrementally and empirically with updating functionalities in it. Test appliance library functionality is verified deploying testing device thereby keeping track of automatic software downloads into the testing device and licenses updating in the testing device. Results: Automation of test appliance functionality of cloud hosted application is made using TestComplete tool and impact of automation on release cycles is found reduced. Through automation of cloud hosted application, nearly 24% of reduction in level of release cycles can be observed thereby reducing the manual effort and increasing the quality of delivery. Conclusion: Automation of a cloud hosted application provides no manual effort thereby utilisation of time can be made effectively and application can be tested continuously increasing the efficiency and / AUTOMATION OF A CLOUD HOSTED APPLICATION
|
158 |
What can the .NET RDBMS developer do? A brief survey of impedance mismatch solutions for the .NET developerFiduk, Kenneth Walter, 1980- 26 August 2010 (has links)
Nearly all modern software applications, from the simplest website user account system to the most complex, enterprise-level, completely-integrated infrastructure, utilize some sort of backend data storage and business logic that interacts with the backend. The ubiquitous nature of this backend/business dichotomy makes sense as the need to both store and manipulate data can be traced as far back as the Turing Machine in Computer Science. The most commonly used technologies for these two aspects are Relational Database Management Systems (RDBMS) for backend and Object-Oriented Programming (OOP) for business logic. However, these two methodologies are not immediately compatible and the inherent differences between data represented in RDBMS and data represented in OOP are not trivial.
Taking a .NET developer’s perspective, this report aims to explore the RDBMS/OO dichotomy and its inherent issues. Schema management theory and algebra are discussed to gain better perspective of the domain and a survey of existing solutions for the .NET environment is explored. Additionally, methods outside the mainstream are discussed. The advantages and disadvantages of each are weighed and presented to the reader to help aid in design implementations in the future. / text
|
159 |
Key Elements of Software Product Integration ProcessesLarsson, Stig January 2007 (has links)
The product integration is a particularly critical phase of the software product development process as many problems originating from earlier phases become visible in this phase. Problems in product integration result in delays and rework. One of the measures to decrease the late discovery of problems is the use of development standards and guidelines that define practices to ensure correctness of the product integration. However, even if such standards and reference models exist, they are in not used consistently. One of the reasons is a lack of a proof that they indeed improve the integration process, and even more important, that they are sufficient for performing efficient and correct product integration. The conclusion of the presented research is that the available descriptions in standards and reference models taken one by one are insufficient and must be consolidated to help development organizations improve the product integration process. The research has resulted in a proposed combination of the activities included in the different reference models. This combination has been based on a number of case studies. Through the case studies performed in seven different product development organizations, a relationship between problems that are observed and the failure to follow the recommendations in reference models is identified. The analysis has indicated which practices are necessary, and how other practices support these. The goal with the research is to provide product development organizations with guidelines for how to perform software product integration. One additional finding of the research is the existence of relation between software architecture and the development process. A method for identifying dependencies between evolvement of software architectures and adaptation of integration practices has been demonstrated.
|
160 |
Traveling of Requirements in the Development of Packaged Software: An Investigation of Work Design and UncertaintyGregory, Thomas 27 June 2014 (has links)
Software requirements, and how they are constructed, shared and translated across software organizations, express uncertainties that software developers need to address through appropriate structuring of the process and the organization at large. To gain new insights into this important phenomenon, we rely on theory of work design and the travelling metaphor to undertake an in-depth qualitative inquiry into recurrent development of packaged software for the utility industry. Using the particular context of software provider GridCo, we examine how requirements are constructed, shared, and translated as they travel across vertical and horizontal boundaries. In revealing insights into these practices, we contribute to theory by conceptualizing how requirements travel, not just locally, but across organizations and time, thereby uncovering new knowledge about the responses to requirement uncertainty in development of packaged software. We also contribute to theory by providing narrative accounts of in situ requirements processes and by revealing practical consequences of organization structure on managing uncertainty.
|
Page generated in 0.083 seconds