• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 117
  • 104
  • 29
  • 12
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 338
  • 338
  • 338
  • 109
  • 105
  • 85
  • 76
  • 60
  • 56
  • 47
  • 46
  • 46
  • 40
  • 39
  • 39
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

A model for enhancing software project management using software agent technology

Nienaber, R. C. (Rita Charlotte) 06 1900 (has links)
The present study has originated from the realisation that numerous software development projects either do not live up to expectations or fail outright. The scope, environment and implementation of traditional software projects have changed due to various reasons such as globalisation, advances in computing technologies and, last but not least, the development and deployment of software projects in distributed, collaborative and virtual environments. As a result, traditional project management methods cannot and do not address the added complexities found in this ever-changing environment. In this study the processes and procedures associated with software project management (SPM) were explored. SPM can be defined as the process of planning, organising, staffing, monitoring, controlling and leading a software project. The current study is principally aimed at making a contribution to enhancing and supporting SPM. A thorough investigation into software agent computing resulted in the realisation that software agent technology can be regarded as a new paradigm that may be used to support the SPM processes. A software agent is an autonomous system that forms part of an environment, can sense the environment and act on it over a period of time, in pursuit of its own agenda. The software agent can also perceive, reason and act by selecting and executing an appropriate action. The unique requirements of SPM and the ways in which agent technology may address these were subsequently identified. It was concluded that agent technology is specifically suited to geographically distributed systems, large network systems and mobile devices. Agents provide a natural metaphor for support in a team environment where cooperation and the coordination of actions toward a common goal, as well as the monitoring and controlling of actions are strongly supported. Although it became evident that agent technology is indeed being applied to areas and sections of the SPM environment, it is not being applied to the whole spectrum, i.e. to all core and facilitating functions of SPM. If software agents were to be used across the whole spectrum of SPM processes, this could provide a significant advantage to software project managers who are currently using other contemporary methods. The "SPMSA" model (Software Project Management supported by Software Agents) was therefore proposed. This model aims to enhance SPM by taking into account the unique nature and changing environment of software projects. The SPMSA model is unique as it supports the entire spectrum of SPM functionality, thus supporting and enhancing each key function with a team of software agents. Both the project manager and individual team members will be supported during software project management processes to simplify their tasks, eliminate the complexities, automate actions and enhance coordination and communication. Virtual teamwork, knowledge management, automated workflow management and process and task coordination will also be supported. A prototype of a section of the risk management key function of the SPMSA model was implemented as `proof of concept'. This prototype may be expanded to include the entire SPMSA model and cover all areas of SPM. Finally, the SPMSA model was verified by comparing the SPM phases of the model to the Plan-Do-Check-Act (PDCA) cycle. These phases of the SPMSA model were furthermore compared to the basic phases of software development as prescribed by the ISO 10006:2003 standard for projects. In both cases the SPMSA model compared favourably. Hence it can be concluded that the SPMSA model makes a fresh contribution to the enhancement of SPM by utilising software agent technology. / School of Computing / Ph. D. (Computer Science)
112

Providing mechanical support for program development in a weakest precondition calculus

Ackerman, Charlotte Christene 04 1900 (has links)
Thesis (MSc)--Stellenbosch University, 1993. / ENGLISH ABSTRACT: Formal methods aim to apply the rigour of mathematical logic to the problem ofguaranteeing that the behaviour of (critical) software conforms to predetermined requirements. The application of formal methods during program construction centers around a formal specification of the required behaviour of the program. A development attempt is successful if the resulting program can be formally proven to conform to its specification. For any substantial program, this entails a great deal of effort. Thus, some research efforts have been directed at providing mechanical support for the application of formal methods to software development. E.W. Dijkstra's calculus of weakest precondition predicate transformers [39,38] represents one of the first attempts to use program correctness requirements to guide program development in a formal manner. / AFRIKAANSE OPSOMMING: Formele metodes poog om die strengheid van wiskundige logika te gebruik om te waarborg dat die gedrag van (kritiese) programmatuur voldoen aan gegewe vereistes. Die toepassing van formele metodes tydens programontwikkeling sentreer rondom a formele spesifikasie van die verlangde programgedrag. 'n Ontwikkelingspoging is suksesvol as daar formee1 bewys kan word dat die resulterende program aan sy spesifikasie voldoen. Vir enige substansiële program, verteenwoordig dit ‘n aansienlike hoeveelheid werk. Verskeie navorsinspoging is gerig op die daarstelling van meganiese ondersteuning vir die gebruik van formele metodes tydens ontwikkeling van sagteware. E. W. Dijkstra se calculus van swakste voorkondisie (“weakest precondition”) predikaattransformators [39,38] is een van die eerste pogings om vereistes vir programkorrektheid op ‘n formele en konstruktiewe wyse tydens programontwikkeling te gebruik.
113

An algebraic framework for reasoning about security

Rajaona, Solofomampionona Fortunat 03 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Stepwise development of a program using refinement ensures that the program correctly implements its requirements. The specification of a system is “refined” incrementally to derive an implementable program. The programming space includes both specifications and implementable code, and is ordered with the refinement relation which obeys some mathematical laws. Morgan proposed a modification of this “classical” refinement for systems where the confidentiality of some information is critical. Programs distinguish between “hidden” and “visible” variables and refinement has to bear some security requirement. First, we review refinement for classical programs and present Morgan’s approach for ignorance pre- serving refinement. We introduce the Shadow Semantics, a programming model that captures essential properties of classical refinement while preserving the ignorance of hidden variables. The model invalidates some classical laws which do not preserve security while it satisfies new laws. Our approach will be algebraic, we propose algebraic laws to describe the properties of ignorance preserving refinement. Thus completing the laws proposed in. Moreover, we show that the laws are sound in the Shadow Semantics. Finally, following the approach of Hoare and He for classical programs, we give a completeness result for the program algebra of ignorance preserving refinement. / AFRIKAANSE OPSOMMING: Stapsgewyse ontwikkeling van ’n program met behulp van verfyning verseker dat die program voldoen aan die vereistes. Die spesifikasie van ’n stelsel word geleidelik ”verfyn” wat lei tot ’n implementeerbare kode, en word georden met ‘n verfyningsverhouding wat wiskundige wette gehoorsaam. Morgan stel ’n wysiging van hierdie klassieke verfyning voor vir stelsels waar die vertroulikheid van sekere inligting van kritieke belang is. Programme onderskei tussen ”verborgeën ”sigbare” veranderlikes en verfyning voldoen aan ’n paar sekuriteitsvereistes. Eers hersien ons verfyning vir klassieke programme en verduidelik Morgan se benadering tot onwetendheid behoud. Ons verduidelik die ”Shadow Semantics”, ’n programmeringsmodel wat die noodsaaklike eienskappe van klassieke verfyning omskryf terwyl dit die onwetendheid van verborge veranderlikes laat behoue bly. Die model voldoen nie aan n paar klassieke wette, wat nie sekuriteit laat behoue bly nie, en dit voldoen aan nuwe wette. Ons benadering sal algebraïese wees. Ons stel algebraïese wette voor om die eienskappe van onwetendheid behoudende verfyning te beskryf, wat dus die wette voorgestel in voltooi. Verder wys ons dat die wette konsekwent is in die ”Shadow Semantics”. Ten slotte, na aanleiding van die benadering in vir klassieke programme, gee ons ’n volledigheidsresultaat vir die program algebra van onwetendheid behoudende verfyning.
114

The science of determining norms for the planning and management of software development projects

Bannister, H. C. 12 1900 (has links)
ENGLISH ABSTRACT: Most people working in the software industry recognise that developing software to predictable schedules is risky and not easy. They experience problems to estimate how long the development of software will take. Underestimation leads to under staffing and setting too short a schedule. That in turn leads to staff burnout, low quality and missed deadlines. Overestimation can be almost as bad: Parkinson's Law that work expands to fill available time comes into action, which means the project will take as long as estimated even if overestimated. Currently people do no put in much effort to estimate jobs and therefore projects take as long as they take. Methods to manage uncertainty lead to putting in excessive safety and then wasting it. Business usually presents a target for the project with tremendous pressure for low 'estimates' during the bidding process and in the end this target becomes the plan. Best practice appears to manage the gap between this target and the estimate as a risk on the project. Without an efficient work breakdown structure (WBS) one cannot accurately estimate. Subject experts should help the project manager to plan the detail of how the work should be done. A functional design by a systems architect helps to break down the technical tasks. This is very important because omitted tasks will not be estimated for. The first step towards sound estimates is to estimate the size. This is extremely difficult at the initial phase but can be overcome if the company store history of size and completion time in a repository. Although lines of code are most often used as a size measure, function points or function blocks appear to be better, especially for the initial estimate. If an organisation has not kept historic data, now is the time to start doing it. The suggested procedure to follow before starting to gather information, is to define what is going to be kept (the norms), to delimit the defined data, to discipline the collection, to deposit it in an established repository and to deliver it in readily usable format. The tool used for storing these metrics must provide building in factors that influence effort like complexity, skills level, elapsed time, staff turnover, etc. There are many different techniques for estimation. The best option seems to use a combination of estimates and include developer opinion and a mathematical method like function point analysis. Estimation of software development, like all other work processes, need management control and this is called metrics management. This process includes establishing some kind of modeL Empirical models, based on a database with data stored by ones own company, give the best results. Two very good models are the Putnam model and the Parr model (for smaller projects). Even the best model and process is never perfect. Therefore the estimation process must be continuously monitored by comparing actual duration times to estimates. Be careful with feedback on how accurate estimates were. No feedback is the worst response. Carefully discussing the implications of underestimation and putting heads together to solve the problem appears to be the best solution. / AFRIKAANSE OPSOMMING: Die meeste mense in die sagteware industrie erken dat om sagteware te ontwikkel teen voorspelbare tydskedules, gevaar inhou en nie maklik is nie. Hulle ondervind probleme om te skat hoe lank die ontwikkeling van sagteware hulle gaan neem. Onderskatting lei tot te min hulpbronne en te kort skedules. Dit veroorsaak uitbrand van mense, lae kwaliteit en einddatums wat nie gehaal word nie. Oorskatting is byna net so erg. Parkinson se Wet dat werk geskep word om beskikbare tyd te vul, kom in aksie en aan die einde beteken dit die projek neem so lank as wat geskat is, selfs al is dit oorskat. Huidiglik doen mense nie veel moeite om tyd te skat op take nie en daarom neem projekte so lank as wat dit neem om te voltooi. Metodes om onsekerheid te bestuur lei tot die byvoeg van oormatige veiligheidstyd net om dit daarna weer te verkwis. Die besigheid verskaf gewoonlik 'n mikpunt vir die projek met geweldige druk vir lae skattings tydens bieery en op die ou end raak hierdie mikpunt die projekplan. Die beste manier om dit die hoof te bied is om die gaping tussen hierdie mikpunt en die skatting te bestuur as 'n projek risiko. Niemand kan akkuraat skat sonder 'n effektiewe metode van werk afbreek nie. Vakkundiges behoort die projekbestuurder te help om die detail van hoe die werk gedoen gaan word, te beplan. 'n Funksionele ontwerp deur 'n stelselsargitek help om die tegniese take verder af te breek. Dit is baie belangrik aangesien take wat uitgelaat word, nie in die skatting ingesluit gaan word nie. Die eerste stap om by gesonde skattings uit te kom, is om grootte te skat. Dit is besonder moeilik in die aanvanklike fase, dog kan oorkom word indien die maatskappy geskiedenis stoor van hoe groot voltooide take was en hoe lank dit geneem het. Alhoewel Iyne kodering die mees algemeenste vorm van meting van grootte is, Iyk dit asof funksie punte of funksie blokke beter werk, veral by die aanvanklike skatting. Indien 'n organisasie nie historiese data stoor nie, is dit nou die tyd om daarmee te begin. Die voorgestelde prosedure om te volg voordat informasie gestoor word, is om te definieer wat gestoor gaan word (norme te bepaal), om die data af te baken, dissipline toe te pas by die insameling, dit te stoor in 'n gevestigde databasis en dit beskikbaar te stel in bruikbare formaat. Die instrument wat gebruik word om hierdie syfers te stoor moet voorsiening maak vir die inbou van faktore wat produksie beinvloed, soos kompleksiteit, vlak van vaardigheid, verstreke tyd, personeel omset, ens. Daar bestaan menige verskillende tegnieke vir skatting. Die beste opsie blyk 'n kombinasie van skattings te wees. Die opinie van die programmeur asook een wiskundige metode soos funksie punt analise, behoort deel te wees hiervan. Soos alle ander werksprosesse, moet skattings vir sagteware ontwikkeling ook bestuur word en dit word metrieke bestuur genoem. Hierdie proses behels dat daar besluit moet word op een of ander model. Empiriese modelle gebaseer op 'n databasis waarin data gestoor word deur die maatskappy self, gee die beste resultate. Twee baie goeie modelle is die Putnam model en die Parr model (vir kleiner projekte). Selfs die beste model en proses is egter nooit perfek nie. Die estimasie proses moet dus voortdurend gemonitor word deur werklike tye met geskatte tye te vergelyk. Wees versigtig met terugvoer aangaande hoe akkuraat skattings was. Geen terugvoer is die ergste oortreding. Die beste oplossing skyn te wees om die implikasie van die onderskatting met die persoon wat die skatting gedoen het, te bespreek en koppe bymekaar te sit om die probleem op te los.
115

A characterisation of open source software adoption decisions in South African organisations

Moolman, Lafras 03 1900 (has links)
Thesis (MBA)--University of Stellenbosch, 2011. / The objective of this research is to characterise the factors influencing Open Source Software (OSS) adoption decisions in South African organisations. OSS is used extensively throughout the world, but there is a large amount of fear, uncertainty and doubt surrounding decisions to adopt OSS. The research improved this situation by determining the adoption factors that are relevant to South African organisations. OSS adoption is influenced by individual and organisational technology adoption factors. An extensive literature revealed the technology adoption factors relevant to OSS adoption. Adoption factors identified in literature were localised to the South African context, taking into account both public and private sector organisations. The research has found that OSS adoption factors identified in literature are relevant in a South African context. Factors investigated include access to source code, adoption costs, software freedom and control, technological factors, support factors, organisational factors and environmental factors. An important factor in OSS adoption decisions is the choice between vendor and community based OSS and the skills and resource requirements. Choosing community based software requires additional training, skills and resources. Organisations should take into account the effect of OSS development methodology on adoption decision factors. Important adoption decision factors include software compatibility (open standards), compatibility different hardware platforms (cross platform capabilities) and software and hardware vendor independence. The research concludes with recommendations approaching OSS adoption decisions when considering the choice between OSS and proprietary software.
116

An interpretive study of software risk management perspectives.

Padayachee, Keshnee. January 2002 (has links)
This dissertation addresses risk management in the software development context. The discussion commences with the risks in software development and the necessity for a software risk management process. The emergent discourse is based on the shortfalls in current risk management practices, elaborated in the software risk management literature. This research proposes a framework for a field investigation of risk management in the context of a particular software development organization. It was experimentally tested within several companies. This framework was designed to provide an understanding of the software development risk phenomena from a project manager's perspective and to understand how this perspective affects their perception. This was done with respect to the consideration of the advantages and disadvantages of software risk management as regards its applicability or inapplicability, respectively. This study can be used as a precursor to improving research into the creation of new software risk management frameworks. / Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 2002.
117

Interactive CD-ROM computer tour of the Ball State University Department of Art

Pridemore, David H. January 1995 (has links)
For my creative thesis project I authored an interactive tour of the Ball State Department of Art. Many underlying factors go into this project. My desire to learn multimedia design, the departments desire to develop a new information tool and having the necessary hardware and software to do such a project were all key to its sucess.In the summer of 1994 I came to Ball State to learn multimedia authoring while getting a master's degree in art. Unknown to me at that time, the department had set a goal of increasing visibility both within and beyond the Ball State community. Faculty members Professor Phil Repp and Professor Christine Paul were collaborating on a promotional identity campaign. From these collaborations grew the idea of a departmental publication to promote the mission and programs of the Department of Art. With the rapid advancement of technology, it seemed appropriate to use computers as part of this promotional campaign.As Professors Paul and Repp researched the possible ways in which computers could be incorporated into this project, many questions remained. Exactly what form should a project like this take and who could do it? Careful discussion and planning also followed over what physical form the project should take (i.e. video tape, a computer disk, or printed material). Eventually the decision was made that an interactive tour of the Department of Art on CD-ROM was the most appropriate solution. For the amount of information that needed to be included and to engage the end user in a dynamic, interactive way, this medium was also the most logical.My decision to return to school coincides perfectly with the departments needs. Professor Paul’s and Professor Repp’s collaboration led to the conclusion that a third person would be needed. Someone who was already literate in advanced computer graphics and had the desire for such an undertaking. Therefore, my goals of advancing my understanding of Macintosh based digital imagery learning multimedia are significant on two levels; my career as a teacher and a professional artist would realize significant gains and this project is an outstanding addition to my portfolio.For the past several years, the primary area of artistic study for me has been in the area of computer graphics and I came to Ball State last summer with some very specific goals. One of them being to learn Macromedia Director (the authoring package I used to create the project). Director is nationally recognized by professionals in this field as the top program for this type of work. Therefore, this was both an opportunity to reach personal goals and to create a thesis project that could be used as an important part of the Department of Arts identity campaign. My thesis project is the result of my own goals and the Department of Arts goals to utilize cutting edge technology for designing innovative computer programs.I’m sure at the onset of this project that I did not understand the full magnitude of an undertaking such as this. However, it is very rewarding to look back and see both how far I’ve come personally and how the piece has progressed into a dynamic information tool. / Department of Art
118

SEM Predicting Success of Student Global Software Development Teams

Brooks, Ian Robert 05 1900 (has links)
The extensive use of global teams to develop software has prompted researchers to investigate various factors that can enhance a team’s performance. While a significant body of research exists on global software teams, previous research has not fully explored the interrelationships and collective impact of various factors on team performance. This study explored a model that added the characteristics of a team’s culture, ability, communication frequencies, response rates, and linguistic categories to a central framework of team performance. Data was collected from two student software development projects that occurred between teams located in the United States, Panama, and Turkey. The data was obtained through online surveys and recorded postings of team activities that occurred throughout the global software development projects. Partial least squares path modeling (PLS-PM) was chosen as the analytic technique to test the model and identify the most influential factors. Individual factors associated with response rates and linguistic characteristics proved to significantly affect a team’s activity related to grade on the project, group cohesion, and the number of messages received and sent. Moreover, an examination of possible latent homogeneous segments in the model supported the existence of differences among groups based on leadership style. Teams with assigned leaders tended to have stronger relationships between linguistic characteristics and team performance factors, while teams with emergent leaders had stronger. Relationships between response rates and team performance factors. The contributions in this dissertation are three fold. 1) Novel analysis techniques using PLS-PM and clustering, 2) Use of new, quantifiable variables in analyzing team activity, 3) Identification of plausible causal indicators for team performance and analysis of the same.
119

An Examination of the Effect of Decision Style on the Use of a Computerized Project Management Tool

Fox, Terry L., 1963- 08 1900 (has links)
Managing a software development project presents many difficulties. Most software development projects are considered less than successful, and many are simply canceled. Ineffective project management has been cited as a major factor contributing to these failures. Project management tools can greatly assist managers in tracking and controlling their projects. However, project management tools are very structured and analytical in nature, which is not necessarily supported by decision-making styles of the managers. This research examined the influence that decision style has on a project manager's use of a project management tool.
120

Improvement of the software systems development life cycle of the credit scoring process at a financial institution through the application of systems engineering

Meyer, Nadia 11 October 2016 (has links)
A Research Report Submitted to the Faculty of Engineering and the Built Environment in partial fulfilment of the Requirements for the degree of Master of Science in Engineering / The research centred on improving the current software systems development life cycle (SDLC) of the credit scoring process at a financial institution based on systems engineering principles. The research sought ways to improve the current software SDLC in terms of cost, schedule and performance. This paper proposes an improved software SDLC that conforms to the principles of systems engineering. As decisioning has been automated in financial institutions, various processes are developed according to a software SDLC in order to ensure accuracy and validity thereof. This research can be applied to various processes within financial institutions where software development is conducted, verified and tested. A comparative analysis between the current software SDLC and a recommended SDLC was performed. Areas within the current SDLC that did not comply with systems engineering principles were identified. These inefficiencies were found during unit testing, functional testing and regression testing. An SDLC is proposed that conforms to systems engineering principles and is expected to reduce the current SDLC schedule by 20 per cent. Proposed changes include the sequence of processes within the SDLC, increasing test coverage by extracting data from the production environment, filtering and sampling data from the production environment, automating functional testing using mathematical algorithms, and creating a test pack for regression testing which adequately covers the software change. / MT2016

Page generated in 0.1439 seconds