• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 13
  • 3
  • Tagged with
  • 64
  • 15
  • 14
  • 12
  • 10
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Specifying, refining and verifying reactive system design with UML and CSP

Ng, Muan Yong January 2005 (has links)
No description available.
12

On the evolutionary optimisation of many objectives

Purshouse, Robin Charles January 2003 (has links)
No description available.
13

Continual resource estimation for evolving software

Fernández Ramil, Juan Carlos January 2003 (has links)
No description available.
14

Automatic graph layout in software architecture

Slade, Andrew John January 2006 (has links)
No description available.
15

Using requirements and design information to predict volatility in software development

Ingram, Claire January 2011 (has links)
We hypothesise that data about the requirements and design stages of a software development project can be used to make predictions about the subsequent number of development changes that software components will experience. This would allow managers to concentrate time-consuming efforts (such as traceability and staff training) to a few at-risk, cost-effective areas, and may also allow predictions to be made at an earlier stage than is possible using traditional metrics, such as lines of code. Previous researchers have studied links between change-proneness and metrics such as measures of inheritance, size and code coupling. We extend these studies by including measures of requirements and design activity as well. Firstly we develop structures to model the requirements and design processes, and then propose some new metrics based on these models. The structures are populated using data from a case study project and analysed alongside existing complexity metrics to ascertain whether change-proneness can be predicted. Finally we examine whether combining these metrics with existing metrics improves our ability to make predictions about change-proneness. First results show that our metrics can be linked to the quantity of change experienced by components in a software development project (potentially allowing predictions to take place earlier than before) but that best results are obtained by combining existing complexity metrics such as size, or combining existing metrics with our newer metrics.
16

Agent-based modelling of transactive memory systems and knowledge processes in agile versus traditional software development teams

Corbett, Andrea J. January 2013 (has links)
The objective of this research is to develop an agent-based model of transactive memory systems (TMS - meta-knowledge of expertise and knowledge in a team) simulating software development teams using two different software development methodologies. Waterfall (a structured methodology with a series of large discrete phases) and the Agile. eXtreme programming (XP - more recent. dynamic, and tuned to change and flexibility). There does exist research relating to TMS; comparisons of software development methodologies; cognitive processes of software development teams; and also agent-based modelling of social and cognitive systems. This is interdisciplinary research spanning psychology and computer science aiming to consolidate these discrete streams of research. The model evaluated the parameters of small/large tasks. and working solo/in pairs to investigate the effect on TMS, knowledge and team output. Over three sirnulaiions. increasing in cognitive realism. the model introduced greater complexity and novelty to the agents work. and various initial conditions of team knowledge and team member familiarity. The results illustrated a number of differences in TMS, knowledge processes and output between XP and Waterfall teams. The main findings indicate that as the novelty and complexity of the task increases the use of some XP techniques can lower the reduction in output. Also the dependence on TMS accuracy for teams using some XP techniques in complex novel environments is high while the team knowledge distribution becomes much more homogenous. This contradicts the literature that asserts a positive relationship between TMS accuracy and knowledge heterogeneity. Results also suggest that XP techniques can compensate for the advantages relating to team members' prior knowledge of each other allowing newly formed X P teams to perform better. The results contribute to understanding how knowledge and memory processes in software development teams affect team output, and how the adoption of XP practices can produce results that, challenge the established TMS literature.
17

An assessment of existing component-based software development methodologies and a holistic approach to CBSD

Tun, Thein Than January 2005 (has links)
No description available.
18

Search-based and goal-oriented refactoring using unfolding of graph transformation systems

Qayum, Fawad January 2012 (has links)
To improve automation and traceability of search-based refactoring, in this thesis we propose a formulation of using graph transformation, where graphs represent object-oriented software architectures at the class level and rules describe refactoring operations. This formalisation allows us to make use of partial order semantics and an associated analysis technique, the approximated unfolding of graph transformation systems. In the unfolding we can identify dependencies and conflicts between refactoring steps leading to an implicit and therefore more scalable representation of the search space by sets of transformation steps equipped with relations of causality and conflict. To implement search based refactoring we make use of the approximated unfolding of graph transformation systems. An optimisation algorithm based on the Ant Colony paradigm is used to explore the search space, aiming to find a sequence of refactoring steps that leads to the best design at a minimal cost. Alternatively, we propose a more targeted approach, aiming at the removal of design flaws. The idea is that such sequences should be relevant to the removal of the flaw identified, i.e., contain only steps which are directly or indirectly contributes to the desired goal.
19

A user-centred design framework for context-aware computing

Bradley, Nicholas Andrew January 2005 (has links)
Many exciting and promising application areas of mobile context-aware computing have emerged in recent years, such as tourist guides and navigation systems for visually impaired people. However, many researchers express grave concerns about the limited appreciation of human and social issues in design: usability issues remain unresolved particularly relating to mobile computer settings, and existing user-centred design approaches/frameworks are still in their infancy. This thesis proposes a framework to advance user-centred approaches to designing context-aware systems in order to help application developers (i) build richer descriptions or scenarios of mobile computer settings, and (ii) identify key human and social issues affecting the usability of their context-aware system. After a critical review of literature, a multidisciplinary model of context was developed in order to bring together theories, and proposed models, of context in Psychology, Linguistics, and Computer Science. This invaluable exercise illustrated the implications those theories have for context-aware computing. Three key perspectives of the multidisciplinary model were then used to investigate the issue of personalisation of context-aware services, focusing mainly on navigation services for visually impaired people. Firstly, the 'user's context' was investigated, where significant differences were found in the use of landmarks to navigate by people with a central vision loss, people with a peripheral vision loss, and registered blind people. Secondly, the 'application's context' involved designing context-aware services for transmission to participants within indoor and outdoor routes. Thirdly, the 'user-application's context', which brought together the first two perspectives, was investigated where it was found that certain groups were more effective at reaching landmarks when being given information that derived from people in the same visual impairment category. The multidisciplinary model, and the studies investigating its three key perspectives, were combined to form a user-centred framework for contextaware design. Key contributions included (i) richer modelling of user-interface interaction in mobile settings, and (ii) an augmentation to existing user-centred design approaches which includes not just meaningful activities of the user but also incidental and unpredictable activities that occur frequently in mobile settings.
20

Economies and diseconomies of scale in software engineering

Comstock, Craig January 2013 (has links)
A software development manager will often need to make decisions about the allocation of resources across a number of different development projects. These projects may vary in size, in importance, and in the potential value to the organisation of the software that is to be produced. Based upon these and other considerations, a manager might consider an increase in the resource allocation for a particular project. They might also consider changing the size of a planned software release by including or excluding a number of features. In doing this, the manager will need to consider economies, or diseconomies, of scale: if a project is given twice the resources, will the software be delivered in half the time? Will a particular project remain on schedule, despite a headcount reduction of 50%, if the size of the next deliverable is reduced also by 50%? If a software product would earn the company a certaiD amount if delivered in one month, and a lesser amount if delivered after that time, how much resource should be allocated to its development? Existing effort prediction models provide some guidance, but fail to agree about the extent, or even the existence, of economies and diseconomies of scale. The models are also limited to effort and scheduling predictions for a single project; there is no provision for the optimization of resource allocation, release sizing, and release scheduling across a portfolio of projects. This dissertation uses a new prediction model, derived from a large database of empirical results, to show the existence of an economy of scale with respect to project size, and a diseconomy of scale with respect to team size. It uses this model as a basis for the comparison and validation of the two leading prediction models, and shows that a failure to address the diseconomy of scale to team size can lead to significant errors. Drawing upon the tools and techniques of financial portfolio management, the model is used as the basis of an "economic framework" for software development optimization, a set of tools for optimizing resource allocation and release sizing across a portfolio of projects, in terms of the net value obtained from the software produced. This framework is extended to address declining marginal value and to provide an indication of risk and variability across a project portfolio.

Page generated in 0.0147 seconds