• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 487
  • 487
  • 487
  • 171
  • 157
  • 155
  • 155
  • 68
  • 57
  • 48
  • 33
  • 29
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Software quality and governance in agile software development

Abbas, Noura January 2009 (has links)
Looking at software engineering from a historical perspective, we can see how software development methodologies have evolved over the past 50 years. Using the right software development methodology with the right settings has always been a challenge. Therefore, there has always been a need for empirical evidence about what worked well and what did not, and what factors affect the different variables of the development process. Probably the most noticeable change to software development methodology in the last 15 years has been the introduction of the word “agile”. As any area matures, there is a need to understand its components and relations, as well as the need of empirical evidence about how well agile methods work in real life settings. In this thesis, we empirically investigate the impact of agile methods on different aspects of quality including product quality, process quality and stakeholders’ satisfaction as well as the different factors that affect these aspects. Quantitative and qualitative research methods were used for this research, including semi-structured interviews and surveys. Quality was studied in two projects that used agile software development. The empirical study showed that both projects were successful with multiple releases, and with improved product quality and stakeholders’ satisfaction. The data analysis produced a list of 13 refined grounded hypotheses out of which 5 were supported throughout the research. One project was studied in-depth by collecting quantitative data about the process used via a newly designed iteration monitor. The iteration monitor was used by the team over three iterations and it helped identify issues and trends within the team in order to improve the process in the following iterations. Data about other organisations collected via surveys was used to generalise the obtained results. A variety of statistical analysis techniques were applied and these suggested that when agile methods have a good impact on quality they also has a good impact on productivity and satisfaction, also when agile methods had good impact on the previous aspects they reduced cost. More importantly, the analysis clustered 58 agile practices into 15 factors including incremental and iterative development, agile quality assurance, and communication. These factors can be used as a guide for agile process improvement. The previous results raised questions about agile project governance, and to answer these questions the agile projects governance survey was conducted. This survey collected 129 responses, and its statistically significant results suggested that: retrospectives are more effective when applied properly as they had more impact when the whole team participated and comments were recorded, that organisation size has a negative relationship with success, and that good practices are related together as when a team does one aspect well, they do all aspects well. Finally, the research results supported the hypotheses: agile software development can produce good quality software, achieve stakeholders’ satisfaction, motivate teams, assures quick and effective response to stakeholder’s requests, and it goes in stages, matures, and improves over time.
132

End-user data-centric interactions over linked data

Popov, Igor January 2013 (has links)
The ability to build tools that support gathering and querying information from distributed sources on the Web rests on the availability of structured data. Linked Data, as a way for publishing and linking distributed structured data sources on the Web, provides an opportunity to create this kind of tools. Currently, however, the ability to complete such tasks over Linked Data sources is limited to users with advanced technical skills, resulting in an online information space largely inaccessible to non-technical end users. This thesis explores the challenges of designing user interfaces for end users, those without technical skills, to use Linked Data to solve information tasks that require combining information from multiple sources. The thesis explores the design space around interfaces that support access to Linked Data on demand, suggests potential use cases and stakeholders, and proposes several direct manipulation tools for end users with diverse needs and skills. User studies indicate that the tools built offer solutions to various challenges in accessing Linked Data that are identified in this thesis.
133

Performance visualization of parallel programs

D'Paola, Oscar Naim January 1995 (has links)
No description available.
134

Implementation and validation of model-based multi-threaded Java applications and Web services

Xue, Pengfei January 2008 (has links)
In the software engineering world, many modelling notations and languages have been developed to aid application development. The technologies, Java and Web services, play an increasingly important role in web applications. However, because of issues of complexity, it is difficult to build multi-threaded Java applications and Web Service applications, and even more difficult to model. Furthermore, it is difficult to reconcile the directly-coded application with the model-based application. Based on the formal modelling system, RDT, the new work here covers: (i) a translator, RDTtoJava, used to automatically convert an RDT model into an executable multi-threaded Java application; (ii) the framework for developing an RDT model into a Java synchronous distributed application that is supported by the JAX-RPC Web Services; and, (iii) the framework for developing an RDT model into a Java asynchronous distributed application that is supported by the JMS Web services. Experience was gained by building distributed computing models and client/server models and generation of the application based on such models. This work is helpful for the software developers and software researchers in formal software development.
135

The automated translation of integrated formal specifications into concurrent programs

Yang, Letu January 2008 (has links)
The PROB model checker [LB03] provides tool support for an integrated formal specification approach, which combines the state-based B specification language [Abr96] with the event-based process algebra CSP [Hoa78]. The JCSP package [WM00b] presents a concurrent Java implementation for CSP/occam. In this thesis, we present a developing strategy for implementing such a combined specification as a concurrent Java program. The combined semantics in PROB is flexible and ideal for model checking, but is too abstract to be implemented in programming languages. Also, although the JCSP package gave us significant inspiration for implementing formal specifications in Java, we argue that it is not suitable for directly implementing the combined semantics in PROB. Therefore, we started with defining a restricted semantics from the original one in PROB. Then we developed a new Java package, JCSProB, for implementing the restricted semantics in Java. The JCSProB package implements multi-way synchronization with choice for the combined B and CSP event, as well as a new multi-threading mechanism at process level. Also, a GUI sub-package is designed for constructing GUI programs for JCSProB to allow user interaction and runtime assertion checking. A set of translation rules relates the integrated formal models to Java and JCSProB, and we also implement these rules in an automated translation tool for automatically generating Java programs from these models. To demonstrate and exercise the tool, several B/CSP models, varying both in syntactic structure and behavioural properties, are translated by the tool. The models manifest the presence and absence of various safety, deadlock, and fairness properties; the generated Java code is shown to faithfully reproduce them. Run-time safety and fairness assertion checking is also demonstrated. We also experimented with composition and decomposition on several combined models, as well as the Java programs generated from them. Composition techniques can help the user to develop large distributed systems, and can significantly improve the scalability of the development of the combined models of PROB.
136

A competency model for semi-automatic question generation in adaptive assessment

Sitthisak, Onjira January 2009 (has links)
The concept of competency is increasingly important since it conceptualises intended learning outcomes within the process of acquiring and updating knowledge. A competency model is critical to successfully managing assessment and achieving the goals of resource sharing, collaboration, and automation to support learning. Existing e learning competency standards such as the IMS Reusable Definition of Competency or Educational Objective (IMS RDCEO) specification and the HR-XML standard are not able to accommodate complicated competencies, link competencies adequately, support comparisons of competency data between different communities, or support tracking of the knowledge state of the learner. Recently, the main goal of assessment has shifted away from content-based evaluation to intended learning outcome-based evaluation. As a result, through assessment, the main focus of assessment goals has shifted towards the identification of learned capability instead of learned content. This change is associated with changes in the method of assessment. This thesis presents a system to demonstrate adaptive assessment and automatic generation of questions from a competency model, based on a sound pedagogical and technological approach. The system’s design and implementation involves an ontological database that represents the intended learning outcome to be assessed across a number of dimensions, including level of cognitive ability and subject matter content. The system generates a list of the questions and tests that are possible from a given learning outcome, which may then be used to test for understanding, and so could determine the degree to which learners actually acquire the desired knowledge. Experiments were carried out to demonstrate and evaluate the generation of assessments, the sequencing of generated assessments from a competency data model, and to compare a variety of adaptive sequences. For each experiment, methods and experimental results are described. The way in which the system has been designed and evaluated is discussed, along with its educational benefits.
137

Defect correction based domain decomposition methods for some nonlinear problems

Siahaan, Antony January 2011 (has links)
Defect correction schemes as a class of nonoverlapping domain decomposition methods offer several advantages in the ways they split a complex problem into several subdomain problems with less complexity. The schemes need a nonlinear solver to take care of the residual at the interface. The adaptive-∝ solver can converge locally in the ∞-norm, where the sufficient condition requires a relatively small local neighbourhood and the problem must have a strongly diagonal dominant Jacobian matrix with a very small condition number. Yet its advantage can be of high signicance in the computational cost where it simply needs a scalar as the approximation of Jacobian matrix. Other nonlinear solvers employed for the schemes are a Newton-GMRES method, a Newton method with a finite difference Jacobian approximation, and nonlinear conjugate gradient solvers with Fletcher-Reeves and Pollak-Ribiere searching direction formulas. The schemes are applied to three nonlinear problems. The first problem is a heat conduction in a multichip module where there the domain is assembled from many components of different conductivities and physical sizes. Here the implementations of the schemes satisfy the component meshing and gluing concept. A finite difference approximation of the residual of the governing equation turns out to be a better defect equation than the equality of normal derivative. Of all the nonlinear solvers implemented in the defect correction scheme, the nonlinear conjugate gradient method with Fletcher-Reeves searching direction has the best performance. The second problem is a 2D single-phase fluid flow with heat transfer where the PHOENICS CFD code is used to run the subdomain computation. The Newton method with a finite difference Jacobian is a reasonable interface solver in coupling these subdomain computations. The final problem is a multiphase heat and moisture transfer in a porous textile. The PHOENICS code is also used to solve the system of partial differential equations governing the multiphase process in each subdomain while the coupling of the subdomain solutions is taken care of with some FORTRAN codes by the defect correction schemes. A scheme using a modified-∝ method fails to obtain decent solutions in both single and two layers case. On the other hand, the scheme using the above Newton method produces satisfying results for both cases where it can lead an initially distant interface data into a good convergent solution. However, it is found that in general the number of nonlinear iteration of the defect correction schemes increases with the mesh refinement.
138

Software for the collaborative editing of the Greek New Testament

Griffitts, Troy Andrew January 2018 (has links)
This project was responsible for developing the Virtual Manuscript Room Collaborative Research Environment (VMR CRE), which offers a facility for the critical editing workflow from raw data collection, through processing, to publication, within an open and online collaborative framework for the Institut für Neutestamentliche Textforschung (INTF) and their global partners while editing the Editio Critica Maior (ECM)-- the paramount critical edition of the Greek New Testament which analyses over 5600 Greek witnesses and includes a comprehensive apparatus of chosen manuscripts, weighted by quotations and early translations. Additionally, this project produced the first digital edition of the ECM. This case study, transitioning the workflow at the INTF to an online collaborative research environment, seeks to convey successful methods and lessons learned through describing a professional software engineer’s foray into the world of academic digital humanities. It compares development roles and practices in the software industry with the academic environment and offers insights to how this software engineer found a software team therein, suggests how a fledgling online community can successfully achieve critical mass, provides an outsider’s perspective on what a digital critical scholarly edition might be, and hopes to offer useful software, datasets, and a thriving online community for manuscript researchers.
139

International reservations systems : their strategic and operational implications for the UK hotel industry

Pringle, Stuart M. January 1995 (has links)
Nature and scope of work: This work presents details of the method and results of an investigation of the role and influence of international reservations systems within the UK hotel industry. The research comprised three questionnaire surveys of the use of computer reservations systems and distribution services by UK hotels. These are analysed and to produce an indication of general use of systems and the contribution which these currently make to hotel groups and consortia. The work also included a study of developments in access methods and changes in buyer behaviour as observed by representatives of computer reservation and distribution system, travel agency, hotel representation and intermediary companies. The impact of information technology developments on the travel agency industry, distribution systems operators and intermediaries is considered. The work then indicates the potential implications of these developments for the strategic planning and operational management of hotels in light of prevailing attitudes to technology, preferred sales methods and buyer behaviour. A computer based information and selection facility is developed. This provides a means of identifying the functions required of a distribution system in order to achieve specific business aims. It identifies the channels which meet the requirements while also providing details of the implications associated with use of each. Contribution to knowledge: This research provides the first published account of the current and potential influence of information technology on the way in which the UK hotel sector deals with its market and on the structure of the industry itself. The work results from a comprehensive study of the role of a significant emerging technology within an important sector of the tourism, travel and leisure industry. It is seen as being complete in its own right but also forms a starting point for longitudinal research since no previous work of this nature or scale has been undertaken in the area of interest. The guide developed as part of this work also lends itself to extensive future development as both the technology with which it is concerned and the technology upon which it is based continue to mature. The results of primary research indicate that there is scope for potential change in hotel sector sales and marketing practices as new methods of conducting business are adopted by hospitality industry service providers, agencies and the buying public. The work also suggests that global distribution systems are not the most suitable channel for all hotels but that alternatives must be considered in the context of the particular requirements of each hotel business. The use of formal research methods provides those involved in this sector with an objective assessment of the implications of widespread adoption of computer based reservation and distribution systems for individual businesses and for the industry as a whole. This addresses a requirement which was identified by the author and contributors in the course of the research. The subject area is complicated by the number of available channels through which businesses may distribute and receive information. This complexity is acknowledged throughout the work generates a distribution channel evaluation guide based on the research findings. The purpose of this device is to direct readers through the process of selecting the most appropriate channel to meet their specific business aims. The guide is based on results from the various stages of primary research which indicated the aspects of distribution system use about which hoteliers were unclear and also provided material about possible strategic uses and the operational implications experienced by users. This information was used to develop a staged method of identifying the type of system which would meet specific requirements and to indicate the implications associated with the use of a particular type of distribution system. This decision process is described and is then presented in the form of a hypertext document. The current version provides an elementary guide which can be used to assist qualitative evaluation in a complex subject area and indicates how this technology can be applied in its most basic form. Planned future work aims to develop the scope and function of the static reference document to produce a means of access to product provider information and to create a forum through which users can communicate with each other through e-mail. System suppliers can provide links to their own product information pages which can be accessed by users seeking information and advice. These developments will result in a guide which is interactive and can be kept up to date by system suppliers. This extension of the guide's role should enable it to provide material to be used in the decision support process by users wishing to conduct quantitative evaluation or comparison of distribution systems. This stage of development would require the use of a facility such as the World Wide Web (WWVV) to enable users and suppliers to communicate with each other. The WWW offers ready support for hypertext, the use of which is considered to be important for this application because of its ease of use for inexperienced computer users, the wide availability of the WWW and the suitability of an on-line system as means of publishing material which is subject to continual change. However, it is considered likely that a static version of the guide could be made available for users who do wish to avoid the cost and complication of obtaining access to the WWW. Although the use of hypertext is becoming more common, it is believed that this is the first use of this technology as a means of publishing research in this field.
140

Evaluating network science in archaeology : a Roman archaeology perspective

Brughmans, Tom January 2014 (has links)
No description available.

Page generated in 0.0693 seconds