• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1383
  • 192
  • 73
  • 30
  • 27
  • 11
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • Tagged with
  • 3635
  • 3635
  • 1069
  • 940
  • 902
  • 716
  • 706
  • 510
  • 470
  • 447
  • 399
  • 357
  • 291
  • 267
  • 263
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
331

Evolusionêre spesifikasie-ontwikkeling

Viljoen, Jacob Johannes 12 September 2012 (has links)
M.Sc. (Computer Science)
332

Tool support for systematic reviews in software engineering

Marshall, Christopher January 2016 (has links)
Background: Systematic reviews have become an established methodology in software engineering. However, they are labour intensive, error prone and time consuming. These and other challenges have led to the development of tools to support the process. However, there is limited evidence about their usefulness. Aim: To investigate the usefulness of tools to support systematic reviews in software engineering and develop an evaluation framework for an overall support tool. Method: A literature review, taking the form of a mapping study, was undertaken to identify and classify tools supporting systematic reviews in software engineering. Motivated by its results, a feature analysis was performed to independently compare and evaluate a selection of tools which aimed to support the whole systematic review process. An initial version of an evaluation framework was developed to carry out the feature analysis and later refined based on its results. To obtain a deeper understanding of the technology, a survey was undertaken to explore systematic review tools in other domains. Semi-structured interviews with researchers in healthcare and social science were carried out. Quantitative and qualitative data was collected, analysed and used to further refine the framework. Results: The literature review showed an encouraging growth of tools to support systematic reviews in software engineering, although many had received limited evaluation. The feature analysis provided new insight into the usefulness of tools, determined the strongest and weakest candidate and established the feasibility of an evaluation framework. The survey provided knowledge about tools used in other domains, which helped further refine the framework. Conclusions: Tools to support systematic reviews in software engineering are still immature. Their potential, however, remains high and it is anticipated that the need for tools within the community will increase. The evaluation framework presented aims to support the future development, assessment and selection of appropriate tools.
333

Programmatuurontwikkelingsmetodologieë met besondere verwysing na 'n ideale raamwerk saamgestel uit data- en prosesgeorienteërde benaderings

Vermeulen, Susan Elizabeth 17 March 2015 (has links)
M.Com. (Informatics) / Please refer to full text to view abstract
334

Performance implications of using diverse redundancy for database replication

Stankovic, Vladimir January 2008 (has links)
Using diverse redundancy for database replication is the focus of this thesis. Traditionally, database replication solutions have been built on the fail-stop failure assumption, i.e. that crashes are believed to cause a majority of failures. However, recent findings refuted this common assumption, showing that many of the faults cause systematic non-crash failures. These findings demonstrate that the existing, non-diverse database replication solutions, which use the same database server products, are ineffective fault-tolerant mechanisms. At the same time, the findings motivated the use of diverse redundancy (when different database server products are used) as a promising way of improving dependability. It seems that using a fault-tolerant server, built with diverse database servers, would deliver improvements in availability and failure rates compared with the individual database servers or their replicated, non-diverse configurations. Besides the potential for improving dependability, one would like to evaluate the performance implications of using diverse redundancy in the context of database replication. This is the focal point of the research. The work performed to that end can be summarised as follows: - We conducted a substantial performance evaluation of database replication using diverse redundancy. We compared its performance to the ones of various non-diverse configurations as well as non-replicated databases. The experiments revealed systematic differences in behaviour of diverse servers. They point to the potential for performance improvement when diverse servers are used. Under particular workloads diverse servers performed better than both non-diverse and non-replicated configurations. - We devised a middleware-based database replication protocol, which provides dependability assurance and guarantees database consistency. It uses an eager update everywhere approach for replica control. Although we focus on the use of diverse database servers, the protocol can be used with the database servers from the same vendor too. We provide the correctness criteria of the protocol. Different regimes of operation of the protocol are defined, which would allow it to be dynamically optimised for either dependability or performance improvements. Additionally, it can be used in conjunction with high-performance replication solutions. - We developed an experimental test harness for performance evaluation of different database replication solutions. It enabled us to evaluate the performance of the diverse database replication protocol, e.g. by comparing it against known replication solutions. We show that, as expected, the improved dependability exhibited by our replication protocol carries a performance overhead. Nevertheless, when optimised for performance improvement our protocol shows good performance. - In order to minimise the performance penalty introduced by the replication we propose a scheme whereby the database server processes are prioritised to deliver performance improvements in cases of low to modest resource utilisation by the database servers. - We performed an uncertainty-explicit assessment of database server products. Using an integrated approach, where both performance and reliability are considered, we rank different database server products to aid selection of the components for the fault-tolerant server built out of diverse databases.
335

SVG 3D graphical presentation for Web-based applications

Lu, Jisheng January 2015 (has links)
Due to the rapid developments in the field of computer graphics and computer hardware, web-based applications are becoming more and more powerful, and the performance distance between web-based applications and desktop applications is increasingly closer. The Internet and the WWW have been widely used for delivering, processing, and publishing 3D data. There is increasingly demand for more and easier access to 3D content on the web. The better the browser experience, the more potential revenue that web-based content can generate for providers and others. The main focus of this thesis is on the design, develop and implementation of a new 3D generic modelling method based on Scalable Vector Graphics (SVG) for web-based applications. While the model is initialized using classical 3D graphics, the scene model is extended using SVG. A new algorithm to present 3D graphics with SVG is proposed. This includes the definition of a 3D scene in the framework, integration of 3D objects, cameras, transformations, light models and textures in a 3D scene, and the rendering of 3D objects on the web page, allowing the end-user to interactively manipulate objects on the web page. A new 3D graphics library for 3D geometric transformation and projection in the SVG GL is design and develop. A set of primitives in the SVG GL, including triangle, sphere, cylinder, cone, etc. are designed and developed. A set of complex 3D models in the SVG GL, including extrusion, revolution, Bezier surface, and point clouds are designed and developed. The new Gouraud shading algorithm and new Phong Shading algorithm in the SVG GL are proposed, designed and developed. The algorithms can be used to generate smooth shading and create highlight for 3D models. The new texture mapping algorithms for the SVG GL oriented toward web-based 3D modelling applications are proposed, designed and developed. Texture mapping algorithms for different 3D objects such as triangle, plane, sphere, cylinder, cone, etc. will also be proposed, designed and developed. This constitutes a unique and significant contribution to the disciplines of web-based 3D modelling, as well as to the process of 3D model popularization.
336

Analysis of motivational factors influencing acceptance of technologically-enhanced personal, academic and professional development portfolios

Ahmed, Ejaz January 2014 (has links)
This research investigates factors that influence students’ intentions to use electronic portfolios (e-portfolios). E-portfolios are important pedagogical tools and a substantial amount of literature supports their role in personal, academic and professional development. However, achieving students' acceptance of e-portfolios is still a challenge for higher education institutions. One approach to understanding acceptance of e-portfolios is through technology acceptance based theories and models. A theoretical framework based on the Decomposed Theory of Planned Behaviour (DTPB) has therefore been developed, which proposes Attitude towards Behaviour (AB), Subjective Norms (SN) and Perceived Behavioural Control (PBC), and their decomposed factors as determinants of students' Behavioural Intention (BI) to use e-portfolios. Based on a positivistic philosophical standpoint, the study used a deductive research approach to test proposed hypotheses. Data was collected from 204 participants via a cross-sectional survey method and Structural Equation Modeling (SEM) was chosen for data analysis using a two-step approach. First, composite reliability, convergent validity and discriminant validity of the measures were established. Next, the structural model was analysed, in which Goodness of Fit (GoF) indices were observed and hypotheses were analysed. The results demonstrated that the theoretical model attained an acceptable fit with the data. The proposed personal, social and control factors in the model were shown to have significant influences on e-portfolio acceptance. The results suggest that use of DTPB can be extended to predict e-portfolio acceptance behaviour.
337

A framework for trend mining with application to medical data

Somaraki, Vassiliki January 2013 (has links)
This thesis presents research work conducted in the field of knowledge discovery. It presents an integrated trend-mining framework and SOMA, which is the application of the trend-mining framework in diabetic retinopathy data. Trend mining is the process of identifying and analysing trends in the context of the variation of support of the association/classification rules that have been extracted from longitudinal datasets. The integrated framework concerns all major processes from data preparation to the extraction of knowledge. At the pre-process stage, data are cleaned, transformed if necessary, and sorted into time-stamped datasets using logic rules. At the next stage, time-stamp datasets are passed through the main processing, in which the ARM technique of matrix algorithm is applied to identify frequent rules with acceptable confidence. Mathematical conditions are applied to classify the sequences of support values into trends. Afterwards, interestingness criteria are applied to obtain interesting knowledge, and a visualization technique is proposed that maps how objects are moving from the previous to the next time stamp. A validation and verification (external and internal validation) framework is described that aims to ensure that the results at the intermediate stages of the framework are correct and that the framework as a whole can yield results that demonstrate causality. To evaluate the thesis, SOMA was developed. The dataset is, in itself, also of interest, as it is very noisy (in common with other similar medical datasets) and does not feature a clear association between specific time stamps and subsets of the data. The Royal Liverpool University Hospital has been a major centre for retinopathy research since 1991. Retinopathy is a generic term used to describe damage to the retina of the eye, which can, in the long term, lead to visual loss. Diabetic retinopathy is used to evaluate the framework, to determine whether SOMA can extract knowledge that is already known to the medics. The results show that those datasets can be used to extract knowledge that can show causality between patients’ characteristics such as the age of patient at diagnosis, type of diabetes, duration of diabetes, and diabetic retinopathy.
338

Assessing the business value of software process improvement using CMMI® in South Africa

Cohen, Douglas James 10 March 2010 (has links)
The focus of software process improvement is on enhancing product quality and productivity to increase business competitiveness and profitability. The Capability Maturity Model Integration or CMMI® remains the dominant standard for software process improvement globally. The lack of software quality standards such as CMMI® is seen as one of the causes of the current uncompetitive state of the South African software industry and so in 2007, a pilot programme called “Bringing CMMI® to South Africa” was launched. This research focused on the experiences of the South African organisations participating in the South African CMMI® pilot study through a combination of semi-structured interviews and questionnaires. The aim was to assist future managerial decision making to assess the business value CMMI® can bring to South African software organisations. The research found that the adoption of CMMI® improved both the internal quality and efficiencies as well as opportunities for growth. The research also established that CMMI® cannot be regarded as a silver bullet solution and that while process improvements can cause short-term upheaval, there are longer-term tangible and intangible benefits. It is, however, key that the organisational aspects of the change be properly managed. A lack of awareness of quality standards or actual demand for CMMI® along with the relatively high implementation and support costs are further preventing its adoption in South Africa. The recommendations resulting from the research, including a model, are discussed and suggestions for future research are provided. Copyright / Dissertation (MBA)--University of Pretoria, 2010. / Gordon Institute of Business Science (GIBS) / unrestricted
339

Testing algorithmically complex software using model programs

Manolache, Liviu-Iulian 23 February 2007 (has links)
This dissertation examines, based on a case study, the feasibility of using model programs as a practical solution to the oracle problem in software testing. The case study pertains especially to testing algorithmically complex software and it evaluates the approach proposed in this dissertation against testing that is based on manual outcome prediction. In essence, the experiment entailed developing a model program for testing a medium-size industrial application that implements a complex scheduling algorithm. One of the most difficult tasks in software testing is to adjudicate on whether a program passed or failed a test. Because that usually requires "predicting" the correct program outcome, the problem of devising a mechanism for correctness checking (i.e., a "test oracle") is usually referred to as the "oracle problem". In practice, the most direct solution to the oracle problem is to pre-calculate manually the expected program outcomes. However, especially for algorithmically complex software, that is usually time consuming and error-prone. Although alternatives to the manual approach have been suggested in the testing literature, only few formal experiments have been conducted to evaluate them. A potential alternative to manual outcome prediction, which is evaluated in this dissertation, is to write one or more model programs that conform to the same functional specification (or parts of that specification) as the primary program (Le., the software to be delivered). Subjected to the same input, the programs should produce identical outputs. Disagreements indicate either the presence of software faults or specification defects. The absence of disagreements does not guarantee the correctness of the results since the programs may erroneously agree on outputs. However, if the test data is adequate and the implementations are diverse, it is unlikely that the programs will consistently fail and still reach agreement. This testing approach is based on a principle that is applied primarily in software fault-tolerance: "N-version diversity". In this dissertation, the approach is called "testing using M model programs" or, in short, "M-mp testing". The advantage of M-mp testing is that the programs, together, constitute an approximate, but continuously perfecting, test oracle. Human assistance is required only to analyse and arbitrate program disagreements. Consequently, the testing process can be automated to a very large degree. The main disadvantage of the approach is the extra effort required for constructing and maintaining the model programs. The case study that is presented in this dissertation provides prima facie evidence to suggest that the M-mp approach may be more cost-effective than testing based on manual outcome prediction. Of course, the validity of such a conclusion is dependent upon the specific context in which the experiment was carried out. However, there are good indications that the results of the experiment are generally applicable to testing algorithmically complex software. / Dissertation (MSc (Computer Science))--University of Pretoria, 2007. / Computer Science / unrestricted
340

Computational fluid dynamics based diagnostics and optimal design of hydraulic capsule pipelines

Asim, Taimoor January 2013 (has links)
Scarcity of fossil fuels and rapid escalation in the energy prices around the world is affecting efficiency of established modes of cargo transport within transportation industry. Extensive research is being carried out on improving efficiency of existing modes of cargo transport, as well as to develop alternative means of transporting goods. One such alternative method can be through the use of energy contained within fluid flowing in pipelines in order to transfer goods from one place to another. Although the concept of using fluid pipelines for transportation purposes has been in practice for more than a millennium now, but the detailed knowledge of the flow behaviour in such pipelines is still a subject of active research. This is due to the fact that most of the studies conducted on transporting goods in pipelines are based on experimental measurements of global flow parameters, and only a rough approximation of the local flow behaviour within these pipelines has been reported. With the emergence of sophisticated analytical tools and the use of high performance computing facilities being installed throughout the globe, it is now possible to simulate the flow conditions within these pipelines and get better understanding of the underlying flow phenomena. The present study focuses on the use of advanced modelling tools to simulate the flow within Hydraulic Capsule Pipelines (HCPs) in order to quantify the flow behaviour within such pipelines. Hydraulic Capsule Pipeline is the term which refers to the transport of goods in hollow containers, typically of spherical or cylindrical shapes, termed as capsules, being carried along the pipeline by water. A novel modelling technique has been employed to carry out the investigations under various geometric and flow conditions within HCPs. Both qualitative and quantitative flow diagnostics has been carried out on the flow of both spherical and cylindrical shaped capsules in a horizontal HCP for on-shore applications. A train of capsules consisting of a single to multiple capsules per unit length of the pipeline has been modelled for practical flow velocities within HCPs. It has been observed that the flow behaviour within HCP depends on a number of fluid and geometric parameters. The pressure drop in such pipelines cannot be predicted from established methods. Development of a predictive tool for such applications is one of the aims that is been achieved in this study. Furthermore, investigations have been conducted on vertical pipelines as well, which are very important for off-shore applications of HCPs. The energy requirements for vertical HCPs are significantly higher than horizontal HCPs. It has been shown that a minimum average flow velocity is required to transport a capsule in a vertical HCP, depending upon the geometric and physical properties of the capsules. The concentric propagation, along the centreline of pipe, of heavy density capsules in vertical HCPs marks a significant variation from horizontal HCPs transporting heavy density capsules. Bends are an integral part of pipeline networks. In order to design any pipeline, it is essential to consider the effects of the bends on the overall energy requirements within the pipelines. In order to accurately design both horizontal and vertical HCPs, analysis of the flow behaviour and energy requirements, of varying geometric configurations, has been carried out. A novel modelling technique has been incorporated in order to accurately predict the velocity, trajectory and orientation of the capsules in pipe bends. Optimisation of HCPs plays a crucial rule towards worldwide commercial acceptability of such pipelines. Based on Least-Cost Principle, an optimisation methodology has been developed for single stage HCPs for both on-shore and off-shore applications. The input to the optimisation model is the solid throughput required from the system, and the outputs are the optimal diameter of the HCPs and the pumping requirements for the capsule transporting system. The optimisation model presented in the present study is both robust and user-friendly. A complete flow diagnostics and design, including optimisation, of Hydraulic Capsule Pipelines has been presented in this study. The advanced computational skills being incorporated in this study has made it possible to map and analyse the flow structure within HCPs. Detailed analysis on even the smallest scale flow variations in HCPs has led to a better understanding of the flow behaviour.

Page generated in 0.1021 seconds