• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 52
  • 8
  • 5
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 93
  • 29
  • 25
  • 19
  • 16
  • 15
  • 13
  • 12
  • 12
  • 9
  • 9
  • 8
  • 7
  • 7
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Regression test selection by exclusion

Ngah, Amir January 2012 (has links)
This thesis addresses the research in the area of regression testing. Software systems change and evolve over time. Each time a system is changed regression tests have to be run to validate these changes. An important issue in regression testing is how to minimise reuse the existing test cases of original program for modied program. One of the techniques to tackle this issue is called regression test selection technique. The aim of this research is to signicantly reduce the number of test cases that need to be run after changes have been made. Specically, this thesis focuses on developing a model for regression test selection using the decomposition slicing technique. Decomposition slicing provides a technique that is capable of identifying the unchanged parts of the system. The model of regression test selection based on decomposition slicing and exclusion of test cases was developed in this thesis. The model is called Regression Test Selection by Exclusion (ReTSE) and has four main phases. They are Program Analysis, Comparison, Exclusion and Optimisation phases. The validity of the ReTSE model is explored through the application of a number of case studies. The case studies tackle all types of modication such as change, delete and add statements. The case studies have covered a single and combination types of modication at a time. The application of the proposed model has shown that signicant reductions in the number of test cases can be achieved. The evaluation of the model based on an existing framework and comparison with another model also has shown promising results. The case studies have limited themselves to relatively small programs and the next step is to apply the model to larger systems with more complex changes to ascertain if it scales up. While some parts of the model have been automated tools will be required for the rest when carrying out the larger case studies.
12

Effect of Dispersion on SS-WDM Systems

Wongpaibool, Virach 23 September 1998 (has links)
The purpose of this thesis is to investigate the effect of dispersion on a spectrum-sliced WDM (SS-WDM) system, specifically a system employing a single-mode optical fiber. The system performance is expressed in term of the receiver sensitivity defined as the average number of photon per bit <i>N<sub>p</i> </sub>required for a given probability of bit error <i>P<sub>e</sub></i>. The receiver sensitivity is expressed in terms of two normalized parameters: the ratio of the optical bandwidth per channel and the bit rate <i>m</i>=<i>B</i><sub>0</sub><i>/R<sub>b</sub></i>=<i>B</i><sub>0</sub><i>T</i>, and the transmission distance normalized by the dispersion distance <i>z/L<sub>D</sub></i>. The former represents the effect of the excess beat noise caused by the signal fluctuation. The latter represents the effect of dispersion. The excess beat noise can be reduced by increasing the value of <i>m</i> (increasing the optical bandwidth<i> B</i><sub>0</sub> for a given bit rate<i> R<sub>b</sub></i>). However, a large <i>m</i> implies that the degradation due to the dispersion is severe in a system employing a single-mode fiber. Therefore, there should be an optimum <i>m</i> resulting from the two effects. The theoretical results obtained from our analysis have confirmed this prediction. It is also shown that the optimum <i>m</i> (<i>m<sub>opt</sub></i>) decreases with an increase in the normalized distance. This suggests that the dispersion strongly affects the system performance. The increase in the excess beat noise is traded for the decrease in the dispersion effect. Additionally, the maximum transmission distance is relatively short, compared to that in a laser-based system. This suggests that the SS-WDM systems with single-mode fibers are suitable for short-haul systems, such as high-speed local-access network where the operating bit rate is high but the transmission distance is relatively short. / Master of Science
13

Detecting Test Clones with Static Analysis

Jain, Divam January 2013 (has links)
Large-scale software systems often have correspondingly complicated test suites, which are diffi cult for developers to construct and maintain. As systems evolve, engineers must update their test suite along with changes in the source code. Tests created by duplicating and modifying previously existing tests (clones) can complicate this task. Several testing technologies have been proposed to mitigate cloning in tests, including parametrized unit tests and test theories. However, detecting opportunities to improve existing test suites is labour intensive. This thesis presents a novel technique for etecting similar tests based on type hierarchies and method calls in test code. Using this technique, we can track variable history and detect test clones based on test assertion similarity. The thesis further includes results from our empirical study of 10 benchmark systems using this technique which suggest that test clone detection by our technique will aid test de-duplication eff orts in industrial systems.
14

Information flow control for Java a comprehensive approach based on path conditions in dependence graphs

Hammer, Christian January 2009 (has links)
Zugl.: Karlsruhe, Univ., Diss., 2009 / Hergestellt on demand
15

Detecting Test Clones with Static Analysis

Jain, Divam January 2013 (has links)
Large-scale software systems often have correspondingly complicated test suites, which are diffi cult for developers to construct and maintain. As systems evolve, engineers must update their test suite along with changes in the source code. Tests created by duplicating and modifying previously existing tests (clones) can complicate this task. Several testing technologies have been proposed to mitigate cloning in tests, including parametrized unit tests and test theories. However, detecting opportunities to improve existing test suites is labour intensive. This thesis presents a novel technique for etecting similar tests based on type hierarchies and method calls in test code. Using this technique, we can track variable history and detect test clones based on test assertion similarity. The thesis further includes results from our empirical study of 10 benchmark systems using this technique which suggest that test clone detection by our technique will aid test de-duplication eff orts in industrial systems.
16

Impact of biofilm formation and sublethal injury of listeria monocytogenes on transfer to delicatessen meats

Keskinen, Lindsey Ann. January 2006 (has links)
Thesis (Ph. D.)--Michigan State University. Dept. of Food Science and Human Nutrition, 2006. / Title from PDF t.p. (viewed on June 19, 2009) Includes bibliographical references (p. 209-220). Also issued in print.
17

Pfadbedingungen in Abhängigkeitsgraphen und ihre Anwendung in der Softwaresicherheitstechnik

Robschink, Torsten. Unknown Date (has links) (PDF)
Universiẗat, Diss., 2005--Passau. / Erscheinungsjahr an der Haupttitelstelle: 2004.
18

Code Decomposition: A New Hope

Garg, Nupur 01 June 2017 (has links)
Code decomposition (also known as functional decomposition) is the process of breaking a larger problem into smaller subproblems so that each function implements only a single task. Although code decomposition is integral to computer science, it is often overlooked in introductory computer science education due to the challenges of teaching it given limited resources. Earthworm is a tool that generates unique suggestions on how to improve the decomposition of provided Python source code. Given a program as input, Earthworm presents the user with a list of suggestions to improve the functional decomposition of the program. Each suggestion includes the lines of code that can be refactored into a new function, the arguments that must be passed to this function and the variables returned from the function. The tool is intended to be used in introductory computer science courses to help students learn more about decomposition. Earthworm generates suggestions by converting Python source code into a control flow graph. Static analysis is performed on the control flow graph to direct the generation of suggestions based on code slices.
19

Dual Execution And Its Applications

Dohyeong Kim (5929886) 08 May 2020 (has links)
<div>Execution comparison techniques compare multiple executions from the same program or highly similar programs to identify state differences including control flow differences and variable value differences. Execution comparison has been used to debug sequential, concurrent, and regression failures, by reasoning about the causality between execution differences and input differences, thread scheduling differences, and syntactic differences among program versions, respectively. However, execution comparison techniques have several limitations. First, executions may have benign differences, which are not related to the behavior differences that the user can observe. Second, huge storage spaces are required to record two independent executions. Third, the techniques can only compare executions from the same or similar programs.</div><div><br></div><div>In this dissertation, we present an execution comparison technique that (1) removes benign differences, (2) requires less space, and (3) can compare two different programs implementing similar algorithms. Also, we present that the execution comparison technique can be used in identifying and extracting a functional component out of a binary. First, we present a dual execution engine that executes multiple executions at the same time and only introduces the desired differences. Also, the engine compares the executions on-the-fly and stores the differences only. Second, we present a technique to compare two programs written by two different programmers. Especially we will show that this technique can compare the buggy program from a student and the correct from the instructor and can reason about the errors.</div><div><br></div>
20

Slice—n—Dice Algorithm Implementation in JPF

Noonan, Eric S. 01 July 2014 (has links) (PDF)
This work deals with evaluating the effectiveness of a new verification algorithm called slice--n--dice. In order to evaluate the effectiveness of slice--n--dice, a vector clock POR was implemented to compare it against. The first paper contained in this work was published in ACM SIGSOFT Software Engineering Notes and discusses the implementation of the vector clock POR. The results of this paper show the vector clock POR performing better than the POR in Java Pathfinder by at least a factor of two. The second paper discusses the implementation of slice--n--dice and compares it against other verification techniques. The results show that slice--n--dice performs better than the other verification methods in terms of states explored and runtime when there is no error in the program or little thread interaction is needed in order for the error to manifest.

Page generated in 0.0497 seconds