• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 51
  • 8
  • 5
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 91
  • 28
  • 25
  • 19
  • 16
  • 15
  • 13
  • 11
  • 11
  • 9
  • 9
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Regression test selection by exclusion

Ngah, Amir January 2012 (has links)
This thesis addresses the research in the area of regression testing. Software systems change and evolve over time. Each time a system is changed regression tests have to be run to validate these changes. An important issue in regression testing is how to minimise reuse the existing test cases of original program for modied program. One of the techniques to tackle this issue is called regression test selection technique. The aim of this research is to signicantly reduce the number of test cases that need to be run after changes have been made. Specically, this thesis focuses on developing a model for regression test selection using the decomposition slicing technique. Decomposition slicing provides a technique that is capable of identifying the unchanged parts of the system. The model of regression test selection based on decomposition slicing and exclusion of test cases was developed in this thesis. The model is called Regression Test Selection by Exclusion (ReTSE) and has four main phases. They are Program Analysis, Comparison, Exclusion and Optimisation phases. The validity of the ReTSE model is explored through the application of a number of case studies. The case studies tackle all types of modication such as change, delete and add statements. The case studies have covered a single and combination types of modication at a time. The application of the proposed model has shown that signicant reductions in the number of test cases can be achieved. The evaluation of the model based on an existing framework and comparison with another model also has shown promising results. The case studies have limited themselves to relatively small programs and the next step is to apply the model to larger systems with more complex changes to ascertain if it scales up. While some parts of the model have been automated tools will be required for the rest when carrying out the larger case studies.
12

Effect of Dispersion on SS-WDM Systems

Wongpaibool, Virach 23 September 1998 (has links)
The purpose of this thesis is to investigate the effect of dispersion on a spectrum-sliced WDM (SS-WDM) system, specifically a system employing a single-mode optical fiber. The system performance is expressed in term of the receiver sensitivity defined as the average number of photon per bit <i>N<sub>p</i> </sub>required for a given probability of bit error <i>P<sub>e</sub></i>. The receiver sensitivity is expressed in terms of two normalized parameters: the ratio of the optical bandwidth per channel and the bit rate <i>m</i>=<i>B</i><sub>0</sub><i>/R<sub>b</sub></i>=<i>B</i><sub>0</sub><i>T</i>, and the transmission distance normalized by the dispersion distance <i>z/L<sub>D</sub></i>. The former represents the effect of the excess beat noise caused by the signal fluctuation. The latter represents the effect of dispersion. The excess beat noise can be reduced by increasing the value of <i>m</i> (increasing the optical bandwidth<i> B</i><sub>0</sub> for a given bit rate<i> R<sub>b</sub></i>). However, a large <i>m</i> implies that the degradation due to the dispersion is severe in a system employing a single-mode fiber. Therefore, there should be an optimum <i>m</i> resulting from the two effects. The theoretical results obtained from our analysis have confirmed this prediction. It is also shown that the optimum <i>m</i> (<i>m<sub>opt</sub></i>) decreases with an increase in the normalized distance. This suggests that the dispersion strongly affects the system performance. The increase in the excess beat noise is traded for the decrease in the dispersion effect. Additionally, the maximum transmission distance is relatively short, compared to that in a laser-based system. This suggests that the SS-WDM systems with single-mode fibers are suitable for short-haul systems, such as high-speed local-access network where the operating bit rate is high but the transmission distance is relatively short. / Master of Science
13

Detecting Test Clones with Static Analysis

Jain, Divam January 2013 (has links)
Large-scale software systems often have correspondingly complicated test suites, which are diffi cult for developers to construct and maintain. As systems evolve, engineers must update their test suite along with changes in the source code. Tests created by duplicating and modifying previously existing tests (clones) can complicate this task. Several testing technologies have been proposed to mitigate cloning in tests, including parametrized unit tests and test theories. However, detecting opportunities to improve existing test suites is labour intensive. This thesis presents a novel technique for etecting similar tests based on type hierarchies and method calls in test code. Using this technique, we can track variable history and detect test clones based on test assertion similarity. The thesis further includes results from our empirical study of 10 benchmark systems using this technique which suggest that test clone detection by our technique will aid test de-duplication eff orts in industrial systems.
14

Information flow control for Java a comprehensive approach based on path conditions in dependence graphs

Hammer, Christian January 2009 (has links)
Zugl.: Karlsruhe, Univ., Diss., 2009 / Hergestellt on demand
15

Detecting Test Clones with Static Analysis

Jain, Divam January 2013 (has links)
Large-scale software systems often have correspondingly complicated test suites, which are diffi cult for developers to construct and maintain. As systems evolve, engineers must update their test suite along with changes in the source code. Tests created by duplicating and modifying previously existing tests (clones) can complicate this task. Several testing technologies have been proposed to mitigate cloning in tests, including parametrized unit tests and test theories. However, detecting opportunities to improve existing test suites is labour intensive. This thesis presents a novel technique for etecting similar tests based on type hierarchies and method calls in test code. Using this technique, we can track variable history and detect test clones based on test assertion similarity. The thesis further includes results from our empirical study of 10 benchmark systems using this technique which suggest that test clone detection by our technique will aid test de-duplication eff orts in industrial systems.
16

Impact of biofilm formation and sublethal injury of listeria monocytogenes on transfer to delicatessen meats

Keskinen, Lindsey Ann. January 2006 (has links)
Thesis (Ph. D.)--Michigan State University. Dept. of Food Science and Human Nutrition, 2006. / Title from PDF t.p. (viewed on June 19, 2009) Includes bibliographical references (p. 209-220). Also issued in print.
17

Pfadbedingungen in Abhängigkeitsgraphen und ihre Anwendung in der Softwaresicherheitstechnik

Robschink, Torsten. Unknown Date (has links) (PDF)
Universiẗat, Diss., 2005--Passau. / Erscheinungsjahr an der Haupttitelstelle: 2004.
18

Code Decomposition: A New Hope

Garg, Nupur 01 June 2017 (has links)
Code decomposition (also known as functional decomposition) is the process of breaking a larger problem into smaller subproblems so that each function implements only a single task. Although code decomposition is integral to computer science, it is often overlooked in introductory computer science education due to the challenges of teaching it given limited resources. Earthworm is a tool that generates unique suggestions on how to improve the decomposition of provided Python source code. Given a program as input, Earthworm presents the user with a list of suggestions to improve the functional decomposition of the program. Each suggestion includes the lines of code that can be refactored into a new function, the arguments that must be passed to this function and the variables returned from the function. The tool is intended to be used in introductory computer science courses to help students learn more about decomposition. Earthworm generates suggestions by converting Python source code into a control flow graph. Static analysis is performed on the control flow graph to direct the generation of suggestions based on code slices.
19

Dual Execution And Its Applications

Dohyeong Kim (5929886) 08 May 2020 (has links)
<div>Execution comparison techniques compare multiple executions from the same program or highly similar programs to identify state differences including control flow differences and variable value differences. Execution comparison has been used to debug sequential, concurrent, and regression failures, by reasoning about the causality between execution differences and input differences, thread scheduling differences, and syntactic differences among program versions, respectively. However, execution comparison techniques have several limitations. First, executions may have benign differences, which are not related to the behavior differences that the user can observe. Second, huge storage spaces are required to record two independent executions. Third, the techniques can only compare executions from the same or similar programs.</div><div><br></div><div>In this dissertation, we present an execution comparison technique that (1) removes benign differences, (2) requires less space, and (3) can compare two different programs implementing similar algorithms. Also, we present that the execution comparison technique can be used in identifying and extracting a functional component out of a binary. First, we present a dual execution engine that executes multiple executions at the same time and only introduces the desired differences. Also, the engine compares the executions on-the-fly and stores the differences only. Second, we present a technique to compare two programs written by two different programmers. Especially we will show that this technique can compare the buggy program from a student and the correct from the instructor and can reason about the errors.</div><div><br></div>
20

Analysis, Design and Performance Evaluation of Optical Fiber Spectrum-Sliced WDM Systems

Arya, Vivek 10 July 1997 (has links)
This dissertation investigates the design and performance issues of a recently demonstrated technique, termed as spectrum-slicing, for implementing wavelength-division-multiplexing (WDM) in optical fiber systems. Conventional WDM systems employ laser diodes operating at discrete wavelengths as carriers for the different data channels that are to be multiplexed. Spectrum-slicing provides an attractive low-cost alternative to the use of multiple coherent lasers for such WDM applications by utilizing spectral slices of a broadband noise source for the different data channels. The principal broadband noise source considered is the amplified spontaneous emission (ASE) noise from an optical amplifier. Each slice of the spectrum is actually a burst of noise that is modulated individually for a high capacity WDM system. The stochastic nature of the broadband source gives rise to excess intensity noise which results in a power penalty at the receiver. One way to minimize this penalty, as proposed and analyzed for the first time in this work, is to use an optical preamplifier receiver. It is shown that when an optical preamplifier receiver is used, there exists an optimum filter bandwidth which optimizes the detection sensitivity (minimizes the average number of photons/bit) for a given error probability. Moreover the evaluated detection sensitivity represents an order of magnitude ( > 10 dB) improvement over conventional PIN receiver-based detection techniques for such spectrum-sliced communication systems. The optimum is a consequence of signal energy fluctuations dominating at low values of the signal time bandwidth product (m), and the preamplifier ASE noise dominating at high values of m. Operation at the optimum bandwidth renders the channel error probability to be a strong function of the optical bandwidth, thus providing motivation for the use of forward error correction coding (FEC). System capacity (for BER = ) is shown to be 23 Gb/s without coding, and 75 Gb/s with a (255,239) Reed Solomon code. The effect of non-rectangular spectra on receiver sensitivity is investigated for both OOK and FSK transmission, assuming the system (de)multiplexer filters to be N'th order Butterworth bandpass. Although narrower filters are recommended for improving power budget, it is shown that system penalty due to filter shape may be kept < 1 dB by employing filters with N > 2. Moreover spectrum-sliced FSK systems using optical preamplifier receivers are shown, for the first time, to perform better in a peak optical power limited environment. Performance-optimized spectrum-sliced WDM systems have potential use in both local loop and long-distance fiber communication systems which require low-cost WDM equipment for high data rate applications. / Ph. D.

Page generated in 0.0697 seconds