• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 69
  • 11
  • 7
  • 6
  • 5
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 132
  • 132
  • 32
  • 20
  • 18
  • 17
  • 16
  • 14
  • 13
  • 13
  • 12
  • 11
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Patterned Versus Conventional Object-Oriented Analysis Methods: A Group Project Experiment

KUROKI, Hiroaki, YAMAMOTO, Shuichiro 20 December 1998 (has links)
No description available.
32

An empirical study of SD signal delay versus temperature in a plenum grade coaxial cable

Kaur, Sukhdeep 14 February 2012 (has links)
A high resolution speedy delivery time domain reflectometer (SD/TDR) has been developed in the Electrical Engineering department at The University of Texas at Austin. The SD/TDR uses a novel non-sinusoidal signal that does not undergo dispersion during transmission in a lossy media. SD/TDR is used to estimate the length and detect the location of faults in the transmission lines. Time of flight (TOF) is one of the critical parameters of SD/TDR and a function of several temperature dependent factors. Given the TOF and length of a transmission line, signal delay can be computed. This research presents an empirical study of the effect of temperature on the TOF in a plenum grade coaxial cable for temperatures ranging from -3 °C to 60 °C. We also study the effect of temperature on characteristic impedance of the coaxial cable. Finally, a SD double exponential waveform is used to estimate TOF for calibrated short and open terminations. / text
33

An empirical study on software quality : developer perception of quality, metrics, and visualizations

Wilson, Gary Lynn 09 December 2013 (has links)
Software tends to decline in quality over time, causing development and maintenance costs to rise. However, by measuring, tracking, and controlling quality during the lifetime of a software product, its technical debt can be held in check, reducing total cost of ownership. The measurement of quality faces challenges due to disagreement in the meaning of software quality, the inability to directly measure quality factors, and the lack of measurement practice in the software industry. This report addresses these challenges through both a literature survey, a metrics derivation process, and a survey of professional software developers. Definitions of software quality from the literature are presented and evaluated with responses from software professionals. A goal, question, metric process is used to derive quality-targeted metrics tracing back to a set of seven code-quality subgoals, while a survey to software professionals shows that despite agreement that metrics and metric visualizations would be useful for improving software quality, the techniques are underutilized in practice. / text
34

Data Quality in Wide-Area Monitoring and Control Systems : PMU Data Latency, Completness, and Design of Wide-Area Damping Systems

Zhu, Kun January 2013 (has links)
The strain on modern electrical power system operation has led to an ever increasing utilization of new Information Communication Technology (ICT) systems to enhance the reliability and efficiency of grid operation. Among these proposals, Phasor Measurement Unit (PMU)-based Wide-Area Monitoring and Control (WAMC) systems have been recognized as one of the enablers of “Smart Grid”, particularly at the transmission level, due to their capability to improve the real-time situational awareness of the grid. These systems differ from the conventional Supervisory Control And Data Acquisition (SCADA) systems in that they provide globally synchronized measurements at high resolutions. On the other hand, the WAMC systems also impose several stringent requirements on the underlying ICT systems, including performance, security, and availability, etc. As a result, the functionality of the WAMC applications is heavily, but not exclusively, dependent on the capabilities of the underlying ICT systems. This tight coupling makes it difficult to fully exploit the benefits of the synchrophasor technology without the proper design and configuration of ICT systems to support the WAMC applications. The strain on modern electrical power system operation has led to an ever increasing utilization of new Information Communication Technology (ICT) systems to enhance the reliability and efficiency of grid operation. Among these proposals, Phasor Measurement Unit (PMU)-based Wide-Area Monitoring and Control (WAMC) systems have been recognized as one of the enablers of “Smart Grid”, particularly at the transmission level, due to their capability to improve the real-time situational awareness of the grid. These systems differ from the conventional Supervisory Control And Data Acquisition (SCADA) systems in that they provide globally synchronized measurements at high resolutions. On the other hand, the WAMC systems also impose several stringent requirements on the underlying ICT systems, including performance, security, and availability, etc. As a result, the functionality of the WAMC applications is heavily, but not exclusively, dependent on the capabilities of the underlying ICT systems. This tight coupling makes it difficult to fully exploit the benefits of the synchrophasor technology without the proper design and configuration of ICT systems to support the WAMC applications. In response to the above challenges, this thesis addresses the dependence of WAMC applications on the underlying ICT systems. Specifically, two of the WAMC system data quality attributes, latency and completeness, are examined together with their effects on a typical WAMC application, PMU-based wide-area damping systems. The outcomes of this research include quantified results in the form of PMU communication delays and data frame losses, and probability distributions that can model the PMU communication delays. Moreover, design requirements are determined for the wide-area damping systems, and three different delay-robust designs for this WAMC application are validated based on the above results. Finally, a virtual PMU is developed to perform power system and communication network co-simulations. The results reported by this thesis offer a prospect for better predictions of the performance of the supporting ICT systems in terms of PMU data latency and completeness. These results can be further used to design and optimize the WAMC applications and their underlying ICT systems in an integrated manner. This thesis also contributes a systematic approach to design the wide-area damping system considering the PMU data latency and completeness. Finally, the developed virtual PMU, as part of a co-simulation platform, provides a means to investigate the dependence of WAMC applications on the capabilities of the underlying ICT systems in a cost-efficient manner. / <p>QC 20131015</p>
35

An Empirical Study of Distributed Constraint Satisfaction Algorithms

Mohamed, Younis 20 September 2011 (has links)
Many real world problems are naturally distributed, whether they are spatially, cognitively, or otherwise. Distributed problems naturally lend themselves to solutions using multi-agent paradigms. Distributed Constraint Satisfaction Problems (DisCSPs) are a class of such distributed problems. In DisCSPs, variables and constraints are distributed between agents. Most distributed algorithms, although exponential in the worst-case, can have a good performance in the average case. The main purpose of this research is to statistically assess difference between the empirical performances of major state of the art DisCSP algorithms including Multi-Sectioned Constraint Network (MSCN) based algorithms, that have never been empirically compared against other DisCSP algorithms. In this thesis, we select a set of state of the art DisCSP algorithms and compare them on randomly generated instances of binary DisCSPs with a wide range of characteristics. Distributed algorithms ADOPT, DSA, DPOP, and MSCN based algorithms were selected based on a set of high level criteria. We explore how these algorithms relatively compare with each other on a range of DisCSPs with different parameters. Their performances are evaluated according to computation time (in the form of non-concurrent computational steps or NCCCs) and communication load (in the form of number of messages as well as volume of messages). Statistical parametric tests are used to aid interpretation of the performance results. In addition, this thesis discusses privacy issues associated with these DisCSP algorithms.
36

USING COMPLEXITY, COUPLING, AND COHESION METRICS AS EARLY INDICATORS OF VULNERABILITIES

Chowdhury, Istehad 28 September 2009 (has links)
Software security failures are common and the problem is growing. A vulnerability is a weakness in the software that, when exploited, causes a security failure. It is difficult to detect vulnerabilities until they manifest themselves as security failures in the operational stage of the software, because security concerns are often not addressed or known sufficiently early during the Software Development Life Cycle (SDLC). Complexity, coupling, and cohesion (CCC) related software metrics can be measured during the early phases of software development such as design or coding. Although these metrics have been successfully employed to indicate software faults in general, the relationships between CCC metrics and vulnerabilities have not been extensively investigated yet. If empirical relationships can be discovered between CCC metrics and vulnerabilities, these metrics could aid software developers to take proactive actions against potential vulnerabilities in software. In this thesis, we investigate whether CCC metrics can be utilized as early indicators of software vulnerabilities. We conduct an extensive case study on several releases of Mozilla Firefox to provide empirical evidence on how vulnerabilities are related to complexity, coupling, and cohesion. We mine the vulnerability databases, bug databases, and version archives of Mozilla Firefox to map vulnerabilities to software entities. It is found that some of the CCC metrics are correlated to vulnerabilities at a statistically significant level. Since different metrics are available at different development phases, we further examine the correlations to determine which level (design or code) of CCC metrics are better indicators of vulnerabilities. We also observe that the correlation patterns are stable across multiple releases. These observations imply that the metrics can be dependably used as early indicators of vulnerabilities in software. We then present a framework to automatically predict vulnerabilities based on CCC metrics. To build vulnerability predictors, we consider four alternative data mining and statistical techniques – C4.5 Decision Tree, Random Forests, Logistic Regression, and Naïve-Bayes – and compare their prediction performances. We are able to predict majority of the vulnerability-prone files in Mozilla Firefox, with tolerable false positive rates. Moreover, the predictors built from the past releases can reliably predict the likelihood of having vulnerabilities in future releases. The experimental results indicate that structural information from the non-security realm such as complexity, coupling, and cohesion are useful in vulnerability prediction. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2009-09-24 17:31:36.581
37

A Bridge from Artificial Places: An Empirical Phenomenology of Mystical Reading in Rilke and Eliot

Campbell, Paul G Unknown Date
No description available.
38

Self-Modifying Experiences in Literary Reading: A Model for Reader Response

Fialho, Olivia da Costa Unknown Date
No description available.
39

Epiphanies of finitude: a phenomenological study of existential reading

Sopcak, Paul Unknown Date
No description available.
40

A Manifestation of Model-Code Duality: Facilitating the Representation of State Machines in the Umple Model-Oriented Programming Language

Badreldin, Omar 18 April 2012 (has links)
This thesis presents research to build and evaluate embedding of a textual form of state machines into high-level programming languages. The work entailed adding state machine syntax and code generation to the Umple model-oriented programming technology. The added concepts include states, transitions, actions, and composite states as found in the Unified Modeling Language (UML). This approach allows software developers to take advantage of the modeling abstractions in their textual environments, without sacrificing the value added of visual modeling. Our efforts in developing state machines in Umple followed a test-driven approach to ensure high quality and usability of the technology. We have also developed a syntax-directed editor for Umple, similar to those available to other high-level programming languages. We conducted a grounded theory study of Umple users and used the findings iteratively to guide our experimental development. Finally, we conducted a controlled experiment to evaluate the effectiveness of our approach. By enhancing the code to be almost as expressive as the model, we further support model-code duality; the notion that both model and code are two faces for the same coin. Systems can be and should be equally-well specified textually and diagrammatically. Such duality will benefit both modelers and coders alike. Our work suggests that code enhanced with state machine modeling abstractions is semantically equivalent to visual state machine models. The flow of the thesis is as follows; the research hypothesis and questions are presented in “Chapter 1: Introduction”. The background is explored in “Chapter 2: Background”. “Chapter 3: Syntax and semantics of simple state machines” and “Chapter 4: Syntax and semantics of composite state machines” investigate simple and composite state machines in Umple, respectively. “Chapter 5: Implementation of composite state machines” presents the approach we adopt for the implementation of composite state machines that avoids explosion of the amount of generated code. From this point on, the thesis presents empirical work. A grounded theory study is presented in “Chapter 6: A Grounded theory study of Umple”, followed by a controlled experiment in “Chapter 7: Experimentation”. These two chapters constitute our validation and evaluation of Umple research. Related and future work is presented in “Chapter 8: Related work”.

Page generated in 0.0479 seconds