• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 621
  • 158
  • 86
  • 74
  • 55
  • 47
  • 33
  • 17
  • 16
  • 14
  • 13
  • 12
  • 9
  • 8
  • 8
  • Tagged with
  • 1436
  • 211
  • 191
  • 191
  • 183
  • 180
  • 124
  • 118
  • 104
  • 103
  • 99
  • 86
  • 82
  • 80
  • 79
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

Empirical Likelihood Inference for the Accelerated Failure Time Model via Kendall Estimating Equation

Lu, Yinghua 17 July 2010 (has links)
In this thesis, we study two methods for inference of parameters in the accelerated failure time model with right censoring data. One is the Wald-type method, which involves parameter estimation. The other one is empirical likelihood method, which is based on the asymptotic distribution of likelihood ratio. We employ a monotone censored data version of Kendall estimating equation, and construct confidence intervals from both methods. In the simulation studies, we compare the empirical likelihood (EL) and the Wald-type procedure in terms of coverage accuracy and average length of confidence intervals. It is concluded that the empirical likelihood method has a better performance. We also compare the EL for Kendall’s rank regression estimator with the EL for other well known estimators and find advantages of the EL for Kendall estimator for small size sample. Finally, a real clinical trial data is used for the purpose of illustration.
382

Empirical Likelihood Confidence Intervals for Generalized Lorenz Curve

Belinga-Hill, Nelly E. 28 November 2007 (has links)
Lorenz curves are extensively used in economics to analyze income inequality metrics. In this thesis, we discuss confidence interval estimation methods for generalized Lorenz curve. We first obtain normal approximation (NA) and empirical likelihood (EL) based confidence intervals for generalized Lorenz curves. Then we perform simulation studies to compare coverage probabilities and lengths of the proposed EL-based confidence interval with the NA-based confidence interval for generalized Lorenz curve. Simulation results show that the EL-based confidence intervals have better coverage probabilities and shorter lengths than the NA-based intervals at 100p-th percentiles when p is greater than 0.50. Finally, two real examples on income are used to evaluate the applicability of these methods: the first example is the 2001 income data from the Panel Study of Income Dynamics (PSID) and the second example makes use of households’ median income for the USA by counties for the years 1999 and 2006
383

Data Quality in Wide-Area Monitoring and Control Systems : PMU Data Latency, Completness, and Design of Wide-Area Damping Systems

Zhu, Kun January 2013 (has links)
The strain on modern electrical power system operation has led to an ever increasing utilization of new Information Communication Technology (ICT) systems to enhance the reliability and efficiency of grid operation. Among these proposals, Phasor Measurement Unit (PMU)-based Wide-Area Monitoring and Control (WAMC) systems have been recognized as one of the enablers of “Smart Grid”, particularly at the transmission level, due to their capability to improve the real-time situational awareness of the grid. These systems differ from the conventional Supervisory Control And Data Acquisition (SCADA) systems in that they provide globally synchronized measurements at high resolutions. On the other hand, the WAMC systems also impose several stringent requirements on the underlying ICT systems, including performance, security, and availability, etc. As a result, the functionality of the WAMC applications is heavily, but not exclusively, dependent on the capabilities of the underlying ICT systems. This tight coupling makes it difficult to fully exploit the benefits of the synchrophasor technology without the proper design and configuration of ICT systems to support the WAMC applications. The strain on modern electrical power system operation has led to an ever increasing utilization of new Information Communication Technology (ICT) systems to enhance the reliability and efficiency of grid operation. Among these proposals, Phasor Measurement Unit (PMU)-based Wide-Area Monitoring and Control (WAMC) systems have been recognized as one of the enablers of “Smart Grid”, particularly at the transmission level, due to their capability to improve the real-time situational awareness of the grid. These systems differ from the conventional Supervisory Control And Data Acquisition (SCADA) systems in that they provide globally synchronized measurements at high resolutions. On the other hand, the WAMC systems also impose several stringent requirements on the underlying ICT systems, including performance, security, and availability, etc. As a result, the functionality of the WAMC applications is heavily, but not exclusively, dependent on the capabilities of the underlying ICT systems. This tight coupling makes it difficult to fully exploit the benefits of the synchrophasor technology without the proper design and configuration of ICT systems to support the WAMC applications. In response to the above challenges, this thesis addresses the dependence of WAMC applications on the underlying ICT systems. Specifically, two of the WAMC system data quality attributes, latency and completeness, are examined together with their effects on a typical WAMC application, PMU-based wide-area damping systems. The outcomes of this research include quantified results in the form of PMU communication delays and data frame losses, and probability distributions that can model the PMU communication delays. Moreover, design requirements are determined for the wide-area damping systems, and three different delay-robust designs for this WAMC application are validated based on the above results. Finally, a virtual PMU is developed to perform power system and communication network co-simulations. The results reported by this thesis offer a prospect for better predictions of the performance of the supporting ICT systems in terms of PMU data latency and completeness. These results can be further used to design and optimize the WAMC applications and their underlying ICT systems in an integrated manner. This thesis also contributes a systematic approach to design the wide-area damping system considering the PMU data latency and completeness. Finally, the developed virtual PMU, as part of a co-simulation platform, provides a means to investigate the dependence of WAMC applications on the capabilities of the underlying ICT systems in a cost-efficient manner. / <p>QC 20131015</p>
384

EXPERIMENTAL VALIDATION OF A NOVEL STRUCTURAL HEALTH MONITORING STRATEGY FOR BOLTED PIPELINE JOINTS

Briand, Julie 18 August 2010 (has links)
The early detection of damage of in-service structural or mechanical systems is of vital importance. With early detection, the damage may be repaired before the integrity of the system is jeopardized, avoiding possible monetary losses, environmental impacts, injury and death. With this goal in mind, many structural health monitoring techniques have been developed which use a combination of sensors and algorithms to collect, process and interpret data to detect damage in a structure. This thesis presents work completed in support of the experimental validation of a novel structural health monitoring technique developed with the aim of providing improved qualitative results compared to those methods currently available.
385

Composition and Structure Dependence of the Photoelastic Response of Oxide Glass

Martin, Vincent 05 August 2011 (has links)
The isotropy of a glass can be broken by the application of a mechanical stress giving rise to a phenomenon of birefringence. Some lead-containing glass compositions are known to prevent this phenomenon and they are called zero-stress optic glass. Mueller’s theory of photoelasticity attempts to explain the structural origin of the photoelastic response in glass and crystal. Zwanziger’s empirical model is able to predict the photoelastic response of a glass based on its composition and the crystal structure of its constituents. Lead-, tin-, antimony-, zinc-, and cadmium-containing glasses were investigated in the binary silicate, borate, and phosphate systems. The stress optic coe?cient of these binary glasses was measured experimentally using the S´enarmont method or found in the literature. Solid-state Nuclear Magnetic Resonance spectroscopy and M¨ossbauer spectroscopy were mainly used to investigate the local environment of the cations. The photoelastic response of a glass and its structure were correlated, and the results were compared with the expectations arising from Mueller’s theory and Zwanziger’s empirical model. The theory and the model were both tested and their reliability was discussed. Zero-stress optic glasses are of technological interest, but new environmental regulations forbids the use of lead in materials, including glass. From experimental results and literature, a global strategy to design new zero-stress optic glasses was established. New lead-free zero-stress optic glasses were discovered with properties similar to the lead-containing zero-stress optic glass (high index of refraction, transparency, no coloration). The study of the structural dependence of the photoelastic response of oxide glass contributed to identify new parameters in?uencing the photoelasticity, such as covalency, polarizability and natural deformation of the additive.
386

An Empirical Study of Distributed Constraint Satisfaction Algorithms

Mohamed, Younis 20 September 2011 (has links)
Many real world problems are naturally distributed, whether they are spatially, cognitively, or otherwise. Distributed problems naturally lend themselves to solutions using multi-agent paradigms. Distributed Constraint Satisfaction Problems (DisCSPs) are a class of such distributed problems. In DisCSPs, variables and constraints are distributed between agents. Most distributed algorithms, although exponential in the worst-case, can have a good performance in the average case. The main purpose of this research is to statistically assess difference between the empirical performances of major state of the art DisCSP algorithms including Multi-Sectioned Constraint Network (MSCN) based algorithms, that have never been empirically compared against other DisCSP algorithms. In this thesis, we select a set of state of the art DisCSP algorithms and compare them on randomly generated instances of binary DisCSPs with a wide range of characteristics. Distributed algorithms ADOPT, DSA, DPOP, and MSCN based algorithms were selected based on a set of high level criteria. We explore how these algorithms relatively compare with each other on a range of DisCSPs with different parameters. Their performances are evaluated according to computation time (in the form of non-concurrent computational steps or NCCCs) and communication load (in the form of number of messages as well as volume of messages). Statistical parametric tests are used to aid interpretation of the performance results. In addition, this thesis discusses privacy issues associated with these DisCSP algorithms.
387

USING COMPLEXITY, COUPLING, AND COHESION METRICS AS EARLY INDICATORS OF VULNERABILITIES

Chowdhury, Istehad 28 September 2009 (has links)
Software security failures are common and the problem is growing. A vulnerability is a weakness in the software that, when exploited, causes a security failure. It is difficult to detect vulnerabilities until they manifest themselves as security failures in the operational stage of the software, because security concerns are often not addressed or known sufficiently early during the Software Development Life Cycle (SDLC). Complexity, coupling, and cohesion (CCC) related software metrics can be measured during the early phases of software development such as design or coding. Although these metrics have been successfully employed to indicate software faults in general, the relationships between CCC metrics and vulnerabilities have not been extensively investigated yet. If empirical relationships can be discovered between CCC metrics and vulnerabilities, these metrics could aid software developers to take proactive actions against potential vulnerabilities in software. In this thesis, we investigate whether CCC metrics can be utilized as early indicators of software vulnerabilities. We conduct an extensive case study on several releases of Mozilla Firefox to provide empirical evidence on how vulnerabilities are related to complexity, coupling, and cohesion. We mine the vulnerability databases, bug databases, and version archives of Mozilla Firefox to map vulnerabilities to software entities. It is found that some of the CCC metrics are correlated to vulnerabilities at a statistically significant level. Since different metrics are available at different development phases, we further examine the correlations to determine which level (design or code) of CCC metrics are better indicators of vulnerabilities. We also observe that the correlation patterns are stable across multiple releases. These observations imply that the metrics can be dependably used as early indicators of vulnerabilities in software. We then present a framework to automatically predict vulnerabilities based on CCC metrics. To build vulnerability predictors, we consider four alternative data mining and statistical techniques – C4.5 Decision Tree, Random Forests, Logistic Regression, and Naïve-Bayes – and compare their prediction performances. We are able to predict majority of the vulnerability-prone files in Mozilla Firefox, with tolerable false positive rates. Moreover, the predictors built from the past releases can reliably predict the likelihood of having vulnerabilities in future releases. The experimental results indicate that structural information from the non-security realm such as complexity, coupling, and cohesion are useful in vulnerability prediction. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2009-09-24 17:31:36.581
388

Empirical Studies of Performance Bugs and Performance Analysis Approaches for Software Systems

ZAMAN, SHAHED 30 April 2012 (has links)
Developing high quality software is of eminent importance to keep the existing customers satisfied and to remain competitive. One of the most important software quality characteristics is performance, which defines how fast and/or efficiently a software can perform its operation. While several studies have shown that field problems are often due to performance issues instead of feature bugs, prior research typically treats all bugs as similar when studying various aspects of software quality (e.g., predicting the time to fix a bug) or focused on other types of bug (e.g., security bugs). There is little work that studies performance bugs. In this thesis, we perform an empirical study to quantitatively and qualitatively examine performance bugs in the Mozilla Firefox and Google Chrome web browser projects in order to find out if performance bugs are really different from other bugs in practice and to understand the rationale behind those differences. In our quantitative study, we find that performance bugs of the Firefox project take longer time to fix, are fixed by more experienced developers, and require changes to more lines of code. We also study performance bugs relative to security bugs, since security bugs have been extensively studied separately in the past. We find that security bugs are re-opened and tossed more often, are fixed and triaged faster, are fixed by more experienced developers, and are assigned more number of developers in the Firefox project. Google Chrome project also shows different quantitative characteristics between performance and non-performance bugs and from the Firefox project. Based on our quantitative results, we look at that data from a qualitative point of view. As one of our most interesting observation, we find that end-users are often frustrated with performance problems and often threaten to switch to competing software products. To better understand, the rationale for some users being very frustrated (even threatening to switch product) even though most systems are well tested, we performed an additional study. In this final study, we explore a global perspective vs a user centric perspective of analyzing performance data. We find that a user-centric perspective might lead to a small number of users with considerably poor performance while the global perspective might show good or same performance across releases. The results of our studies show that performance bugs are different and should be studied separately in large scale software systems to improve the quality assurance processes related to software performance. / Thesis (Master, Computing) -- Queen's University, 2012-04-30 01:28:22.623
389

A Bridge from Artificial Places: An Empirical Phenomenology of Mystical Reading in Rilke and Eliot

Campbell, Paul G Unknown Date
No description available.
390

Self-Modifying Experiences in Literary Reading: A Model for Reader Response

Fialho, Olivia da Costa Unknown Date
No description available.

Page generated in 0.0622 seconds