Spelling suggestions: "subject:" cmpirical study"" "subject:" 7empirical study""
31 |
A Recommendation System for Preconditioned Iterative SolversGeorge, Thomas 2009 December 1900 (has links)
Solving linear systems of equations is an integral part of most scientific simulations. In
recent years, there has been a considerable interest in large scale scientific simulation of
complex physical processes. Iterative solvers are usually preferred for solving linear systems
of such magnitude due to their lower computational requirements. Currently, computational
scientists have access to a multitude of iterative solver options available as "plug-and-
play" components in various problem solving environments. Choosing the right solver
configuration from the available choices is critical for ensuring convergence and achieving
good performance, especially for large complex matrices. However, identifying the
"best" preconditioned iterative solver and parameters is challenging even for an expert due
to issues such as the lack of a unified theoretical model, complexity of the solver configuration
space, and multiple selection criteria. Therefore, it is desirable to have principled
practitioner-centric strategies for identifying solver configuration(s) for solving large linear
systems.
The current dissertation presents a general practitioner-centric framework for (a) problem
independent retrospective analysis, and (b) problem-specific predictive modeling of
performance data. Our retrospective performance analysis methodology introduces new
metrics such as area under performance-profile curve and conditional variance-based finetuning
score that facilitate a robust comparative performance evaluation as well as parameter
sensitivity analysis. We present results using this analysis approach on a number of popular
preconditioned iterative solvers available in packages such as PETSc, Trilinos, Hypre, ILUPACK, and WSMP. The predictive modeling of performance data is an integral part
of our multi-stage approach for solver recommendation. The key novelty of our approach
lies in our modular learning based formulation that comprises of three sub problems: (a)
solvability modeling, (b) performance modeling, and (c) performance optimization, which
provides the flexibility to effectively target challenges such as software failure and multiobjective
optimization. Our choice of a "solver trial" instance space represented in terms
of the characteristics of the corresponding "linear system", "solver configuration" and their
interactions, leads to a scalable and elegant formulation. Empirical evaluation of our approach
on performance datasets associated with fairly large groups of solver configurations
demonstrates that one can obtain high quality recommendations that are close to the ideal
choices.
|
32 |
Patterned Versus Conventional Object-Oriented Analysis Methods: A Group Project ExperimentKUROKI, Hiroaki, YAMAMOTO, Shuichiro 20 December 1998 (has links)
No description available.
|
33 |
An empirical study of SD signal delay versus temperature in a plenum grade coaxial cableKaur, Sukhdeep 14 February 2012 (has links)
A high resolution speedy delivery time domain reflectometer (SD/TDR) has been developed in the Electrical Engineering department at The University of Texas at Austin. The SD/TDR uses a novel non-sinusoidal signal that does not undergo dispersion during transmission in a lossy media. SD/TDR is used to estimate the length and detect the location of faults in the transmission lines. Time of flight (TOF) is one of the critical parameters of SD/TDR and a function of several temperature dependent factors. Given the TOF and length of a transmission line, signal delay can be computed. This research presents an empirical study of the effect of temperature on the TOF in a plenum grade coaxial cable for temperatures ranging from -3 °C to 60 °C. We also study the effect of temperature on characteristic impedance of the coaxial cable. Finally, a SD double exponential waveform is used to estimate TOF for calibrated short and open terminations. / text
|
34 |
An empirical study on software quality : developer perception of quality, metrics, and visualizationsWilson, Gary Lynn 09 December 2013 (has links)
Software tends to decline in quality over time, causing development and maintenance costs to rise. However, by measuring, tracking, and controlling quality during the lifetime of a software product, its technical debt can be held in check, reducing total cost of ownership. The measurement of quality faces challenges due to disagreement in the meaning of software quality, the inability to directly measure quality factors, and the lack of measurement practice in the software industry. This report addresses these challenges through both a literature survey, a metrics derivation process, and a survey of professional software developers. Definitions of software quality from the literature are presented and evaluated with responses from software professionals. A goal, question, metric process is used to derive quality-targeted metrics tracing back to a set of seven code-quality subgoals, while a survey to software professionals shows that despite agreement that metrics and metric visualizations would be useful for improving software quality, the techniques are underutilized in practice. / text
|
35 |
Data Quality in Wide-Area Monitoring and Control Systems : PMU Data Latency, Completness, and Design of Wide-Area Damping SystemsZhu, Kun January 2013 (has links)
The strain on modern electrical power system operation has led to an ever increasing utilization of new Information Communication Technology (ICT) systems to enhance the reliability and efficiency of grid operation. Among these proposals, Phasor Measurement Unit (PMU)-based Wide-Area Monitoring and Control (WAMC) systems have been recognized as one of the enablers of “Smart Grid”, particularly at the transmission level, due to their capability to improve the real-time situational awareness of the grid. These systems differ from the conventional Supervisory Control And Data Acquisition (SCADA) systems in that they provide globally synchronized measurements at high resolutions. On the other hand, the WAMC systems also impose several stringent requirements on the underlying ICT systems, including performance, security, and availability, etc. As a result, the functionality of the WAMC applications is heavily, but not exclusively, dependent on the capabilities of the underlying ICT systems. This tight coupling makes it difficult to fully exploit the benefits of the synchrophasor technology without the proper design and configuration of ICT systems to support the WAMC applications. The strain on modern electrical power system operation has led to an ever increasing utilization of new Information Communication Technology (ICT) systems to enhance the reliability and efficiency of grid operation. Among these proposals, Phasor Measurement Unit (PMU)-based Wide-Area Monitoring and Control (WAMC) systems have been recognized as one of the enablers of “Smart Grid”, particularly at the transmission level, due to their capability to improve the real-time situational awareness of the grid. These systems differ from the conventional Supervisory Control And Data Acquisition (SCADA) systems in that they provide globally synchronized measurements at high resolutions. On the other hand, the WAMC systems also impose several stringent requirements on the underlying ICT systems, including performance, security, and availability, etc. As a result, the functionality of the WAMC applications is heavily, but not exclusively, dependent on the capabilities of the underlying ICT systems. This tight coupling makes it difficult to fully exploit the benefits of the synchrophasor technology without the proper design and configuration of ICT systems to support the WAMC applications. In response to the above challenges, this thesis addresses the dependence of WAMC applications on the underlying ICT systems. Specifically, two of the WAMC system data quality attributes, latency and completeness, are examined together with their effects on a typical WAMC application, PMU-based wide-area damping systems. The outcomes of this research include quantified results in the form of PMU communication delays and data frame losses, and probability distributions that can model the PMU communication delays. Moreover, design requirements are determined for the wide-area damping systems, and three different delay-robust designs for this WAMC application are validated based on the above results. Finally, a virtual PMU is developed to perform power system and communication network co-simulations. The results reported by this thesis offer a prospect for better predictions of the performance of the supporting ICT systems in terms of PMU data latency and completeness. These results can be further used to design and optimize the WAMC applications and their underlying ICT systems in an integrated manner. This thesis also contributes a systematic approach to design the wide-area damping system considering the PMU data latency and completeness. Finally, the developed virtual PMU, as part of a co-simulation platform, provides a means to investigate the dependence of WAMC applications on the capabilities of the underlying ICT systems in a cost-efficient manner. / <p>QC 20131015</p>
|
36 |
An Empirical Study of Distributed Constraint Satisfaction AlgorithmsMohamed, Younis 20 September 2011 (has links)
Many real world problems are naturally distributed, whether they are spatially, cognitively, or otherwise. Distributed problems naturally lend themselves to solutions using multi-agent paradigms. Distributed Constraint Satisfaction Problems (DisCSPs) are a class of such distributed problems. In DisCSPs, variables and constraints are distributed between agents. Most distributed algorithms, although exponential in the worst-case, can have a good performance in the average case. The main purpose of this research is to statistically assess difference between the empirical performances of major state of the art DisCSP algorithms including Multi-Sectioned Constraint Network (MSCN) based algorithms, that have never been empirically compared against other DisCSP algorithms. In this thesis, we select a set of state of the art DisCSP algorithms and compare them on randomly generated instances of binary DisCSPs with a wide range of characteristics. Distributed algorithms ADOPT, DSA, DPOP, and MSCN based algorithms were selected based on a set of high level criteria. We explore how these algorithms relatively compare with each other on a range of DisCSPs with different parameters. Their performances are evaluated according to computation time (in the form of non-concurrent computational steps or NCCCs) and communication load (in the form of number of messages as well as volume of messages). Statistical parametric tests are used to aid interpretation of the performance results. In addition, this thesis discusses privacy issues associated with these DisCSP algorithms.
|
37 |
USING COMPLEXITY, COUPLING, AND COHESION METRICS AS EARLY INDICATORS OF VULNERABILITIESChowdhury, Istehad 28 September 2009 (has links)
Software security failures are common and the problem is growing. A vulnerability is a weakness in the software that, when exploited, causes a security failure. It is difficult to detect vulnerabilities until they manifest themselves as security failures in the operational stage of the software, because security concerns are often not addressed or known sufficiently early during the Software Development Life Cycle (SDLC). Complexity, coupling, and cohesion (CCC) related software metrics can be measured during the early phases of software development such as design or coding. Although these metrics have been successfully employed to indicate software faults in general, the relationships between CCC metrics and vulnerabilities have not been extensively investigated yet. If empirical relationships can be discovered between CCC metrics and vulnerabilities, these metrics could aid software developers to take proactive actions against potential vulnerabilities in software.
In this thesis, we investigate whether CCC metrics can be utilized as early indicators of software vulnerabilities. We conduct an extensive case study on several releases of Mozilla Firefox to provide empirical evidence on how vulnerabilities are related to complexity, coupling, and cohesion. We mine the vulnerability databases, bug databases, and version archives of Mozilla Firefox to map vulnerabilities to software entities. It is found that some of the CCC metrics are correlated to vulnerabilities at a statistically significant level. Since different metrics are available at different development phases, we further examine the correlations to determine which level (design or code) of CCC metrics are better indicators of vulnerabilities. We also observe that the correlation patterns are stable across multiple releases. These observations imply that the metrics can be dependably used as early indicators of vulnerabilities in software.
We then present a framework to automatically predict vulnerabilities based on CCC metrics. To build vulnerability predictors, we consider four alternative data mining and statistical techniques – C4.5 Decision Tree, Random Forests, Logistic Regression, and Naïve-Bayes – and compare their prediction performances. We are able to predict majority of the vulnerability-prone files in Mozilla Firefox, with tolerable false positive rates. Moreover, the predictors built from the past releases can reliably predict the likelihood of having vulnerabilities in future releases. The experimental results indicate that structural information from the non-security realm such as complexity, coupling, and cohesion are useful in vulnerability prediction. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2009-09-24 17:31:36.581
|
38 |
A Bridge from Artificial Places: An Empirical Phenomenology of Mystical Reading in Rilke and EliotCampbell, Paul G Unknown Date
No description available.
|
39 |
Self-Modifying Experiences in Literary Reading: A Model for Reader ResponseFialho, Olivia da Costa Unknown Date
No description available.
|
40 |
Epiphanies of finitude: a phenomenological study of existential readingSopcak, Paul Unknown Date
No description available.
|
Page generated in 0.0758 seconds