Spelling suggestions: "subject:"[een] METRICS"" "subject:"[enn] METRICS""
291 |
Predicting mutation score using source code and test suite metricsJalbert, Kevin 01 September 2012 (has links)
Mutation testing has traditionally been used to evaluate the effectiveness of test suites
and provide con dence in the testing process. Mutation testing involves the creation of
many versions of a program each with a single syntactic fault. A test suite is evaluated
against these program versions (i.e., mutants) in order to determine the percentage
of mutants a test suite is able to identify (i.e., mutation score). A major drawback
of mutation testing is that even a small program may yield thousands of mutants
and can potentially make the process cost prohibitive. To improve the performance
and reduce the cost of mutation testing, we proposed a machine learning approach to
predict mutation score based on a combination of source code and test suite metrics.
We conducted an empirical evaluation of our approach to evaluated its effectiveness
using eight open source software systems. / UOIT
|
292 |
Toward a Heuristic Model for Evaluating the Complexity of Computer Security Visualization InterfaceWang, Hsiu-Chung 05 December 2006 (has links)
Computer security visualization has gained much attention in the research community in the past few years. However, the advancement in security visualization research has been hampered by the lack of standardization in visualization design, centralized datasets, and evaluation methods. We propose a new heuristic model for evaluating the complexity of computer security visualizations. This complexity evaluation method is designed to evaluate the efficiency of performing visual search in security visualizations in terms of measuring critical memory capacity load needed to perform such tasks. Our method is based on research in cognitive psychology along with characteristics found in a majority of the security visualizations. The main goal for developing this complexity evaluation method is to guide computer security visualization design and compare different visualization designs. Finally, we compare several well known computer security visualization systems. The proposed method has the potential to be extended to other areas of information visualization.
|
293 |
Corporate finance and option theory: an extension model of rao and stevens (2007)Li, Xiaoni 25 January 2010 (has links)
El objetivo planteado es contribuir a la teoría y práctica de la valoración
financiera de la empresa estudiando la influencia que sobre dicha valoración
ejercen integradamente gobierno, accionistas, acreedores, empleados y clientes.
La necesaria flexibilidad de la vida empresarial se modeliza usando la metodología
de las opciones, tanto ordinarias como reales. Las principales novedades se
hallan en los dos ámbitos siguientes: 1) de carácter instrumental, que consiste
en la forma de utilizar las opciones; 2) en la introducción y cuantificación del
papel que empleados y clientes desempeñan en la creación de valor por parte
de la firma.
Este planteamiento pretende dar un paso adelante en la línea de evaluación
de la remuneración de los empleados (lo que podría permitir la mejora de
los esquemas de remuneración actualmente utilizados) así como en la de
clasificar la clientela de acuerdo al valor añadido que estos dos grupos de
personas (stakeholders) hayan aportado durante el ejercicio económico
La investigación realizada abre importantes líneas de avance para el futuro,
muchas de ellas reseñadas en la Tesis, que indudablemente servirán como
hoja de ruta para ir completando el marco de trabajo aquí establecido. El
actual estado de desarrollo de nuestros sistemas económicos necesitan nuevos
enfoques, siendo muy prometedor el aquí planteado y utilizado. / This thesis focuses on the field of market valuation of relatively large firms and it refers markets
with “normal” behavior, where classical assumptions apply, such as rationality and dynamical
stability, in an attempt to investigate the firm's value creation so as to reveal the contribution of all
possible stakeholders that might be involved in the formation of the market value of a firm. The
literature review related to valuation models, especially the DCF model, has shown that the
conceptual frame of Modigliani and Miller, to determine the market value of a firm as it is
understood today, is too restricted because only three types of stakeholders (shareholders,
debtholders and government) are considered. This work contributes to build an extending valuation
model which incorporates some other stakeholders (different from shareholders, debtholders and
government), such as employees and clients so as to reflect their influence on the firm's market
value. Based upon the work of Rao and Stevens (2007) which reflects the role of the three types of
stakeholders with a special emphasis on the role of government, real options theory is applied here
to quantify the value created through a major degree of loyalty and capture policies for both
employees and clients.
One fundamental option is proposed when building the model and it is related to the employees' and
clients' portfolio of a firm, which has options to improve returns by driving up the motivations of
employees, the fidelity of clients, the capture of talents, the information campaigns to clients and/or
investors. Through applying real options theory, this thesis finds an appropriate way for treating the
uncertainty and integrating the risks associated with it in valuation models, along with considering
the flexibility as an ingredient of value in managerial decisions, increasing the capability to give
alternative actions. The approach in this thesis is almost theoretical with a possible scheme for
empirical experiments suggested for future research. The results achieved attempt to suggest that
there exists communication vehicles for the information about an increase of the satisfaction degree
of employees and customers in such way to be truly transmitted to investors and thus to be
converted into an increase of the firm's market value.
|
294 |
Maintenance of the Quality Monitor Web-ApplicationPonomarenko, Maksym January 2013 (has links)
Applied Research in System Analysis (ARiSA) is a company specialized in the development of the customer-specific quality models and applied research work. In order to improve the quality of the projects and to reduce maintenance costs, ARiSA developed Quality Monitor (QM) – a web application for quality analysis. QM application has been originally developed as a basic program to enable customers to evaluate the quality of the sources. Therefore, the business logic of the application was simplified and certain limitations were imposed on it, which in its turn leads to a number of issues related to user experience, performance and architecture design. These aspects are important for both application as a product, and for its future promotion. Moreover, this is important for customers, as end users. Main application issues, which were added to the maintenance list are: manual data upload, insufficient server resources to handle long-running and resource consuming operations, no background processing and status reporting, simplistic presentation of analysis results and known usability issues, weak integration between analysis back-ends and front-end. In order to address known issues and to make improvements of the existing limitations, a maintenance phase of QM application is initiated. First of all, it is intended to stabilize current version and improve user experience. It also needed for refactoring and implementation of more efficient data uploads processing in the background. In addition, extended functionality of QM would fulfill customer needs and transform application from the project into a product. Extended functionality includes: automated data upload from different build processes, new data visualizations, and improvement of the current functionality according to customer comments. Maintenance phase of QM application has been successfully completed and master thesis goals are met. Current version is more stable and more responsive from user experience perspective. Data processing is more efficient, and now it is implemented as background analysis with automatic data import. User interface has been updated with visualizations for client-side interaction and progress reporting. The solution has been evaluated and tested in close cooperation with QM application customers. This thesis describes requirements analysis, technology stack with choice rationale and implementation to show maintenance results.
|
295 |
Improved effort estimation of software projects based on metricsAndersson, Veronika, Sjöstedt, Hanna January 2005 (has links)
Saab Ericsson Space AB develops products for space for a predetermined price. Since the price is fixed, it is crucial to have a reliable prediction model to estimate the effort needed to develop the product. In general software effort estimation is difficult, and at the software department this is a problem. By analyzing metrics, collected from former projects, different prediction models are developed to estimate the number of person hours a software project will require. Models for predicting the effort before a project begins is first developed. Only a few variables are known at this state of a project. The models developed are compared to a current model used at the company. Linear regression models improve the estimate error with nine percent units and nonlinear regression models improve the result even more. The model used today is also calibrated to improve its predictions. A principal component regression model is developed as well. Also a model to improve the estimate during an ongoing project is developed. This is a new approach, and comparison with the first estimate is the only evaluation. The result is an improved prediction model. There are several models that perform better than the one used today. In the discussion, positive and negative aspects of the models are debated, leading to the choice of a model, recommended for future use.
|
296 |
A Study of the Structural Similarity Image Quality Measure with Applications to Image ProcessingBrunet, Dominique 02 August 2012 (has links)
Since its introduction in 2004, the Structural Similarity (SSIM) index has gained widespread popularity as an image quality assessment measure. SSIM is currently recognized to be one of the most powerful methods of assessing the visual closeness of images. That being said, the Mean Squared Error (MSE), which performs very poorly from a perceptual point of view, still remains the most common optimization criterion in image processing applications because of its relative simplicity along with a number of other properties that are deemed important. In this thesis, some necessary tools to assist in the design of SSIM-optimal algorithms are developed. This work combines theoretical developments with experimental research and practical algorithms.
The description of the mathematical properties of the SSIM index represents the principal theoretical achievement in this thesis. Indeed, it is demonstrated how the SSIM index can be transformed into a distance metric. Local convexity, quasi-convexity, symmetries and invariance properties are also proved. The study of the SSIM index is also generalized to a family of metrics called normalized (or M-relative) metrics.
Various analytical techniques for different kinds of SSIM-based optimization are then devised. For example, the best approximation according to the SSIM is described for orthogonal and redundant basis sets. SSIM-geodesic paths with arclength parameterization are also traced between images. Finally, formulas for SSIM-optimal point estimators are obtained.
On the experimental side of the research, the structural self-similarity of images is studied. This leads to the confirmation of the hypothesis that the main source of self-similarity of images lies in their regions of low variance.
On the practical side, an implementation of local statistical tests on the image residual is proposed for the assessment of denoised images. Also, heuristic estimations of the SSIM index and the MSE are developed.
The research performed in this thesis should lead to the development of state-of-the-art image denoising algorithms. A better comprehension of the mathematical properties of the SSIM index represents another step toward the replacement of the MSE with SSIM in image processing applications.
|
297 |
Bayesian Methods to Characterize Uncertainty in Predictive Modeling of the Effect of Urbanization on Aquatic EcosystemsKashuba, Roxolana Oresta January 2010 (has links)
<p>Urbanization causes myriad changes in watershed processes, ultimately disrupting the structure and function of stream ecosystems. Urban development introduces contaminants (human waste, pesticides, industrial chemicals). Impervious surfaces and artificial drainage systems speed the delivery of contaminants to streams, while bypassing soil filtration and local riparian processes that can mitigate the impacts of these contaminants, and disrupting the timing and volume of hydrologic patterns. Aquatic habitats where biota live are degraded by sedimentation, channel incision, floodplain disconnection, substrate alteration and elimination of reach diversity. These compounding changes ultimately lead to alteration of invertebrate community structure and function. Because the effects of urbanization on stream ecosystems are complex, multilayered, and interacting, modeling these effects presents many unique challenges, including: addressing and quantifying processes at multiple scales, representing major interrelated simultaneously acting dynamics at the system level, incorporating uncertainty resulting from imperfect knowledge, imperfect data, and environmental variability, and integrating multiple sources of available information about the system into the modeling construct. These challenges can be addressed by using a Bayesian modeling approach. Specifically, the use of multilevel hierarchical models and Bayesian network models allows the modeler to harness the hierarchical nature of the U.S. Geological Survey (USGS) Effect of Urbanization on Stream Ecosystems (EUSE) dataset to predict invertebrate response at both basin and regional levels, concisely represent and parameterize this system of complicated cause and effect relationships and uncertainties, calculate the full probabilistic function of all variables efficiently as the product of more manageable conditional probabilities, and includes both expert knowledge and data. Utilizing this Bayesian framework, this dissertation develops a series of statistically rigorous and ecologically interpretable models predicting the effect of urbanization on invertebrates, as well as a unique, systematic methodology that creates an informed expert prior and then updates this prior with available data using conjugate Dirichlet-multinomial distribution forms. The resulting models elucidate differences between regional responses to urbanization (particularly due to background agriculture and precipitation) and address the influences of multiple urban induced stressors acting simultaneously from a new system-level perspective. These Bayesian modeling approaches quantify previously unexplained regional differences in biotic response to urbanization, capture multiple interacting environmental and ecological processes affected by urbanization, and ultimately link urbanization effects on stream biota to a management context such that these models describe and quantify how changes in drivers lead to changes in regulatory endpoint (the Biological Condition Gradient; BCG).</p> / Dissertation
|
298 |
Architectural contextualism in the twentieth century, with particular reference to the architects E. Fay Jones and John Carl WarneckeWolford, Jane N. 15 July 2005 (has links)
A study of the importance, elements and techniques of architectural contextualism. Contextual architecture is here defined as architecture that creates relationships with its specific site or its broader physical or visual environment. This study posits the comprehensive definition of architectural contextualism on multiple levels: denotatively, connotatively, historically, philosophically, and in its aspects of critical regionalism. American architects adept at the practice of architectural contextualism during the mid-twentieth century offer principles and techniques. These architects are John Carl Warnecke, E. Fay Jones, and George White and others. This research has yielded the systematic, comprehensive definition of contextualism, a set of metrics which can be used as a basis of design and aid in the evaluation of the degree to which a building or set of buildings and their landscape are contextually congruent.
|
299 |
Indexing presentations using multiple media streamsRuddarraju, Ravikrishna 15 August 2006 (has links)
This thesis presents novel techniques to index multiple media streams in a digi-
tally captured presentation. These media streams are related by the common content in
a presentation. We use relevance curves to represent these relationships. These relevance
curves are generated by using a mix of text processing techniques and distance measures for
sparse vocabularies. These techniques are used to automatically detect slide boundaries in
a presentation. Accuracy of detecting these boundaries is evaluated as a function of word
error rates.
|
300 |
The Effect Of Design Patterns On Object-oriented Metrics And Solfware Error-pronenessAydinoz, Baris 01 September 2006 (has links) (PDF)
This thesis study investigates the connection between design patterns, OO metrics and
software error-proneness. The literature on OO metrics, design patterns and software
error-proneness is reviewed. Different software projects and synthetic source codes
have been analyzed to verify this connection.
|
Page generated in 0.0468 seconds