Spelling suggestions: "subject:" coequality assessment"" "subject:" c.equality assessment""
71 |
Thematic quality assessment of learning at work within Higher Vocational EducationThång, Per-Olof, Littke, Kerstin 31 October 2014 (has links)
No description available.
|
72 |
An Analysis and Reasoning Framework for Project Data Software RepositoriesAttarian, Ioanna Maria January 2012 (has links)
As the requirements for software systems increase, their size, complexity and functionality consequently increases as well. This has a direct impact on the complexity of numerous artifacts related to the system such as specification, design, implementation and, testing models. Furthermore, as the software market becomes more and more competitive, the need for software products that are of high quality and require the least monetary, time and human resources for their development and maintenance becomes evident. Therefore, it is important that project managers and software engineers are given the necessary tools to obtain a more holistic and accurate perspective of the status of their projects in order to early identify potential risks, flaws, and quality issues that may arise during each stage of the software project life cycle. In this respect, practitioners and academics alike have recognized the significance of investigating new methods for supporting software management operations with respect to large software projects. The main target of this M.A.Sc. thesis is the design of a framework in terms of, first, a reference architecture for mining and analyzing of software project data repositories according to specific objectives and analytic knowledge, second, the techniques to model such analytic knowledge and, third, a reasoning methodology for verifying or denying hypotheses related to analysis objectives. Such a framework could assist project managers, team leaders and development teams towards more accurate prediction of project traits such as quality analysis, risk assessment, cost estimation and progress evaluation. More specifically, the framework utilizes goal models to specify analysis objectives as well as, possible ways by which these objectives can be achieved. Examples of such analysis objectives for a project could be to yield, high code quality, achieve low production cost or, cope with tight delivery deadlines. Such goal models are consequently transformed into collections of Markov Logic Network rules which are then applied to the repository data in order to verify or deny with a degree of probability, whether the particular project objectives can be met as the project evolves. The proposed framework has been applied, as a proof of concept, on a repository pertaining to three industrial projects with more that one hundred development tasks.
|
73 |
A Study of the Structural Similarity Image Quality Measure with Applications to Image ProcessingBrunet, Dominique 02 August 2012 (has links)
Since its introduction in 2004, the Structural Similarity (SSIM) index has gained widespread popularity as an image quality assessment measure. SSIM is currently recognized to be one of the most powerful methods of assessing the visual closeness of images. That being said, the Mean Squared Error (MSE), which performs very poorly from a perceptual point of view, still remains the most common optimization criterion in image processing applications because of its relative simplicity along with a number of other properties that are deemed important. In this thesis, some necessary tools to assist in the design of SSIM-optimal algorithms are developed. This work combines theoretical developments with experimental research and practical algorithms.
The description of the mathematical properties of the SSIM index represents the principal theoretical achievement in this thesis. Indeed, it is demonstrated how the SSIM index can be transformed into a distance metric. Local convexity, quasi-convexity, symmetries and invariance properties are also proved. The study of the SSIM index is also generalized to a family of metrics called normalized (or M-relative) metrics.
Various analytical techniques for different kinds of SSIM-based optimization are then devised. For example, the best approximation according to the SSIM is described for orthogonal and redundant basis sets. SSIM-geodesic paths with arclength parameterization are also traced between images. Finally, formulas for SSIM-optimal point estimators are obtained.
On the experimental side of the research, the structural self-similarity of images is studied. This leads to the confirmation of the hypothesis that the main source of self-similarity of images lies in their regions of low variance.
On the practical side, an implementation of local statistical tests on the image residual is proposed for the assessment of denoised images. Also, heuristic estimations of the SSIM index and the MSE are developed.
The research performed in this thesis should lead to the development of state-of-the-art image denoising algorithms. A better comprehension of the mathematical properties of the SSIM index represents another step toward the replacement of the MSE with SSIM in image processing applications.
|
74 |
Created stormwater wetlands as wetland compensation and a floristic quality approach to wetland condition assessment in central AlbertaForrest, Andrew SM 11 1900 (has links)
In Alberta, almost all created wetlands accepted as compensation have been naturalized stormwater management facilities. Our investigation of 32 created and natural wetlands in central Alberta determined that created wetlands have steeper shoreline slopes, largely as a result of their primary function as stormwater retention ponds. This resulted in distinctly different vegetation zonation, with the steeper slopes of created wetlands resulting in fewer, narrower wetland vegetation zones. This was reflected in reduced species richness and abundance of wetland songbirds at created wetlands. This study also discusses the development of a Floristic Quality Assessment (FQA) approach, a standardized, quantitative approach to measuring wetland condition, for Albertas Parkland and Boreal natural regions. I present plant survey data from the 32 wetlands as validation of the effectiveness of this approach. This study provides information on current wetland compensation practices and a potential wetland assessment tool; both topics that are directly relevant to the implementation of wetland compensation policies in Alberta. / Ecology
|
75 |
Proposta de metodologia para avaliação de redes de voz sobre IP / Proposal of Methodology for Evaluation of Voice over IP NetworksSilva, Vandersilvio da January 2006 (has links)
A redução de custo com telefonia através do uso de voz sobre IP tem disparado a busca de soluções que transformem redes IP originalmente dedicadas a transporte de dados em redes para transporte de voz. Esta dissertação tem por objetivo apresentar uma metodologia para sistematizar a avaliação de redes para o tráfego de voz sobre IP de acordo com as possibilidades disponíveis no cenário a ser avaliado. Inicialmente é dada uma visão geral de voz sobre IP, apresentando os protocolos utilizados, os fatores que influenciam na qualidade da voz e os métodos de avaliação de qualidade da voz. Na seqüência são apresentados trabalhos correlatos a avaliação de qualidade de aplicações de voz sobre IP. E por fim descreve-se a proposta de uma metodologia para sistematizar a avaliação de redes com VoIP. / The use of voice over IP telephony was started with solutions to adapt existent data networks to carrier voice streams. The use of monitoring techniques, QoS and signaling protocols can be combined on a such design. Our goal is to present a methodology to evaluate and choose the probing points and the voice quality evaluation techniques to be used in network redesign. An overview about VoIP protocols and parameters that change the voice quality are presented as well as some related works on evaluating voice quality based on network parameters. A proposed methodology is presented, with a case study to show how one can choose the right combination of probing points with some voice quality measurement technique.
|
76 |
Proposta de metodologia para avaliação de redes de voz sobre IP / Proposal of Methodology for Evaluation of Voice over IP NetworksSilva, Vandersilvio da January 2006 (has links)
A redução de custo com telefonia através do uso de voz sobre IP tem disparado a busca de soluções que transformem redes IP originalmente dedicadas a transporte de dados em redes para transporte de voz. Esta dissertação tem por objetivo apresentar uma metodologia para sistematizar a avaliação de redes para o tráfego de voz sobre IP de acordo com as possibilidades disponíveis no cenário a ser avaliado. Inicialmente é dada uma visão geral de voz sobre IP, apresentando os protocolos utilizados, os fatores que influenciam na qualidade da voz e os métodos de avaliação de qualidade da voz. Na seqüência são apresentados trabalhos correlatos a avaliação de qualidade de aplicações de voz sobre IP. E por fim descreve-se a proposta de uma metodologia para sistematizar a avaliação de redes com VoIP. / The use of voice over IP telephony was started with solutions to adapt existent data networks to carrier voice streams. The use of monitoring techniques, QoS and signaling protocols can be combined on a such design. Our goal is to present a methodology to evaluate and choose the probing points and the voice quality evaluation techniques to be used in network redesign. An overview about VoIP protocols and parameters that change the voice quality are presented as well as some related works on evaluating voice quality based on network parameters. A proposed methodology is presented, with a case study to show how one can choose the right combination of probing points with some voice quality measurement technique.
|
77 |
Avaliação e recomendação de colaborações em redes sociais acadêmicas / Evaluation and recommendation of collaborations on academic social networksLopes, Giseli Rabello January 2012 (has links)
No contexto acadêmico o trabalho de pesquisa científica, nas áreas tecnológicas, é efetuado através de colaborações e cooperações entre diferentes pesquisadores e grupos de pesquisa. Existem pesquisadores atuando nos mais variados assuntos e nas mais diversas subáreas de pesquisa. Para analisar e expandir tais colaborações, muitas vezes, é necessário avaliar o nível de cooperação dos atuais parceiros, bem como identificar novos parceiros para conduzir trabalhos conjuntos. Tal avaliação e identificação não são tarefas triviais. Dessa forma, abordagens para avaliação e recomendação de colaborações são de grande valia para o aperfeiçoamento da cooperação e consequente melhoria da qualidade da pesquisa. Em relação à análise de colaborações, a demanda por critérios de avaliação de qualidade e por métodos de avaliação associados está aumentando e tem sido foco de muitos estudos na última década. Esse crescimento surge devido à busca por excelência acadêmica e para o apoio à tomada de decisões por parte de agências de financiamento para a alocação de recursos. Nesse contexto, há uma tendência a empregar técnicas bibliométricas, especialmente métodos estatísticos aplicados a citações. Com tanto material sendo pesquisado e publicado, resolveu-se explorar outra faceta para definição de indicadores de qualidade no contexto acadêmico visando a obtenção de resultados complementares e que garantam, através de sua validação experimental, uma melhor geração de indicadores. Desse modo, nesta tese, utiliza-se a tendência atual de estudos em análises de redes sociais, definindo métricas sociais específicas para definição de tais indicadores. Neste trabalho, é apresentada uma função para avaliação de qualidade de grupos de pesquisa com base nas colaborações internas entre seus pesquisadores membros. Estas colaborações são avaliadas através de análises em redes sociais bibliográficas acadêmicas baseadas em métricas de interação social. Com relação à identificação ou recomendação de colaborações, esta tese apresenta uma abordagem que considera tanto a parte de conteúdo quanto a de estrutura de uma rede. Especificamente, o conteúdo envolve a correlação entre os pesquisadores por áreas de pesquisa, enquanto a estrutura inclui a análise da existência de relacionamentos prévios entre os pesquisadores. Grande parte das abordagens que efetuam a recomendação de colaborações foca em recomendar especialistas em uma determinada área ou informação. Essas não consideram a área de atuação do usuário alvo da recomendação, como no caso da abordagem apresentada nesta tese. Além disso, neste trabalho, a obtenção de informações sobre os relacionamentos entre usuários, para construção de uma rede social acadêmica, é feita de forma implícita, em dados sobre publicações obtidos de bibliotecas digitais. Utilizando tais dados, também é possível explorar aspectos temporais para ponderação desses relacionamentos, utilizando-os para fins de recomendação de colaborações. Não foram encontrados trabalhos prévios nesse sentido. A presente abordagem inclui a recomendação não só de novas colaborações, como também, a recomendação de intensificação de colaborações já existentes, o que não é considerado por outros trabalhos relacionados. Dessa forma, pode-se dizer que os objetivos de recomendação da presente abordagem são mais amplos. Após propor novas técnicas para avaliação e identificação de parcerias, esta tese as valida através de uma avaliação experimental. Especificamente, experimentos com dados reais sobre as relações de coautoria entre pesquisadores pertencentes a diferentes grupos de pesquisa são apresentados para avaliação e demonstração da validade e da aplicabilidade das diferentes proposições desta tese referentes à avaliação de qualidade e recomendação de colaborações. / In technological fields, scientific research is performed through collaboration and cooperation of different researchers and research groups. In order to analyze and expand such collaborations, it is necessary to evaluate the level of cooperation between current partners as well as to identify new partners. Such an analysis and identification are not trivial tasks. Thus, approaches to evaluating and recommending collaborations are valuable to improve cooperation and, hence, improve research quality. Regarding the collaborations evaluation, the demand for quality assessment criteria and associated evaluation methods is increasing. Indeed, such evaluations have been the focus of many studies in the last decade. This growth arises from the pursuit of academic excellence and decision making of funding agencies. In this context, the trend is to employ bibliometric techniques, especially citation statistics. With so much material being researched and published, another facet for defining quality indicators is explored. Our goal is to obtain additional results that ensure, through its experimental validation, a better indicators generation. In this thesis, the current trend of studies in social network analysis is applied in the definition of such indicators. Specifically, we introduce a function for quality assessment of research groups based on internal collaborations among their member researchers. These collaborations are evaluated through analysis on bibliometric academic social networks based on metrics of social interaction. Regarding the collaborations recommendation, this thesis presents an approach that considers both the content and structure of research networks. The content involves the correlation among researchers by research areas whereas the structure includes the analysis of existing relationships among researchers. Most of the approaches that perform the collaborations recommendation focus on recommending experts in a certain area or information. They do not consider the working area of the recommendation target user, as we do in this thesis. Moreover, here, the information about the researchers’ relationships, employed for building an academic social network, is implicitly obtained through publications data available in digital libraries. Moreover, we expand previous analysis by considering temporal aspects to determine the relationships weights (which may be used to collaborations recommendation purposes). There were no previous studies in this direction. Our approach includes not only the recommendation of new collaborations, but also the recommendation of the collaborations intensification, which is not considered by other related work. After proposing new techniques for evaluating and identifying research collaborators, this thesis validates it through an experimental evaluation. Specifically, we evaluate and demonstrate the applicability of our techniques considering real datasets on the co-author relationships among researchers from different research groups.
|
78 |
Perceptual-Based Locally Adaptive Noise and Blur DetectionJanuary 2016 (has links)
abstract: The quality of real-world visual content is typically impaired by many factors including image noise and blur. Detecting and analyzing these impairments are important steps for multiple computer vision tasks. This work focuses on perceptual-based locally adaptive noise and blur detection and their application to image restoration.
In the context of noise detection, this work proposes perceptual-based full-reference and no-reference objective image quality metrics by integrating perceptually weighted local noise into a probability summation model. Results are reported on both the LIVE and TID2008 databases. The proposed metrics achieve consistently a good performance across noise types and across databases as compared to many of the best very recent quality metrics. The proposed metrics are able to predict with high accuracy the relative amount of perceived noise in images of different content.
In the context of blur detection, existing approaches are either computationally costly or cannot perform reliably when dealing with the spatially-varying nature of the defocus blur. In addition, many existing approaches do not take human perception into account. This work proposes a blur detection algorithm that is capable of detecting and quantifying the level of spatially-varying blur by integrating directional edge spread calculation, probability of blur detection and local probability summation. The proposed method generates a blur map indicating the relative amount of perceived local blurriness. In order to detect the flat/near flat regions that do not contribute to perceivable blur, a perceptual model based on the Just Noticeable Difference (JND) is further integrated in the proposed blur detection algorithm to generate perceptually significant blur maps. We compare our proposed method with six other state-of-the-art blur detection methods. Experimental results show that the proposed method performs the best both visually and quantitatively.
This work further investigates the application of the proposed blur detection methods to image deblurring. Two selective perceptual-based image deblurring frameworks are proposed, to improve the image deblurring results and to reduce the restoration artifacts. In addition, an edge-enhanced super resolution algorithm is proposed, and is shown to achieve better reconstructed results for the edge regions. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2016
|
79 |
GPGPU based implementation of BLIINDS-II NR-IQAJanuary 2016 (has links)
abstract: The technological advances in the past few decades have made possible creation and consumption of digital visual content at an explosive rate. Consequently, there is a need for efficient quality monitoring systems to ensure minimal degradation of images and videos during various processing operations like compression, transmission, storage etc. Objective Image Quality Assessment (IQA) algorithms have been developed that predict quality scores which match well with human subjective quality assessment. However, a lot of research still remains to be done before IQA algorithms can be deployed in real world systems. Long runtimes for one frame of image is a major hurdle. Graphics Processing Units (GPUs), equipped with massive number of computational cores, provide an opportunity to accelerate IQA algorithms by performing computations in parallel. Indeed, General Purpose Graphics Processing Units (GPGPU) techniques have been applied to a few Full Reference IQA algorithms which fall under the. We present a GPGPU implementation of Blind Image Integrity Notator using DCT Statistics (BLIINDS-II), which falls under the No Reference IQA algorithm paradigm. We have been able to achieve a speedup of over 30x over the previous CPU version of this algorithm. We test our implementation using various distorted images from the CSIQ database and present the performance trends observed. We achieve a very consistent performance of around 9 milliseconds per distorted image, which made possible the execution of over 100 images per second (100 fps). / Dissertation/Thesis / Masters Thesis Computer Science 2016
|
80 |
Hardware Acceleration of Most Apparent Distortion Image Quality Assessment Algorithm on FPGA Using OpenCLJanuary 2017 (has links)
abstract: The information era has brought about many technological advancements in the past
few decades, and that has led to an exponential increase in the creation of digital images and
videos. Constantly, all digital images go through some image processing algorithm for
various reasons like compression, transmission, storage, etc. There is data loss during this
process which leaves us with a degraded image. Hence, to ensure minimal degradation of
images, the requirement for quality assessment has become mandatory. Image Quality
Assessment (IQA) has been researched and developed over the last several decades to
predict the quality score in a manner that agrees with human judgments of quality. Modern
image quality assessment (IQA) algorithms are quite effective at prediction accuracy, and
their development has not focused on improving computational performance. The existing
serial implementation requires a relatively large run-time on the order of seconds for a single
frame. Hardware acceleration using Field programmable gate arrays (FPGAs) provides
reconfigurable computing fabric that can be tailored for a broad range of applications.
Usually, programming FPGAs has required expertise in hardware descriptive languages
(HDLs) or high-level synthesis (HLS) tool. OpenCL is an open standard for cross-platform,
parallel programming of heterogeneous systems along with Altera OpenCL SDK, enabling
developers to use FPGA's potential without extensive hardware knowledge. Hence, this
thesis focuses on accelerating the computationally intensive part of the most apparent
distortion (MAD) algorithm on FPGA using OpenCL. The results are compared with CPU
implementation to evaluate performance and efficiency gains. / Dissertation/Thesis / Masters Thesis Electrical Engineering 2017
|
Page generated in 0.1022 seconds