• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 475
  • 281
  • 75
  • 64
  • 35
  • 15
  • 10
  • 7
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 1160
  • 243
  • 174
  • 162
  • 159
  • 151
  • 144
  • 131
  • 108
  • 97
  • 97
  • 95
  • 87
  • 87
  • 84
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Einstein homogeneous Riemannian fibrations

Araujo, Fatima January 2008 (has links)
This thesis is dedicated to the study of the existence of homogeneous Einstein metrics on the total space of homogeneous fibrations such that the fibers are totally geodesic manifolds. We obtain the Ricci curvature of an invariant metric with totally geodesic fibers and some necessary conditions for the existence of Einstein metrics with totally geodesic fibers in terms of Casimir operators. Some particular cases are studied, for instance, for normal base or fiber, symmetric fiber, Einstein base or fiber, for which the Einstein equations are manageable. We investigate the existence of such Einstein metrics for invariant bisymmetric fibrations of maximal rank, i.e., when both the base and the fiber are symmetric spaces and the base is an isotropy irreducible space of maximal rank. We find this way new Einstein metrics. For such spaces we describe explicitly the isotropy representation in terms subsets of roots and compute the eigenvalues of the Casimir operators of the fiber along the horizontal direction. Results for compact simply connected 4-symmetric spaces of maximal rank follow from this. Also, new invariant Einstein metrics are found on Kowalski n-symmetric spaces.
232

Représentation et détection des images et des surfaces déformables

Barolet, Justine C. 08 1900 (has links)
La représentation d'une surface, son lissage et son utilisation pour l'identification, la comparaison, la classification, et l'étude des variations de volume, de courbure ou de topologie sont omniprésentes dans l'aire de la numérisation. Parmi les méthodes mathématiques, nous avons retenu les transformations difféomorphiques d'un pattern de référence. Il y a un grand intérêt théorique et numérique à approcher un difféomorphisme arbitraire par des difféomorphismes engendrés par des champs de vitesses. Sur le plan théorique la question est : "est-ce que le sous-groupe de difféomorphismes engendrés par des champs de vitesses est dense dans le groupe plus large de Micheletti pour la métrique de Courant ?" Malgré quelques progrès réalisés ici, cette question demeure ouverte. Les pistes empruntées ont alors convergé vers le sous-groupe de Azencott et de Trouvé et sa métrique dans le cadre de l'imagerie. Elle correspond à une notion de géodésique entre deux difféomorphismes dans leur sous-groupe. L'optimisation est utilisée pour obtenir un système d'équations état adjoint caractérisant la solution optimale du problème d'identification à partir des observations. Cette approche est adaptée à l'identification de surfaces obtenues par un numériseur tel que, par exemple, le scan d'un visage. Ce problème est beaucoup plus difficile que celui d'imagerie. On doit alors introduire un système de référence courbe et une surface à facettes pour les calculs. On donne la formulation du problème d'identification et du calcul du changement de volume par rapport à un scan de référence. / The representation of a surface, its smoothing, and its use in identification, comparison, classification, and in the study of changes in volume, curvature, and topology are ubiquitous in the area of the scanning. Among mathematical methods, we have retained the diffeomorphisms of a reference pattern. There is a considerable interest, both theoretical and numerical, in approximating an arbitrary diffeomorphism by diffeomorphisms generated by velocity fields. On the theoretical front the question is : "is the subgroup of diffeomorphisms generated by velocity fields dense in Micheletti's larger group endowed with the Courant metric ?" In spite of some progress, the question remains open. The tracks followed have converged towards the subgroup of Lipschitzian diffeomorphisms of Azencott and Trouvé and its metric developed for imaging. It corresponds to a notion of geodesic between two diffeomorphisms in their subgroup. Optimization is then used to obtain a system of equations of the state adjoint state type characterizing the optimal solution of the identification problem from observations. This approach is adapted to the identification of surfaces obtained from a scanner such as, for instance, the scan of a face. This problem is much more difficult than the one of imaging. We introduce a curvilinear reference system and a faceted surface for numerical computations. We provide a formulation of the identification problem and of the computation of the change of volume from a reference scan.
233

Exploiting abstract syntax trees to locate software defects

Shippey, Thomas Joshua January 2015 (has links)
Context. Software defect prediction aims to reduce the large costs involved with faults in a software system. A wide range of traditional software metrics have been evaluated as potential defect indicators. These traditional metrics are derived from the source code or from the software development process. Studies have shown that no metric clearly out performs another and identifying defect-prone code using traditional metrics has reached a performance ceiling. Less traditional metrics have been studied, with these metrics being derived from the natural language of the source code. These newer, less traditional and finer grained metrics have shown promise within defect prediction. Aims. The aim of this dissertation is to study the relationship between short Java constructs and the faultiness of source code. To study this relationship this dissertation introduces the concept of a Java sequence and Java code snippet. Sequences are created by using the Java abstract syntax tree. The ordering of the nodes within the abstract syntax tree creates the sequences, while small sub sequences of this sequence are the code snippets. The dissertation tries to find a relationship between the code snippets and faulty and non-faulty code. This dissertation also looks at the evolution of the code snippets as a system matures, to discover whether code snippets significantly associated with faulty code change over time. Methods. To achieve the aims of the dissertation, two main techniques have been developed; finding defective code and extracting Java sequences and code snippets. Finding defective code has been split into two areas - finding the defect fix and defect insertion points. To find the defect fix points an implementation of the bug-linking algorithm has been developed, called S + e . Two algorithms were developed to extract the sequences and the code snippets. The code snippets are analysed using the binomial test to find which ones are significantly associated with faulty and non-faulty code. These techniques have been performed on five different Java datasets; ArgoUML, AspectJ and three releases of Eclipse.JDT.core Results. There are significant associations between some code snippets and faulty code. Frequently occurring fault-prone code snippets include those associated with identifiers, method calls and variables. There are some code snippets significantly associated with faults that are always in faulty code. There are 201 code snippets that are snippets significantly associated with faults across all five of the systems. The technique is unable to find any significant associations between code snippets and non-faulty code. The relationship between code snippets and faults seems to change as the system evolves with more snippets becoming fault-prone as Eclipse.JDT.core evolved over the three releases analysed. Conclusions. This dissertation has introduced the concept of code snippets into software engineering and defect prediction. The use of code snippets offers a promising approach to identifying potentially defective code. Unlike previous approaches, code snippets are based on a comprehensive analysis of low level code features and potentially allow the full set of code defects to be identified. Initial research into the relationship between code snippets and faults has shown that some code constructs or features are significantly related to software faults. The significant associations between code snippets and faults has provided additional empirical evidence to some already researched bad constructs within defect prediction. The code snippets have shown that some constructs significantly associated with faults are located in all five systems, and although this set is small finding any defect indicators that transfer successfully from one system to another is rare.
234

Visual Quality Metrics Resulting from Dynamic Corneal Tear Film Topography

Solem, Cameron Cole, Solem, Cameron Cole January 2017 (has links)
The visual quality effects from the dynamic behavior of the tear film have been determined through measurements acquired with a high resolution Twyman-Green interferometer. The base shape of the eye has been removed to isolate the aberrations induced by the tear film. The measured tear film was then combined with a typical human eye model to simulate visual performance. Fourier theory has been implemented to calculate the incoherent point spread function, the modulation transfer function, and the subjective quality factor for this system. Analysis software has been developed for ease of automation for large data sets, and outputs movies have been made that display these visual quality metrics alongside the tear film. Post processing software was written to identify and eliminate bad frames. As a whole, this software creates the potential for increased intuition about the connection between blinks, tear film dynamics and visual quality.
235

DEVELOPMENT OF AN ROI AWARE FULL-REFERENCE OBJECTIVE PERCEPTUAL QUALITY METRIC ON IMAGES OVER FADING CHANNEL

GOGINENI, SRI LOHITH January 2016 (has links)
In spite of technological advances in wireless systems, transmitted data suffers from impairments through both lossy source coding and transmission overerror prone channels. Due to these errors, the quality of multimedia content is degraded. The major challenge for service providers in this scenario is to measure the perceptual impact of distortions to provide certain Quality of Experience(QoE) to the end user. The general tendency of the Human Visual System (HVS) suggests that the artifacts in the Region-of-Interest (ROI) are perceived to be more annoying compared to the artifacts in Background (BG). With this assumption, the thesis aims to measure the quality of image over ROI and BG independently. Visual Information Fidelity (VIF), a full-reference image quality assessment is chosen for this purpose. Finally, the metric measured over ROI and BG are pooled to get a ROI aware metric. The ROI aware metric is used to predict the Mean Opinion Score (MOS) of an image. In this study, an ROI aware quality metric is used to measure the quality of a set of distorted images generated using a wireless channel. Eventually, MOS of the distorted images is estimated. Lastly, the predicted MOS is validated with the MOS obtained from subjective tests. Testing the proposed image quality assessment approach shows an improved prediction performance of ROI aware quality metric over traditional image quality metrics. It is also observed that the above approach provides a consistent improvement over a wide variety of distortions. After extensive research, the obtained results suggest that the impairments in the ROI are perceived to be more annoying than that of the BG.
236

Towards efficient vehicle dynamics development : From subjective assessments to objective metrics, from physical to virtual testing

Gil Gómez, Gaspar January 2017 (has links)
Vehicle dynamics development is strongly based on subjective assessments (SA) of vehicle prototypes, which is expensive and time consuming. Consequently, in the age of computer- aided engineering (CAE), there is a drive towards reducing this dependency on physical test- ing. However, computers are known for their remarkable processing capacity, not for their feelings. Therefore, before SA can be computed, it is required to properly understand the cor- relation between SA and objective metrics (OM), which can be calculated by simulations, and to understand how this knowledge can enable a more efficient and effective development process. The approach to this research was firstly to identify key OM and SA in vehicle dynamics, based on the multicollinearity of OM and of SA, and on interviews with expert drivers. Sec- ondly, linear regressions and artificial neural network (ANN) were used to identify the ranges of preferred OM that lead to good SA-ratings. This result is the base for objective require- ments, a must in effective vehicle dynamics development and verification. The main result of this doctoral thesis is the development of a method capable of predicting SA from combinations of key OM. Firstly, this method generates a classification map of ve- hicles solely based on their OM, which allows for a qualitative prediction of the steering feel of a new vehicle based on its position, and that of its neighbours, in the map. This prediction is enhanced with descriptive word-clouds, which summarizes in a few words the comments of expert test drivers to each vehicle in the map. Then, a second superimposed ANN displays the evolution of SA-ratings in the map, and therefore, allows one to forecast the SA-rating for the new vehicle. Moreover, this method has been used to analyse the effect of the tolerances of OM requirements, as well as to verify the previously identified preferred range of OM. This thesis focused on OM-SA correlations in summer conditions, but it also aimed to in- crease the effectiveness of vehicle dynamics development in general. For winter conditions, where objective testing is not yet mature, this research initiates the definition and identifica- tion of robust objective manoeuvres and OM. Experimental data were used together with CAE optimisations and ANOVA-analysis to optimise the manoeuvres, which were verified in a second experiment. To improve the quality and efficiency of SA, Volvo’s Moving Base Driving Simulator (MBDS) was validated for vehicle dynamics SA-ratings. Furthermore, a tablet-app to aid vehicle dynamics SA was developed and validated. Combined this research encompasses a comprehensive method for a more effective and ob- jective development process for vehicle dynamics. This has been done by increasing the un- derstanding of OM, SA and their relations, which enables more effective SA (key SA, MBDS, SA-app), facilitates objective requirements and therefore CAE development, identi- fies key OM and their preferred ranges, and which allow to predict SA solely based on OM. / <p>QC 20170223</p> / iCOMSA
237

Gestion des connaissances et externalisation informatique. Apports managériaux et techniques pour l'amélioration du processus de transition : Cas de l’externalisation informatique dans un EPST / Knowledge Management and IT Outsourcing. Managerial and technical inputs to improve the transition process

Grim-Yefsah, Malika 23 November 2012 (has links)
Le travail de recherche de cette thèse traite de la problématique de transfert de connaissances lors du processus de transition d’un projet informatique externalisé dans un EPST. En particulier, Comment transférer les connaissances, constituées des expériences-succès ou échecs passés, routines, assimilées et cumulées pendant la durée d’un projet externalisé par les membres d’une équipe sortante vers une nouvelle équipe entrante d’une manière efficiente ? Nous nous focalisons sur ce processus de transition en raison de son importance pour le succès de l’externalisation informatique, de sa complexité, de sa richesse théorique et le manque d’études dans ce domaine. Nous avons choisi d’approcher cette problématique par le biais de la gestion des connaissances. Dans un premier volet de cette thèse, nous nous sommes appuyées sur le paradigme Goal-Question-Metric proposant une démarche de définition de la qualité pour progresser de notre besoin opérationnel jusqu’à la définition des métriques d’évaluation de la robustesse utilisant des informations issues de l’analyse de réseaux informels sous-jacents aux activités effectuées dans le processus métier. Ces métriques permettent d’évaluer une partie de la qualité d’un processus métier en tenant compte de la connaissance tacite des acteurs du processus de transition. Dans un second volet de cette recherche, nous avons développés une méthode, en nous appuyant sur l’approche de capitalisation sur les connaissances et des mécanismes théoriques de transfert de connaissances, et un outil informatique pour mettre en œuvre ce processus de transfert de connaissances / The research of this thesis deals with the issue of knowledge transfer during the transition process of an IT project outsourced in EPST. In particular, How to transfer knowledge, experience and routines related to outsourced activities from outgoing team to a new incoming team? We focus on the transition due to its significance for outsourcing success, its complexity and theoretical richness, and its limited current understanding. We chose to approach this problem through knowledge management. In the first part of this thesis, based on the Goal-Question-Metric paradigm, we propose an approach for the definition of quality metrics covering the given operational requirements. The metrics we define take tacit knowledge into account, using information from the structural analysis of an informal network. In a second phase of this research, we developed a method, relying on capitalization on knowledge and theoretical mechanisms of knowledge transfer, and a tool to implement this process of knowledge transfer
238

Tónický verš a čeština / Tonic Verse and the Czech Language

Zindulková, Klára January 2014 (has links)
Chief subject of this work is a detailed description of the metrical structure of accentual verse and its place in the system of versifications. Definitions of this metrical system in the Czech theory of verse are compared to the international concepts of accentual verse. Then the category of isochrony is described. Isochrony is a property of language, on which the priciple of accentual verse is based on. In the next part of the work I present an overview of the types of accentual verse in Polish, English, Russian and German. Special attention is paid to the category of strict stress-verse, its relation to the accentual verse and also to some of Czech literary works. Further in the work I focus on metrical analysis of the texts written by five czech poets, which are more or less based on the accentual principle. The most attention is paid to the drama Faëthon by O. Theer. I elaborate on author's comments and critical reviews of his contemporaries, comparing them with the modern metrical descriptions. The last part is devoted to the problem of translating accentual verses into Czech, connecting this versification with "sylabotonic" translations of quantitative verse and also oral character of accentual versification in general.
239

Měření kvality strojového překladu / Measures of Machine Translation Quality

Macháček, Matouš January 2014 (has links)
Title: Measures of Machine Translation Quality Author: Matouš Macháček Department: Institute of Formal and Applied Linguistics Supervisor: RNDr. Ondřej Bojar, Ph.D. Abstract: We explore both manual and automatic methods of machine trans- lation evaluation. We propose a manual evaluation method in which anno- tators rank only translations of short segments instead of whole sentences. This results in easier and more efficient annotation. We have conducted an annotation experiment and evaluated a set of MT systems using this method. The obtained results are very close to the official WMT14 evaluation results. We also use the collected database of annotations to automatically evalu- ate new, unseen systems and to tune parameters of a statistical machine translation system. The evaluation of unseen systems, however, does not work and we analyze the reasons. To explore the automatic methods, we organized Metrics Shared Task held during the Workshop of Statistical Ma- chine Translation in years 2013 and 2014. We report the results of the last shared task, discuss various metaevaluation methods and analyze some of the participating metrics. Keywords: machine translation, evaluation, automatic metrics, annotation
240

An analysis of the impact of data errors on backorder rates in the F404 engine system

Burson, Patrick A. R. 03 1900 (has links)
Approved for public release; distribution in unlimited. / In the management of the U.S. Naval inventory, data quality is of critical importance. Errors in major inventory databases contribute to increased operational costs, reduced revenue, and loss of confidence in the reliability of the supply system. Maintaining error-free databases is not a realistic objective. Data-quality efforts must be prioritized to ensure that limited resources are allocated to achieve the maximum benefit. This thesis proposes a methodology to assist the Naval Inventory Control Point in the prioritization of its data-quality efforts. By linking data errors to Naval inventory performance metrics, statistical testing is used to identify errors that have the greatest adverse impact on inventory operations. By focusing remediation efforts on errors identified in this manner, the Navy can best use its limited resources devoted to improvement of data quality. Two inventory performance metrics are considered: Supply Material Availability (SMA), an established metric in Naval inventory management; and Backorder Persistence Metric (BPM), which is developed in the thesis. Backorder persistence measures the duration of time that the ratio of backorders to quarterly demand exceeds a threshold value. Both metrics can be used together to target remediation on reducing shortage costs and improving inventory system performance. / Lieutenant Commander, Supply Corps, United States Navy

Page generated in 0.0693 seconds