• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 415
  • 146
  • 56
  • 39
  • 37
  • 14
  • 10
  • 6
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • Tagged with
  • 863
  • 111
  • 110
  • 92
  • 80
  • 79
  • 78
  • 77
  • 63
  • 61
  • 58
  • 55
  • 53
  • 49
  • 47
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
491

Reading Faces. Using Hard Multi-Task Metric Learning for Kernel Regression / Analyse de visages à l'aide d'une régularisation multi-tâches contrainte pour un apprentissage de métrique adaptée à un régresseur par noyaux

Nicolle, Jérémie 08 March 2016 (has links)
Recueillir et labelliser un ensemble important et pertinent de données pour apprendre des systèmes de prédiction d'informations à partir de visages est à la fois difficile et long. Par conséquent, les données disponibles sont souvent de taille limitée comparée à la difficultés des tâches. Cela rend le problème du sur-apprentissage particulièrement important dans de nombreuses applications d'apprentissage statistique liées au visage. Dans cette thèse, nous proposons une nouvelle méthode de régression de labels multi-dimensionnels, nommée Hard Multi-Task Metric Learning for Kernel Regression (H-MT-MLKR). Notre méthode a été développée en focalisant sur la réduction du phénomène de sur-apprentissage. La méthode Metric Learning for Kernel Regression qui a été proposée par Kilian Q. Weinberger en 2007 vise à apprendre un sous-espace pour minimiser l'erreur quadratique d'un estimateur de Nadaraya-Watson sur la base d'apprentissage. Dans notre méthode, on étend la méthode MLKR pour une régression de labels multi-dimensionnels en ajoutant une nouvelle régularisation multi-tâches qui réduit les degrés de liberté du modèle appris ainsi que le sur-apprentissage. Nous évaluons notre méthode pour deux applications différentes, à savoir la localisation de points caractéristiques et la prédiction de l'intensité des Action Units. Nous présentons aussi un travail sur la prédiction des émotions en espace continu basé aussi sur l'estimateur de Nadaraya-Watson. Deux des systèmes proposés nous ont permis de remporter deux premières places à des concours internationaux, à savoir le Audio-Visual Emotion Challenge (AVEC'12) et le Facial Expression Recognition and Analysis challenge (FERA'15). / Collecting and labeling various and relevant data for training automatic facial information prediction systems is both hard and time-consuming. As a consequence, available data is often of limited size compared to the difficulty of the prediction tasks. This makes overfitting a particularly important issue in several face-related machine learning applications. In this PhD, we introduce a novel method for multi-dimensional label regression, namely Hard Multi-Task Metric Learning for Kernel Regression (H-MT-MLKR). Our proposed method has been designed taking a particular focus on overfitting reduction. The Metric Learning for Kernel Regression method (MLKR) that has been proposed by Kilian Q. Weinberger in 2007 aims at learning a subspace for minimizing the quadratic training error of a Nadaraya-Watson estimator. In our method, we extend MLKR for multi-dimensional label regression by adding a novel multi-task regularization that reduces the degrees of freedom of the learned model along with potential overfitting. We evaluate our regression method on two different applications, namely landmark localization and Action Unit intensity prediction. We also present our work on automatic emotion prediction in a continuous space which is based on the Nadaraya-Watson estimator as well. Two of our frameworks let us win international data science challenges, namely the Audio-Visual Emotion Challenge (AVEC’12) and the fully continuous Facial Expression Recognition and Analysis challenge (FERA’15).
492

On the Stability of Certain Riemannian Functionals

Maity, Soma January 2012 (has links) (PDF)
Given a compact smooth manifold Mn without boundary and n ≥ 3, the Lp-norm of the curvature tensor, defines a Riemannian functional on the space of Riemannian metrics with unit volume M1. Consider C2,α-topology on M1 Rp remains invariant under the action of the group of diffeomorphisms D of M. So, Rp is defined on M1/ D. Our first result is that Rp restricted to the space M1/D has strict local minima at Riemannian metrics with constant sectional curvature for certain values of p. The product of spherical space forms and the product of compact hyperbolic manifolds are also critical point for Rp if they are product of same dimensional manifolds. We prove that these spaces are strict local minima for Rp restricted to M1/D. Compact locally symmetric isotropy irreducible metrics are critical points for Rp. We give a criteria for the local minima of Rp restricted to the conformal class of metrics of a given irreducible symmetric metric. We also prove that the metrics with constant bisectional curvature are strict local minima for Rp restricted to the space of Kahlar metrics with unite volume quotient by D. Next we consider the Riemannian functional given by In [GV], M. J. Gursky and J. A. Viaclovsky studied the local properties of the moduli space of critical metrics for the functional Ric2.We generalize their results for any p > 0.
493

Trust & Security issues in Mobile banking and its effect on Customers / Trust & Security issues in Mobile banking and its effect on Customers

Bilal, Muhammad, Sankar, Ganesh January 2011 (has links)
Context: The invention of mobile phones makes the human life easier. The purpose of this study is to identify security risks in mobile banking and to provide an authentication method for mobile banking transaction by using bio-metric mechanism. Objectives: Current mobile banking authentication is challenging and identified as a major security risk. Literature review shows that customer distrusts mobile banking due to security issues. The authors discuss security risks in current authentication methods in mobile banking. Methods: There are different methods and approaches to handle authentication in mobile banking. In this thesis, we propose a new approach of authentication in mobile banking. The strengths and weaknesses of existing approaches of authentication are identified with the help of Literature Review and interviews. The authors present basic transaction model and include security risks. By Literature Review it is found that finger print mechanism is a suitable method for authentication. Authors focus on authentication method and present a biometric scanning device which can identify the customer’s finger print thus enabling the customer to access mobile banking facility. Results: An authentication model is proposed through design process. The proposed biometric design was validated by conducting a workshop. The analysis of the workshop’s results showed that customer’s trust in security for mobile banking will be increased by finger print mechanism. To promote mobile banking, it is necessary to improve customer trust in terms of security. Conclusions: The authors concluded that, only authorized person will be able to use mobile banking services by incorporating bio-metric finger-print mechanism. By literature review and interview it was found that finger-print mechanism is more suitable than other ordinary mechanisms like login and password mechanism, SMS etc. / Using mobile phones for mobile banking, customers can push or pull the details like Funds transfer, Bill payment, Share trade, Check order and also inquiries like Account balance, Account statement and Check status Transaction history etc. It means that the customer is interacting with the files, databases etc., of the bank . Database at the server end is sensitive in terms of security. Customers distrust mobile devices to transfer money or for making any transactions. The reason is that security is a major concern for the customer’s fulfillment. Customer’s main concern in using mobile devices for mobile banking is the authentication method used to ensure that the right person is accessing the services like transaction etc.The authors made a basic model for mobile banking transaction. All security risks were included in the transaction model. Then the authors focused on authentication method. By literature review and interview it was concluded that security can be improved by bio metric methods. The authors focused on different bio-metric mechanism and concluded that fingerprint mechanism is more suitable as it requires less storage capacity in database and identifies the uniqueness of customers. The authors suggest a possible solution by proposing finger-print mechanism model and designed a bio-metric scanning device as a solution through which customer can interact with banking system using their finger-print. The result of workshop shows that bio-metric finger print mechanism is more suitable and secure then other authentication methods for mobile banking. / 004531847791
494

Méthodes de résolution d’inclusions variationnelles sous hypothèses de stabilité / Methods for solving variational inclusions under stability assumptions

Burnet, Steeve 30 October 2012 (has links)
Dans cette thèse, nous nous intéressons à des inclusions de la forme 0∈ f( x) + F(x), où f est une application univoque et F est une application multivoque à graphe fermé. Ces dernières années, diverses méthodes de résolutions d'inclusions de ce type ont été développées par les chercheurs et, après un bref rappel sur quelques notions d'analyse (univoque et multivoque) nous en présentons quelques unes utilisant l'hypothèse de régularité métrique sur l'application multivoque. Dans la suite de notre travail, plutôt que d'utiliser cette hypothèse de régularité métrique, nous lui préférons des hypothèses directement liées à la solution qui sont la semistabilité et l'hemistabilité. Notons que la semistabilité d'une solution x̅ de l'inclusion 0∈G(x) est en fait équivalente à la sous-régularité métrique forte de l'application multivoque G en x̅ pour 0. Après avoir présenté des méthodes utilisant la semistabilité et l'hemistabilité, nous exposons les nouveaux résultats auxquels nous avons abouti qui consistent essentiellement en des améliorations des méthodes présentées. Ce que nous entendons par améliorations se décline en deux points principaux : soit nous obtenons un meilleur taux de convergence, soit nous utilisons des hypothèses plus faibles qui nous permettent d'obtenir des taux de convergence similaires. / In this thesis, we focus on inclusions in the form of 0∈ f( x) + F(x), where f is a single-valued function and F is a set-valued map with closed graph. In the last few years, various methods to solve such inclusions have been developed; after having recalled some notions in analysis (single-valued and set-valued) we present some of them using metric regularity on the set-valued map. Then, instead of considering this metric regularity assumption, we prefer assumptions which are directly connected to the solution, that are semistability and hemistability. One can note that semistabily of a solution x̅ of the inclusion 0∈G(x) is actually equivalent strong metric subregularity on the set-valued map G at x̅ for 0. After having presented some methods using semistability and hemistability, we show the new results we obtained, most of them being improvement of the presented methods. What we mean by improvement is mainly a better convergence rate on the one hand, and weaker assumptions that lead to similar convergence rate, on the other.
495

Line element and variational methods for color difference metrics / Lignes géodésiques et méthodes différentielles pour les métriques de différence couleur

Pant, Dibakar Raj 17 February 2012 (has links)
Afin de pouvoir apparier de manière précise les couleurs il est essentiel de prendre en compte la sensibilité visuelle à percevoir de petites différences de couleur. Les petites différences de couleur peuvent être mesurées par des ellipses qui décrivent les différences justes observables (just noticeable difference - JND). Ces ellipses décrivent la faculté du Système Visuel Humain à discriminer des couleurs très peu différentes. D'un point de vue mathématique, ces ellipses peuvent être modélisées par une fonction différentielle positive de forme quadratique, caractéristique de ce que l'on appelle communément une métrique Riemannienne. La métrique Riemannienne peut être considérée comme un outil utile pour évaluer l'adéquation, la robustesse et la précision, d'un espace couleur ou d'une métrique couleur, à décrire, à mesurer, correctement les différences de couleur telles qu'elles sont perçues par le Système Visuel Humain. L'un des particularités de cette métrique est qu'elle modélise la plus petite distance qui sépare deux couleurs dans un espace couleur par une ligne géodésique. Selon l'hypothèse de Schrödinger les lignes géodésiques qui partent d'un point neutre d'une surface de luminosité constante décrivent des courbes de teinte constante. Les contours de chrominance (chroma) forment alors des courbes fermées à intervalles constants à partir de ce point neutre situées à une distance constante des lignes géodésiques associées à ces teintes constances. Cette hypothèse peut être utilisée pour tester la robustesse, la précision, des formules mathématiques utilisées pour mesurer des différences couleur (color difference formulas) et pour prédire quelle valeurs peuvent prendre tel ou tel attribut perceptuel, ex. la teinte et la saturation (hue and chroma), ou telle distribution de stimulus couleur, dans n'importe quel espace couleur. Dans cette thèse, nous présentons une méthode qui permet de modéliser les éléments de ligne (lignes géodésiques), correspondants aux formules mathématiques Delta E * ab, Delta E * uv, OSA-UCS Delta EE utilisées pour mesurer des différences couleur, ainsi que les éléments de ligne correspondants à l'approximation infinitésimales du CIEDE2000. La pertinence de ces quatre formules mathématiques a été évaluée par comparaison, dans différents plans de représentation chromatique, des ellipses prédites et des ellipses expérimentalement obtenues par observation visuelle. Pour chacune de ces formules mathématiques, nous avons également testé l'hypothèse de Schrödinger, en calculant à partir de la métrique Riemannienne, les lignes géodésiques de teinte et les contours de chroma associés, puis en comparant les courbes calculées dans l'espace couleur CIELAB avec celles obtenues dans le système Munsell. Les résultats que nous avons obtenus démontrent qu'aucune de ces formules mathématiques ne prédit précisément les différences de couleur telles qu'elles sont perçues par le Système Visuel Humain. Ils démontrent également que les deux dernières formules en date, OSA-UCS Delta EE et l'approximation infinitésimale du CIEDE2000, ne sont pas plus précises que les formules conventionnelles calculées à partir des espaces couleur CIELAB et CIELUV, quand on se réfère au système Munsell (Munsell color order system) / Visual sensitivity to small color difference is an important factor for precision color matching. Small color differences can be measured by the line element theory in terms of color distances between a color point and neighborhoods of points in a color space. This theory gives a smooth positive definite symmetric metric tensor which describes threshold of color differences by ellipsoids in three dimensions and ellipses in two dimensions. The metric tensor is also known as the Riemannian metric tensor. In regard to the color differences, there are many color difference formulas and color spaces to predict visual difference between two colors but, it is still challenging due to the nonexistence of a perfect uniform color space. In such case, the Riemannian metric tensor can be used as a tool to study the performance of various color spaces and color difference metrics for measuring the perceptual color differences. It also computes the shortest length or the distance between any two points in a color space. The shortest length is called a geodesic. According to Schrödinger's hypothesis geodesics starting from the neutral point of a surface of constant brightness correspond to the curves of constant hue. The chroma contours are closed curves at constant intervals from the origin measured as the distance along the constant hue geodesics. This hypothesis can be utilized to test the performance of color difference formulas to predict perceptual attributes (hue and chroma) and distribution of color stimulus in any color space. In this research work, a method to formulate line element models of color difference formulas the ΔE*ab, the ΔE*uv, the OSA-UCS ΔEE and infinitesimal approximation of CIEDE2000 (ΔE00) is presented. The Jacobian method is employed to transfer their Riemannian metric tensors in other color spaces. The coefficients of such metric tensors are used to compute ellipses in two dimensions. The performance of these four color difference formulas is evaluated by comparing computed ellipses with experimentally observed ellipses in different chromaticity diagrams. A method is also developed for comparing the similarity between a pair of ellipses. The technique works by calculating the ratio of the area of intersection and the area of union of a pair of ellipses. Similarly, at a fixed value of lightness L*, hue geodesics originating from the achromatic point and their corresponding chroma contours of the above four formulas in the CIELAB color space are computed by solving the Euler-Lagrange equations in association with their Riemannian metrics. They are compared with with the Munsell chromas and hue circles at the Munsell values 3, 5 and 7. The result shows that neither formulas are fully perfect for matching visual color difference data sets. However, Riemannized ΔE00 and the ΔEE formulas measure the visual color differences better than the ΔE*ab and the ΔE*uv formulas at local level. It is interesting to note that the latest color difference formulas like the OSA-UCS ΔEE and the Riemannized ΔE00 do not show better performance to predict hue geodesics and chroma contours than the conventional CIELAB and CIELUV color difference formulas and none of these formulas fit the Munsell data accurately
496

[en] A BLUEPRINT-BASED APPROACH FOR PRIORITIZING AND RANKING CRITICAL CODE ANOMALIES / [pt] UMA ABORDAGEM BASEADA EM BLUEPRINTS PARA PRIORIZAÇÃO E CLASSIFICAÇÃO DE ANOMALIAS DE CÓDIGO CRÍTICAS

EVERTON TAVARES GUIMARAES 17 January 2017 (has links)
[pt] Sistemas de software estão evoluindo frequentemente devido a diversas requisições de mudanças. A medida que o software evolui, seu tamanho e complexidade aumentam, e consequentemente, sua arquitetura tende a se degradar. Sintomas de degradação arquitetural são por muitas vezes uma consequência direta da inserção progressiva de anomalias de código. Uma anomalia de código é uma estrutura da implementação recorrente que possivelmente indica problemas mais severos no projeto arquitetural. Anomalia de código é considerada crítica quando ela está relacionada problemas estruturais na arquitetura do software. Sua criticidade origina-se da sua influência negativa em uma ampla gama de requisitos não-funcionais. Por exemplo, a presença e anomalias e código críticas dificulta a manutenibilidade e software., ex. uma grande refatoração pode ser necessária para remover um problema arquitetural. Diversas abordagens tem sido propostas para a detecção de anomalias em sistemas de software, mas nenhuma delas suporta eficientemente a priorização e classificação de anomalias de código críticas de acordo com seu impacto na arquitetura. O presente trabalho investiga como a priorização e classificação dessas anomalias críticas de código pode se melhorado com o uso de blueprints arquiteturais. Blueprints arquiteturais são providos pelo arquiteto de software desde estágios iniciais de desenvolvimento do sistema. Blueprints são modelos de projeto informais normalmente definidos para capturar e comunicar as principais decisões de projeto arquitetural. Embora blueprints normalmente sejam incompletos e inconsistentes com respeito a implementação do sistema, eles podem contribuir para o processo de priorização e classificação de anomalias de código críticas. Com o intuito de alcançar nossos objetivos de pesquisa, um conjunto de estudos empíricos foram realizados. O trabalho também propõe e avalia um conjunto de heurísticas para auxiliar desenvolvedores na priorização e classificação de anomalias de código em 3 sistemas de software. Os resultados mostraram uma acurácia média de mais de 60 porcento na priorização e classificação de anomalias de código associadas com problemas arquiteturais nesses sistemas. / [en] Software systems are often evolving due to many changing requirements. As the software evolves, it grows in size and complexity, and consequently, its architecture design tends to degrade. Architecture degradation symptoms are often a direct consequence of the progressive insertion of code anomalies in the software implementation. A code anomaly is a recurring implementation structure that possibly indicates deeper architectural design problems. Code anomaly is considered critical when it is related with a structural problem in the software architecture. Its criticality stems from its negative influence on a wide range of non-functional requirements. For instance, the presence of critical code anomalies hinders software aintainability, i.e. these critical anomalies require wide refactoring in order to remove an architectural problem. Symptoms of architecture degradation have often to be observed in the source code due to the lack of an explicit, formal representation of the software architecture in a project. Many approaches are proposed for detecting code anomalies in software systems, but none of them efficiently support the prioritization and ranking of critical code anomalies according to their architecture impact. Our work investigates how the prioritization and ranking of such critical code anomalies could be improved by using blueprints. Architecture blueprints are usually provided by software architects since the early stages of the system development. Blueprints are informal design models usually defined to capture and communicate key architectural design decisions. Even though blueprints are often incomplete and inconsistent with respect to the underlying implementation, we aim to study if their use can contribute to improve the processes of prioritizing and ranking critical code anomalies. Aiming to address these research goals, a set of empirical studies has been performed. We also proposed and evaluated a set ofheuristics to support developers when prioritizing and ranking code anomalies in 3 software systems. The results showed an average accuracy higher than 60 percent when prioritizing and ranking code anomalies associated with architectural problems in these systems.
497

Rigidity And Regularity Of Holomorphic Mappings

Balakumar, G P 07 1900 (has links) (PDF)
We deal with two themes that are illustrative of the rigidity and regularity of holomorphic mappings. The first one concerns the regularity of continuous CR mappings between smooth pseudo convex, finite type hypersurfaces which is a well studied subject for it is linked with the problem of studying the boundary behaviour of proper holomorphic mappings between domains bounded by such hypersurfaces. More specifically, we study the regularity of Lipschitz CR mappings from an h-extendible(or semi-regular) hypersurface in Cn .Under various assumptions on the target hypersurface, it is shown that such mappings must be smooth. A rigidity result for proper holomorphic mappings from strongly pseudo convex domains is also proved. The second theme dealt with, is the classification upto biholomorphic equivalence of model domains with abelian automorphism group in C3 .It is shown that every model domain i.e.,a hyperbolic rigid polynomial domainin C3 of finite type, with abelian automorphism group is equivalent to a domain that is balanced with respect to some weight.
498

Analýza platformy SAP NetWeaver Business Intelligence 2004s / Analysis of SAP NetWeaver Business Intelligence 2004s

Michna, Petr January 2008 (has links)
This diploma thesis deals with the analysis and evaluation of one of the most important Business Intelligence platform in the BI market -- SAP NetWeaver Business Intelligence 2004s. The main objective of the analysis is to provide reader with very comprehensive theoretical and practical platform overview that is based among others on valuable implementation of SAP NetWeaver BI sample application. The sample application not only allows better understanding of the platform, but also provides significant inputs to evaluation final phase. This evaluation is based on a metric system, which is defined as a part of this thesis as well. The evaluation result is then, apart from the quantitative quality interpretation of analyzed platform, included in a quality comparison (some areas) with BI platforms Oracle and Open Source Pentaho, whose evaluation and comparison have been made in diploma thesis [VÁLEK, 2008]. At the end of this thesis is then provided an overview of SAP NetWeaver Business Intelligence 2004s strengths and weaknesses.
499

Model metrik ve strojírenské firmě a jeho pilotní implementace pomocí nástrojů Business Intelligence / Metrics Model In Engigeering Company And Its Initial Implementation in Business Intelligence tools

Lukeš, Ondřej January 2009 (has links)
This thesis deals with possibilities of strategic measurement of performance in Czech industrial firm and application of distinguished metrics using Business Intelligence tools. In particular it is a firm called Linde Pohony s.r.o. that produces axes and half axes for forklifts. This thesis is based on contemporary information about strategic performance management and its tools. The main aim of this thesis is to demonstrate possibilities of using Business Intelligence in order to measure key performance metrics. Microsoft's SQL Server 2005 with its Analysis and Integration services are used to prove it. It was necessary to create model of metrics at first. First part of the thesis deals with several ways and methods how to measure firm's performance. Then a proper method to measure performance is chosen. In following chapters ways how Business Intelligence tools can contribute to industrial firms are discussed. Then the real implementation of few metrics is described following with an outline of firm's software architecture. Main contribution of this thesis is particularly in demonstrating power of Business Intelligence tools when they are used along with Balanced Scorecard methodics in small or medium sized Czech industrial firm. Also Linde Pohony s.r.o. can profit from this work -- considering that many administrative tasks are much easier and faster and management has much better view into their firm. Keywords Business Intelligence, industrial firm, Balanced Scorecard, Six Sigma, Theory of Constraints, Strategic performance management, metric
500

Modelování podnikových procesů / Modeling business processes

Skala, Jakub January 2008 (has links)
The diploma thesis deals with the theme of Modeling business processes with orientation to the metric process interface. The aim of the thesis is to design the architecture of metric, which enabled to construct type metrics of the process interface. Firstly, the reader is informed about the basic terms related to process modeling, process management and connection of strategy with metrics. The next part describes the architecture of metrics created, which is oriented to supporting the process management of the organization. The set of type metrics was chosen with consideration to the possibility of their placement on process interface. The set created in this manner should help to connect strategy with processes when implementing into organization. The last part of the thesis describes the practical application of knowledge obtained in the PARMA project. This project deals with the proposal and implementation of the process management into the Bureau of the Municipality of Prague 10. The thesis deals with two processes and the conclusions obtained in the theoretic part of the thesis are applied herein.

Page generated in 0.0315 seconds