• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 6
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 26
  • 26
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Aplikace matematických znalostí při výuce biologie

STUDENÁ, Lucie January 2018 (has links)
The Theses deals with applications of mathematical knowledge in teaching biology and it is divided into four chapters. Each chapter is dedicated to another application: 1. Application of conditional probability in medical diagnostics, 2. Application of exponential function in population ecology, 3. Application of logic functions in mathematical modelation of neuron and 4. Aplication of binomial theorem and binomial distribution in genetics. Each application contains solved problems, a worksheet for students and a solution for each worksheet. Two application (1. and 2.) have been tested in teaching and as an assessment of my lessons students filled questionnaires. Results of these questionnaires are processed in the end of these chapters. This Thesis can be used in teaching or self-studying.
22

Techniques for automated and interactive note sequence morphing of mainstream electronic music

Wooller, René William January 2007 (has links)
Note sequence morphing is the combination of two note sequences to create a ‘hybrid transition’, or ‘morph’. The morph is a ‘hybrid’ in the sense that it exhibits properties of both sequences. The morph is also a ‘transition’, in that it can segue between them. An automated and interactive approach allows manipulation in realtime by users who may control the relative influence of source or target and the transition length. The techniques that were developed through this research were designed particularly for popular genres of predominantly instrumental electronic music which I will refer to collectively as Mainstream Electronic Music (MEM). The research has potential for application within contexts such as computer games, multimedia, live electronic music, interactive installations and accessible music or “music therapy”. Musical themes in computer games and multimedia can morph adaptively in response to parameters in realtime. Morphing can be used by electronic music producers as an alternative to mixing in live performance. Interactive installations and accessible music devices can utilise morphing algorithms to enable expressive control over the music through simple interface components. I have developed a software application called LEMorpheus which consists of software infrastructure for morphing and three alternative note sequence morphing algorithms: parametric morphing, probabilistic morphing and evolutionary morphing. Parametric morphing involves converting the source and target into continuous envelopes, interpolation, and converting the interpolated envelopes back into note sequences. Probabilistic morphing involves converting the source and target into probability matrices and seeding them on recent output to generate the next note. Evolutionary morphing involves iteratively mutating the source into multiple possible candidates and selecting those which are judged as more similar to the target, until the target is reached. I formally evaluated the probabilistic morphing algorithm by extracting qualitative feedback from participants in a live electronic music situation, benchmarked against a live, professional DJ. The probabilistic algorithm was competitive, being favoured particularly for long morphs. The evolutionary morphing algorithm was formally evaluated using an online questionnaire, benchmarked against a human composer/producer. For particular samples, the morphing algorithm was competitive and occasionally seen as innovative; however, the morphs created by the human composer typically received more positive feedback, due to coherent, large scale structural changes, as opposed to the forced continuity of the morphing software.
23

Non-parametric synthesis of volumetric textures from a 2D sample / Méthodes non-paramétriques pour la synthèse de textures volumiques à partir d’un exemple 2D

Urs, Radu Dragos 29 March 2013 (has links)
Ce mémoire traite de synthèse de textures volumiques anisotropes à partir d’une observation 2D unique. Nous présentons différentes variantes d’algorithmes non paramétriques et multi-échelles. Leur principale particularité réside dans le fait que le processus de synthèse 3D s’appuie sur l’échantillonnage d’une seule image 2D d’entrée, en garantissant la cohérence selon les différentes vues de la texture 3D. Deux catégories d’approches sont abordées, toutes deux multi-échelles et basées sur une hypothèse markovienne. La première catégorie regroupe un ensemble d’algorithmes dits de recherche de voisinages fixes, adaptés d’algorithmes existants de synthèses de textures volumiques à partir de sources 2D multiples. Le principe consiste, à partir d’une initialisation aléatoire, à modifier les voxels un par un, de façon déterministe, en s’assurant que les configurations locales de niveaux de gris sur des tranches orthogonales contenant le voxel sont semblables à des configurations présentes sur l’image d’entrée. La deuxième catégorie relève d’une approche probabiliste originale dont l’objectif est de reproduire, sur le volume texturé, les interactions entre pixels estimées sur l’image d’entrée. L’estimation est réalisée de façon non paramétrique par fenêtrage de Parzen. L’optimisation est gérée voxel par voxel, par un algorithme déterministe de type ICM. Différentes variantes sont proposées, relatives aux stratégies de gestion simultanée des tranches orthogonales contenant le voxel. Ces différentes méthodes sont d’abord mises en œuvre pour la synthèse d’un jeu de textures structurées, de régularité et d’anisotropie variées. Une analyse comparée et une étude de sensibilité sont menées, mettant en évidence les atouts et faiblesses des différentes approches. Enfin, elles sont appliquées à la simulation de textures volumiques de matériaux composites carbonés, à partir de clichés obtenus à l’échelle nanométrique par microscopie électronique à transmission. Le schéma expérimental proposé permet d’évaluer quantitativement et de façon objective les performances des différentes méthodes. / This thesis deals with the synthesis of anisotropic volumetric textures from a single 2D observation. We present variants of non parametric and multi-scale algorithms. Their main specificity lies in the fact that the 3D synthesis process relies on the sampling of a single 2D input sample, ensuring consistency in the different views of the 3D texture. Two types of approaches are investigated, both multi-scale and based on markovian hypothesis. The first category brings together a set of algorithms based on fixed-neighbourhood search, adapted from existing algorithms of texture synthesis from multiple 2D sources. The principle is that, starting from a random initialisation, the 3D texture is modified, voxel by voxel, in a deterministic manner, ensuring that the grey level local configurations on orthogonal slices containing the voxel are similar to configurations of the input image. The second category points out an original probabilistic approach which aims at reproducing in the textured volume the interactions between pixels learned in the input image. The learning is done by non-parametric Parzen windowing. Optimization is handled voxel by voxel by a deterministic ICM type algorithm. Several variants are proposed regarding the strategies used for the simultaneous handling of the orthogonal slices containing the voxel. These synthesis methods are first implemented on a set of structured textures of varied regularity and anisotropy. A comparative study and a sensitivity analysis are carried out, highlighting the strengths and the weaknesses of the different algorithms. Finally, they are applied to the simulation of volumetric textures of carbon composite materials, on nanometric scale snapshots obtained by transmission electron microscopy. The proposed experimental benchmark allows to evaluate quantitatively and objectively the performances of the different methods.
24

Méthodes non-paramétriques pour l'apprentissage et la détection de dissimilarité statistique multivariée / Nonparametric methods for learning and detecting multivariate statistical dissimilarity

Lhéritier, Alix 23 November 2015 (has links)
Cette thèse présente trois contributions en lien avec l'apprentissage et la détection de dissimilarité statistique multivariée, problématique d'importance primordiale pour de nombreuses méthodes d'apprentissage utilisées dans un nombre croissant de domaines. La première contribution introduit la notion de taille d'effet multivariée non-paramétrique, éclairant la nature de la dissimilarité détectée entre deux jeux de données, en deux étapes. La première consiste en une décomposition d'une mesure de dissimilarité (divergence de Jensen-Shannon) visant à la localiser dans l'espace ambiant, tandis que la seconde génère un résultat facilement interprétable en termes de grappes de points de forte discrépance et en proximité spatiale. La seconde contribution présente le premier test non-paramétrique d'homogénéité séquentiel, traitant les données issues de deux jeux une à une--au lieu de considérer ceux-ci- in extenso. Le test peut ainsi être arrêté dès qu'une évidence suffisamment forte est observée, offrant une flexibilité accrue tout en garantissant un contrôle del'erreur de type I. Sous certaines conditions, nous établissons aussi que le test a asymptotiquement une probabilité d'erreur de type II tendant vers zéro. La troisième contribution consiste en un test de détection de changement séquentiel basé sur deux fenêtres glissantes sur lesquelles un test d'homogénéité est effectué, avec des garanties sur l'erreur de type I. Notre test a une empreinte mémoire contrôlée et, contrairement à des méthodes de l'état de l'art qui ont aussi un contrôle sur l'erreur de type I, a une complexité en temps constante par observation, le rendant adapté aux flux de données. / In this thesis, we study problems related to learning and detecting multivariate statistical dissimilarity, which are of paramount importance for many statistical learning methods nowadays used in an increasingly number of fields. This thesis makes three contributions related to these problems. The first contribution introduces a notion of multivariate nonparametric effect size shedding light on the nature of the dissimilarity detected between two datasets. Our two step method first decomposes a dissimilarity measure (Jensen-Shannon divergence) aiming at localizing the dissimilarity in the data embedding space, and then proceeds by aggregating points of high discrepancy and in spatial proximity into clusters. The second contribution presents the first sequential nonparametric two-sample test. That is, instead of being given two sets of observations of fixed size, observations can be treated one at a time and, when strongly enough evidence has been found, the test can be stopped, yielding a more flexible procedure while keeping guaranteed type I error control. Additionally, under certain conditions, when the number of observations tends to infinity, the test has a vanishing probability of type II error. The third contribution consists in a sequential change detection test based on two sliding windows on which a two-sample test is performed, with type I error guarantees. Our test has controlled memory footprint and, as opposed to state-of-the-art methods that also provide type I error control, has constant time complexity per observation, which makes our test suitable for streaming data.
25

以特徵向量法解條件分配相容性問題 / Solving compatibility issues of conditional distributions by eigenvector approach

顧仲航, Ku, Chung Hang Unknown Date (has links)
給定兩個隨機變數的條件機率矩陣A和B,相容性問題的主要課題包 含:(一)如何判斷他們是否相容?若相容,則如何檢驗聯合分配的唯一性 或找出所有的聯合分配;(二)若不相容,則如何訂定測量不相容程度的方 法並找出最近似聯合分配。目前的文獻資料有幾種解決問題的途徑,例 如Arnold and Press (1989)的比值矩陣法、Song et al. (2010)的不可約 化對角塊狀矩陣法及Arnold et al. (2002)的數學規劃法等,經由這些方法 的啟發,本文發展出創新的特徵向量法來處理前述的相容性課題。 當A和B相容時,我們觀察到邊際分配分別是AB′和B′A對應特徵值1的 特徵向量。因此,在以邊際分配檢驗相容性時,特徵向量法僅需檢驗滿足 特徵向量條件的邊際分配,大幅度減少了檢驗的工作量。利用線性代數中 的Perron定理和不可約化對角塊狀矩陣的概念,特徵向量法可圓滿處理相 容性問題(一)的部份。 當A和B不相容時,特徵向量法也可衍生出一個測量不相容程度的簡單 方法。由於不同的測量方法可得到不同的最近似聯合分配,為了比較其優 劣,本文中提出了以條件分配的偏差加上邊際分配的偏差作為評量最近似 聯合分配的標準。特徵向量法除了可推導出最近似聯合分配的公式解外, 經過例子的驗證,在此評量標準下特徵向量法也獲得比其他測量法更佳的 最近似聯合分配。由是,特徵向量法也可用在處理相容性問題(二)的部份。 最後,將特徵向量法實際應用在兩人零和有限賽局問題上。作業研究的 解法是將雙方採取何種策略視為獨立,但是我們認為雙方可利用償付值表 所提供的資訊作為決策的依據,並將雙方的策略寫成兩個條件機率矩陣, 則賽局問題被轉換為相容性問題。我們可用廣義相容的概念對賽局的解進 行分析,並在各種測度下討論賽局的解及雙方的最佳策略。 / Given two conditional probability matrices A and B of two random variables, the issues of the compatibility include: (a) how to determine whether they are compatible? If compatible, how to check the uniqueness of the joint distribution or find all possible joint distributions; (b) if incompatible, how to measure how far they are from compatibility and find the most nearly compatible joint distribution. There are several approaches to solve these problems, such as the ratio matrix method(Arnold and Press, 1989), the IBD matrix method(Song et al., 2010) and the mathematical programming method(Arnold et al., 2002). Inspired by these methods, the thesis develops the eigenvector approach to deal with the compatibility issues. When A and B are compatible, it is observed that the marginal distributions are eigenvectors of AB′ and B′A corresponding to 1, respectively. While checking compatibility by the marginal distributions, the eigenvector approach only checks the marginal distributions which are eigenvectors of AB′ and B′A. It significantly reduces the workload. By using Perron theorem and the concept of the IBD matrix, the part (a) of compatibility issues can be dealt with the eigenvector approach. When A and B are incompatible, a simple way to measure the degree of incompatibility can be derived from the eigenvector approach. In order to compare the most nearly compatible joint distributions given by different measures, the thesis proposes the deviation of the conditional distributions plus the deviation of the marginal distributions as the most nearly compatible joint distribution assessment standard. The eigenvector approach not only derives formula for the most nearly compatible distribution, but also provides better joint distribution than those given by the other measures through the validations under this standard. The part (b) of compatibility issues can also be dealt with the eigenvector approach. Finally, the eigenvector approach is used in solving game problems. In operations research, strategies adopted by both players are assumed to be independent. However, this independent assumption may not be appropriate, since both players can make decisions through the information provided by the payoffs for the game. Let strategies of both players form two conditional probability matrices, then the game problems can be converted into compatibility issues. We can use the concept of generalized compatibility to analyze game solutions and discuss the best strategies for both players in a variety of measurements.
26

Evolution des méthodes de gestion des risques dans les banques sous la réglementation de Bale III : une étude sur les stress tests macro-prudentiels en Europe / Evolution of risk management methods in banks under Basel III regulation : a study on macroprudential stress tests in Europe

Dhima, Julien 11 October 2019 (has links)
Notre thèse consiste à expliquer, en apportant quelques éléments théoriques, les imperfections des stress tests macro-prudentiels d’EBA/BCE, et de proposer une nouvelle méthodologie de leur application ainsi que deux stress tests spécifiques en complément. Nous montrons que les stress tests macro-prudentiels peuvent être non pertinents lorsque les deux hypothèses fondamentales du modèle de base de Gordy-Vasicek utilisé pour évaluer le capital réglementaire des banques en méthodes internes (IRB) dans le cadre du risque de crédit (portefeuille de crédit asymptotiquement granulaire et présence d’une seule source de risque systématique qui est la conjoncture macro-économique), ne sont pas respectées. Premièrement, ils existent des portefeuilles concentrés pour lesquels les macro-stress tests ne sont pas suffisants pour mesurer les pertes potentielles, voire inefficaces si ces portefeuilles impliquent des contreparties non cycliques. Deuxièmement, le risque systématique peut provenir de plusieurs sources ; le modèle actuel à un facteur empêche la répercussion propre des chocs « macro ».Nous proposons un stress test spécifique de crédit qui permet d’appréhender le risque spécifique de crédit d’un portefeuille concentré, et un stress test spécifique de liquidité qui permet de mesurer l’impact des chocs spécifiques de liquidité sur la solvabilité de la banque. Nous proposons aussi une généralisation multifactorielle de la fonction d’évaluation du capital réglementaire en IRB, qui permet d’appliquer les chocs des macro-stress tests sur chaque portefeuille sectoriel, en stressant de façon claire, précise et transparente les facteurs de risque systématique l’impactant. Cette méthodologie permet une répercussion propre de ces chocs sur la probabilité de défaut conditionnelle des contreparties de ces portefeuilles et donc une meilleure évaluation de la charge en capital de la banque. / Our thesis consists in explaining, by bringing some theoretical elements, the imperfections of EBA / BCE macro-prudential stress tests, and proposing a new methodology of their application as well as two specific stress tests in addition. We show that macro-prudential stress tests may be irrelevant when the two basic assumptions of the Gordy-Vasicek core model used to assess banks regulatory capital in internal methods (IRB) in the context of credit risk (asymptotically granular credit portfolio and presence of a single source of systematic risk which is the macroeconomic conjuncture), are not respected. Firstly, they exist concentrated portfolios for which macro-stress tests are not sufficient to measure potential losses or even ineffective in the case where these portfolios involve non-cyclical counterparties. Secondly, systematic risk can come from several sources; the actual one-factor model doesn’t allow a proper repercussion of the “macro” shocks. We propose a specific credit stress test which makes possible to apprehend the specific credit risk of a concentrated portfolio, as well as a specific liquidity stress test which makes possible to measure the impact of liquidity shocks on the bank’s solvency. We also propose a multifactorial generalization of the regulatory capital valuation model in IRB, which allows applying macro-stress tests shocks on each sectorial portfolio, stressing in a clear, precise and transparent way the systematic risk factors impacting it. This methodology allows a proper impact of these shocks on the conditional probability of default of the counterparties of these portfolios and therefore a better evaluation of the capital charge of the bank.

Page generated in 0.1194 seconds