• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 100
  • 66
  • 47
  • 19
  • 8
  • 6
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 312
  • 81
  • 43
  • 40
  • 36
  • 32
  • 32
  • 32
  • 32
  • 31
  • 29
  • 27
  • 26
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

Desigualdade regional de renda e migrações : mobilidade intergeracional educacional e intrageracional de renda no Brasil

Netto Junior, José Luis da Silva January 2008 (has links)
A presente tese tem como objetivo analisar as relações entre as variáveis educacionais e a desigualdade de renda no Brasil e suas repercussões no que se refere a mobilidade intergeracional educacional e intrageracional de renda. O objetivo específico é o de verificar como a mobilidade intergeracional educacional e intrageracional de renda se diferencia regionalmente e de que modo se distingue entre os migrantes e não migrantes. Os resultados sugerem que a desigualdade de renda e de capital humano têm uma relação positiva não linear. Nas áreas onde o indicador de desigualdade de capital humano é maior, a influência dos pais nos mais baixos estratos educacionais é grande se comparado as regiões onde a desigualdade educacional é mais baixa. De um modo geral, nas regiões e estados mais pobres, os pais menos qualificados têm maior influência sobre a trajetória educacional de seus filhos. Em paralelo na região onde os estados têm os mais altos indicadores de desigualdade educacional apresenta a menor mobilidade de renda dentre as regiões analisadas. Os pais migrantes com baixa escolaridade têm uma influência menor sobre a educação dos seus filhos que seus equivalentes nas áreas de origem. E por último, os migrantes têm uma mobilidade de renda maior que a população de suas áreas de origem o que sugere uma seletividade positiva destes. / This thesis aims to analyze the relationship between educational variables and income inequality in Brazil and its repercussion related to educational and income mobility. The specific goal is to verify how the income mobility and human capital accumulation behave considering the regional differences in Brazil and migrant and native population. The results show a non-linear and positive relationship between income and human capital inequality. In the areas where the human capital inequality is higher, parents with no schooling have more influence than in the places where educational inequality is lower. At same time, the income mobility is higher in the Center and Southeast regions e lower in Northeast. The migrant parents with low schooling have less influence over the child schooling in comparison with the equivalents in their origin region. population has higher income mobility than non-migrant.
262

Eigen-birds : Exploring avian morphospace with image analytictools

Thuné, Mikael January 2012 (has links)
The plumage colour and patterns of birds have interested biologists for a long time.Why are some bird species all black while others have a multitude of colours? Does ithave anything to do with sexual selection, predator avoidance or social signalling?Many questions such as these have been asked and as many hypotheses about thefunctional role of the plumage have been formed. The problem, however, has been toprove any of these. To test these hypotheses you need to analyse the bird plumagesand today such analyses are still rather subjective. Meaning the results could varydepending on the individual performing the analysis. Another problem that stemsfrom this subjectiveness is that it is difficult to make quantitative measurements of theplumage colours. Quantitative measurements would be very useful since they couldbe related to other statistical data like speciation rates, sexual selection and ecologicaldata. This thesis aims to assist biologists with the analysis and measurement of birdplumages by developing a MATLAB toolbox for this purpose. The result is a wellstructured and user friendly toolbox that contains functions for segmenting, resizing,filtering and warping, all used to prepare the images for analysis. It also containsfunctions for the actual analysis such as basic statistical measurements, principalcomponent analysis and eigenvector projection.
263

Metody konstrukce výnosové křivky státních dluhopisů na českém dluhopisovém trhu / Methods for construction of zero-coupon yield curve from the Czech coupon bond market

Hladíková, Hana January 2008 (has links)
The zero coupon yield curve is one of the most fundamental tools in finance and is essential in the pricing of various fixed-income securities. Zero coupon rates are not observable in the market for a range of maturities. Therefore, an estimation methodology is required to derive the zero coupon yield curves from observable data. If we deal with approximations of empirical data to create yield curves it is necessary to choose suitable mathematical functions. We discuss the following methods: the methods based on cubic spline functions, methods employing linear combination of the Fourier or exponential basis functions and the parametric model of Nelson and Siegel. The current mathematical apparatus employed for this kind of approximation is outlined. In order to find parameters of the models we employ the least squares minimization of computed and observed prices. The theoretical background is applied to an estimation of the zero-coupon yield curves derived from the Czech coupon bond market. Application of proper smoothing functions and weights of bonds is crucial if we want to select a method which performs best according to given criteria. The best performance is obtained for Bspline models with smoothing.
264

Extending covariance structure analysis for multivariate and functional data

Sheppard, Therese January 2010 (has links)
For multivariate data, when testing homogeneity of covariance matrices arising from two or more groups, Bartlett's (1937) modified likelihood ratio test statistic is appropriate to use under the null hypothesis of equal covariance matrices where the null distribution of the test statistic is based on the restrictive assumption of normality. Zhang and Boos (1992) provide a pooled bootstrap approach when the data cannot be assumed to be normally distributed. We give three alternative bootstrap techniques to testing homogeneity of covariance matrices when it is both inappropriate to pool the data into one single population as in the pooled bootstrap procedure and when the data are not normally distributed. We further show that our alternative bootstrap methodology can be extended to testing Flury's (1988) hierarchy of covariance structure models. Where deviations from normality exist, we show, by simulation, that the normal theory log-likelihood ratio test statistic is less viable compared with our bootstrap methodology. For functional data, Ramsay and Silverman (2005) and Lee et al (2002) together provide four computational techniques for functional principal component analysis (PCA) followed by covariance structure estimation. When the smoothing method for smoothing individual profiles is based on using least squares cubic B-splines or regression splines, we find that the ensuing covariance matrix estimate suffers from loss of dimensionality. We show that ridge regression can be used to resolve this problem, but only for the discretisation and numerical quadrature approaches to estimation, and that choice of a suitable ridge parameter is not arbitrary. We further show the unsuitability of regression splines when deciding on the optimal degree of smoothing to apply to individual profiles. To gain insight into smoothing parameter choice for functional data, we compare kernel and spline approaches to smoothing individual profiles in a nonparametric regression context. Our simulation results justify a kernel approach using a new criterion based on predicted squared error. We also show by simulation that, when taking account of correlation, a kernel approach using a generalized cross validatory type criterion performs well. These data-based methods for selecting the smoothing parameter are illustrated prior to a functional PCA on a real data set.
265

Compression multimodale du signal et de l’image en utilisant un seul codeur / Multimodal compression of digital signal and image data using a unique encoder

Zeybek, Emre 24 March 2011 (has links)
Cette thèse a pour objectif d'étudier et d'analyser une nouvelle stratégie de compression, dont le principe consiste à compresser conjointement des données issues de plusieurs modalités, en utilisant un codeur unique. Cette approche est appelée « Compression Multimodale ». Dans ce contexte, une image et un signal audio peuvent être compressés conjointement et uniquement par un codeur d'image (e.g. un standard), sans la nécessité d'intégrer un codec audio. L'idée de base développée dans cette thèse consiste à insérer les échantillons d'un signal en remplacement de certains pixels de l'image « porteuse » tout en préservant la qualité de l'information après le processus de codage et de décodage. Cette technique ne doit pas être confondue aux techniques de tatouage ou de stéganographie puisqu'il ne s'agit pas de dissimuler une information dans une autre. En Compression Multimodale, l'objectif majeur est, d'une part, l'amélioration des performances de la compression en termes de débit-distorsion et d'autre part, l'optimisation de l'utilisation des ressources matérielles d'un système embarqué donné (e.g. accélération du temps d'encodage/décodage). Tout au long de ce rapport, nous allons étudier et analyser des variantes de la Compression Multimodale dont le noyau consiste à élaborer des fonctions de mélange et de séparation, en amont du codage et de séparation. Une validation est effectuée sur des images et des signaux usuels ainsi que sur des données spécifiques telles que les images et signaux biomédicaux. Ce travail sera conclu par une extension vers la vidéo de la stratégie de la Compression Multimodale / The objective of this thesis is to study and analyze a new compression strategy, whose principle is to compress the data together from multiple modalities by using a single encoder. This approach is called “Multimodal Compression” during which, an image and an audio signal is compressed together by a single image encoder (e.g. a standard), without the need for an integrating audio codec. The basic idea developed in this thesis is to insert samples of a signal by replacing some pixels of the "carrier's image” while preserving the quality of information after the process of encoding and decoding. This technique should not be confused with techniques like watermarking or stéganographie, since Multimodal Compression does not conceal any information with another. Two main objectives of Multimodal Compression are to improve the compression performance in terms of rate-distortion and to optimize the use of material resources of a given embedded system (e.g. acceleration of encoding/decoding time). In this report we study and analyze the variations of Multimodal Compression whose core function is to develop mixing and separation prior to coding and separation. Images and common signals as well as specific data such as biomedical images and signals are validated. This work is concluded by discussing the video of the strategy of Multimodal Compression
266

Machine learning methods for seasonal allergic rhinitis studies

Feng, Zijie January 2021 (has links)
Seasonal allergic rhinitis (SAR) is a disease caused by allergens from both environmental and genetic factors. Some researchers have studied the SAR based on traditional genetic methodologies. As technology develops, a new technique called single-cell RNA sequencing (scRNA-seq) is developed, which can generate high-dimension data. We apply two machine learning (ML) algorithms, random forest (RF) and partial least squares discriminant analysis (PLS-DA), for cell source classification and gene selection based on the SAR scRNA-seq time-series data from three allergic patients and four healthy controls denoised by single-cell variational inference (scVI). We additionally propose a new fitting method consisting of bootstrap and cubic smoothing splines to fit the averaged gene expressions per cell from different populations. To sum up, we find that both RF and PLS-DA could provide high classification accuracy, and RF is more preferable, considering its stable performance and strong gene-selection ability. Based on our analysis, there are 10 genes having discriminatory power to classify cells of allergic patients and healthy controls at any timepoints. Although there is no literature founded to show the direct connections between such 10 genes and SAR, the potential associations are indirectly confirmed by some studies. It shows a possibility that we can alarm allergic patients before a disease outbreak based on their genetic information. Meanwhile, our experiment results indicate that ML algorithms may discover something between genes and SAR compared with traditional techniques, which needs to be analyzed in genetics in the future.
267

Application des méthodes de partitionnement de données fonctionnelles aux trajectoires de voiture

Paul, Alexandre 08 1900 (has links)
La classification et le regroupement des données fonctionnelles longitudinales ont fait beaucoup de progrès dans les dernières années. Plusieurs méthodes ont été proposées et ont démontré des résultats prometteurs. Pour ce mémoire, on a comparé le comportement des algorithmes de partitionnement sur un ensemble de données décrivant les trajectoires de voitures dans une intersection de Montréal. La motivation est qu’il est coûteux et long de faire la classification manuellement et on démontre dans cet ouvrage qu’il est possible d’obtenir des prédictions adéquates avec les différents algorithmes. Parmi les méthodes utilisées, la méthode distclust utilise l’approche des K-moyennes avec une notion de distance entre les courbes fonctionnelles. On utilise aussi une classification par mélange de densité gaussienne, mclust. Ces deux approches n’étant pas conçues uniquement pour le problème de classification fonctionnelle, on a donc également appliqué des méthodes fonctionnelles spécifiques au problème : fitfclust, funmbclust, funclust et funHDDC. On démontre que les résultats du partitionnement et de la prédiction obtenus par ces approches sont comparables à ceux obtenus par ceux basés sur la distance. Les méthodes fonctionnelles sont préférables, car elles permettent d’utiliser des critères de sélection objectifs comme le AIC et le BIC. On peut donc éviter d’utiliser une partition préétablie pour valider la qualité des algorithmes, et ainsi laisser les données parler d’elles-mêmes. Finalement, on obtient des estimations détaillées de la structure fonctionnelle des courbes, comme sur l’impact de la réduction de données avec une analyse en composantes principales fonctionnelles multivariées. / The study of the clustering of functional data has made a lot of progress in the last couple of years. Multiple methods have been proposed and the respective analysis has shown their eÿciency with some benchmark studies. The objective of this Master’s thesis is to compare those clustering algorithms with datasets from traÿc at an intersection of Montreal. The idea behind this is that the manual classification of these data sets is time-consuming. We show that it is possible to obtain adequate clustering and prediction results with several algorithms. One of the methods that we discussed is distclust : a distance-based algorithm that uses a K-means approach. We will also use a Gaussian mixture density clustering method known as mclust. Although those two techniques are quite e˙ective, they are multi-purpose clustering methods, therefore not tailored to the functional case. With that in mind, we apply four functional clustering methods : fitfclust, funmbclust, funclust, and funHDDC. Our results show that there is no loss in the quality of the clustering between the afore-mentioned functional methods and the multi-purpose ones. We prefer to use the functional ones because they provide a detailed estimation of the functional structure of the trajectory curves. One notable detail is the impact of a dimension reduction done with multivari-ate functional principal components analysis. Furthermore, we can use objective selection criteria such as the AIC and the BIC, and avoid using cluster quality indices that use a pre-existing classification of the data.
268

Modul pro generování "atomů" pro přeparametrizovanou reprezentaci signálu / Software module generating "atoms" for purposes of overcomplete signal representation

Špiřík, Jan January 2010 (has links)
The aim of this master thesis is generating new "atoms'' for purposes of overcomplete signal representation for toolbox Frames in MATLAB. At first is described the principle of overcomplete systems and so-called frames. In the thesis is introduced the basic distribution of frames and conditions of their constructions. There is described the basic principle of finding the sparse solutions in overcomplete systems too. The main part is dealt with construction single functions for generating "atoms'', such as: Gabor function, B-splines, Bézier curves, Daubechies wavelets, etc. At last there is introduced an example of usage these functions for reconstruction signal in comparison with Fourier and wavelet transforms.
269

Intuitivní kreslení na platformě Android / Intuitive Drawing on the Android Platform

Appl, Martin January 2012 (has links)
This master's thesis deals with design and implementation of finger painting application for mobile devices with Android operating system. Main focus is on well designed, intuitive and friendly user interface. Solved problems are spline interpolation of points, zoom and pinch with transformation matrices, extensive history for action reversal and few basic tools.
270

Procedural Worlds : A proposition for a tool to assist in creation of landscapes byprocedural means in Unreal Engine 5

Sjögren, Viktor, Malteskog, William January 2023 (has links)
This thesis explores the possibilities of creating landscapes through procedural means within the game engine Unreal Engine 5. The aim is to provide a flexible procedural landscape tool that doesn't limit the user and that is compatible with existing systems in the engine. The research questions focuses on comparison to other work regarding landscape generation and generation of procedural roads. The process to achieve this was done through extensive implementation adding modules that both builds upon and adds to the source code. The implementation was divided into five major components, which was noise generation for terrain, biotope interpolation, asset distribution, road generation and a user interface. Perlin noise, utilizing Fractal Brownian Motion were a vital part of generating terrain with varying features. For interpolation a modified version of Lowpass Gaussian filtering was implemented in order to blend biotope edges together. Asset distribution and road generation were implemented in a way that uses pseudo-randomness combined with heuristics. The user interface was done to tie everything together for testing. The results shows potential for assisting in procedural landscape creation with a large amount of freedom in customization. There is however flaws in some aspects, namely the interpolation methods suffer from clear visual artefacts. Whether it is suitable for professional standards remains to be fully proven objectively as the testing in this thesis work was limited.

Page generated in 0.047 seconds