• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 48
  • 21
  • 15
  • 13
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 131
  • 131
  • 88
  • 34
  • 26
  • 26
  • 25
  • 23
  • 18
  • 17
  • 16
  • 14
  • 12
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Shluková analýza pro funkcionální data / Cluster analysis for functional data

Zemanová, Barbora January 2012 (has links)
In this work we deal with cluster analysis for functional data. Functional data contain a set of subjects that are characterized by repeated measurements of a variable. Based on these measurements we want to split the subjects into groups (clusters). The subjects in a single cluster should be similar and differ from subjects in the other clusters. The first approach we use is the reduction of data dimension followed by the clustering method K-means. The second approach is to use a finite mixture of normal linear mixed models. We estimate parameters of the model by maximum likelihood using the EM algorithm. Throughout the work we apply all described procedures to real meteorological data.
92

Curve Estimation and Signal Discrimination in Spatial Problems

Rau, Christian, rau@maths.anu.edu.au January 2003 (has links)
In many instances arising prominently, but not exclusively, in imaging problems, it is important to condense the salient information so as to obtain a low-dimensional approximant of the data. This thesis is concerned with two basic situations which call for such a dimension reduction. The first of these is the statistical recovery of smooth edges in regression and density surfaces. The edges are understood to be contiguous curves, although they are allowed to meander almost arbitrarily through the plane, and may even split at a finite number of points to yield an edge graph. A novel locally-parametric nonparametric method is proposed which enjoys the benefit of being relatively easy to implement via a `tracking' approach. These topics are discussed in Chapters 2 and 3, with pertaining background material being given in the Appendix. In Chapter 4 we construct concomitant confidence bands for this estimator, which have asymptotically correct coverage probability. The construction can be likened to only a few existing approaches, and may thus be considered as our main contribution. ¶ Chapter 5 discusses numerical issues pertaining to the edge and confidence band estimators of Chapters 2-4. Connections are drawn to popular topics which originated in the fields of computer vision and signal processing, and which surround edge detection. These connections are exploited so as to obtain greater robustness of the likelihood estimator, such as with the presence of sharp corners. ¶ Chapter 6 addresses a dimension reduction problem for spatial data where the ultimate objective of the analysis is the discrimination of these data into one of a few pre-specified groups. In the dimension reduction step, an instrumental role is played by the recently developed methodology of functional data analysis. Relatively standar non-linear image processing techniques, as well as wavelet shrinkage, are used prior to this step. A case study for remotely-sensed navigation radar data exemplifies the methodology of Chapter 6.
93

Degradation modeling and monitoring of engineering systems using functional data analysis

Zhou, Rensheng 08 November 2012 (has links)
In this thesis, we develop several novel degradation models based on techniques from functional data analysis. These models are suitable for characterizing different types of sensor-based degradation signals, whether they are censored at a certain fixed time point or truncated at the failure threshold. Our proposed models can also be easily extended to accommodate for the effects of environmental conditions on degradation processes. Unlike many existing degradation models that rely on the existence of a historical sample of complete degradation signals, our modeling framework is well-suited for modeling complete as well as incomplete (sparse and fragmented) degradation signals. We utilize these models to predict and continuously update, in real time, the residual life distributions of partially degraded components. We assess and compare the performance of our proposed models and existing benchmark models by using simulated signals and real world data sets. The results indicate that our models can provide a better characterization of the degradation signals and a more accurate prediction of a system's lifetime under different signal scenarios. Another major advantage of our models is their robustness to the model mis-specification, which is especially important for applications with incomplete degradation signals (sparse or fragmented).
94

Robotic Hand Evaluation Based on Task Specific Kinematic Requirements

Neninger, Carlos Rafael 01 January 2011 (has links)
With the rise autonomous and robotic systems in field applications, the need for dexterous, highly adaptable end effectors has become a major research topic. Control mechanisms of robotics hands with a high number independent actuators is recognized as a complex, high dimensional problem, with exponentially complex algorithms. However, recent studies have shown that human hand motion possesses very high joint correlation which translates into a set of predefined postures, or synergies. The hand produces a motion using a complementing contribution of multiple joints, called synergies. The similarities place variables onto a common dimensional space, effectively reducing the number of independent variables. In this thesis, we analyze the motion of the hand during a set of objects grasps using mul- tivariate Principal Component Analysis (mPCA) to extract both the principal variables and their correlation during grasping. We introduce the use of Functional PCA (fPCA) primarily on princi- pal components to study the dynamic requirements of the motion. The goal is to defined a set of synergies common and specific to all motions. We expand the analysis by classifying the objects grasps, or tasks, using their functional components, or harmonics over the entire motion. A set of groups are described based on these classification that confirms empirical findings. Lastly, we evaluate the motions generated from the analysis by applying them onto robotic hands. The results from the mPCA and fPCA procedures are used to map the principal components from each motion onto underactuated robotic designs. We produce a viable routine that indicates how the mapping is performed, and finally, we implement the motion generated onto a real hand. The resultant robotic motion was evaluated on how it mimics the human motion.
95

Robust Multichannel Functional-Data-Analysis Methods for Data Recovery in Complex Systems

Sun, Jian 01 December 2011 (has links)
In recent years, Condition Monitoring (CM), which can be performed via several sensor channels, has been recognized as an effective paradigm for failure prevention of operational equipment or processes. However, the complexity caused by asynchronous data collection with different and/or time-varying sampling/transmission rates has long been a hindrance in the effective use of multichannel data in constructing empirical models. The problem becomes more challenging when sensor readings are incomplete. Traditional sensor data recovery techniques are often prohibited in asynchronous CM environments, not to mention sparse datasets. The proposed Functional Principal Component Analysis (FPCA) methodologies, e.g., nonparametric FPC model and semi-parametric functional regression model, provide new sensor data recovery techniques to improve the reliability and robustness of multichannel CM systems. Based on the FPCA results obtained from historical asynchronous data, the deviation from the smoothing trajectory of each sensor signal can be described by a set of unit-specific model parameters. Furthermore, the relationships among these sensor signals can be identified and used to construct regression models for the correlated signals. For real-time or online implementation, use of these models along with the parameters adjusted by real-time CM data become powerful tools for dealing with asynchronous CM data while recovering lost data when needed. To improve the robustness and predictability in dealing with asynchronous data, which may be skewed in probability distribution, robust methods were developed based on Functional Data Analysis (FDA) and Local Quantile Regression (LQR) models. Case studies examining turbofan aircraft engines and an experimental two-tank flow-control loop are used to demonstrate the effectiveness and adaptability of the proposed sensor data recovery techniques. The proposed methods may also find a variety of applications in systems of other industries, such as nuclear power plants, wind turbines, railway systems, economic fields, etc., which may face asynchronous sampling and/or missing data collection problems.
96

一種基於函數型資料主成分分析的曲線對齊方式 / A Curve Alignment Method Based on Functional PCA

林昱航, Lin,Yu-Hang Unknown Date (has links)
函數型資料分析的是一組曲線資料,通常定義域為一段時間範圍。常見的如某一個地區人口在成長期的身高紀錄表或是氣候統計資料。函數型資料主要特色曲線間常有共同趨勢,而且個別曲線反應共同趨勢時也有時間和強度上的差異。本文研究主要是使用Kneip 和 Ramsay提出,結合對齊程序和主成分分析的想法作為模型架構,來分析函數型資料的特性。首先在對齊過程中,使用時間轉換函數(warping function),解決觀測資料上時間的差異;並使用主成分分析方法,幫助研究者探討資料的主要特性。基於函數型資料被預期的共同趨勢性,我們可以利用此一特色作為各種類型資料分類上的依據。此外本研究會對幾種選取主成分個數的方法,進行綜合討論與比較。 / In this thesis, a procedure combining curve alignment and functional principal component analysis is studied. The procedure is proposed by Kneip and Ramsay .In functional principal component analysis, if the data curves are roughly linear combinations of k basis curves, then the data curves are expected to be explained well by principle component curves. The goal of this study is to examine whether this property still holds when curves need to be aligned. It is found that, if the aligned data curves can be approximated well by k basis curves, then applying Kneip and Ramsay's procedure to the unaligned curves gives k principal components that can explain the aligned curves well. Several approaches for selecting the number of principal components are proposed and compared.
97

Traitement statistique du signal : applications en biologie et économie / Statistical signal processing : Applications in biology and economics

Hamie, Ali 28 January 2016 (has links)
Dans cette thèse, nous nous intéressons à développer des outils mathématiques, afin de traiter une gamme des signaux biologiques et économiques. En premier lieu, nous proposons la transformée Dynalet, considérée comme une alternative, pour des signaux de relaxation sans symétrie interne, à la transformée de Fourier et à la transformée ondelette. L'applicabilité de cette nouvelle approximation est illustrée sur des données réelles. Ensuite, nous corrigeons la ligne de base des signaux biologiques spectrométriques, à l'aide d'une régression expectile pénalisée, qui, sur les applications proposées, est plus performante qu'une régression quantile. Puis, afin d'éliminer le bruit blanc, nous adaptons aux signaux spectrométriques une nouvelle approche combinant ondelette, seuillage doux et composants PLS. Pour terminer, comme les signaux peuvent être considérés comme des données fonctionnelles, d'une part, nous développons une vraisemblance locale fonctionnelle dont le but est d'effectuer une classification supervisée des courbes, et, d'autre part, nous estimons l'opérateur de régression pour une réponse scalaire positive non nulle, par minimisation de l'erreur quadratique moyenne relative. De plus, les lois asymptotiques de notre estimateur sont établies et son efficacité est illustrée sur des données simulées et sur des données spectroscopiques et économiques. / In this thesis, we focus on developing mathematical tools to treat a range of biological and economic signals. First, we propose the Dynalet transform for non-symmetrical biological relaxation signals. This transform is considered as an alternative to the Fourier transform and the wavelet transform. The applicability of the new approximation approach is illustrated on real data. Then, for spectrometric biological signals, we correct the baseline using a penalized expectile regression. Thus, the proposed applications show that our proposed regression is more efficient than the quantile regression. Then to remove random noise, we adapt to spectrometric data a new denoising method that combine wavelets, soft thresholding rule and PLS components. Finally, note that the biological signals may be often regarded as functional data. On one hand, we develop a functional local likelihood aiming to perform a supervised classification of curves. On the other hand, we estimate the regression operator with positive responses, by minimizing the mean squared relative error. Moreover, The asymptotic distributions of our estimator are established and their efficiency is illustrated on a simulation study and on a spectroscopic and economic data set.
98

Funkcionální datové struktury a algoritmy / Functional Data Stuctures and Algorithms

Straka, Milan January 2013 (has links)
Title: Functional Data Structures and Algorithms Author: Milan Straka Institute: Computer Science Institute of Charles University Supervisor of the doctoral thesis: doc. Mgr. Zdeněk Dvořák, Ph.D, Computer Science Institute of Charles University Abstract: Functional programming is a well established programming paradigm and is becoming increasingly popular, even in industrial and commercial appli- cations. Data structures used in functional languages are principally persistent, that is, they preserve previous versions of themselves when modified. The goal of this work is to broaden the theory of persistent data structures and devise efficient implementations of data structures to be used in functional languages. Arrays are without any question the most frequently used data structure. Despite being conceptually very simple, no persistent array with constant time access operation exists. We describe a simplified implementation of a fully per- sistent array with asymptotically optimal amortized complexity Θ(log log n) and especially a nearly optimal worst-case implementation. Additionally, we show how to effectively perform a garbage collection on a persistent array. The most efficient data structures are not necessarily based on asymptotically best structures. On that account, we also focus on data structure...
99

Analyse de données fonctionnelles en télédétection hyperspectrale : application à l'étude des paysages agri-forestiers / Functional data analysis in hyperspectral remote sensing : application to the study of agri-forest landscape

Zullo, Anthony 19 September 2016 (has links)
En imagerie hyperspectrale, chaque pixel est associé à un spectre provenant de la réflectance observée en d points de mesure (i.e., longueurs d'onde). On se retrouve souvent dans une situation où la taille d'échantillon n est relativement faible devant le nombre d de variables. Ce phénomène appelé "fléau de la dimension" est bien connu en statistique multivariée. Plus d augmente devant n, plus les performances des méthodologies statistiques standard se dégradent. Les spectres de réflectance intègrent dans leur dimension spectrale un continuum qui leur confère une nature fonctionnelle. Un hyperspectre peut être modélisé par une fonction univariée de la longueur d'onde, sa représentation produisant une courbe. L'utilisation de méthodes fonctionnelles sur de telles données permet de prendre en compte des aspects fonctionnels tels que la continuité, l'ordre des bandes spectrales, et de s'affranchir des fortes corrélations liées à la finesse de la grille de discrétisation. L'objectif principal de cette thèse est d'évaluer la pertinence de l'approche fonctionnelle dans le domaine de la télédétection hyperspectrale lors de l'analyse statistique. Nous nous sommes focalisés sur le modèle non-paramétrique de régression fonctionnelle, couvrant la classification supervisée. Dans un premier temps, l'approche fonctionnelle a été comparée avec des méthodes multivariées usuellement employées en télédétection. L'approche fonctionnelle surpasse les méthodes multivariées dans des situations délicates où l'on dispose d'une petite taille d'échantillon d'apprentissage combinée à des classes relativement homogènes (c'est-à-dire difficiles à discriminer). Dans un second temps, une alternative à l'approche fonctionnelle pour s'affranchir du fléau de la dimension a été développée à l'aide d'un modèle parcimonieux. Ce dernier permet, à travers la sélection d'un petit nombre de points de mesure, de réduire la dimensionnalité du problème tout en augmentant l'interprétabilité des résultats. Dans un troisième temps, nous nous sommes intéressés à la situation pratique quasi-systématique où l'on dispose de données fonctionnelles contaminées. Nous avons démontré que pour une taille d'échantillon fixée, plus la discrétisation est fine, meilleure sera la prédiction. Autrement dit, plus d est grand devant n, plus la méthode statistique fonctionnelle développée est performante. / In hyperspectral imaging, each pixel is associated with a spectrum derived from observed reflectance in d measurement points (i.e., wavelengths). We are often facing a situation where the sample size n is relatively low compared to the number d of variables. This phenomenon called "curse of dimensionality" is well known in multivariate statistics. The mored increases with respect to n, the more standard statistical methodologies performances are degraded. Reflectance spectra incorporate in their spectral dimension a continuum that gives them a functional nature. A hyperspectrum can be modelised by an univariate function of wavelength and his representation produces a curve. The use of functional methods allows to take into account functional aspects such as continuity, spectral bands order, and to overcome strong correlations coming from the discretization grid fineness. The main aim of this thesis is to assess the relevance of the functional approach in the field of hyperspectral remote sensing for statistical analysis. We focused on the nonparametric fonctional regression model, including supervised classification. Firstly, the functional approach has been compared with multivariate methods usually involved in remote sensing. The functional approach outperforms multivariate methods in critical situations where one has a small training sample size combined with relatively homogeneous classes (that is to say, hard to discriminate). Secondly, an alternative to the functional approach to overcome the curse of dimensionality has been proposed using parsimonious models. This latter allows, through the selection of few measurement points, to reduce problem dimensionality while increasing results interpretability. Finally, we were interested in the almost systematic situation where one has contaminated functional data. We proved that for a fixed sample size, the finer the discretization, the better the prediction. In other words, the larger dis compared to n, the more effective the functional statistical methodis.
100

Aspects théoriques et pratiques dans l'estimation non paramétrique de la densité conditionnelle pour des données fonctionnelles / Theoretical and practical aspects in non parametric estimation of the conditional density with functional data

Madani, Fethi 11 May 2012 (has links)
Dans cette thèse, nous nous intéressons à l'estimation non paramétrique de la densité conditionnelle d'une variable aléatoire réponse réelle conditionnée par une variable aléatoire explicative fonctionnelle de dimension éventuellement fi nie. Dans un premier temps, nous considérons l'estimation de ce modèle par la méthode du double noyaux. Nous proposons une méthode de sélection automatique du paramètre de lissage (global et puis local) intervenant dans l'estimateur à noyau, et puis nous montrons l'optimalité asymptotique du paramètre obtenu quand les observations sont indépendantes et identiquement distribuées. Le critère adopté est issu du principe de validation croisée. Dans cette partie nous avons procédé également à la comparaison de l'efficacité des deux types de choix (local et global). Dans la deuxième partie et dans le même contexte topologique, nous estimons la densité conditionnelle par la méthode des polynômes locaux. Sous certaines conditions, nous établissons des propriétés asymptotiques de cet estimateur telles que la convergence presque-complète et la convergence en moyenne quadratique dans le cas où les observations sont indépendantes et identiquement distribuées. Nous étendons aussi nos résultats au cas où les observations sont de type α- mélangeantes, dont on montre la convergence presque-complète (avec vitesse de convergence) de l'estimateur proposé. Enfi n, l'applicabilité rapide et facile de nos résultats théoriques, dans le cadre fonctionnel, est illustrée par des exemples (1) sur des données simulées, et (2) sur des données réelles. / In this thesis, we consider the problem of the nonparametric estimation of the conditional density when the response variable is real and the regressor is valued in a functional space. In the rst part, we use the double kernels method's as a estimation method where we focus on the choice of the smoothing parameters. We construct a data driven method permitting to select optimally and automatically bandwidths. As main results, we study the asymptotic optimality of this selection method in the case where observations are independent and identically distributed (i.i.d). Our selection rule is based on the classical cross-validation ideas and it deals with the both global and local choices. The performance of our approach is illustrated also by some simulation results on nite samples where we conduct a comparison between the two types of bandwidths choices (local and global). In the second part, we adopt a functional version of the local linear method, in the same topological context, to estimate some functional parameters. Under some general conditions, we establish the almost-complete convergence (with rates) of the proposed estimator in the both cases ( the i.i.d. case and the α-mixing case) . As application, we use the conditional density estimator to estimate the conditional mode estimation and to derive some asymptotic proprieties of the constructed estimator. Then, we establish the quadratic error of this estimator by giving its exact asymptotic expansion (involved in the leading in the bias and variance terms). Finally, the applicability of our results is then veri ed and validated for (1) simulated data, and (2) some real data.

Page generated in 0.0809 seconds