Spelling suggestions: "subject:"biunctional data"" "subject:"5functional data""
31 |
Statistical validation and calibration of computer modelsLiu, Xuyuan 21 January 2011 (has links)
This thesis deals with modeling, validation and calibration problems in experiments of computer models. Computer models are mathematic representations of real systems developed for understanding and investigating the systems. Before a computer model
is used, it often needs to be validated by comparing the computer outputs with physical observations and calibrated by adjusting internal model parameters in order to improve the agreement between the computer outputs and physical observations.
As computer models become more powerful and popular, the complexity of input and output data raises new computational challenges and stimulates the development of novel statistical modeling methods.
One challenge is to deal with computer models with random inputs (random effects). This kind of computer models is very common in engineering applications. For example, in a thermal experiment in the Sandia National Lab (Dowding et al. 2008), the volumetric heat capacity and thermal conductivity are random input variables. If input variables are randomly sampled from particular distributions with unknown parameters, the existing methods in the literature are not directly applicable. The reason is that integration over the random variable distribution is needed for the joint likelihood and the integration cannot always be expressed in a closed form. In this research, we propose a new approach which combines the nonlinear mixed effects model and the Gaussian process model (Kriging model). Different model formulations are also studied to have an better understanding of validation and calibration activities by using the thermal problem.
Another challenge comes from computer models with functional outputs. While many methods have been developed for modeling computer experiments with single response, the literature on modeling computer experiments with functional response is sketchy. Dimension reduction techniques can be used to overcome the complexity problem of function response; however, they generally involve two steps. Models are first fit at each individual setting of the input to reduce the dimensionality of the functional data. Then the estimated parameters of the models are treated as new responses, which are further modeled for prediction. Alternatively, pointwise models are first constructed at each time point and then functional curves are fit to the parameter estimates obtained from the fitted models. In this research, we first propose a functional regression model to relate functional responses to both design and time variables in one single step. Secondly, we propose a functional kriging model which uses variable selection methods by imposing a penalty function. we show that the proposed model performs better than dimension reduction based approaches and the kriging model without regularization. In addition, non-asymptotic theoretical bounds on the estimation error are presented.
|
32 |
FPCA Based Human-like Trajectory GeneratingDai, Wei 01 January 2013 (has links)
This thesis presents a new human-like upper limb and hand motion generating method. The work is based on Functional Principal Component Analysis and Quadratic Programming. The human-like motion generating problem is formulated in a framework of minimizing the difference of the dynamic profile of the optimal trajectory and the known types of trajectory. Statistical analysis is applied to the pre-captured human motion records to work in a low dimensional space. A novel PCA FPCA hybrid motion recognition method is proposed. This method is implemented on human grasping data to demonstrate its advantage in human motion recognition. One human grasping hierarchy is also proposed during the study. The proposed method of generating human-like upper limb and hand motion explores the ability to learn the motion kernels from human demonstration. Issues in acquiring motion kernels are also discussed. The trajectory planning method applies different weight on the extracted motion kernels to approximate the kinematic constraints of the task. Multiple means of evaluation are implemented to illustrate the quality of the generated optimal human-like trajectory compared to the real human motion records.
|
33 |
Function-on-Function Regression with Public Health ApplicationsMeyer, Mark John 06 June 2014 (has links)
Medical research currently involves the collection of large and complex data. One such type of data is functional data where the unit of measurement is a curve measured over a grid. Functional data comes in a variety of forms depending on the nature of the research. Novel methodologies are required to accommodate this growing volume of functional data alongside new testing procedures to provide valid inferences. In this dissertation, I propose three novel methods to accommodate a variety of questions involving functional data of multiple forms. I consider three novel methods: (1) a function-on-function regression for Gaussian data; (2) a historical functional linear models for repeated measures; and (3) a generalized functional outcome regression for ordinal data. For each method, I discuss the existing shortcomings of the literature and demonstrate how my method fills those gaps. The abilities of each method are demonstrated via simulation and data application.
|
34 |
On Bayesian Analyses of Functional Regression, Correlated Functional Data and Non-homogeneous Computer ModelsMontagna, Silvia January 2013 (has links)
<p>Current frontiers in complex stochastic modeling of high-dimensional processes include major emphases on so-called functional data: problems in which the data are snapshots of curves and surfaces representing fundamentally important scientific quantities. This thesis explores new Bayesian methodologies for functional data analysis. </p><p>The first part of the thesis places emphasis on the role of factor models in functional data analysis. Data reduction becomes mandatory when dealing with such high-dimensional data, more so when data are available on a large number of individuals. In Chapter 2 we present a novel Bayesian framework which employs a latent factor construction to represent each variable by a low dimensional summary. Further, we explore the important issue of modeling and analyzing the relationship of functional data with other covariate and outcome variables simultaneously measured on the same subjects.</p><p>The second part of the thesis is concerned with the analysis of circadian data. The focus is on the identification of circadian genes that is, genes whose expression levels appear to be rhythmic through time with a period of approximately 24 hours. While addressing this goal, most of the current literature does not account for the potential dependence across genes. In Chapter 4, we propose a Bayesian approach which employs latent factors to accommodate dependence and verify patterns and relationships between genes, while representing the true gene expression trajectories in the Fourier domain allows for inference on period, phase, and amplitude of the signal.</p><p>The third part of the thesis is concerned with the statistical analysis of computer models (simulators). The heavy computational demand of these input-output maps calls for statistical techniques that quickly estimate the surface output at untried inputs given a few preliminary runs of the simulator at a set design points. In this regard, we propose a Bayesian methodology based on a non-stationary Gaussian process. Relying on a model-based assessment of uncertainty, we envision a sequential design technique which helps choosing input points where the simulator should be run to minimize the uncertainty in posterior surface estimation in an optimal way. The proposed non-stationary approach adapts well to output surfaces of unconstrained shape.</p> / Dissertation
|
35 |
Statistical Models and Algorithms for Studying Hand and Finger Kinematics and their Neural MechanismsCastellanos, Lucia 01 August 2013 (has links)
The primate hand, a biomechanical structure with over twenty kinematic degrees of freedom, has an elaborate anatomical architecture. Although the hand requires complex, coordinated neural control, it endows its owner with an astonishing range of dexterous finger movements. Despite a century of research, however, the neural mechanisms that enable finger and grasping movements in primates are largely unknown. In this thesis, we investigate statistical models of finger movement that can provide insights into the mechanics of the hand, and that can have applications in neural-motor prostheses, enabling people with limb loss to regain natural function of the hands.
There are many challenges associated with (1) the understanding and modeling of the kinematics of fingers, and (2) the mapping of intracortical neural recordings into motor commands that can be used to control a Brain-Machine Interface. These challenges include: potential nonlinearities; confounded sources of variation in experimental datasets; and dealing with high degrees of kinematic freedom. In this work we analyze kinematic and neural datasets from repeated-trial experiments of hand motion, with the following contributions: We identified static, nonlinear, low-dimensional representations of grasping finger motion, with accompanying evidence that these nonlinear representations are better than linear representations at predicting the type of object being grasped over the course of a reach-to-grasp movement. In addition, we show evidence of better encoding of these nonlinear (versus linear) representations in the firing of some neurons collected from the primary motor cortex of rhesus monkeys. A functional alignment of grasping trajectories, based on total kinetic energy, as a strategy to account for temporal variation and to exploit a repeated-trial experiment structure. An interpretable model for extracting dynamic synergies of finger motion, based on Gaussian Processes, that decomposes and reduces the dimensionality of variance in the dataset. We derive efficient algorithms for parameter estimation, show accurate reconstruction of grasping trajectories, and illustrate the interpretation of the model parameters. Sound evidence of single-neuron decoding of interpretable grasping events, plus insights about the amount of grasping information extractable from just a single neuron. The Laplace Gaussian Filter (LGF), a deterministic approximation to the posterior mean that is more accurate than Monte Carlo approximations for the same computational cost, and that in an off-line decoding task is more accurate than the standard Population Vector Algorithm.
|
36 |
Evaluation of functional data models for database design and useKulkarni, Krishnarao Gururao January 1983 (has links)
The problems of design, operation, and maintenance of databases using the three most popular database management systems (Hierarchical, CQDASYL/DBTG, and Relational) are well known. Users wishing to use these systems have to make conscious and often complex mappings between the real-world structures and the data structuring options (data models) provided by these systems. In addition, much of the semantics associated with the data either does not get expressed at all or gets embedded procedurally in application programs in an ad-hoc way. In recent years, a large number of data models (called semantic data models) have been proposed with the aim of simplifying database design and use. However, the lack of usable implementations of these proposals has so far inhibited the widespread use of these concepts. The present work reports on an effort to evaluate and extend one such semantic model by means of an implementation. It is based on the functional data model proposed earlier by Shipman (SHIP81). We call this 'Extended Functional Data Model' (EFDM). EFDM, like Shipman's proposals, is a marriage of three of the advanced modelling concepts found in both database and artificial intelligence research: the concept of entity to represent an object in the real world, the concept of type hierarchy among entity types, and the concept of derived data for modelling procedural knowledge. The functional notation of the model lends itself to high level data manipulation languages. The data selection in these languages is expressed simply as function application. Further, the functional approach makes it possible to incorporate general purpose computation facilities in the data languages without having to embed them in procedural languages. In addition to providing the usual database facilities, the implementation also provides a mechanism to specify multiple user views of the database.
|
37 |
Análise de dados funcionais aplicada à engenharia da qualidade / Functional data analysis applied to quality engineeringPedott, Alexandre Homsi January 2015 (has links)
A disseminação de sistemas de aquisição de dados sobre a qualidade e o desempenho de produtos e processos de fabricação deu origem a novos tipos de dados. Dado funcional é um conjunto de dados que formam um perfil ou uma curva. No perfil, a característica de qualidade é uma função dependente de uma ou mais variáveis exploratórias ou independentes. A análise de dados funcionais é um tema de pesquisa recente praticado em diversas áreas do conhecimento. Na indústria, os dados funcionais aparecem no controle de qualidade. A ausência de métodos apropriados a dados funcionais pode levar ao uso de métodos ineficientes e reduzir o desempenho e a qualidade de um produto ou processo. A análise de dados funcionais através de métodos multivariados pode ser inadequada devido à alta dimensionalidade e estruturas de variância e covariância dos dados. O desenvolvimento teórico de métodos para a análise de dados funcionais na área de Engenharia da Qualidade encontra-se defasado em relação ao potencial de aplicações práticas. Este trabalho identificou a existência dos dados funcionais tratados por métodos ineficientes. Os métodos atuais para controle de qualidade de dados são adaptados a situações específicas, conforme o tipo de dado funcional e a fase do monitoramento. Este trabalho apresenta propostas para métodos de análise de dados funcionais aplicáveis a questões relevantes da área de pesquisa em Engenharia da Qualidade, tais como: (i) o uso da análise de variância em experimentos com dados funcionais; (ii) gráficos de controle para monitoramento de perfis; e (iii) a análise e seleção de perfis de fornecedores em projetos inovadores. / The dissemination of data acquisition systems on the quality and performance of products and manufacturing process has given rise to new types of data. Functional data are a collection of data points organized as a profile or curve. In profile, the quality characteristic is a function dependent on one or more exploratory or independent variables. The functional data analysis is a recent research topic practiced in various areas of knowledge. In industry, the functional data appears in quality control. The lack of suitable methods can lead to use of inefficient methods and reducing the performance and quality of a product or process. The analysis of functional data by multivariate methods may be inadequate due to the high dimensionality and variance and covariance structures of the data. The development of theoretical methods for the analysis of functional data in Quality Engineering area is lagged behind the potential for practical applications. This work identified the existence of functional data processed by inefficient methods. Current methods for data quality control are adapted to specific situations, depending on the type of functional data and the phase of monitoring. This paper presents proposals for functional data analysis methods applicable to relevant research questions in the area of Quality Engineering such as: (i) the use of analysis of variance in experiments with functional data; (ii) control charts for monitoring profiles; and (iii) the analysis and selection of supplier profiles on innovative projects.
|
38 |
Estimação da estrutura a termo da taxa de juros com abordagem de dados funcionaisRuas, Marcelo Castiel January 2014 (has links)
Neste trabalho, estudam-se métodos que consideram a natureza funcional da Estrutura a Termo da Taxa de Juros (ETTJ) para fazer previsões fora da amostra. São estimados modelos não-paramétricos para dados funcionais (NP-FDA) e séries temporais funcionais (FTS). O primeiro se baseia em um estimador de regressão proposto por Ferraty e Vieu (2006), que utiliza funções Kernel para atribuir pesos localmente às variáveis funcionais. Já o segundo se baseia no trabalho de Hays, Shen e Huang (2012), que estimam a ETTJ através de um modelo de fatores dinâmicos, que por sua vez são estimados através de análise de componentes principais funcional. Testa-se a capacidade de previsão dos modelos com a ETTJ americana, para os horizontes de 1, 3, 6 e 12 meses, e comparam-se os resultados com modelos benchmark, como Diebold e Li (2006) e o passeio aleatório. Principal foco deste trabalho, as estimações com métodos NP-FDA não tiveram resultado muito bons, obtendo sucesso apenas com maturidades e horizontes muito curtos. Já as estimações com FTS tiveram, no geral, desempenho melhor que os métodos escolhidos como benchmark. / This work studies methods that takes the Yield Curve's functional nature into account to produce out-of-sample forecasts. These methods are based in nonparametric functional data analysis (NP-FDA) and functional time series (FTS). The former are based in a functional regressor estimator proposed by Ferraty e Vieu (2006) that includes Kernel functions to do local weighting between the functional variables. The latter are based on the paper by Hays, Shen and Huang (2012), that forecasts the Yield Curve based in a dynamic factors model, in which the factors are determined by functional principal component analysis. Their forecasting capability is tested for the american's Yield Curve database for 1, 3, 6 and 12 months. The results from the functional methods models are then compared to benchmarks widely used in the literature, such as the random walk and the Diebold and Li (2006). Main focus on this work, the NP-FDA methods didn't produce very good forecasts, being successful only for very low maturities and short forecast horizons. The forecasts generated by the FTS methods were, in general, better than our chosen benchmarks.
|
39 |
Supervised and Ensemble Classification of Multivariate Functional Data: Applications to Lupus DiagnosisJanuary 2018 (has links)
abstract: This dissertation investigates the classification of systemic lupus erythematosus (SLE) in the presence of non-SLE alternatives, while developing novel curve classification methodologies with wide ranging applications. Functional data representations of plasma thermogram measurements and the corresponding derivative curves provide predictors yet to be investigated for SLE identification. Functional nonparametric classifiers form a methodological basis, which is used herein to develop a) the family of ESFuNC segment-wise curve classification algorithms and b) per-pixel ensembles based on logistic regression and fused-LASSO. The proposed methods achieve test set accuracy rates as high as 94.3%, while returning information about regions of the temperature domain that are critical for population discrimination. The undertaken analyses suggest that derivate-based information contributes significantly in improved classification performance relative to recently published studies on SLE plasma thermograms. / Dissertation/Thesis / Doctoral Dissertation Applied Mathematics 2018
|
40 |
Estimação da estrutura a termo da taxa de juros com abordagem de dados funcionaisRuas, Marcelo Castiel January 2014 (has links)
Neste trabalho, estudam-se métodos que consideram a natureza funcional da Estrutura a Termo da Taxa de Juros (ETTJ) para fazer previsões fora da amostra. São estimados modelos não-paramétricos para dados funcionais (NP-FDA) e séries temporais funcionais (FTS). O primeiro se baseia em um estimador de regressão proposto por Ferraty e Vieu (2006), que utiliza funções Kernel para atribuir pesos localmente às variáveis funcionais. Já o segundo se baseia no trabalho de Hays, Shen e Huang (2012), que estimam a ETTJ através de um modelo de fatores dinâmicos, que por sua vez são estimados através de análise de componentes principais funcional. Testa-se a capacidade de previsão dos modelos com a ETTJ americana, para os horizontes de 1, 3, 6 e 12 meses, e comparam-se os resultados com modelos benchmark, como Diebold e Li (2006) e o passeio aleatório. Principal foco deste trabalho, as estimações com métodos NP-FDA não tiveram resultado muito bons, obtendo sucesso apenas com maturidades e horizontes muito curtos. Já as estimações com FTS tiveram, no geral, desempenho melhor que os métodos escolhidos como benchmark. / This work studies methods that takes the Yield Curve's functional nature into account to produce out-of-sample forecasts. These methods are based in nonparametric functional data analysis (NP-FDA) and functional time series (FTS). The former are based in a functional regressor estimator proposed by Ferraty e Vieu (2006) that includes Kernel functions to do local weighting between the functional variables. The latter are based on the paper by Hays, Shen and Huang (2012), that forecasts the Yield Curve based in a dynamic factors model, in which the factors are determined by functional principal component analysis. Their forecasting capability is tested for the american's Yield Curve database for 1, 3, 6 and 12 months. The results from the functional methods models are then compared to benchmarks widely used in the literature, such as the random walk and the Diebold and Li (2006). Main focus on this work, the NP-FDA methods didn't produce very good forecasts, being successful only for very low maturities and short forecast horizons. The forecasts generated by the FTS methods were, in general, better than our chosen benchmarks.
|
Page generated in 0.0889 seconds