• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 202
  • 88
  • 54
  • 34
  • 14
  • 13
  • 12
  • 9
  • 6
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 485
  • 86
  • 71
  • 59
  • 56
  • 55
  • 50
  • 48
  • 48
  • 45
  • 45
  • 44
  • 41
  • 40
  • 37
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Comparaison d'estimateurs de la variance du TMLE

Boulanger, Laurence 09 1900 (has links)
No description available.
292

Implementação no software estatístico R de modelos de regressão normal com parametrização geral / Normal regression models with general parametrization in software R

Perette, André Casagrandi 23 August 2019 (has links)
Este trabalho objetiva o desenvolvimento de um pacote no software estatístico R com a implementação de estimadores em modelos de regressão normal univariados com parametrização geral, uma particularidade do modelo definido em Patriota e Lemonte (2011). Essa classe contempla uma ampla gama de modelos conhecidos, tais como modelos de regressão não lineares e heteroscedásticos. São implementadas correções nos estimadores de máxima verossimilhança e na estatística de razão de verossimilhanças. Tais correções são efetivas quando o tamanho amostral é pequeno. Para a correção do estimador de máxima verossimilhança, considerou-se a correção do viés de segunda ordem, enquanto que para a estatística da razão de verossimilhanças aplicou-se a correção desenvolvida em Skovgaard (2001). Todas as funcionalidades do pacote são descritas detalhadamente neste trabalho. Para avaliar a qualidade do algoritmo desenvolvido, realizaram-se simulações de Monte Carlo para diferentes cenários, avaliando taxas de convergência, erros da estimação e eficiência das correções de viés e de Skovgaard. / This work aims to develop a package in R language with the implementation of normal regression models with general parameterization, proposed in Patriota and Lemonte (2011). This model unifies important models, such as nonlinear heteroscedastic models. Corrections are implemented for the MLEs and likelihood-ratio statistics. These corrections are effective in small samples. The algorithm considers the second-order bias of MLEs solution presented in Patriota and Lemonte (2009) and the Skovgaard\'s correction for likelihood-ratio statistics defined in Skovgaard (2001). In addition, a simulation study is developed under different scenarios, where the convergence ratio, relative squared error and the efficiency of bias correction and Skovgaard\'s correction are evaluated.
293

General conditional linear models with time-dependent coefficients under censoring and truncation

Teodorescu, Bianca 19 December 2008 (has links)
In survival analysis interest often lies in the relationship between the survival function and a certain number of covariates. It usually happens that for some individuals we cannot observe the event of interest, due to the presence of right censoring and/or left truncation. A typical example is given by a retrospective medical study, in which one is interested in the time interval between birth and death due to a certain disease. Patients who die of the disease at early age will rarely have entered the study before death and are therefore left truncated. On the other hand, for patients who are alive at the end of the study, only a lower bound of the true survival time is known and these patients are hence right censored. In the case of censored and/or truncated responses, lots of models exist in the literature that describe the relationship between the survival function and the covariates (proportional hazards model or Cox model, log-logistic model, accelerated failure time model, additive risks model, etc.). In these models, the regression coefficients are usually supposed to be constant over time. In practice, the structure of the data might however be more complex, and it might therefore be better to consider coefficients that can vary over time. In the previous examples, certain covariates (e.g. age at diagnosis, type of surgery, extension of tumor, etc.) can have a relatively high impact on early age survival, but a lower influence at higher age. This motivated a number of authors to extend the Cox model to allow for time-dependent coefficients or consider other type of time-dependent coefficients models like the additive hazards model. In practice it is of great use to have at hand a method to check the validity of the above mentioned models. First we consider a very general model, which includes as special cases the above mentioned models (Cox model, additive model, log-logistic model, linear transformation models, etc.) with time-dependent coefficients and study the parameter estimation by means of a least squares approach. The response is allowed to be subject to right censoring and/or left truncation. Secondly we propose an omnibus goodness-of-fit test that will test if the general time-dependent model considered above fits the data. A bootstrap version, to approximate the critical values of the test is also proposed. In this dissertation, for each proposed method, the finite sample performance is evaluated in a simulation study and then applied to a real data set.
294

Tail Estimation for Large Insurance Claims, an Extreme Value Approach.

Nilsson, Mattias January 2010 (has links)
In this thesis are extreme value theory used to estimate the probability that large insuranceclaims are exceeding a certain threshold. The expected claim size, given that the claimhas exceeded a certain limit, are also estimated. Two different models are used for thispurpose. The first model is based on maximum domain of attraction conditions. A Paretodistribution is used in the other model. Different graphical tools are used to check thevalidity for both models. Länsförsäkring Kronoberg has provided us with insurance datato perform the study.Conclusions, which have been drawn, are that both models seem to be valid and theresults from both models are essential equal. / I detta arbete används extremvärdesteori för att uppskatta sannolikheten att stora försäkringsskadoröverträffar en vis nivå. Även den förväntade storleken på skadan, givetatt skadan överstiger ett visst belopp, uppskattas. Två olika modeller används. Den förstamodellen bygger på antagandet att underliggande slumpvariabler tillhör maximat aven extremvärdesfördelning. I den andra modellen används en Pareto fördelning. Olikagrafiska verktyg används för att besluta om modellernas giltighet. För att kunna genomförastudien har Länsförsäkring Kronoberg ställt upp med försäkringsdata.Slutsatser som dras är att båda modellerna verkar vara giltiga och att resultaten ärlikvärdiga.
295

Optimal distributed detection and estimation in static and mobile wireless sensor networks

Sun, Xusheng 27 June 2012 (has links)
This dissertation develops optimal algorithms for distributed detection and estimation in static and mobile sensor networks. In distributed detection or estimation scenarios in clustered wireless sensor networks, sensor motes observe their local environment, make decisions or quantize these observations into local estimates of finite length, and send/relay them to a Cluster-Head (CH). For event detection tasks that are subject to both measurement errors and communication errors, we develop an algorithm that combines a Maximum a Posteriori (MAP) approach for local and global decisions with low-complexity channel codes and processing algorithms. For event estimation tasks that are subject to measurement errors, quantization errors and communication errors, we develop an algorithm that uses dithered quantization and channel compensation to ensure that each mote's local estimate received by the CH is unbiased and then lets the CH fuse these estimates into a global one using a Best Linear Unbiased Estimator (BLUE). We then determine both the minimum energy required for the network to produce an estimate with a prescribed error variance and show how this energy must be allocated amongst the motes in the network. In mobile wireless sensor networks, the mobility model governing each node will affect the detection accuracy at the CH and the energy consumption to achieve this level of accuracy. Correlated Random Walks (CRWs) have been proposed as mobility models that accounts for time dependency, geographical restrictions and nonzero drift. Hence, the solution to the continuous-time, 1-D, finite state space CRW is provided and its statistical behavior is studied both analytically and numerically. The impact of the motion of sensor on the network's performance is also studied.
296

A Foot Placement Strategy for Robust Bipedal Gait Control

Wight, Derek L. 09 May 2008 (has links)
This thesis introduces a new measure of balance for bipedal robotics called the foot placement estimator (FPE). To develop this measure, stability first is defined for a simple biped. A proof of the stability of a simple biped in a controls sense is shown to exist using classical methods for nonlinear systems. With the addition of a contact model, an analytical solution is provided to define the bounds of the region of stability. This provides the basis for the FPE which estimates where the biped must step in order to be stable. By using the FPE in combination with a state machine, complete gait cycles are created without any precalculated trajectories. This includes gait initiation and termination. The bipedal model is then advanced to include more realistic mechanical and environmental models and the FPE approach is verified in a dynamic simulation. From these results, a 5-link, point-foot robot is designed and constructed to provide the final validation that the FPE can be used to provide closed-loop gait control. In addition, this approach is shown to demonstrate significant robustness to external disturbances. Finally, the FPE is shown in experimental results to be an unprecedented estimate of where humans place their feet for walking and jumping, and for stepping in response to an external disturbance.
297

Likelihood ratio tests of separable or double separable covariance structure, and the empirical null distribution

Gottfridsson, Anneli January 2011 (has links)
The focus in this thesis is on the calculations of an empirical null distributionfor likelihood ratio tests testing either separable or double separable covariancematrix structures versus an unstructured covariance matrix. These calculationshave been performed for various dimensions and sample sizes, and are comparedwith the asymptotic χ2-distribution that is commonly used as an approximative distribution. Tests of separable structures are of particular interest in cases when data iscollected such that more than one relation between the components of the observationis suspected. For instance, if there are both a spatial and a temporalaspect, a hypothesis of two covariance matrices, one for each aspect, is reasonable.
298

A Foot Placement Strategy for Robust Bipedal Gait Control

Wight, Derek L. 09 May 2008 (has links)
This thesis introduces a new measure of balance for bipedal robotics called the foot placement estimator (FPE). To develop this measure, stability first is defined for a simple biped. A proof of the stability of a simple biped in a controls sense is shown to exist using classical methods for nonlinear systems. With the addition of a contact model, an analytical solution is provided to define the bounds of the region of stability. This provides the basis for the FPE which estimates where the biped must step in order to be stable. By using the FPE in combination with a state machine, complete gait cycles are created without any precalculated trajectories. This includes gait initiation and termination. The bipedal model is then advanced to include more realistic mechanical and environmental models and the FPE approach is verified in a dynamic simulation. From these results, a 5-link, point-foot robot is designed and constructed to provide the final validation that the FPE can be used to provide closed-loop gait control. In addition, this approach is shown to demonstrate significant robustness to external disturbances. Finally, the FPE is shown in experimental results to be an unprecedented estimate of where humans place their feet for walking and jumping, and for stepping in response to an external disturbance.
299

p-Refinement Techniques for Vector Finite Elements in Electromagnetics

Park, Gi-Ho 25 August 2005 (has links)
The vector finite element method has gained great attention since overcoming the deficiencies incurred by the scalar basis functions for the vector Helmholtz equation. Most implementations of vector FEM have been non-adaptive, where a mesh of the domain is generated entirely in advance and used with a constant degree polynomial basis to assign the degrees of freedom. To reduce the dependency on the users' expertise in analyzing problems with complicated boundary structures and material characteristics, and to speed up the FEM tool, the demand for adaptive FEM grows high. For efficient adaptive FEM, error estimators play an important role in assigning additional degrees of freedom. In this proposal study, hierarchical vector basis functions and four error estimators for p-refinement are investigated for electromagnetic applications.
300

Theoretical Results and Applications Related to Dimension Reduction

Chen, Jie 01 November 2007 (has links)
To overcome the curse of dimensionality, dimension reduction is important and necessary for understanding the underlying phenomena in a variety of fields. Dimension reduction is the transformation of high-dimensional data into a meaningful representation in the low-dimensional space. It can be further classified into feature selection and feature extraction. In this thesis, which is composed of four projects, the first two focus on feature selection, and the last two concentrate on feature extraction. The content of the thesis is as follows. The first project presents several efficient methods for the sparse representation of a multiple measurement vector (MMV); some theoretical properties of the algorithms are also discussed. The second project introduces the NP-hardness problem for penalized likelihood estimators, including penalized least squares estimators, penalized least absolute deviation regression and penalized support vector machines. The third project focuses on the application of manifold learning in the analysis and prediction of 24-hour electricity price curves. The last project proposes a new hessian regularized nonlinear time-series model for prediction in time series.

Page generated in 0.0161 seconds