• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 291
  • 113
  • 32
  • 31
  • 15
  • 13
  • 8
  • 7
  • 7
  • 6
  • 5
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 604
  • 604
  • 213
  • 118
  • 101
  • 99
  • 97
  • 82
  • 78
  • 65
  • 62
  • 61
  • 55
  • 53
  • 51
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Nova generalização para a classe Beta-G de distribuições de probabilidade

SOUZA, Glaucia Tadu de 04 March 2016 (has links)
Submitted by Mario BC (mario@bc.ufrpe.br) on 2016-08-01T12:23:27Z No. of bitstreams: 1 Glaucia Tadu de Souza.pdf: 877784 bytes, checksum: ccbab2806d7ec089da4ec4976e11ce8e (MD5) / Made available in DSpace on 2016-08-01T12:23:27Z (GMT). No. of bitstreams: 1 Glaucia Tadu de Souza.pdf: 877784 bytes, checksum: ccbab2806d7ec089da4ec4976e11ce8e (MD5) Previous issue date: 2016-03-04 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / We propose a new generator of continuous distributions with three extra parameters called the Beta ( (1 − G), (1 − )G + ), which generalizes the Beta-G class. Some special cases are presented. The new density function can be expressed as a difference of linear combinations of exponentiated densities based on the same baseline distribution. Various structural properties of the new class, which hold for any baseline model, are derived including explicit expressions for the moments of order n, the moment generating function, the characteristic function, central moments of order n, the general coefficient, the mean deviations, residual life function, reverse life function and order statistics. We discuss estimation of the model parameters by maximum likelihood and provide an application to a real data set. / Propusemos um novo gerador de distribuições contínuas com três parâmetros adicionais chamado Beta ( (1−G), (1− )G+ ), que generaliza a classe Beta-G. Alguns casos especiais são apresentados. A nova função densidade pode ser expressa como uma diferença de combinações lineares de densidades exponencializadas através da mesma distribuição-base. Várias propriedades estruturais da nova classe, as quais valem para qualquer distribuiçãobase são derivadas, incluindo expressões explicitas para os momentos de ordem n, função geradora de momentos, função característica, momentos centrais de ordem n, coeficiente geral, desvios médios, função de vida residual, função de vida reversa e estatísticas de ordem. Discutimos a estimação dos parâmetros do modelo através do método de máxima verossimilhança e fornecemos uma aplicação a um conjunto de dados reais.
152

Estimação de maxima verossimilhança para processo de nascimento puro espaço-temporal com dados parcialmente observados / Maximum likelihood estimation for space-time pu birth process with missing data

Goto, Daniela Bento Fonsechi 09 October 2008 (has links)
Orientador: Nancy Lopes Garcia / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 2018-08-11T16:45:43Z (GMT). No. of bitstreams: 1 Goto_DanielaBentoFonsechi_M.pdf: 3513260 bytes, checksum: ff6f9e35005ad9015007d1f51ee722c1 (MD5) Previous issue date: 2008 / Resumo: O objetivo desta dissertação é estudar estimação de máxima verossimilhança para processos de nascimento puro espacial para dois diferentes tipos de amostragem: a) quando há observação permanente em um intervalo [0, T]; b) quando o processo é observado após um tempo T fixo. No caso b) não se conhece o tempo de nascimento dos pontos, somente sua localização (dados faltantes). A função de verossimilhança pode ser escrita para o processo de nascimento puro não homogêneo em um conjunto compacto através do método da projeção descrito por Garcia and Kurtz (2008), como projeção da função de verossimilhança. A verossimilhança projetada pode ser interpretada como uma esperança e métodos de Monte Carlo podem ser utilizados para estimar os parâmetros. Resultados sobre convergência quase-certa e em distribuição são obtidos para a aproximação do estimador de máxima verossimilhança. Estudos de simulação mostram que as aproximações são adequadas. / Abstract: The goal of this work is to study the maximum likelihood estimation of a spatial pure birth process under two different sampling schemes: a) permanent observation in a fixed time interval [0, T]; b) observation of the process only after a fixed time T. Under scheme b) we don't know the birth times, we have a problem of missing variables. We can write the likelihood function for the nonhomogeneous pure birth process on a compact set through the method of projection described by Garcia and Kurtz (2008), as the projection of the likelihood function. The fact that the projected likelihood can be interpreted as an expectation suggests that Monte Carlo methods can be used to compute estimators. Results of convergence almost surely and in distribution are obtained for the aproximants to the maximum likelihood estimator. Simulation studies show that the approximants are appropriate. / Mestrado / Inferencia em Processos Estocasticos / Mestre em Estatística
153

Extensões de distribuições com aplicação à analise de sobrevivência / Extensions of distributions with application to survival analysis

Yolanda Magaly Gómez Olmos 09 February 2017 (has links)
Nesta tese serão estudadas diferentes generalizações de algumas distribuições bem conhecidas na literatura para os tempos de vida, tais como exponencial, Lindley, Rayleigh e exponencial segmentada, entre outras, e compará-las com outras extensões com suporte positivo. A finalidade dessas generalizações é flexibilizar a função de risco de modo que possam assumir formas mais flexíveis. Além disso, pretende-se estudar propriedades importantes dos modelos propostos, tais como os momentos, coeficientes de curtose e assimetria e função quantílica, entre outras. A estimação dos parâmetros é abordada através dos métodos de máxima verossimilhança, via algoritmo EM (quando for possível) ou também, do método dos momentos. O comportamento desses estimadores foi avaliado em estudos de simulação. Foram ajustados a conjuntos de dados reais, usando uma abordagem clássica, e compará-los com outras extensões na literatura. Finalmente, um dos modelos propostos é considerado no contexto de fração de cura. / The main focus of this thesis is the study of generalizations for some positive distributions widely known in the literature of lifetime analysis, such as the exponential, Lindley, Rayleigh and segmented exponential. Comparisons of the proposed extensions and alternative extensions in the literature such as the generalized exponential distribution, are reported. Moreover, of interest is also the study of some properties of the proposed distributions such as moments, kurtosis and asymmetry coefficients, quantile functions and the risk function. Parameter estimation is approached via maximum likelihood (using the EM-algorithm when available) and the method of moments as initial parameter estimators. Results of simulation studies are reported comparing the performance of these estimators with small and moderate sample sizes. Further comparisons are reported for real data applications, where the proposed models show satisfactory performance. Finally, one of the models proposed is considered no context of cure rate.
154

Modeling synthetic aperture radar image data

Matthew Pianto, Donald 31 January 2008 (has links)
Made available in DSpace on 2014-06-12T18:29:09Z (GMT). No. of bitstreams: 2 arquivo4274_1.pdf: 5027595 bytes, checksum: 37a31f281a0f888465edbdc60cb2db39 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2008 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Nessa tese estudamos a estimação por máxima verossimilhança (MV) do parâmetro de aspereza da distribuição G 0 A de imagens com speckle (Frery et al., 1997). Descobrimos que, satisfeita uma certa condição dos momentos amostrais, a função de verossimilhança é monótona e as estimativas MV são infinitas, implicando uma região plana. Implementamos quatro estimadores de correção de viés em uma tentativa de obter estimativas MV finitas. Três dos estimadores são obtidos da literatura sobre verossimilhança monótona (Firth, 1993; Jeffreys, 1946) e um, baseado em reamostragem, é proposto pelo autor. Fazemos experimentos numéricos de Monte Carlo para comparar os quatro estimadores e encontramos que não existe um favorito claro, a menos quando um parâmetro (dado a priori da estimação) toma um valor específico. Também aplicamos os estimadores a dados reais de radar de abertura sintética. O resultado desta análise mostra que os estimadores precisam ser comparados com base em suas habilidades de classificar regiões corretamente como ásperas, planas, ou intermediárias e não pelos seus vieses e erros quadráticos médios
155

Grid-Based RFID Indoor Localization Using Tag Read Count and Received Signal Strength Measurements

Jeevarathnam, Nanda Gopal 26 October 2017 (has links)
Passive ultra-high frequency (UHF) radio frequency identification (RFID) systems have gained immense popularity in recent years for their wide-scale industrial applications in inventory tracking and management. In this study, we explore the potential of passive RFID systems for indoor localization by developing a grid-based experimental framework using two standard and easily measurable performance metrics: received signal strength indicator (RSSI) and tag read count (TRC). We create scenarios imitating real life challenges such as placing metal objects and other RFID tags in two different read fields (symmetric and asymmetric) to analyze their impacts on location accuracy. We study the prediction potential of RSSI and TRC both independently and collaboratively. In the end, we demonstrate that both signal metrics can be used for localization with sufficient accuracy whereas the best performance is obtained when both metrics are used together for prediction on an artificial neural network especially for more challenging scenarios. Experimental results show an average error of as low as 0.286 (where consecutive grid distance is defined as unity) which satisfies the grid-based localization benchmark of less than 0.5.
156

Seismic Vulnerability Assessment of a Shallow Two-Story Underground RC Box Structure

Huh, Jungwon, Tran, Quang, Haldar, Achintya, Park, Innjoon, Ahn, Jin-Hee 18 July 2017 (has links)
Tunnels, culverts, and subway stations are the main parts of an integrated infrastructure system. Most of them are constructed by the cut-and-cover method at shallow depths (mainly lower than 30 m) of soil deposits, where large-scale seismic ground deformation can occur with lower stiffness and strength of the soil. Therefore, the transverse racking deformation (one of the major seismic ground deformation) due to soil shear deformations should be included in the seismic design of underground structures using cost- and time-efficient methods that can achieve robustness of design and are easily understood by engineers. This paper aims to develop a simplified but comprehensive approach relating to vulnerability assessment in the form of fragility curves on a shallow two-story reinforced concrete underground box structure constructed in a highly-weathered soil. In addition, a comparison of the results of earthquakes per peak ground acceleration (PGA) is conducted to determine the effective and appropriate number for cost- and time-benefit analysis. The ground response acceleration method for buried structures (GRAMBS) is used to analyze the behavior of the structure subjected to transverse seismic loading under quasi-static conditions. Furthermore, the damage states that indicate the exceedance level of the structural strength capacity are described by the results of nonlinear static analyses (or so-called pushover analyses). The Latin hypercube sampling technique is employed to consider the uncertainties associated with the material properties and concrete cover owing to the variation in construction conditions. Finally, a large number of artificial ground shakings satisfying the design spectrum are generated in order to develop the seismic fragility curves based on the defined damage states. It is worth noting that the number of ground motions per PGA, which is equal to or larger than 20, is a reasonable value to perform a structural analysis that produces satisfactory fragility curves.
157

Methods for Viral Population Analysis

Artyomenko, Alexander 08 August 2017 (has links)
The ability of Next-Generation Sequencing (NGS) to produce massive quantities of genomic data inexpensively has allowed to study the structure of viral populations from an infected host at an unprecedented resolution. As a result of a high rate of mutation and recombination events, an RNA virus exists as a heterogeneous "swarm". Virologists and computational epidemiologists are widely using NGS data to study viral populations. However, discerning rare variants is muddled by the presence of errors introduced by the sequencing technology. We develop and implement time- and cost-efficient strategy for NGS of multiple viral samples, and computational methods to analyze large quantities of NGS data and to handle sequencing errors. In particular, we present: (i) combinatorial pooling strategy for massive NGS of viral samples; (ii) kGEM and 2SNV — methods for viral population haplotyping; (iii) ShotMCF — a Multicommodity Flow (MCF) based method for frequency estimation of viral haplotypes; (iv) QUASIM — an agent-based simulator of viral evolution taking in account viral variants and immune response.
158

Sensory Integration During Goal Directed Reaches: The Effects of Manipulating Target Availability

Khanafer, Sajida January 2012 (has links)
When using visual and proprioceptive information to plan a reach, it has been proposed that the brain combines these cues to estimate the object and/or limb’s location. Specifically, according to the maximum-likelihood estimation (MLE) model, more reliable sensory inputs are assigned a greater weight (Ernst & Banks, 2002). In this research we examined if the brain is able to adjust which sensory cue it weights the most. Specifically, we asked if the brain changes how it weights sensory information when the availability of a visual cue is manipulated. Twenty-four healthy subjects reached to visual (V), proprioceptive (P), or visual + proprioceptive (VP) targets under different visual delay conditions (e.g. on V and VP trials, the visual target was available for the entire reach, it was removed with the go-signal or it was removed 1, 2 or 5 seconds before the go-signal). Subjects completed 5 blocks of trials, with 90 trials per block. For 12 subjects, the visual delay was kept consistent within a block of trials, while for the other 12 subjects, different visual delays were intermixed within a block of trials. To establish which sensory cue subjects weighted the most, we compared endpoint positions achieved on V and P reaches to VP reaches. Results indicated that all subjects weighted sensory cues in accordance with the MLE model across all delay conditions and that these weights were similar regardless of the visual delay. Moreover, while errors increased with longer visual delays, there was no change in reaching variance. Thus, manipulating the visual environment was not enough to change subjects’ weighting strategy, further i
159

Investigation of island geometry variations in bit patterned media storage systems

Shi, Yuanjing January 2011 (has links)
Bit-Patterned Media (BPM) has been recognised as one of the candidate technologies to achieve an areal density beyond 1Tb/in2 by fabricating single-domain islands out of continuous magnetic media. Though much attention has been focused on the fabrication of BPM, existing lithography techniques demonstrate difficulties in producing uniform islands over large areas cost effectively; the resulting fabricated islands often vary in position and size. The primary purpose of the research documented in this thesis is to investigate the issue of island geometry variations on the data recovery process from a perpendicular patterned media with head and media configurations optimised to achieve an areal density of 1Tb/in2. In order to achieve the research aim, a read channel model has been implemented as a platform to evaluate the read channel performance numerically. It can be also altered to investigate new read channel designs. The simulated results demonstrate that island geometry variations have a detrimental effect on read channel performance. It has shown that a BPM system can be tolerant to island position variations, but more effort needs to be paid to the effect that island size variations have on the read channel performance. A new read channel design revolving around the design of a modified trellis has been proposed for use in the Viterbi detector in order to combat the effect of island geometry variations. The modified trellis for island position variations results in extra states and branches compared to the standard trellis, while the modified trellis for island size variations results in only extra branches. The novel read channel designs demonstrate an improved read channel performance in the presence of island geometry variations even with increasing amounts of island position and size variations. There are two ways to obtain the read channel performance in terms of the bit-error-rate (BER): a) by running a numerical Monte-Carlo simulation to count the number of bits in error at the output of the read channel model and b) using an analytical approach to calculate the BER by approximating the noise into a known distribution. It is shown that both ways demonstrate very similar results, which indicates as long as the distribution of the noise present in read channel model is predictable, the analytical approach can evaluate the BER performance more efficiently, especially when the BER is low. However, the Monte-Carlo simulation is still useful for understanding of the correlation of the errors. Novel trellis proposed in this work will contribute to the commercial development of BPM in two ways: a) to improve the data recovery process in BPM systems, b) to allow a tolerance of 10% size variations for the existing fabrication techniques.
160

DSGE Model Estimation and Labor Market Dynamics

Mickelsson, Glenn January 2016 (has links)
Essay 1: Estimation of DSGE Models with Uninformative Priors DSGE models are typically estimated using Bayesian methods, but because prior information may be lacking, a number of papers have developed methods for estimation with less informative priors (diffuse priors). This paper takes this development one step further and suggests a method that allows full information maximum likelihood (FIML) estimation of a medium-sized DSGE model. FIML estimation is equivalent to placing uninformative priors on all parameters. Inference is performed using stochastic simulation techniques. The results reveal that all parameters are identifiable and several parameter estimates differ from previous estimates that were based on more informative priors. These differences are analyzed. Essay 2: A DSGE Model with Labor Hoarding Applied to the US Labor Market In the US, some relatively stable patterns can be observed with respect to employment, production and productivity. An increase in production is followed by an increase in employment with lags of one or two quarters. Productivity leads both production and employment, especially employment. I show that it is possible to replicate this empirical pattern in a model with only one demand-side shock and labor hoarding. I assume that firms have organizational capital that depreciates if workers are utilized to a high degree in current production. When demand increases, firms can increase utilization, but over time, they have to hire more workers and reduce utilization to restore organizational capital. The risk shock turns out to be very dominant and explains virtually all of the dynamics. Essay 3: Demand Shocks and Labor Hoarding: Matching Micro Data In Swedish firm-level data, output is more volatile than employment, and in response to demand shocks, employment follows output with a one- to two-year lag. To explain these observations, we use a model with labor hoarding in which firms can change production by changing the utilization rate of their employees. Matching the impulse response functions, we find that labor hoarding in combination with increasing returns to scale in production and a very high price stickiness can explain the empirical pattern very well. Increasing returns to scale implies a larger percentage change in output than in employment. Price stickiness amplifies volatility in output because the price has a dampening effect on demand changes. Both of these explain the delayed reaction in employment in response to output changes.

Page generated in 0.0477 seconds