• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 548
  • 506
  • 119
  • 66
  • 55
  • 36
  • 26
  • 18
  • 17
  • 13
  • 10
  • 9
  • 8
  • 7
  • 7
  • Tagged with
  • 1663
  • 204
  • 142
  • 105
  • 102
  • 99
  • 95
  • 91
  • 90
  • 88
  • 86
  • 84
  • 83
  • 77
  • 75
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Calibração linear assimétrica / Asymmetric Linear Calibration

Cléber da Costa Figueiredo 27 February 2009 (has links)
A presente tese aborda aspectos teóricos e aplicados da estimação dos parâmetros do modelo de calibração linear com erros distribuídos conforme a distribuição normal-assimétrica (Azzalini, 1985) e t-normal-assimétrica (Gómez, Venegas e Bolfarine, 2007). Aplicando um modelo assimétrico, não é necessário transformar as variáveis a fim de obter erros simétricos. A estimação dos parâmetros e das variâncias dos estimadores do modelo de calibração foram estudadas através da visão freqüentista e bayesiana, desenvolvendo algoritmos tipo EM e amostradores de Gibbs, respectivamente. Um dos pontos relevantes do trabalho, na óptica freqüentista, é a apresentação de uma reparametrização para evitar a singularidade da matriz de informação de Fisher sob o modelo de calibração normal-assimétrico na vizinhança de lambda = 0. Outro interessante aspecto é que a reparametrização não modifica o parâmetro de interesse. Já na óptica bayesiana, o ponto forte do trabalho está no desenvolvimento de medidas para verificar a qualidade do ajuste e que levam em consideração a assimetria do conjunto de dados. São propostas duas medidas para medir a qualidade do ajuste: o ADIC (Asymmetric Deviance Information Criterion) e o EDIC (Evident Deviance Information Criterion), que são extensões da ideia de Spiegelhalter et al. (2002) que propôs o DIC ordinário que só deve ser usado em modelos simétricos. / This thesis focuses on theoretical and applied estimation aspects of the linear calibration model with skew-normal (Azzalini, 1985) and skew-t-normal (Gómez, Venegas e Bolfarine, 2007) error distributions. Applying the asymmetrical distributed error methodology, it is not necessary to transform the variables in order to have symmetrical errors. The frequentist and the Bayesian solution are presented. The parameter estimation and its variance estimation were studied using the EM algorithm and the Gibbs sampler, respectively, in each approach. The main point, in the frequentist approach, is the presentation of a new parameterization to avoid singularity of the information matrix under the skew-normal calibration model in a neighborhood of lambda = 0. Another interesting aspect is that the reparameterization developed to make the information matrix nonsingular, when the skewness parameter is near to zero, leaves the parameter of interest unchanged. The main point, in the Bayesian framework, is the presentation of two measures of goodness-of-fit: ADIC (Asymmetric Deviance Information Criterion) and EDIC (Evident Deviance Information Criterion ). They are natural extensions of the ordinary DIC developed by Spiegelhalter et al. (2002).
172

Modelos de regressão lineares mistos sob a classe de distribuições normal-potência / Linear mixed regression models under the power-normal class distributions

Roger Jesus Tovar Falon 27 November 2017 (has links)
Neste trabalho são apresentadas algumas extensões dos modelos potência-alfa assumindo o contexto em que as observações estão censuradas ou limitadas. Inicialmente propomos um novo modelo assimétrico que estende os modelos t-assimétrico (Azzalini e Capitanio, 2003) e t-potência (Zhao e Kim, 2016) e inclui a distribuição t de Student como caso particular. Este novo modelo é capaz de ajustar dados com alto grau de assimetria e curtose, ainda maior do que os modelos t-assimétrico e t-potência. Em seguida estendemos o modelo t-potência às situações em que os dados apresentam censura, com alto grau de assimetria e caudas pesadas. Este modelo generaliza o modelo de regressão linear t de Student para dados censurados por Arellano-Valle et al. (2012). O trabalho também introduz o modelo linear misto normal-potência para dados assimétricos. Aqui a inferência estatística é realizada desde uma perspectiva clássica usando o método de máxima verossimilhança junto com o método de integração numérica de Gauss-Hermite para aproximar as integrais envolvidas na função de verossimilhança. Mais tarde, o modelo linear com interceptos aleatórios para dados duplamente censurados é estudado. Este modelo é desenvolvido sob a suposição de que os erros e os efeitos aleatórios seguem distribuições normal-potência e normal- assimétrica. Para todos os modelos estudados foram realizados estudos de simulação a fim de estudar as suas bondades de ajuste e limitações. Finalmente, ilustram-se todos os métodos propostos com dados reais. / In this work some extensions of the alpha-power models are presented, assuming the context in which the observations are censored or limited. Initially we propose a new asymmetric model that extends the skew-t (Azzalini e Capitanio, 2003) and power-t (Zhao e Kim, 2016) models and includes the Students t-distribution as a particular case. This new model is able to adjust data with a high degree of asymmetry and cursose, even higher than the skew-t and power-t models. Then we extend the power-t model to situations in which the data present censorship, with a high degree of asymmetry and heavy tails. This model generalizes the Students t linear censored regression model t by Arellano-Valle et al. (2012) The work also introduces the power-normal linear mixed model for asymmetric data. Here statistical inference is performed from a classical perspective using the maximum likelihood method together with the Gauss-Hermite numerical integration method to approximate the integrals involved in the likelihood function. Later, the linear model with random intercepts for doubly censored data is studied. This model is developed under the assumption that errors and random effects follow power-normal and skew-normal distributions. For all the models studied, simulation studies were carried out to study their benefits and limitations. Finally, all proposed methods with real data are illustrated.
173

The Effect of Cigarette Smoke on the Microbiota or the Normal Flora of the Nasal Cavity

Osazee, Osarueme J., Idemudia, Monday I., Veeranki, Sreenivas P., Cao, Yan, Zheng, Shimin 08 April 2015 (has links)
Introduction. The adverse health effects of cigarette smoking are well established, including the direct effects of nicotine on human endothelial cells and fibroblasts. However, the direct effects of nicotine in the nasal cavity remain uninvestigated. The study aim was to ascertain the direct effect of chemical components in cigarette smoke on the microbial flora or microbiota of the nasal cavity. Methods. We enrolled 40 participants from the Benson Idahosa University’s Health clinics to conduct this study. Information on demographic characteristics including age, sex, and smoking duration was obtained at baseline. We collected nasal swabs of 20 smokers and 20 non-smokers. The wool content of the swap stick was immediately stored in mac Cathy bottles containing 9 ml of normal saline and transported to the laboratory in less than 45 minutes after collection for microbiological analysis under aseptic condition. The enumeration of the microorganisms was carried out using the streak and pours plate method on Nutrient agar, Blood agar, Chocolate agar, Mac coney agar and Potato Dextrose agar (PDA). A two-sample t-test was used to determine differences in enumeration of microorganisms and isolates between smokers and non-smokers. Results. Total bacterial counts for the smokers ranged from 177 - 307 x 102 cfu/ml and 223 - 551 x 102 cfu/ml for non-smokers with (p = 0.046 Nutrient agar, and p = 0.011 PDA). The bacteria isolates were identified as Staphylococcus aureus, Escherichia coli, Klebsiella pneumonia, Proteus vulgaris and Streptococcus pneumonia, and the fungi isolates were identified as Aspergillus niger, Rhizopus stolonifer, Tricoderma viridae, respectively. The mean of nutrient agar in smokers is significantly lower (12.10) than that in nonsmokers (19.35), and the mean of PDA in smokers is significantly higher in smokers (5.75) than in nonsmokers (2.35). Higher percentages of gram-negative rods including Klebsiella pneumoniae (20%), Escherichia coli (20%) and Proteus vulgaris (20%) were identified in the smokers, which is in contrast to higher percentage of gram-positive cocci including Staphylococcus aureus and Streptococcus pneumonia identified in non-smokers. The most commonly isolated fungus in smokers was Aspergillus niger (45%) and nonsmokers was Rhizopus stolonifer (60%). Conclusions. Study findings demonstrated increased bacterial count and isolates in nasal cavity of smokers than non-smokers. Future studies should be warranted to understand the mechanistic role of nicotine in influencing microbiota of the nasal cavity.
174

A type of 'inverseness' of certain distributions and the inverse normal distribution

Tlakula, Stanley Nkhensani January 1978 (has links)
Thesis (M. Sc. (Mathematical Statistics)) -- University of the North, 1978 / Refer to the document
175

Turbulence, Magnetics, and Closure Equations

Pratt, Jane 24 June 2003 (has links)
When a ferromagnet is heated, it loses its magnetism. Stars and planets have magnetic fields, as does the Earth. But it is known that the center of the Earth is very hot. Therefore, to sustain the large magnetic field of a planet, we cannot look to simple ferromagnetism like that of a bar magnet, but we have to look at the movement of electric charges within the Earth’s molten core to generate magnetic field. This magnetic field sustainment against ohmic dissipation by turbulent flow is referred to as the turbulent dynamo effect. Theoretical research into the mechanisms that create the dynamo has been actively pursued for several decades, culminating recently in massive computer simulations of the Earth’s core. Most of these studies have employed the equations of magnetohydrodynamics (MHD), a nonlinear theory of electrically conducting fluids. The EDQNM (Eddy-Damped Quasi-Normal Markovian) closure is a statistical model designed so that the turbulence equations derived from Navier-Stokes dynamics can be closed and satisfy the realizability condition of positivity of the kinetic energy spectrum. In case of MHD turbulence, realizability requires more work. We have proved in an earlier work that equations analogous to those expected of the EDQNM closure for MHD without mean fields satisfy the appropriate realizability conditions (Turner and Pratt 1999). In this work, we discuss requirements needed to make the MHD equations realizable with mean fields, extending those of neutral fluid turbulence by Turner [1]. Finally, we discuss direct numerical simulations and the correspondence of the statistical theories with simulation results.
176

Normal mode for chamber ensemble and electronics

Neuman, Israel 01 May 2010 (has links)
Normal Mode is a composition for chamber ensemble and electronics that makes reference to the microtonality employed in Turkish music. In this composition I have made an attempt to expand the timbral palette of standard Western instruments by the use of electronic sounds, which were constructed through digital sound synthesis. The microtonal frequencies, which were used in this synthesis process, were derived from the Turkish tonal system. The ensemble material, on the other hand, was conceived within a Western-influenced serial pitch organization. These two distinct influences invite a dynamic discourse between the ensemble and the electronics. As a new instrument, which was developed specifically for this composition, the electronics initially attracts more attention. Over time a new equilibrium is established and the electronics part is integrated in the ensemble. The electronics part of Normal Mode was created in the object-oriented programming environment Max/MSP. It is realized in a performance of the composition with the same software. Five of the chapters of this thesis discuss the compositional process of the electronic part and the system of organization that guided this process. These chapters describe how this system was incorporated in the programming of Max/MSP patchers which generated the composition's sound library and perform the electronics part in real time. They also describe the relationships between the ensemble and the electronics. The sixth chapter presents the composition Normal Mode. The Max/MSP patchers that perform the electronics part are included in the supplement of this thesis.
177

Effect of Intensity Increment on P300 Amplitude

Skinner, Tim 01 January 2004 (has links)
The purpose of this study was to determine the effects of task difficulty on the amplitude and latency of the P300 by altering the intensity of the oddball stimulus. A P300 was obtained on 22 adult subjects ranging in age from 21 to 34 years of age (mean = 24 years) with normal hearing. The "frequent stimulus" was a 1000 Hz or 4000 Hz tone burst, gated with a rise and fall time of 10 msec and 20 msec plateau, presented at 75 dBn HL The "oddball stimulus" was a tone burst of the same frequency (1000 Hz or 4000 Hz)presented at 77, 79, or 81 dBn HL. A four-channel recording was made with linked reference electrodes and the following montages:Cz-A1+A2, Pz-A1+A2, and Fz-A1+A2. The fourth channel was used to monitor "eye blink" activity. The investigation tested the null hypothesis that changing the intensity of the oddball stimuli would not result in a significant change in either the amplitude or latency of the P300. Analyses of Variance (ANOVA) indicate that P300 latency and amplitude did not differ significantly by run, stimulus frequency, intensity of the oddball, or montage. Thus the null hypothesis was supported.
178

La Quinta Normal de Agricultura: un nodo chileno en la red global de conocimientos: el caso de estudio del jardín botánico

López Atala, Matilde January 2017 (has links)
Informe de Seminario para optar al grado de Licenciado en Historia / Este trabajo trata sobre la Quinta Normal de Agricultura como caso de estudio dentro de la Red Global de Conocimientos de fines del siglo XIX y principios del siglo XX. En base a periódicos, publicaciones e informes al Ministro de Instrucción Pública, planteo dos casos de estudio en su relación con el jardín botánico. Mediante ellos demostraré que el nodo Quinta Normal de Agricultura funciona como un complejo con redes internas entre instituciones, edificaciones y sociedades cuyo fin común es el difundir, reinventar y producir conocimiento. / This paper deals with Quinta Normal de Agricultura as a case study within the Global Knowledge Network of the late nineteenth and early twentieth century. On the basis of newspapers, publications and reports to the Minister of Public Instruction I propose two case studies in their relationship with the Botanical Garden. Through them I will demonstrate that the Quinta Normal de Agricultura node functions as a complex with an internal network of institutions, buildings and societies, whose common goal is to disseminate, reinvent and produce knowledge.
179

Contrôle de processus industriels complexes et instables par le biais des techniques statistiques et automatiques

Babus, Florina 15 December 2008 (has links) (PDF)
Le contexte général dans lequel s'inscrivent les travaux développés dans ce mémoire est le contrôle des processus industriels complexes. Ces travaux proposent des nouvelles techniques d'amélioration du contrôle statistique des processus non gaussiens : la carte de contrôle avec des paramètres variables et la carte de contrôle théorique pour la loi de distribution de Rayleigh. Un modèle d'intégration des outils des domaines de l'APC ( Automatic Process Control ) et de la MSP est proposée et ensuite analysée par le biais des deux modèles de processus réel.
180

The Elucidation of Stationary Phase Treatment Effects in Enantiomeric Separations

Putnam, Joel Garrett 01 May 2011 (has links)
Acid/base modifiers are sometimes used as additives in the elution on columns packed with amylose tris(3,5-dimethylphenylcarbamate) stationary phase to separate enantiomers. When modifiers are removed from the mobile phase, the stationary phase is affected in ways that are not understood because of the lack of systematic studies, making the scale-up of preparative separations difficult to predict. Once a column has been exposed to these modifiers, the selectivity of certain pairs of enantiomers may change, for the better or the worse. Numerous pairs of molecules affected by this phenomenon are listed in the literature. Five pairs of these molecules were chosen, the selectivity of which changes after an acidic or basic solutions has been percolated through the column. The selectivity of the ketoprofen, 4-chlorophenylalanine methyl and ethyl esters improves after a solution of ethanesulfonic acid is percolated through the column. The selectivity of the propranolol HCl and Troger’s base increases after a solution of diiospropylethylamine is percolated through the column. The selectivity of these the 4-chlorophenylalanine ethyl ester, propranolol and Troger's base enantiomers are inversely affected by percolation of the opposite acid/base solution. This residual change in certain enantiomeric separations has been named the Memory Effect. In contrast, trans-stilbene oxide (TSO) was used as a standard to determine the column's stability because no Memory Effect is observed for this separation (the retention, enantioselectivity, and resolution remain constant). Karl Fischer titrations showed that only slight changes in the mobile phase's water content occurred, and that the water to polymer repeat unit ratio is important. Analytical studies of the stationary phase suggest that slow protonation/deprotonation of water bounded to the carbamate moiety may be responsible for the Memory Effect. It has been shown that the Memory Effect can be minimized by percolating through the column a sufficiently concentrated solution of the appropriate acid or base. Thus, columns that were unreliable for method development, due to the Memory Effect, can now be used. As a result, the scale-up of separations can be predicted and successfully performed. Finally, a test was devised to determine if a column was under the influence of the Memory Effect.

Page generated in 0.0415 seconds