Spelling suggestions: "subject:"aximum likelihood method"" "subject:"amaximum likelihood method""
1 |
Statistical method in a comparative study in which the standard treatment is superior to othersIkeda, Mitsuru, Shimamoto, Kazuhiro, Ishigaki, Takeo, Yamauchi, Kazunobu, 池田, 充, 山内, 一信 11 1900 (has links)
No description available.
|
2 |
A Viterbi Decoder Using System C For Area Efficient Vlsi ImplementationSozen, Serkan 01 September 2006 (has links) (PDF)
In this thesis, the VLSI implementation of Viterbi decoder using a design and simulation platform called SystemC is studied. For this purpose, the architecture of Viterbi decoder is tried to be optimized for VLSI implementations. Consequently, two novel area efficient structures for reconfigurable Viterbi decoders have been suggested.
The traditional and SystemC design cycles are compared to show the advantages of SystemC, and the C++ platforms supporting SystemC are listed, installation issues and examples are discussed.
The Viterbi decoder is widely used to estimate the message encoded by Convolutional encoder. For the implementations in the literature, it can be found that special structures called trellis have been formed to decrease the complexity and the area.
In this thesis, two new area efficient reconfigurable Viterbi decoder approaches are suggested depending on the rearrangement of the states of the trellis structures to eliminate the switching and memory addressing complexity.
The first suggested architecture based on reconfigurable Viterbi decoder reduces switching and memory addressing complexity. In the architectures, the states are reorganized and the trellis structures are realized by the usage of the same structures in subsequent instances. As the result, the area is minimized and power consumption is reduced. Since the addressing complexity is reduced, the speed is expected to increase.
The second area efficient Viterbi decoder is an improved version of the first one and has the ability to configure the parameters of constraint length, code rate, transition probabilities, trace-back depth and generator polynomials.
|
3 |
Statistická analýza rozdělení extrémních hodnot pro cenzorovaná data / Statistical Analysis of Extreme Value Distributions for Censored DataChabičovský, Martin January 2011 (has links)
The thesis deals with extreme value distributions and censored samples. Theoretical part describes a maximum likelihood method, types of censored samples and introduce a extreme value distributions. In the thesis are derived likelihood equations for censored samples from exponential, Weibull, lognormal, Gumbel and generalized extreme value distribution. For these distributions are also derived asymptotic interval estimates and is made simulation studies on the dependence of the parameter estimate on the percentage of censoring.
|
4 |
Dalitz Plot Analysis of η'→ηπ+π-Taylor, Simon January 2020 (has links)
Chiral Perturbation Theory (ChPT) is a tool for studying the strong interaction at low energies. The Perturbation theory is developed around the limit where the light quarks, u,d,s are approximated to be massless. In this approximation the isospin symmetry, one of the main features of the strong interaction, is fulfilled automatically. The study of the light quark masses and isospin violation can be done with the η'→πππ and η'→ηππ decay channels by analyzing the kinematic distribution using so-called Dalitz plots. A Dalitz plot analysis of the η'→ηπ+π- decay mode is conducted by the BESIII collaboration. The unbinned maximum likelihood method is used to fit the parameters that describe the Dalitz plot distribution. In this fit a polynomial expansion of the matrix element squared is used. However, in order to study light quark masses, it is better to use a parameterization which includes the description of the final-state interaction based on a dispersion relation. Hence, it is desirable to use a representation of the Dalitz plot as a two-dimensional histogram with acceptance corrected data as input to extract the substraction constants. Therefore, the goal of this thesis is to make a consistency check between the unbinned and binned representation of the data. In this thesis Monte Carlo data of η'→ηπ+π- decay channel is generated based on the BESIII. An unbinned maximum likelihood fit is performed to find the Dalitz plot parameters repeating the BESIII analysis method. The Monte Carlo data is then used for a binned maximum likelihood and a χ2 fit. Finally, the prepared binned experimental acceptance corrected data from BESIII is used to fit the Dalitz plot parameters using the same statistical methods. The results based on the binned maximum likelihood and the χ2 methods are consistent with the fit using the unbinned maximum likelihood method applied in the original BESIII publication.
|
5 |
A systems engineering approach to metallurgical accounting of integrated smelter complexesMtotywa, Busisiwe Percelia, Lyman, G. J. 12 1900 (has links)
Thesis (PhD)--Stellenbosch University, 2008. / ENGLISH ABSTRACT: The growing need to improve accounting accuracy, precision and to standardise
generally accepted measurement methods in the mining and processing industries
has led to the joining of a number of organisations under the AMIRA International
umbrella, with the purpose of fulfilling these objectives. As part of this venture,
Anglo Platinum undertook a project on the material balancing around its largest
smelter, the Waterval Smelter.
The primary objective of the project was to perform a statistical material balance
around the Waterval Smelter using the Maximum Likelihood method with respect
to platinum, rhodium, nickel, sulphur and chrome (III) oxide.
Pt, Rh and Ni were selected for their significant contribution to the company’s profit
margin, whilst S was included because of its environmental importance. Cr2O3 was
included for its importance in as far as the difficulties its presence poses in
smelting of PGMs.
The objective was achieved by performing a series of statistical computations.
These include; quantification of total and analytical uncertainties, detection of
outliers, estimation and modelling of daily and monthly measurement uncertainties,
parameter estimation and data reconciliation. Comparisons were made between
the Maximum Likelihood and Least Squares methods.
Total uncertainties associated with the daily grades were determined by use of
variographic studies. The estimated Pt standard deviations were within 10%
relative to the respective average grades with a few exceptions. The total
uncertainties were split into their respective components by determining analytical variances from analytical replicates. The results indicated that the sampling
components of the total uncertainty were generally larger as compared to their
analytical counterparts. WCM, the platinum rich Waterval smelter product, has an
uncertainty that is worth ~R2 103 000 in its daily Pt grade. This estimated figure
shows that the quality of measurements do not only affect the accuracy of metal
accounting, but can have considerable implications if not quantified and managed.
The daily uncertainties were estimated using Kriging and bootstrapped to obtain
estimates for the monthly uncertainties. Distributions were fitted using MLE on the
distribution fitting tool of the JMP6.0 programme and goodness of fit tests were
performed. The data were fitted with normal and beta distributions, and there was
a notable decrease in the skewness from the daily to the monthly data.
The reconciliation of the data was performed using the Maximum Likelihood and
comparing that with the widely used Least Squares. The Maximum Likelihood and
Least Squares adjustments were performed on simulated data in order to conduct
a test of accuracy and to determine the extent of error reduction after the
reconciliation exercise. The test showed that the two methods had comparable
accuracies and error reduction capabilities. However, it was shown that modelling
of uncertainties with the unbounded normal distribution does lead to the estimation
of adjustments so large that negative adjusted values are the result. The benefit of
modelling the uncertainties with a bounded distribution, which is the beta
distribution in this case, is that the possibility of obtaining negative adjusted values
is annihilated. ML-adjusted values (beta) will always be non-negative, therefore
feasible. In a further comparison of the ML(bounded model) and the LS methods in
the material balancing of the Waterval smelter complex, it was found that for all
those streams whose uncertainties were modelled with a beta distribution, i.e.
those whose distribution possessed some degree of skewness, the ML
adjustments were significantly smaller than the LS counterparts
It is therefore concluded that the Maximum Likelihood (bounded models) is a
rigorous alternative method of data reconciliation to the LS method with the benefits of; -- Better estimates due to the fact that the nature of the data (distribution) is not assumed, but determined through distribution fitting and parameter estimation
-- Adjusted values can never be negative due to the bounded nature of the
distribution
The novel contributions made in this thesis are as follows;
-- The Maximum Likelihood method was for the first time employed in the
material balancing of non-normally distributed data and compared with the
well-known Least Squares method
-- This was an original integration of geostatistical methods with data
reconciliation to quantify and predict measurement uncertainties.
-- For the first time, measurement uncertainties were modeled with a
distribution that was non-normal and bounded in nature, leading to smaller
adjustments / AFRIKAANSE OPSOMMING: Die groeiende behoefte aan rekeningkundige akkuraatheid, en om presisie te
verbeter, en te standardiseer op algemeen aanvaarde meetmetodes in die mynbou
en prosesseringsnywerhede, het gelei tot die samwewerking van 'n aantal van
organisasies onder die AMIRA International sambreel, met die doel om
bogenoemde behoeftes aan te spreek. As deel van hierdie onderneming, het
Anglo Platinum onderneem om 'n projek op die materiaal balansering rondom sy
grootste smelter, die Waterval smelter.
Die primêre doel van die projek was om 'n statistiese materiaal balans rondom die
Waterval smelter uit te voer deur gebruik te maak van die sogenaamde maksimum
waarskynlikheid metode met betrekking tot platinum, rodium, nikkel, swawel en
chroom (iii) oxied.
Pt, Rh en Ni was gekies vir hul beduidende bydrae tot die maatskappy se
winsmarge, terwyl S ingesluit was weens sy belangrike omgewingsimpak. Cr2O3
was ingesluit weens sy impak op die smelting van Platinum groep minerale.
Die doelstelling was bereik deur die uitvoering van 'n reeks van statistiese
berekeninge. Hierdie sluit in: die kwantifisering van die totale en analitiese
variansies, opsporing van uitskieters, beraming en modellering van daaglikse en
maandelikse metingsvariansies, parameter beraming en data rekonsiliasie.
Vergelykings was getref tussen die maksimum waarskynlikheid en kleinste
kwadrate metodes.
Totale onsekerhede of variansies geassosieer met die daaglikse grade was bepaal
deur ’n Variografiese studie. Die beraamde Pt standaard afwykings was binne 10% relatief tot die onderskeie gemiddelde grade met sommige uitsonderings. Die totale
onsekerhede was onderverdeel in hul onderskeie komponente deur bepaling van
die ontledingsvariansies van duplikate. Die uitslae toon dat die monsternemings
komponente van die totale onsekerheid oor die algemeen groter was as hul
bypassende analitiese variansies. WCM, ‘n platinum-ryke Waterval Smelter
produk, het 'n onsekerheid in die orde van ~twee miljoen rand in sy daagliks Pt
graad. Hierdie beraamde waarde toon dat die kwaliteit van metings nie alleen die
akkuraatheid van metaal rekeningkunde affekteer nie, maar aansienlike finansiële
implikasies het indien nie die nie gekwantifiseer en bestuur word nie.
Die daagliks onsekerhede was beraam deur gebruik te maak van “Kriging” en
“Bootstrap” metodes om die maandelikse onsekerhede te beraam. Verspreidings
was gepas deur gebruik te maak van hoogste waarskynlikheid beraming passings
en goedheid–van-pas toetse was uitgevoer. Die data was gepas met Normaal en
Beta verspreidings, en daar was 'n opmerklike vermindering in die skeefheid van
die daaglikse tot die maandeliks data.
Die rekonsiliasies van die massabalans data was uitgevoer deur die gebruik die
maksimum waarskynlikheid metodes en vergelyk daardie met die algemeen
gebruikde kleinste kwadrate metode. Die maksimum waarskynlikheid (ML) en
kleinste kwadrate (LS) aanpassings was uitgevoer op gesimuleerde data ten einde
die akkuraatheid te toets en om die mate van fout vermindering na die rekonsiliasie
te bepaal. Die toets getoon dat die twee metodes het vergelykbare akkuraathede
en foutverminderingsvermoëns. Dit was egter getoon dat modellering van die
onsekerhede met die onbegrensde Normaal verdeling lei tot die beraming van
aanpassings wat so groot is dat negatiewe verstelde waardes kan onstaan na
rekosniliasie. Die voordeel om onsekerhede met 'n begrensde distribusie te
modelleer, soos die beta distribusie in hierdie geval, is dat die moontlikheid om
negatiewe verstelde waardes te verkry uitgelsuit word. ML-verstelde waardes (met
die Beta distribusie funksie) sal altyd nie-negatief wees, en om hierdie rede
uitvoerbaar. In 'n verdere vergelyking van die ML (begrensd) en die LS metodes in
die materiaal balansering van die waterval smelter kompleks, is dit gevind dat vir
almal daardie strome waarvan die onserkerhede gesimuleer was met 'n Beta distribusie, dus daardie strome waarvan die onsekerheidsdistribusie ‘n mate van
skeefheid toon, die ML verstellings altyd beduidend kleiner was as die
ooreenkomstige LS verstellings. Vervolgens word die Maksimum Waarskynlikheid
metode (met begrensde modelle) gesien as 'n beter alternatiewe metode van data
rekosiliasie in vergelyking met die kleinste kwadrate metode met die voordele van:
• Beter beramings te danke aan die feit dat die aard van die
onsekerheidsdistribusie nie aangeneem word nie, maar bepaal is deur die
distribusie te pas en deur van parameter beraming gebruik te maak.
• Die aangepaste waardes kan nooit negatief wees te danke aan die begrensde
aard van die verdeling.
Die volgende oorspronklike bydraes is gelewer in hierdie verhandeling:
• Die Maksimum Waarskynlikheid metode was vir die eerste keer geëvalueer vir
massa balans rekonsiliasie van nie-Normaal verspreide data en vergelyk met die
bekendde kleinste kwadrate metode.
• Dit is die eerste keer geostatistiese metodes geïntegreer is met data rekonsiliasie
om onsekerhede te beraam waarbinne verstellings gemaak word.
• Vir die eerste keer, is meetonsekerhede gemoddelleer met 'n distribusie wat nie-
Normaal en begrensd van aard is, wat lei tot kleiner en meer realistiese verstellings.
|
6 |
影響產業獲利率因素之探討--以台灣中游石化業為例趙國卿 Unknown Date (has links)
利用小型開放經濟體系下寡佔理論模型的建立,以民國78年到85年台灣中游石化業的實證資料為依據,利用完全訊息最大概似法(Full Information Maximum Likelihood Method)對產業獲利率與產業集中度兩條聯立方程式進行估計,結果本文發現,產業集中度對產業獲利率的影響為正但並不具有顯著性;而加權匯率的影響為負但也不具有顯著性;又關稅與產能利用率對產業獲利率則呈現顯著正面的影響;以及進口比例與出口運輸成本對其的影響則為負向且在統計上呈現顯著性。另一方面,產業獲利率、進口比例與進口運輸成本對產業集中度產生顯著正面的影響;而市場規模則呈現顯著負面的影響。
|
7 |
Estimation of Pareto distribution functions from samples contaminated by measurement errorsLwando Orbet Kondlo January 2010 (has links)
<p>The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo / s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.</p>
|
8 |
Estimation of Pareto distribution functions from samples contaminated by measurement errorsLwando Orbet Kondlo January 2010 (has links)
<p>The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo / s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.</p>
|
9 |
Teoria de resposta ao item aplicada no ENEM / Theory of response to the item applied in the ENEMCosta, Sidney Tadeu Santiago 03 March 2017 (has links)
Submitted by JÚLIO HEBER SILVA (julioheber@yahoo.com.br) on 2017-03-15T17:36:59Z
No. of bitstreams: 2
Dissertação - Sidney Tadeu Santiago Costa - 2017.pdf: 1406618 bytes, checksum: 291719e6f7eaaff496ec405e241ce518 (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-03-20T12:39:15Z (GMT) No. of bitstreams: 2
Dissertação - Sidney Tadeu Santiago Costa - 2017.pdf: 1406618 bytes, checksum: 291719e6f7eaaff496ec405e241ce518 (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2017-03-20T12:39:15Z (GMT). No. of bitstreams: 2
Dissertação - Sidney Tadeu Santiago Costa - 2017.pdf: 1406618 bytes, checksum: 291719e6f7eaaff496ec405e241ce518 (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Previous issue date: 2017-03-03 / With the note gotten in the Exame Nacional do Ensino Médio - ENEM the students
can applay the vacant in diverse public institutions of superior education and programs
of the government, for example, the program Universidade para Todos(Prouni) and
the Fundo de Financiamento Estudantil (Fies). The ENEM uses a methodology of
correction of the objective questions called Theory of Reply to the Item - TRI, that
has some aspects that are different of the Classic Theory of the Tests - TCT. The
main factor that determines the result of a citizen in a avaliativo process where if uses
the TCT, is the number of correct answers, while in the TRI, beyond the amount of
rightnesss is basic if to analyze which answers they are correct. The objective of this
work is to explain what it is the TRI and as if it applies this methodology in evaluations
of wide scale.
A historical boarding of the logistic models used by the TRI and the justification
of the existence of each parameter will be made that composes the main equation of
the modeling. To determine each parameter that composes the model of the TRI and
to calculate the final note of each candidate, a procedure of called optimization will be
used Method of Maximum Probability - MMV.
The computational tools in the work had been software R, with packages developed
for application of the TRI and the Visual programming language beginner’s all-purpose
symbolic instruction code to program functions, called as macros, in electronic spread
sheets. / Com a nota obtida no Exame Nacional do Ensino Médio - ENEM os estudantes
podem se candidatar a vagas em diversas instituições públicas de ensino superior e
programas do governo, por exemplo, o programa Universidade para Todos (Prouni)
e o Fundo de Financiamento Estudantil (Fies). O ENEM utiliza uma metodologia
de correção das questões objetivas denominada Teoria de Resposta ao Item - TRI,
que possui vários aspectos que são diferentes da Teoria Clássica dos Testes - TCT.
O principal fator que determina o resultado de um sujeito em um processo avaliativo
onde se utiliza a TCT, é o número de respostas corretas, enquanto na TRI, além da
quantidade de acertos é fundamental se analisar quais respostas estão corretas. O
objetivo deste trabalho é explicar o que é a TRI e como se aplica essa metodologia em
avaliações de larga escala.
Será feita uma abordagem histórica dos modelos logísticos utilizados pela TRI e
a justificativa da existência de cada parâmetro que compõe a equação principal da
modelagem. Para determinar cada parâmetro que compõe o modelo da TRI e calcular
a nota final de cada candidato, será utilizado um procedimento de otimização
denominado Método da Máxima Verossimilhança - MMV.
As ferramentas computacionais no trabalho foram o software R, com pacotes desenvolvidos
para aplicação da TRI e a linguagem de programação Visual Basic para
programar funções, denominadas como macros, em planilhas eletrônicas.
|
10 |
Estimation of Pareto distribution functions from samples contaminated by measurement errorsKondlo, Lwando Orbet January 2010 (has links)
Magister Scientiae - MSc / The intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher’s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available. / South Africa
|
Page generated in 0.0623 seconds