1 |
The Exploration of Effect of Model Misspecification and Development of an Adequacy-Test for Substitution Model in PhylogeneticsChen, Wei Jr 06 November 2012 (has links)
It is possible that the maximum likelihood method can give an inconsistent result when
the DNA sequences are generated under a tree topology which is in the Felsentein
Zone and analyzed with a misspeci ed model. Therefore, it is important to select a
good substitution model. This thesis rst explores the e ects of di erent degrees and
types of model misspeci cation on the maximum likelihood estimates. The results
are presented for tree selection and branch length estimates based on simulated data
sets. Next, two Pearson's goodness-of- t tests are developed based on binning of site
patterns. These two tests are used for testing the adequacy of substitution models and
their performances are studied on both simulated data sets and empirical data.
|
2 |
Various statistical test of pseudorandom number generatorHaque, Mohammad Shafiqul January 2010 (has links)
<p>This thesis is related to varies statistical test of pseudorandom number generator. In thisthesis I have tried to discuss some aspects of selecting and testing Pseudorandom numbergenerators. The outputs of such generators may be used in many cryptographic applications,such as the generation of key material. After statistical test I have tried to compairethe test value of every generator and have discussed which one is producing good sequencesand which one is a good generator.</p>
|
3 |
Statistical distributions for service timesAdedigba, Adebolanle Iyabo 20 September 2005
<p>Queueing models have been used extensively in the design of call centres. In particular, a queueing model will be used to describe a help desk which is a form of a call centre. The design of the queueing model involves modelling the arrival an service processes of the system.</p><p>Conventionally, the arrival process is assumed to be Poisson and service times are assumed to be exponentially distributed. But it has been proposed that practically these are seldom the case. Past research reveals that the log-normal distribution can be used to model the service times in call centres. Also, services may involve stages/tasks before completion. This motivates the use of a phase-type distribution to model the underlying stages of service.</p><p>This research work focuses on developing statistical models for the overall service times and the service times by job types in a particular help desk. The assumption of exponential service times was investigated and a log-normal distribution was fitted to service times of this help desk. Each stage of the service in this help desk was modelled as a phase in the phase-type distribution.</p><p>Results from the analysis carried out in this work confirmed the irrelevance of the assumption of exponential service times to this help desk and it was apparent that log-normal distributions provided a reasonable fit to the service times. A phase-type distribution with three phases fitted the overall service times and the service times of administrative and miscellaneous jobs very well. For the service times of e-mail and network jobs, a phase-type distribution with two phases served as a good model.</p><p>Finally, log-normal models of service times in this help desk were approximated using an order three phase-type distribution.</p>
|
4 |
Tremor quantification and parameter extractionBejugam, Santosh January 2011 (has links)
Tremor is a neuro degenerative disease causing involuntary musclemovements in human limbs. There are many types of tremor that arecaused due to the damage of nerve cells that surrounds thalamus of thefront brain chamber. It is hard to distinguish or classify the tremors asthere are many reasons behind the formation of specific category, soevery tremor type is named behind its frequency type. Propermedication for the cure by physician is possible only when the disease isidentified.Because of the argument given in the above paragraph, there is a needof a device or a technique to analyze the tremor and for extracting theparameters associated with the signal. These extracted parameters canbe used to classify the tremor for onward identification of the disease.There are various diagnostic and treatment monitoring equipment areavailable for many neuromuscular diseases. This thesis is concernedwith the tremor analysis for the purpose of recognizing certain otherneurological disorders. A recording and analysis system for human’stremor is developed.The analysis was performed based on frequency and amplitudeparameters of the tremor. The Fast Fourier Transform (FFT) and higherorderspectra were used to extract frequency parameters (e.g., peakamplitude, fundamental frequency of tremor, etc). In order to diagnosesubjects’ condition, classification was implemented by statisticalsignificant tests (t‐test).
|
5 |
Statistical distributions for service timesAdedigba, Adebolanle Iyabo 20 September 2005 (has links)
<p>Queueing models have been used extensively in the design of call centres. In particular, a queueing model will be used to describe a help desk which is a form of a call centre. The design of the queueing model involves modelling the arrival an service processes of the system.</p><p>Conventionally, the arrival process is assumed to be Poisson and service times are assumed to be exponentially distributed. But it has been proposed that practically these are seldom the case. Past research reveals that the log-normal distribution can be used to model the service times in call centres. Also, services may involve stages/tasks before completion. This motivates the use of a phase-type distribution to model the underlying stages of service.</p><p>This research work focuses on developing statistical models for the overall service times and the service times by job types in a particular help desk. The assumption of exponential service times was investigated and a log-normal distribution was fitted to service times of this help desk. Each stage of the service in this help desk was modelled as a phase in the phase-type distribution.</p><p>Results from the analysis carried out in this work confirmed the irrelevance of the assumption of exponential service times to this help desk and it was apparent that log-normal distributions provided a reasonable fit to the service times. A phase-type distribution with three phases fitted the overall service times and the service times of administrative and miscellaneous jobs very well. For the service times of e-mail and network jobs, a phase-type distribution with two phases served as a good model.</p><p>Finally, log-normal models of service times in this help desk were approximated using an order three phase-type distribution.</p>
|
6 |
Analysis and simulation of temporal and spatial variations of suspended particulates in the urban area, KaohsiungHuang, Yao-Tien 12 June 2005 (has links)
ABSTRACT
Although the fractions of station-days that the Pollutant Standard Index (PSI) exceed 100 (also referring to the episodes) in Kaohsiung City showed a decline trend from about 10.3% in 1995 to about 5.5% in 2002, the percentage of particulate PM10 events showed a increase trend: from 1.0 % in 2002 to 2.9 % in 2004. This study first statistically summarized the trends of PM10 concentrations using box plots for four air-quality monitoring stations in Kaohsiung during the period of 1997 to 2004, together with the t-test and F-test. The Comprehensive Air Quality Model with extensions (CAMx model) was then applied to analyze the source and the cause of the PM10 events.
The monthly averages of PM10 concentrations at four air-quality monitoring stations were 72.9 ¡V 81.7 £gg/m3 during the period of 1997 to 2004, highest at Hsiung-Kong and lowest at Nan-Chie. The long-term trend analyses show slightly decline results for yearly-averaged PM10 concentrations (1.05% at Nan-Chie, 1.38% at Tzuo-Yin, 1.51% at Chien-Chin, and 1.91% at Hsiung-Kong).
During 1997 to 2004, the PM10 episodes occurred most frequently, while the numbers of PM10 episodes decreased from south to north (i.e., Hsiung-Kong > Chien-Chin > Tzuo-Yin > Nan-Chie). The statistical tests using t-test for the mean and F-test for the variance with 95% confidence level show that the probability that the hourly PM10 concentrations differ insignificantly among the four stations is only about 42%. That is, the spatial difference of pollutant concentrations among four air-quality monitoring stations is rather significant in Kaohsiung.
The CAMx simulations show that contributions to ambient PM10 from stationary source is about 38.9% (NOx: 24.7%; SO2: 14.2%), 8.8% from mobile source (NOx: 7.4%; SO2: 1.4%), and 0.9% from fugitive emissions (SO2: 0.9%) in Kaohsiung. The contributions to ambient PM10 from the emissions in Kaohsiung Harbor are about 3.5%.
Keywords: Particulate matter, Trend analysis, Statistical test, CAMx model.
|
7 |
Meteorogically adjusted long-term trend analysis of primary air pollutants and statistical testing during high pollution events in Kaohsiung AreaLiao, Kun-Chuan 04 July 2008 (has links)
The trends of PM10, O3, NOX and NMHC concentrations were analyzed by the Holland model (without meteorological-adjusted) and the MM-Regression model (with meteorological-adjusted) base on the data of eight EPA air quality stations from 1997 to 2006 in Kaohsiung. The aim of this study was to evaluate the influence of meteorological factors on the pollutants (PM10 and O3) trends.
The trends of PM10 concentrations in Kaohsiung city analyzed without meteorological-adjusted were 7.18 % at Tzuo-Yin, 3.20 % at Chien-Chin and 9.72 % at Nan-Chie. After eliminating the meteorological factors, the percent of gradual trends were 1.91 % at Tzuo-Yin, 2.92 % at Chien-Chin and 2.02 % at Nan-Chie. The trends of O3 concentrations without meteorological-adjusted were 11.42 % at Tzuo-Yin, 20.92 % at Hsiung-Kong, 42.08 % at Chien-Chin and 13.69 % at Nan-Chie. The trends of PM10 concentrations in Kaohsiung County analyzed without meteorological-adjusted were 14.96 % at Lin-yuan and 3.24 % at Jen-wu. After meteorological factors eliminating, the trend was 3.15 % at Jen-wu but the trend was -2.53 % at Lin-yuan. Meteorological factor was a primary reason that influences the PM10 concentration in recent years. The trends of O3 in Kaohsiung County without meteorological-adjusted were 18.89 % at Da-liao, 4.40 % at Jen-wu, 35.16 % at Lin-yuan and 29.98 % at Mei-nung. After meteorological factors eliminating, the trends were 1.99 % at Da-liao, 2.23 % at Jen-wu, 1.16 % at Lin-yuan and -1.16 % at Mei-nung. The results show that the influence of meteorological factors for O3 trends was more sensitive in Kaohsiung county than in Kaohsiung city.
The concentration of PM10 has no significant difference (64.8 ¡V 92.3 %) in Kaohsiung city. For the concentration of O3, the similarity (78 ¡V 100 %) was extensive in Kaohsiung city because O3 could diffuse easily. O3 episodes has no significant difference as PM10 episodes in Kaohsiung city. As above-mentioned, the results show that the contributions of ambient PM10 were individually but the contributions of ambient O3 were uniform extensively.
|
8 |
Various statistical test of pseudorandom number generatorHaque, Mohammad Shafiqul January 2010 (has links)
This thesis is related to varies statistical test of pseudorandom number generator. In thisthesis I have tried to discuss some aspects of selecting and testing Pseudorandom numbergenerators. The outputs of such generators may be used in many cryptographic applications,such as the generation of key material. After statistical test I have tried to compairethe test value of every generator and have discussed which one is producing good sequencesand which one is a good generator.
|
9 |
Certification de l'intégrité d'images numériques et de l'authenticité / Certification of authenticity and integrity of digital imagesNguyen, Hoai phuong 07 February 2019 (has links)
Avec l’avènement de l’informatique grand public et du réseau Internet, de nombreuses vidéos circulent un peu partout dans le monde. La falsification de ces supports est devenue une réalité incontournable, surtout dans le domaine de la cybercriminalité. Ces modifications peuvent être relativement anodines (retoucher l’apparence d’une personne pour lui enlever des imperfections cutanées), dérangeantes (faire disparaitre les défauts d’un objet) ou bien avoir de graves répercussions sociales (montage présentant la rencontre improbable de personnalités politiques). Ce projet s’inscrit dans le domaine de l’imagerie légale (digital forensics en anglais). Il s’agit de certifier que des images numériques sont saines ou bien falsifiées. La certification peut être envisagée comme une vérification de la conformité de l’image à tester en rapport à une référence possédée. Cette certification doit être la plus fiable possible car la preuve numérique de la falsification ne pourra être établie que si la méthode de détection employée fournit très peu de résultats erronés. Une image est composée de zones distinctes correspondantes à différentes portions de la scène (des individus, des objets, des paysages, etc.). La recherche d’une falsification consiste à vérifier si une zone suspecte est « physiquement cohérente » avec d’autres zones de l’image. Une façon fiable de définir cette cohérence consiste à se baser sur les « empreintes physiques » engendrées par le processus d’acquisition. Le premier caractère novateur de ce projet est la différenciation entre les notions de conformité et d’intégrité. Un support est dit conforme s’il respecte le modèle physique d’acquisition. Si certains des paramètres du modèle prennent des valeurs non autorisées, le support sera déclaré non-conforme. Le contrôle d’intégrité va plus loin. Il s’agit d’utiliser le test précédent pour vérifier si deux zones distinctes sont conformes à un modèle commun. Autrement dit, contrairement au contrôle de conformité qui s’intéresse au support dans son ensemble, le contrôle d’intégrité examine l’image zone par zone pour vérifier si deux zones sont mutuellement cohérentes, c’est-à-dire si la différence entre les paramètres caractérisant ces deux zones est cohérente avec la réalité physique du processus d’acquisition. L’autre caractère novateur du projet est la construction d’outils permettant de pouvoir calculer analytiquement les probabilités d’erreurs du détecteur de falsifications afin de fournir un critère quantitatif de décision. Aucune méthode ou outil actuels ne répondent à ces contraintes. / Nowadays, with the advent of the Internet, the falsification of digital media such as digital images and video is a security issue that cannot be ignored. It is of vital importance to certify the conformity and the integrity of these media. This project, which is in the domain of digital forensics, is proposed to answer this problematic.
|
10 |
A Study of Deploying Monitor-Oriented System Simulation Models to Improve the Efficiency of Statistical Process ControlSu, Yung-Chi 06 August 2011 (has links)
The development of statistical process control has been for a long time and can be turned up in many manufacturing environments. However, statistical process control applications in process control generally limited to use the control chart applications, the deepening capacity for control charts such as process capability control, variation detection and evaluation, are rarely described so often so that statistical process control techniques is relegated. Meanwhile, statistical process control can detect the production process of the variations, but it can¡¦t integrate the production resource capacity. Although the process control of manufacturing processes can achieve real-time control of effects, but the resources of the production process appeared to be quite inadequate in response to future demand forecast and capacity analysis.
Therefore, this study combined with statistical process control system simulation technology for innovative management. Through the process observation and sample collection, we can use simulation technology to propose the process feasibility and applicability in resource constraint and resource allocation for considering the variation of the statistical process control, and use the quality improvement tools and causal feedback map, the system dynamics tools, in the resource dynamic ability for decision-making management.
The research result appears:
1¡BBased on the effective input parameters of simulation model , it can effectively simulate the actual production processes and produce an effective output.
2¡BThrough the appropriate statistical data validation, it can improve the sample reliability as an important reference to system simulation methods.
3¡BUsing the simulation technology, we can monitor the online process control, production resources allocation and capacity prediction.
|
Page generated in 0.0797 seconds