1 |
Space, time and the sensibility of the virtualPalmer, Mark William January 1999 (has links)
No description available.
|
2 |
Measurement and minimization of field inhomogeneities in high resolution NMRMattila, S. (Sampo) 04 September 2001 (has links)
Abstract
In this work, the homogeneity of both the B0 and B1 fields was studied. Both B0 and B1 field homogeneities are the basic assumptions of high resolution liquid state NMR. Although some inhomogeneity of both of the fields is always present, the spectrometers can be operated, with the help of the developed spectral purging techniques, without giving any thought to the field inhomogeneities or the necessary actions to minimize their adverse effects. Although the effect of B0 inhomogeneity can occasionally be seen, the B1 fieldin a modern probe head is often assumed to be sufficiently homogenous for any practical purpose. By using the method used in this study the B1 field strength along one axis, typically the z-axis, can be easily mapped. Based on the information gathered from a single experiment, one can obtain reliable and valuable information about the B1 field distribution, e.g. homogeneity of the coil. From such information, the degree of required artifact suppressing methods for successful NMR experiments can be determined. Since normal pulse length calibration also requires the acquisition of several 1-D spectra, the required experimentation time is not increased.
Although the maximum amount of signal from an NMR experiment is obtained when the signal is acquired from a maximum number of resonating spins, the results presented show that significantly more homogenous B1 field along the active sample volume is achieved by rejection of the signal originating from the outer parts of the coil length. Although the total amount of signal obtained from the outer parts of the RF-coil is not very high, some loss of signal is associated with the spatially selective acquisition. The rejected signal, however, is a significant source of artifacts, and if no precautions were taken, the artifacts would severely decrease the quality of the acquired data. If the sample concentration can be increased, it would be advantageous to dissolve the amount of sample available in as small an amount of solvent as is possible and place the sample in the most B1 homogenous part of the probe-head RF-coil. With the same amount of nuclear spins concentrated into a smaller volume, the sensitivity of an NMR experiment can be increased manifold.
As an application of a spatially selective data acquisition, a versatile method capable of producing a map of the B0 field strength and its variation along the sample volume is presented.
|
3 |
The Study of Ctp:Glycerol 3-Phosphate Cytidylyltransferase (Tard) From Staphylococcus AureasBadurina, David 03 1900 (has links)
The CTP:glycerol3-phosphate cytidylyltransferase (TarD) from Staphylococcus
aureus catalyzes the formation of the nucleotide activated form of glycerol 3-phosphate
(CDP-glycerol) used in the construction ofteichoic acid, a structure shown to be essential
in Bacillis subtilis 168. The CTP:glycerol 3-phosphate cytidylyltransferase from B.
subtilis 168 (TagD) involved in teichioc acid biosynthesis has high sequence identity (69
%) and similarity (86 %) to TarD and its characterization has been well documented. In
these studies, TagD was shown to carry out the CTP:glycerol 3-phosphate
cytidylyltransferase reaction via a random mechanism where there is negative
cooperativity in the binding of substrates but not in catalysis. The work described here
illustrates that the kinetic reaction mechanism for TarD is vastly different from TagD in
spite of their high sequence similarity. Recombinant TarD was over-expressed in
Escherichia coli and purified to homogeneity. Steady state hi-substrate experiments were
performed utilizing a high-performance liquid chromatography assay in order to deduce
the kinetic mechanism for TarD. In this analysis, data were globally best fit to the model
that describes the formation of a ternary complex of substrates (CTP and glycerol 3-
phosphate) and enzyme before catalysis. This examination yielded Km values for CTP
and glycerol 3-phosphate of 21 ± 4.1 ).lM and 36 ± 5.8 ).lM respectively, while the kcat
was measured to be 2.6 ± 0.2 s -I. From the pattern observed in product inhibition studies,
a classic ordered Bi Bi reaction mechanism was inferred where glycerol 3-phosphate is
the initial substrate to bind followed by CTP and the release ofCDP-glycerol precedes the release of pyrophosphate. A Keq of 16 ± 15 was calculated using data obtained from
exploring the kinetic parameters of the reverse reaction where data was also fit to the
equation that describes the formation of a ternary complex before catalysis. The
equilibrium constant was also determined experimentally to be 6. To illustrate the
biological role ofTarD with respect to TagD, the integration plasmid, pSWEET, was
used to introduce a copy of tarD, under xylose control, into the chromosome of a strain of
B. subtilis 168 possessing a temperature sensitive mutation (tag-12) mapped to tagD.
Successful complementation of the temperature sensitive mutant by tarD at the restrictive
temperature indicated that despite their apparent uniqueness in kinetic mechanism, TarD
and TagD have similar roles in vivo. / Thesis / Master of Science (MSc)
|
4 |
Variability in surface atmospheric circulation over Europe from early instrumental recordsSlonosky, Victoria C. January 1999 (has links)
The variability of atmospheric circulation is reconstructed over the last two centuries from surface pressure observations extending into the 18th and 19th centuries at 51 locations across Europe. Daily observations from London and Paris exist for 1697-1706; these are analyzed and compared to modem data. The monthly pressure data have been rigorously checked to ensure compatibility with modem observational standards. The pressure series have undergone relative homogeneity tests using a technique developed to deal specifically with these data, and the results compared to those obtained using well-established homogeneity methods. The method developed here was shown to be the most appropriate, particularly for the earlier data. Empirical orthogonal function (BOF) analysis was used to test the stability of circulation patterns over different periods. The three most important modes of variation were found: EOF 1) describing the overall covariance of pressure; EOF 2) the strength of the zonal flow over Europe; and EOF 3) the degree of cycloncity or anti-cyclonicity in the eastern North Atlantic. The ability of the sparser network of 20 stations available from the early 19th century to adequately recover the patterns and variability of the full network is demonstrated. Time series of the North Atlantic Oscillation, the mid-latitude westerly winds, and an index representing the strength of the westerly air flow between London and Paris have been constructed and extend back to the 18th century, as well as the period 1697-1706 for Paris and London. Correlations between eight temperature series from western and central Europe and the circulation indices demonstrate the importance of the atmospheric circulation in determining European temperatures. Running correlations calculated over windows of 25 years reveal striking non-stationarity in circulation-climate relationships. Spectral analyses of the circulation indices suggest a shift from high-frequency oscillatory behaviour in the 19th century to lower frequency behaviour in the 20th
|
5 |
Homogenizace práškových směsí s využitím vysokorychlostního mísiče / Homogenization of powder blends using a high-speed mixerPerďochová, Kristína January 2018 (has links)
Charles University, Faculty of Pharmacy in Hradci Králové Department of: Department of Pharmaceutical Technology Supervisor: doc. PharmDr. Zdeňka Šklubalová, Ph.D. Consultant: PharmDr. Karel Palát, CSc. Student: Kristína Perďochová Title of Thesis: Homogenization of powder blends using a high-speed mixer This work focuses on using a high-speed mixer for the powder mixing process in pharmaceutical technology. Shortened experimental plan was used to study the influence of three factors (mixing speed, mixing time and weight of the mixture) in order to achieve the overall homogenity of the mixture containing acetylsalicylic acid and microcrystalline cellulose. Individual samples were compressed into tablets. The homogenity of the mixture was evaluated using near infrared spectroscopy (NIR). The standard devations of the measured spectra were observed. The best results obtained from studied mixtures were achieved by a combination of mixing 150 g of mixture, rotational speed of 900 rpm and mixing time of 30 seconds.
|
6 |
Boundary Conditions of Several Variables Relative to the Robustness of Analysis of Variance Under Violation of the Assumption of Homogeneity of VariancesGrizzle, Grady M. 12 1900 (has links)
The purpose of this study is to determine boundary conditions associated with the number of treatment groups (K), the common treatment group sample size (n), and an index of the extent to which the assumption of equality of treatment population variances is violated (Q) with regard to user confidence in application of the one-way analysis of variance F-test for determining equality of treatment population means. The study concludes that the analysis of variance F-test is robust when the number of treatment groups is less than seven and when the extreme ratio of variances is less than 1:5, but when the violation of the assumption is more severe or the number of treatment groups is seven or more, serious discrepancies between actual and nominal significance levels occur. It was also concluded that for seven treatment groups confidence in the application of the analysis of variance should be limited to the values of Q and n so that n is greater than or equal to 10 In (1/2)Q. For nine treatment groups, it was concluded that confidence be limited to those values of Q and n so that n is greater than or equal to (-2/3) + 12 ln (1/2)Q. No definitive boundary could be developed for analyses with five treatment groups.
|
7 |
Hypothesis Testing in Finite Mixture ModelsLi, Pengfei 11 December 2007 (has links)
Mixture models provide a natural framework for
unobserved heterogeneity in a population.
They are widely applied in astronomy, biology,
engineering, finance, genetics, medicine, social sciences,
and other areas.
An important first step for using mixture models is the test
of homogeneity. Before one tries to fit a mixture model,
it might be of value to know whether the data arise from a
homogeneous or heterogeneous population. If the data are
homogeneous, it is not even necessary to go into mixture modeling.
The rejection of the homogeneous model may also have scientific implications.
For example, in classical statistical genetics,
it is often suspected that only a subgroup of patients have a
disease gene which is linked to the marker. Detecting
the existence of this subgroup amounts to the rejection of
a homogeneous null model in favour of a two-component
mixture model. This problem has attracted intensive
research recently. This thesis makes substantial contributions
in this area of research.
Due to partial loss of identifiability, classic inference methods
such as the likelihood ratio test (LRT) lose their usual elegant
statistical properties. The limiting distribution of the LRT
often involves complex Gaussian processes,
which can be hard to implement in data analysis.
The modified likelihood ratio test (MLRT) is found to be a nice
alternative of the LRT. It restores the identifiability by introducing
a penalty to the log-likelihood function.
Under some mild conditions,
the limiting distribution of the MLRT is
1/2\chi^2_0+1/2\chi^2_1,
where \chi^2_{0} is a point mass at 0.
This limiting distribution is convenient to use in real data analysis.
The choice of the penalty functions in the MLRT is very flexible.
A good choice of the penalty enhances the power of the MLRT.
In this thesis, we first introduce a new class of penalty functions,
with which the MLRT enjoys a significantly improved power for testing
homogeneity.
The main contribution of this thesis is to propose a new class of
methods for testing homogeneity. Most existing methods in the
literature for testing of homogeneity, explicitly or implicitly, are
derived under the condition of finite Fisher information and a
compactness assumption on the space of the mixing parameters. The
finite Fisher information condition can prevent their usage to many
important mixture models, such as the mixture of geometric
distributions, the mixture of exponential distributions and more
generally mixture models in scale distribution families. The
compactness assumption often forces applicants to set artificial
bounds for the parameters of interest and makes the resulting
limiting distribution dependent on these bounds. Consequently,
developing a method without such restrictions is a dream of many
researchers. As it will be seen, the proposed EM-test in this thesis
is free of these shortcomings.
The EM-test combines the merits of the classic LRT and score test.
The properties of the EM-test are particularly easy to investigate
under single parameter mixture models.
It has a simple limiting distribution
0.5\chi^2_0+0.5\chi^2_1, the same as the MLRT.
This result is applicable to mixture models without requiring
the restrictive regularity conditions described earlier.
The normal mixture model is a very popular model in applications.
However it does not satisfy the strong identifiability condition,
which imposes substantial technical difficulties in the study of the
asymptotic properties. Most existing methods do not directly apply
to the normal mixture models, so the asymptotic properties have to
be developed separately. We investigate the use of the EM-test to
normal mixture models and its limiting distributions are derived.
For the homogeneity test in the presence of the structural
parameter, the limiting distribution is a simple function of the
0.5\chi^2_0+0.5\chi^2_1 and \chi^2_1 distributions. The test
with this limiting distribution is still very convenient to
implement. For normal mixtures in both mean and variance parameters,
the limiting distribution of the EM-test is found be to \chi^2_2.
Mixture models are also widely used in the analysis of the
directional data. The von Mises distribution is often regarded as
the circular normal model. Interestingly, it satisfies the strong
identifiability condition and the parameter space of the mean
direction is compact. However the theoretical results in the single
parameter mixture models can not directly apply to the von Mises
mixture models. Because of this, we also study the application of
the EM-test to von Mises mixture models in the presence of the
structural parameter. The limiting distribution of the EM-test is
also found to be 0.5\chi^2_0+0.5\chi^2_1.
Extensive simulation results are obtained to examine the precision
of the approximation of the limiting distributions to the finite
sample distributions of the EM-test. The type I errors with the
critical values determined by the limiting distributions are found
to be close to nominal values. In particular, we also propose
several precision enhancing methods, which are found to work well.
Real data examples are used to illustrate the use of the EM-test.
|
8 |
Hypothesis Testing in Finite Mixture ModelsLi, Pengfei 11 December 2007 (has links)
Mixture models provide a natural framework for
unobserved heterogeneity in a population.
They are widely applied in astronomy, biology,
engineering, finance, genetics, medicine, social sciences,
and other areas.
An important first step for using mixture models is the test
of homogeneity. Before one tries to fit a mixture model,
it might be of value to know whether the data arise from a
homogeneous or heterogeneous population. If the data are
homogeneous, it is not even necessary to go into mixture modeling.
The rejection of the homogeneous model may also have scientific implications.
For example, in classical statistical genetics,
it is often suspected that only a subgroup of patients have a
disease gene which is linked to the marker. Detecting
the existence of this subgroup amounts to the rejection of
a homogeneous null model in favour of a two-component
mixture model. This problem has attracted intensive
research recently. This thesis makes substantial contributions
in this area of research.
Due to partial loss of identifiability, classic inference methods
such as the likelihood ratio test (LRT) lose their usual elegant
statistical properties. The limiting distribution of the LRT
often involves complex Gaussian processes,
which can be hard to implement in data analysis.
The modified likelihood ratio test (MLRT) is found to be a nice
alternative of the LRT. It restores the identifiability by introducing
a penalty to the log-likelihood function.
Under some mild conditions,
the limiting distribution of the MLRT is
1/2\chi^2_0+1/2\chi^2_1,
where \chi^2_{0} is a point mass at 0.
This limiting distribution is convenient to use in real data analysis.
The choice of the penalty functions in the MLRT is very flexible.
A good choice of the penalty enhances the power of the MLRT.
In this thesis, we first introduce a new class of penalty functions,
with which the MLRT enjoys a significantly improved power for testing
homogeneity.
The main contribution of this thesis is to propose a new class of
methods for testing homogeneity. Most existing methods in the
literature for testing of homogeneity, explicitly or implicitly, are
derived under the condition of finite Fisher information and a
compactness assumption on the space of the mixing parameters. The
finite Fisher information condition can prevent their usage to many
important mixture models, such as the mixture of geometric
distributions, the mixture of exponential distributions and more
generally mixture models in scale distribution families. The
compactness assumption often forces applicants to set artificial
bounds for the parameters of interest and makes the resulting
limiting distribution dependent on these bounds. Consequently,
developing a method without such restrictions is a dream of many
researchers. As it will be seen, the proposed EM-test in this thesis
is free of these shortcomings.
The EM-test combines the merits of the classic LRT and score test.
The properties of the EM-test are particularly easy to investigate
under single parameter mixture models.
It has a simple limiting distribution
0.5\chi^2_0+0.5\chi^2_1, the same as the MLRT.
This result is applicable to mixture models without requiring
the restrictive regularity conditions described earlier.
The normal mixture model is a very popular model in applications.
However it does not satisfy the strong identifiability condition,
which imposes substantial technical difficulties in the study of the
asymptotic properties. Most existing methods do not directly apply
to the normal mixture models, so the asymptotic properties have to
be developed separately. We investigate the use of the EM-test to
normal mixture models and its limiting distributions are derived.
For the homogeneity test in the presence of the structural
parameter, the limiting distribution is a simple function of the
0.5\chi^2_0+0.5\chi^2_1 and \chi^2_1 distributions. The test
with this limiting distribution is still very convenient to
implement. For normal mixtures in both mean and variance parameters,
the limiting distribution of the EM-test is found be to \chi^2_2.
Mixture models are also widely used in the analysis of the
directional data. The von Mises distribution is often regarded as
the circular normal model. Interestingly, it satisfies the strong
identifiability condition and the parameter space of the mean
direction is compact. However the theoretical results in the single
parameter mixture models can not directly apply to the von Mises
mixture models. Because of this, we also study the application of
the EM-test to von Mises mixture models in the presence of the
structural parameter. The limiting distribution of the EM-test is
also found to be 0.5\chi^2_0+0.5\chi^2_1.
Extensive simulation results are obtained to examine the precision
of the approximation of the limiting distributions to the finite
sample distributions of the EM-test. The type I errors with the
critical values determined by the limiting distributions are found
to be close to nominal values. In particular, we also propose
several precision enhancing methods, which are found to work well.
Real data examples are used to illustrate the use of the EM-test.
|
9 |
Exploiting tightly-coupled coresBates, Daniel January 2014 (has links)
As we move steadily through the multicore era, and the number of processing cores on each chip continues to rise, parallel computation becomes increasingly important. However, parallelising an application is often difficult because of dependencies between different regions of code which require cores to communicate. Communication is usually slow compared to computation, and so restricts the opportunities for profitable parallelisation. In this work, I explore the opportunities provided when communication between cores has a very low latency and low energy cost. I observe that there are many different ways in which multiple cores can be used to execute a program, allowing more parallelism to be exploited in more situations, and also providing energy savings in some cases. Individual cores can be made very simple and efficient because they do not need to exploit parallelism internally. The communication patterns between cores can be updated frequently to reflect the parallelism available at the time, allowing better utilisation than specialised hardware which is used infrequently. In this dissertation I introduce Loki: a homogeneous, tiled architecture made up of many simple, tightly-coupled cores. I demonstrate the benefits in both performance and energy consumption which can be achieved with this arrangement and observe that it is also likely to have lower design and validation costs and be easier to optimise. I then determine exactly where the performance bottlenecks of the design are, and where the energy is consumed, and look into some more-advanced optimisations which can make parallelism even more profitable.
|
10 |
Génération automatique de questionnaires à choix multiples pédagogiques : évaluation de l'homogénéité des options / Automatic generation of educational multiple-choice questions : evaluation of option homogeneityPho, Van-Minh 24 September 2015 (has links)
Ces dernières années ont connu un renouveau des Environnements Informatiques pour l'Apprentissage Humain. Afin que ces environnements soient largement utilisés par les enseignants et les apprenants, ils doivent fournir des moyens pour assister les enseignants dans leur tâche de génération d'exercices. Parmi ces exercices, les Questionnaires à Choix Multiples (QCM) sont très présents. Cependant, la rédaction d'items à choix multiples évaluant correctement le niveau d'apprentissage des apprenants est une tâche complexe. Des consignes ont été développées pour rédiger manuellement des items, mais une évaluation automatique de la qualité des items constituerait un outil pratique pour les enseignants.Nous nous sommes intéressés à l'évaluation automatique de la qualité des distracteurs (mauvais choix de réponse). Pour cela, nous avons étudié les caractéristiques des distracteurs pertinents à partir de consignes de rédaction de QCM. Cette étude nous a conduits à considérer que l'homogénéité des distracteurs et de la réponse est un critère important pour valider les distracteurs. L'homogénéité est d'ordre syntaxique et sémantique. Nous avons validé la définition de l'homogénéité par une analyse de corpus de QCM, et nous avons proposé des méthodes de reconnaissance automatique de l'homogénéité syntaxique et sémantique à partir de cette analyse.Nous nous sommes ensuite focalisé sur l'homogénéité sémantique des distracteurs. Pour l'estimer automatiquement, nous avons proposé un modèle d'ordonnancement par apprentissage, combinant différentes mesures d'homogénéité sémantique. L'évaluation du modèle a montré que notre méthode est plus efficace que les travaux existants pour estimer l'homogénéité sémantique des distracteurs. / Recent years have seen a revival of Intelligent Tutoring Systems. In order to make these systems widely usable by teachers and learners, they have to provide means to assist teachers in their task of exercise generation. Among these exercises, multiple-choice tests are very common. However, writing Multiple-Choice Questions (MCQ) that correctly assess a learner's level is a complex task. Guidelines were developed to manually write MCQs, but an automatic evaluation of MCQ quality would be a useful tool for teachers.We are interested in automatic evaluation of distractor (wrong answer choice) quality. To do this, we studied characteristics of relevant distractors from multiple-choice test writing guidelines. This study led us to assume that homogeneity between distractors and answer is an important criterion to validate distractors. Homogeneity is both syntactic and semantic. We validated the definition of homogeneity by a MCQ corpus analysis, and we proposed methods for automatic recognition of syntactic and semantic homogeneity based on this analysis.Then, we focused our work on distractor semantic homogeneity. To automatically estimate it, we proposed a ranking model by machine learning, combining different semantic homogeneity measures. The evaluation of the model showed that our method is more efficient than existing work to estimate distractor semantic homogeneity
|
Page generated in 0.05 seconds