• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 513
  • 246
  • 208
  • 111
  • 56
  • 24
  • 20
  • 18
  • 16
  • 15
  • 14
  • 14
  • 12
  • 11
  • 10
  • Tagged with
  • 1455
  • 280
  • 200
  • 167
  • 140
  • 122
  • 121
  • 120
  • 118
  • 116
  • 116
  • 113
  • 110
  • 105
  • 95
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Data analysis and results of the upgraded CRESST dark matter search

McGowan, Richard January 2010 (has links)
CRESST has an established analysis procedure to evaluate the energy of the events it detects, in an attempt to detect WIMP dark matter. It was shown that unless eight classes of contaminant event were removed prior to this analysis, the output energy spectrum would be significantly biased. For both scientific and practical reasons, the removal process should be blind, and a series of cuts were developed to flag these events automatically, without removing any true events. An event simulation package was developed to optimise these cuts. It was shown that noise fluctuations could also reduce CRESST’s sensitivity, so a noise-dependent acceptance region was introduced to resolve this. The upgraded CRESST experiment included a new electronics system to provide heating and bias currents for 66 detectors. This system was integrated into the CRESST set-up, and it was shown that the electronics contributed no extra noise to the detectors. Data with an exposure of 50 kg days were analysed using the cuts and the noise-dependent acceptance. The cuts were successful, with no contaminant event retained and a live time reduction of just 2.3%. The data were used to set an upper limit on the WIMP-nucleon cross section for elastic scattering with a minimum of 6.3 × 10^(−7) pb at a WIMP mass of 61 GeV. This is a factor of 2.5 better than the previous best CRESST limit.
82

Determinacy-related Consequences on Limit Superiors

Walker, Daniel 05 1900 (has links)
Laczkovich proved from ZF that, given a countable sequence of Borel sets on a perfect Polish space, if the limit superior along every subsequence was uncountable, then there was a particular subsequence whose intersection actually contained a perfect subset. Komjath later expanded the result to hold for analytic sets. In this paper, by adding AD and sometimes V=L(R) to our assumptions, we will extend the result further. This generalization will include the increasing of the length of the sequence to certain uncountable regular cardinals as well as removing any descriptive requirements on the sets.
83

Inverse Limit Spaces

Williams, Stephen Boyd 12 1900 (has links)
Inverse systems, inverse limit spaces, and bonding maps are defined. An investigation of the properties that an inverse limit space inherits, depending on the conditions placed on the factor spaces and bonding maps is made. Conditions necessary to ensure that the inverse limit space is compact, connected, locally connected, and semi-locally connected are examined. A mapping from one inverse system to another is defined and the nature of the function between the respective inverse limits, induced by this mapping, is investigated. Certain restrictions guarantee that the induced function is continuous, onto, monotone, periodic, or open. It is also shown that any compact metric space is the continuous image of the cantor set. Finally, any compact Hausdorff space is characterized as the inverse limit of an inverse system of polyhedra.
84

Limita ve středoškolské matematice / Limit in the high school mathematics

Podobník, Ľuboš January 2011 (has links)
No description available.
85

Shluky volatility a dynamika poptávky a nabídky / Volatility bursts and order book dynamics

Plačková, Jana January 2011 (has links)
Title: Volatility bursts and order book dynamics Author: Jana Plačková Department: Department of Probability and Mathematical Statistics Supervisor: Dr. Jan M. Swart Supervisor's e-mail address: swart@utia.cas.cz Abstract: The presented paper studies the dynamics of supply and demand through the electronic order book. We describe and define the basic rules of the order book and its dynamics. We also define limit and market orders and describe the differences between them and how they influenced the evolution of ask, bid price and spread. Next part of the paper is dedicated to the de- scription and definition of volatility and its basic models. The brief overview about volatility clustering and its modeling by economists and physicists can be found in the following part. In the last part we introduce a simple model of order book in which we observe ask, bid price and spread. Then we study the empirical distribution of spread and try to find its probability distribu- tion. The volatility clustering is then observed through the relative returns of spread. In the last part we introduce some possible improvement of the model. Keywords: volatility clustering, order book, limit orders, market orders 1
86

Hodnocení úspěšnosti účastníků praktické výuky letního a zimního přežití / Evaluating the success of participants practical, summer and winter survival training.

Šika, Vojtěch January 2016 (has links)
Title: Evaluating the success of participants practical, summer and winter survival training. Objectives: The aim of this thesis was to evaluate and interpret the data. This data was collected over a period of several years, and observations were made by experts in the field of survival. Based on the course of the winter and summer survival, the idea was to observe if there are differences between successful men and women who best cope with stress. Observed people are generally students at Charles University. Methods: Data were collected during the course of survival thanks to observing instructors, who in no way influenced the behavior of the students in all activities associated with survival. Evaluating of behavior was conducted on the basis of six criteria established by survival instructors. The students completed the five-factor questionnaire NEOPS that was used to obtain personal profile of individual students. Results: For interpretation of the results was used Eyseneck's coordinate model of temperament. In this model they were recorded individual personality profiles obtained thanks to two fundamental dimensions - neuroticism and extroversion. Over 72% of students participating the survival seminar belong to a sanguine temperament, a person who is emotionally stable and openminded. Along...
87

Estimation of an upper tolerance limit for small-samples containing observations below the limit of quantitation

Yan, Donglin January 1900 (has links)
Master of Science / Department of Statistics / Christopher I. Vahl / Chemicals and drugs applied to animals used in meat production often have the potential to cause adverse effects on human consumers. To ensure safety, a withdrawal period, i.e. the minimum time allowed between application of the drug and entry of the animal into the food supply, must be determined for each drug used on food-producing animals. The withdrawal period is based on an upper tolerance limit at a given time point. It is not uncommon that the concentration of the drug in some tissue samples to be measured at a level below the limit of quantitation (LOQ). Because the measurement of the tissue concentration cannot be confidently determined with enough precision, these types of observations are often treated as if they were left censored where the censoring value is equal to the limit of quantitation. Several methods are commonly used in practice to deal with this situation. The simplest methods are either to exclude observations below the limit of quantitation or to replace those values with zero, LOQ or ½ LOQ. Previous studies have shown that these methods result in biased in estimation of the population mean and population variance. Alternatively, one could incorporate censoring into the likelihood and compute the maximum likelihood estimate (MLE) for the population mean and variance assuming a normal or lognormal distribution. These estimates are also biased but it has been shown that they are asymptotically unbiased. However, it is not clear yet how these various methods affect estimation of the upper tolerance limit, especially when the sample size is small, e.g. less than 35. In this report, we will examine the effects of substituting the LOQ or ½ LOQ for censored values as well as using the MLEs of the mean and variance in the construction of an upper tolerance limit for a normal population through simulation. Additionally, we propose a modified substitution method where observations below the LOQ are replaced by functions of the order statistics of non-censored observations under an assumption of symmetry. Its performance relative to the above methods will also be evaluated in the simulation study. In the end, the results from this study will be applied to an environmental study.
88

Construction and Approximation of Stable Lévy Motion with Values in Skorohod Space

Saidani, Becem 12 August 2019 (has links)
Under an appropriate regular variation condition, the affinely normalized partial sums of a sequence of independent and identically distributed random variables converges weakly to a non-Gaussian stable random variable. A functional version of this is known to be true as well, the limit process being a stable L´evy process. In this thesis, we developed an explicit construction for the α-stable L´evy process motion with values in D([0, 1]), by considering the cases α < 1 and α > 1. The case α < 1 is the simplest since we can work with the uniform topology of the sup-norm on D([0, 1]) and the construction follows more or less by classical techniques. The case α > 1 required more work. In particular, we encountered two problems : one was related to the construction of a modification of this process (for all time), which is right-continuous and has left-limit with respect to the J1 topology. This problem was solved by using the Itob-Nisio theorem. The other problem was more difficult and we only managed to solve it by developing a criterion for tightness of probability measures on the space of cadlag fonction on [0, T] with values in D([0, 1]), equipped with a generalization of Skorohod’s J1 topology. In parallel with the construction of the infinite-dimensional process Z, we focus on the functional extension of Roueff and Soulier [29]. This part of the thesis was completed using the method of point process, which gave the convergence of the truncated sum. The case α > 1 required more work due to the presence of centering. For this case, we developed an ad-hoc result regarding the continuity of the addition for functions on [0, T] with values in D([0, 1]), which was tailored for our problem.
89

Empirical likelihood method for segmented linear regression

Unknown Date (has links)
For a segmented regression system with an unknown change-point over two domains of a predictor, a new empirical likelihood ratio test statistic is proposed to test the null hypothesis of no change. The proposed method is a non-parametric method which releases the assumption of the error distribution. Under the null hypothesis of no change, the proposed test statistic is shown empirically Gumbel distributed with robust location and scale parameters under various parameter settings and error distributions. Under the alternative hypothesis with a change-point, the comparisons with two other methods (Chen's SIC method and Muggeo's SEG method) show that the proposed method performs better when the slope change is small. A power analysis is conducted to illustrate the performance of the test. The proposed method is also applied to analyze two real datasets: the plasma osmolality dataset and the gasoline price dataset. / by Zhihua Liu. / Thesis (Ph.D.)--Florida Atlantic University, 2011. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 200?. Mode of access: World Wide Web.
90

The enumeration of lattice paths and walks

Unknown Date (has links)
A well-known long standing problem in combinatorics and statistical mechanics is to find the generating function for self-avoiding walks (SAW) on a two-dimensional lattice, enumerated by perimeter. A SAW is a sequence of moves on a square lattice which does not visit the same point more than once. It has been considered by more than one hundred researchers in the pass one hundred years, including George Polya, Tony Guttmann, Laszlo Lovasz, Donald Knuth, Richard Stanley, Doron Zeilberger, Mireille Bousquet-Mlou, Thomas Prellberg, Neal Madras, Gordon Slade, Agnes Dit- tel, E.J. Janse van Rensburg, Harry Kesten, Stuart G. Whittington, Lincoln Chayes, Iwan Jensen, Arthur T. Benjamin, and many others. More than three hundred papers and a few volumes of books were published in this area. A SAW is interesting for simulations because its properties cannot be calculated analytically. Calculating the number of self-avoiding walks is a common computational problem. A recently proposed model called prudent self-avoiding walks (PSAW) was first introduced to the mathematics community in an unpublished manuscript of Pra, who called them exterior walks. A prudent walk is a connected path on square lattice such that, at each step, the extension of that step along its current trajectory will never intersect any previously occupied vertex. A lattice path composed of connected horizontal and vertical line segments, each passing between adjacent lattice points. We will discuss some enumerative problems in self-avoiding walks, lattice paths and walks with several step vectors. Many open problems are posted. / by Shanzhen Gao. / Thesis (Ph.D.)--Florida Atlantic University, 2011. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2011. Mode of access: World Wide Web.

Page generated in 0.0273 seconds