• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 19
  • 9
  • 5
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 121
  • 121
  • 94
  • 17
  • 17
  • 17
  • 16
  • 16
  • 15
  • 14
  • 14
  • 13
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Multivariate analysis of flow cytometry data

Collins, Gary Stephen January 2000 (has links)
No description available.
2

UTeach summer masters statistics course : a journey from traditional to Bayesian analysis

Fitzpatrick, Daniel Lee 05 January 2011 (has links)
This paper will outline some of the key parts of the Statistics course offered through the UTeach Summer Master’s Program as taught by Dr. Martha K. Smith. The paper begins with the introduction of the normal probability density function and is proven with calculus techniques and Euclidean geometry. Probability is discussed at great length in Smith’s course and the importance of understanding probability in statistical analysis is demonstrated through a reference to a study on how medical doctors confuse false positives in breast cancer testing. The frequentist perspective is concluded with a proof that the normal probability density function is zero. The shift from traditional to Bayesian inference begins with a brief introduction to the terminology involved, as well as an example with patient testing. The pros and cons of Bayesian inference are discussed and a proof is shown using the normal probability density function in finding a Bayes estimate for µ. It will be argued that a Statistics course moving from traditional to Bayesian analysis, such as that offered by the UTeach Summer Master’s Program and Smith, would supplement the traditional Statistics course offered at most universities. Such a course would be relevant for the mathematics major, mathematics educator, professionals in the medical industry, and individuals seeking to gain insights into how to understand data sets in new ways. / text
3

Optimal Subsampling of Finite Mixture Distribution

Neupane, Binod Prasad 05 1900 (has links)
<p> A mixture distribution is a compounding of statistical distributions, which arises when sampling from heterogeneous populations with a different probability density function in each component. A finite mixture has a finite number of components. In the past decade the extent and the potential of the applications of finite mixture models have widened considerably.</p> <p> The objective of this project is to add some functionalities to a package 'mixdist' developed by Du and Macdonald (Du 2002) and Gao (2004) in the R environment (R Development Core Team 2004) for estimating the parameters of a finite mixture distribution with data grouped in bins and conditional data. Mixed data together with conditional data will provide better estimates of parameters than do mixed data alone. Our main objective is to obtain the optimal sample size for each bin of the mixed data to obtain conditional data, given approximate values of parameters and the distributional form of the mixture for the given data. We have also replaced the dependence of the function mix upon the optimizer nlm to optimizer optim to provide the limits to the parameters.</p> <p> Our purpose is to provide easily available tools to modeling fish growth using mixture distribution. However, it has a number of applications in other areas as well.</p> / Thesis / Master of Science (MSc)
4

Sea spike modeling

Kuo, Chin-Chuan 12 1900 (has links)
Approved for public release; distribution is unlimited / In this thesis a clutter voltage model for scattering from the sea surface is developed. A model for the scattering from a whitecap and a wave breaking occurrence model re combined to simulate the back scattered signal from one radar resolution cell. The simulation performed obtained the probability density function of sea clutter under different assumptions of wind velocities and wave breaking conditions. This model incorporates some measured quantities such as the mean clutter voltage and the correlation time as parameters. The probability density function depends on the parameters of this model. The obtained probability density functions do not confirm to any familiar simple density function. / http://archive.org/details/seaspikemodeling00kuoc / Lieutenant, Taiwan Navy
5

Future projections of daily precipitation and its extremes in simulations of 21st century climate change

Yin, Lei 15 April 2014 (has links)
The current generation of climate models in the Coupled Model Intercomparison Project Phase 5 (CMIP5) is used to assess the future changes in daily precipitation and its extremes. The simple average of all the models, i.e. the multi-model ensemble mean (MMEM), has been widely used due to its simplicity and better performance than most individual models. Weighting techniques are also proposed to deal with the systematic biases within the models. However, both methods are designed to reduce the uncertainties for the study of climate mean state. They will induce problems when the climate extremes are of interest. We utilize a Bayesian weighting method to investigate the rainfall mean state and perform a probability density function based assessment of daily rainfall extremes. Satellite measurement is used to evaluate the short historical period. The weighting method can be only applied to regions rather than hemispheric scale, and thus three tropical regions including the Amazon, Congo, and Southeast Asia are studied. The method based on the Gamma distribution for daily precipitation is demonstrated to perform much better than the MMEM with respect to the extreme events. A use of the Kolmogorov-Smirnov statistic for the distribution assessment indicates the method is more applicable in three tropical wet regions over land mentioned above. This is consistent with previous studies showing the Gamma distribution is more suitable for daily rainfall in wet regions. Both methods provide consistent results. The three regions display significant changes at the end of the 21st century. The Amazon will be drier, while the Congo will not have large changes in mean rainfall. However, both of the Amazon and Congo will have large rainfall variability, implying more droughts and floods. The Amazon will have 7.5% more little-rain days (defined as > 0.5 mm/d) and 4.5 mm/d larger 95th percentile for 2092-2099, and the Congo will have 2.5% more little-rain days and 1 mm/d larger 95th percentile. Southeast Asia will be dryer in the western part and wetter in the eastern part, which is consistent with the different changes in the 5th percentile. It will also experience heavier rainfall events with much larger increases in the 95th percentile. The future changes, especially the increase in rainfall extremes, are very likely associated with the strengthening of hydrological cycle. / text
6

平板乱流境界層対数速度分布領域における変動速度確率密度関数の特性 (第2報, レイノルズ数依存性について)

辻, 義之, TSUJI, Yoshiyuki, 宮地, 圭, MIYACHI, Kei, 中村, 育雄, NAKAMURA, Ikuo 03 1900 (has links)
No description available.
7

An introductory survey of probability density function control

Ren, M., Zhang, Qichun, Zhang, J. 03 October 2019 (has links)
Yes / Probability density function (PDF) control strategy investigates the controller design approaches where the random variables for the stochastic processes were adjusted to follow the desirable distributions. In other words, the shape of the system PDF can be regulated by controller design.Different from the existing stochastic optimization and control methods, the most important problem of PDF control is to establish the evolution of the PDF expressions of the system variables. Once the relationship between the control input and the output PDF is formulated, the control objective can be described as obtaining the control input signals which would adjust the system output PDFs to follow the pre-specified target PDFs. Motivated by the development of data-driven control and the state of the art PDF-based applications, this paper summarizes the recent research results of the PDF control while the controller design approaches can be categorized into three groups: (1) system model-based direct evolution PDF control; (2) model-based distribution-transformation PDF control methods and (3) data-based PDF control. In addition, minimum entropy control, PDF-based filter design, fault diagnosis and probabilistic decoupling design are also introduced briefly as extended applications in theory sense. / De Montfort University - DMU HEIF’18 project, Natural Science Foundation of Shanxi Province [grant number 201701D221112], National Natural Science Foundation of China [grant numbers 61503271 and 61603136]
8

A Novel Data-based Stochastic Distribution Control for Non-Gaussian Stochastic Systems

Zhang, Qichun, Wang, H. 06 April 2021 (has links)
Yes / This note presents a novel data-based approach to investigate the non-Gaussian stochastic distribution control problem. As the motivation of this note, the existing methods have been summarised regarding to the drawbacks, for example, neural network weights training for unknown stochastic distribution and so on. To overcome these disadvantages, a new transformation for dynamic probability density function is given by kernel density estimation using interpolation. Based upon this transformation, a representative model has been developed while the stochastic distribution control problem has been transformed into an optimisation problem. Then, data-based direct optimisation and identification-based indirect optimisation have been proposed. In addition, the convergences of the presented algorithms are analysed and the effectiveness of these algorithms has been evaluated by numerical examples. In summary, the contributions of this note are as follows: 1) a new data-based probability density function transformation is given; 2) the optimisation algorithms are given based on the presented model; and 3) a new research framework is demonstrated as the potential extensions to the existing st
9

Advanced Image Processing Using Histogram Equalization and Android Application Implementation

Gaddam, Purna Chandra Srinivas Kumar, Sunkara, Prathik January 2016 (has links)
Now a days the conditions at which the image taken may lead to near zero visibility for the human eye. They may usually due to lack of clarity, just like effects enclosed on earth’s atmosphere which have effects upon the images due to haze, fog and other day light effects. The effects on such images may exists, so useful information taken under those scenarios should be enhanced and made clear to recognize the objects and other useful information. To deal with such issues caused by low light or through the imaging devices experience haze effect many image processing algorithms were implemented. These algorithms also provide nonlinear contrast enhancement to some extent. We took pre-existed algorithms like SMQT (Successive mean Quantization Transform), V Transform, histogram equalization algorithms to improve the visual quality of digital picture with large range scenes and with irregular lighting conditions. These algorithms were performed in two different method and tested using different image facing low light and color change and succeeded in obtaining the enhanced image. These algorithms helps in various enhancements like color, contrast and very accurate results of images with low light. Histogram equalization technique is implemented by interpreting histogram of image as probability density function. To an image cumulative distribution function is applied so that accumulated histogram values are obtained. Then the values of the pixels are changed based on their probability and spread over the histogram. From these algorithms we choose histogram equalization, MATLAB code is taken as reference and made changes to implement in API (Application Program Interface) using JAVA and confirms that the application works properly with reduction of execution time.
10

Action potentials in the peripheral auditory nervous system : a novel PDE distribution model

Gasper, Rebecca Elizabeth 01 July 2014 (has links)
Auditory physiology is nearly unique in the human body because of its small-diameter neurons. When considering a single node on one neuron, the number of channels is very small, so ion fluxes exhibit randomness. Hodgkin and Huxley, in 1952, set forth a system of Ordinary Differential Equations (ODEs) to track the flow of ions in a squid motor neuron, based on a circuit analogy for electric current. This formalism for modeling is still in use today and is useful because coefficients can be directly measured. To measure auditory properties of Firing Efficiency (FE) and Post Stimulus Time (PST), we can simply measure the depolarization, or "upstroke," of a node. Hence, we reduce the four-dimensional squid neuron model to a two-dimensional system of ODEs. The stochastic variable m for sodium activation is allowed a random walk in addition to its normal evolution, and the results are drastic. The diffusion coefficient, for spreading, is inversely proportional to the number of channels; for 130 ion channels, D is closer to 1/3 than 0 and cannot be called negligible. A system of Partial Differential Equations (PDEs) is derived in these pages to model the distribution of states of the node with respect to the (nondimensionalized) voltage v and the sodium activation gate m. Initial conditions describe a distribution of (v,m) states; in most experiments, this would be a curve with mode at the resting state. Boundary conditions are Robin (Natural) boundary conditions, which gives conservation of the population. Evolution of the PDE has a drift term for the mean change of state and a diffusion term, the random change of state. The phase plane is broken into fired and resting regions, which form basins of attraction for fired and resting-state fixed points. If a stimulus causes ions to flow from the resting region into the fired region, this rate of flux is approximately the firing rate, analogous to clinically measuring when the voltage crosses a threshold. This gives a PST histogram. The FE is an integral of the population over the fired region at a measured stop time after the stimulus (since, in the reduced model, when neurons fire they do not repolarize). This dissertation also includes useful generalizations and methodology for turning other ODEs into PDEs. Within the HH modeling, parameters can be switched for other systems of the body, and may present a similar firing and non-firing separatrix (as in Chapter 3). For any system of ODEs, an advection model can show a distribution of initial conditions or the evolution of a given initial probability density over a state space (Chapter 4); a system of Stochastic Differential Equations can be modeled with an advection-diffusion equation (Chapter 5). As computers increase in speed and as the ability of software to create adaptive meshes and step sizes improves, modeling with a PDE becomes more and more efficient over its ODE counterpart.

Page generated in 0.0789 seconds