1 |
UTeach summer masters statistics course : a journey from traditional to Bayesian analysisFitzpatrick, Daniel Lee 05 January 2011 (has links)
This paper will outline some of the key parts of the Statistics course offered through the UTeach Summer Master’s Program as taught by Dr. Martha K. Smith. The paper begins with the introduction of the normal probability density function and is proven with calculus techniques and Euclidean geometry. Probability is discussed at great length in Smith’s course and the importance of understanding probability in statistical analysis is demonstrated through a reference to a study on how medical doctors confuse false positives in breast cancer testing. The frequentist perspective is concluded with a proof that the normal probability density function is zero.
The shift from traditional to Bayesian inference begins with a brief introduction to the terminology involved, as well as an example with patient testing. The pros and cons of Bayesian inference are discussed and a proof is shown using the normal probability density function in finding a Bayes estimate for µ.
It will be argued that a Statistics course moving from traditional to Bayesian analysis, such as that offered by the UTeach Summer Master’s Program and Smith, would supplement the traditional Statistics course offered at most universities. Such a course would be relevant for the mathematics major, mathematics educator, professionals in the medical industry, and individuals seeking to gain insights into how to understand data sets in new ways. / text
|
2 |
Sea spike modelingKuo, Chin-Chuan 12 1900 (has links)
Approved for public release; distribution is unlimited / In this thesis a clutter voltage model for scattering from the sea surface is developed. A model for the scattering from a whitecap and a wave breaking occurrence model re combined to simulate the back scattered signal from one radar resolution cell. The simulation performed obtained the probability density function of sea clutter under different assumptions of wind velocities and wave breaking conditions. This model incorporates some measured quantities such as the mean clutter voltage and the correlation time as parameters. The probability density function depends on the parameters of this model. The obtained probability density functions do not confirm to any familiar simple density function. / http://archive.org/details/seaspikemodeling00kuoc / Lieutenant, Taiwan Navy
|
3 |
Future projections of daily precipitation and its extremes in simulations of 21st century climate changeYin, Lei 15 April 2014 (has links)
The current generation of climate models in the Coupled Model Intercomparison Project Phase 5 (CMIP5) is used to assess the future changes in daily precipitation and its extremes. The simple average of all the models, i.e. the multi-model ensemble mean (MMEM), has been widely used due to its simplicity and better performance than most individual models. Weighting techniques are also proposed to deal with the systematic biases within the models. However, both methods are designed to reduce the uncertainties for the study of climate mean state. They will induce problems when the climate extremes are of interest.
We utilize a Bayesian weighting method to investigate the rainfall mean state and perform a probability density function based assessment of daily rainfall extremes. Satellite measurement is used to evaluate the short historical period. The weighting method can be only applied to regions rather than hemispheric scale, and thus three tropical regions including the Amazon, Congo, and Southeast Asia are studied. The method based on the Gamma distribution for daily precipitation is demonstrated to perform much better than the MMEM with respect to the extreme events. A use of the Kolmogorov-Smirnov statistic for the distribution assessment indicates the method is more applicable in three tropical wet regions over land mentioned above. This is consistent with previous studies showing the Gamma distribution is more suitable for daily rainfall in wet regions. Both methods provide consistent results.
The three regions display significant changes at the end of the 21st century. The Amazon will be drier, while the Congo will not have large changes in mean rainfall. However, both of the Amazon and Congo will have large rainfall variability, implying more droughts and floods. The Amazon will have 7.5% more little-rain days (defined as > 0.5 mm/d) and 4.5 mm/d larger 95th percentile for 2092-2099, and the Congo will have 2.5% more little-rain days and 1 mm/d larger 95th percentile. Southeast Asia will be dryer in the western part and wetter in the eastern part, which is consistent with the different changes in the 5th percentile. It will also experience heavier rainfall events with much larger increases in the 95th percentile. The future changes, especially the increase in rainfall extremes, are very likely associated with the strengthening of hydrological cycle. / text
|
4 |
平板乱流境界層対数速度分布領域における変動速度確率密度関数の特性 (第2報, レイノルズ数依存性について)辻, 義之, TSUJI, Yoshiyuki, 宮地, 圭, MIYACHI, Kei, 中村, 育雄, NAKAMURA, Ikuo 03 1900 (has links)
No description available.
|
5 |
An introductory survey of probability density function controlRen, M., Zhang, Qichun, Zhang, J. 03 October 2019 (has links)
Yes / Probability density function (PDF) control strategy investigates the controller design approaches where the random variables for the stochastic processes were adjusted to follow the desirable distributions. In other words, the shape of the system PDF can be regulated by controller design.Different from the existing stochastic optimization and control methods, the most important problem of PDF control is to establish the evolution of the PDF expressions of the system variables. Once the relationship between the control input and the output PDF is formulated, the control objective can be described as obtaining the control input signals which would adjust the system output PDFs to follow the pre-specified target PDFs. Motivated by the development of data-driven control and the state of the art PDF-based applications, this paper summarizes the recent research results of the PDF control while the controller design approaches can be categorized into three groups: (1) system model-based direct evolution PDF control; (2) model-based distribution-transformation PDF control methods and (3) data-based PDF control. In addition, minimum entropy control, PDF-based filter design, fault diagnosis and probabilistic decoupling design are also introduced briefly as extended applications in theory sense. / De Montfort University - DMU HEIF’18 project, Natural Science Foundation of Shanxi Province [grant number 201701D221112], National Natural Science Foundation of China [grant numbers 61503271 and 61603136]
|
6 |
A Novel Data-based Stochastic Distribution Control for Non-Gaussian Stochastic SystemsZhang, Qichun, Wang, H. 06 April 2021 (has links)
Yes / This note presents a novel data-based approach to investigate the non-Gaussian stochastic distribution control problem. As the motivation of this note, the existing methods have been summarised regarding to the drawbacks, for example, neural network weights training for unknown stochastic distribution and so on. To overcome these disadvantages, a new transformation for dynamic probability density function is given by kernel density estimation using interpolation. Based upon this transformation, a representative model has been developed while the stochastic distribution control problem has been transformed into an optimisation problem. Then, data-based direct optimisation and identification-based indirect optimisation have been proposed. In addition, the convergences of the presented algorithms are analysed and the effectiveness of these algorithms has been evaluated by numerical examples. In summary, the contributions of this note are as follows: 1) a new data-based probability density function transformation is given; 2) the optimisation algorithms are given based on the presented model; and 3) a new research framework is demonstrated as the potential extensions to the existing st
|
7 |
Advanced Image Processing Using Histogram Equalization and Android Application ImplementationGaddam, Purna Chandra Srinivas Kumar, Sunkara, Prathik January 2016 (has links)
Now a days the conditions at which the image taken may lead to near zero visibility for the human eye. They may usually due to lack of clarity, just like effects enclosed on earth’s atmosphere which have effects upon the images due to haze, fog and other day light effects. The effects on such images may exists, so useful information taken under those scenarios should be enhanced and made clear to recognize the objects and other useful information. To deal with such issues caused by low light or through the imaging devices experience haze effect many image processing algorithms were implemented. These algorithms also provide nonlinear contrast enhancement to some extent. We took pre-existed algorithms like SMQT (Successive mean Quantization Transform), V Transform, histogram equalization algorithms to improve the visual quality of digital picture with large range scenes and with irregular lighting conditions. These algorithms were performed in two different method and tested using different image facing low light and color change and succeeded in obtaining the enhanced image. These algorithms helps in various enhancements like color, contrast and very accurate results of images with low light. Histogram equalization technique is implemented by interpreting histogram of image as probability density function. To an image cumulative distribution function is applied so that accumulated histogram values are obtained. Then the values of the pixels are changed based on their probability and spread over the histogram. From these algorithms we choose histogram equalization, MATLAB code is taken as reference and made changes to implement in API (Application Program Interface) using JAVA and confirms that the application works properly with reduction of execution time.
|
8 |
Action potentials in the peripheral auditory nervous system : a novel PDE distribution modelGasper, Rebecca Elizabeth 01 July 2014 (has links)
Auditory physiology is nearly unique in the human body because of its small-diameter neurons. When considering a single node on one neuron, the number of channels is very small, so ion fluxes exhibit randomness.
Hodgkin and Huxley, in 1952, set forth a system of Ordinary Differential Equations (ODEs) to track the flow of ions in a squid motor neuron, based on a circuit analogy for electric current. This formalism for modeling is still in use today and is useful because coefficients can be directly measured.
To measure auditory properties of Firing Efficiency (FE) and Post Stimulus Time (PST), we can simply measure the depolarization, or "upstroke," of a node. Hence, we reduce the four-dimensional squid neuron model to a two-dimensional system of ODEs. The stochastic variable m for sodium activation is allowed a random walk in addition to its normal evolution, and the results are drastic. The diffusion coefficient, for spreading, is inversely proportional to the number of channels; for 130 ion channels, D is closer to 1/3 than 0 and cannot be called negligible.
A system of Partial Differential Equations (PDEs) is derived in these pages to model the distribution of states of the node with respect to the (nondimensionalized) voltage v and the sodium activation gate m. Initial conditions describe a distribution of (v,m) states; in most experiments, this would be a curve with mode at the resting state. Boundary conditions are Robin (Natural) boundary conditions, which gives conservation of the population. Evolution of the PDE has a drift term for the mean change of state and a diffusion term, the random change of state.
The phase plane is broken into fired and resting regions, which form basins of attraction for fired and resting-state fixed points. If a stimulus causes ions to flow from the resting region into the fired region, this rate of flux is approximately the firing rate, analogous to clinically measuring when the voltage crosses a threshold. This gives a PST histogram. The FE is an integral of the population over the fired region at a measured stop time after the stimulus (since, in the reduced model, when neurons fire they do not repolarize).
This dissertation also includes useful generalizations and methodology for turning other ODEs into PDEs. Within the HH modeling, parameters can be switched for other systems of the body, and may present a similar firing and non-firing separatrix (as in Chapter 3). For any system of ODEs, an advection model can show a distribution of initial conditions or the evolution of a given initial probability density over a state space (Chapter 4); a system of Stochastic Differential Equations can be modeled with an advection-diffusion equation (Chapter 5). As computers increase in speed and as the ability of software to create adaptive meshes and step sizes improves, modeling with a PDE becomes more and more efficient over its ODE counterpart.
|
9 |
Nonlinear stochastic dynamics and chaos by numerical path integrationMo, Eirik January 2008 (has links)
<p>The numerical path integration method for solving stochastic differential equations is extended to solve systems up to six spatial dimensions, angular variables, and highly nonlinear systems - including systems that results in discontinuities in the response probability density function of the system. Novel methods to stabilize the numerical method and increase computation speed are presented and discussed. This includes the use of the fast Fourier transform (FFT) and some new spline interpolation methods. Some sufficient criteria for the path integration theory to be applicable is also presented. The development of complex numerical code is made possible through automatic code generation by scripting. The resulting code is applied to chaotic dynamical systems by adding a Gaussian noise term to the deterministic equation. Various methods and approximations to compute the largest Lyapunov exponent of these systems are presented and illustrated, and the results are compared. Finally, it is shown that the location and size of the additive noise term affects the results, and it is shown that additive noise for specific systems could make a non-chaotic system chaotic, and a chaotic system non-chaotic.</p>
|
10 |
Nonlinear stochastic dynamics and chaos by numerical path integrationMo, Eirik January 2008 (has links)
The numerical path integration method for solving stochastic differential equations is extended to solve systems up to six spatial dimensions, angular variables, and highly nonlinear systems - including systems that results in discontinuities in the response probability density function of the system. Novel methods to stabilize the numerical method and increase computation speed are presented and discussed. This includes the use of the fast Fourier transform (FFT) and some new spline interpolation methods. Some sufficient criteria for the path integration theory to be applicable is also presented. The development of complex numerical code is made possible through automatic code generation by scripting. The resulting code is applied to chaotic dynamical systems by adding a Gaussian noise term to the deterministic equation. Various methods and approximations to compute the largest Lyapunov exponent of these systems are presented and illustrated, and the results are compared. Finally, it is shown that the location and size of the additive noise term affects the results, and it is shown that additive noise for specific systems could make a non-chaotic system chaotic, and a chaotic system non-chaotic.
|
Page generated in 0.1547 seconds