251 |
Comparison of Bayes' and minimum variance unbiased estimators of reliability in the extreme value life testing modelGodbold, James Homer January 1970 (has links)
The purpose of this study is to consider two different types of estimators for reliability using the extreme value distribution as the life-testing model. First the unbiased minimum variance estimator is derived. Then the Bayes' estimators for the uniform, exponential, and inverted gamma prior distributions are obtained, and these results are extended to a whole class of exponential failure models. Each of the Bayes' estimators is compared with the unbiased minimum variance estimator in a Monte Carlo simulation where it is shown that the Bayes' estimator has smaller squared error loss in each case.
The problem of obtaining estimators with respect to an exponential type loss function is also considered. The difficulties in such an approach are demonstrated. / Master of Science
|
252 |
Optimization Techniques for Multi-object Detection and Tracking on Live-cell Fluorescence Microscopy Images and Their ApplicationsWang, Mengfan 24 July 2024 (has links)
Fluorescence microscopy is a pivotal imaging technique to visualize biological processes and has been extensively utilized in live-cell morphology analysis. Despite its utility, related object detection and tracking tasks still face challenges due to large data scales, inferior data quality, and insufficient annotations, leading to reliance on adaptive thresholding. Current adaptive thresholding approaches have two significant limitations: Firstly, they cannot handle the heteroscedasticity of image data well and result in biased outputs. Secondly, they deal with frames of time-series imaging data independently and result in inconsistent detections over time. We introduce two novel optimization techniques to address these limitations and enhance detection and tracking results in live-cell imaging. The first one, ConvexVST, is a convex optimization approach to transform heteroscedastic data into homoscedastic data, making them more tractable for subsequent analysis. The second one, Joint Thresholding, is a graph-based approach to get the optimal adaptive thresholds while maintaining temporal consistency. Our methods demonstrate superior performance across various object detection and tracking tasks. Specifically, when applied to microglia imaging data, our techniques enable the acquisition of more complete cell morphology and more accurate detection of microglia tips. Furthermore, by integrating these techniques with existing frameworks, we propose an advanced pipeline for embryonic cell detection and tracking in light-sheet microscopy images, which is streets ahead of state-of-the-art peer methods and sets a new benchmark in the field. / Doctor of Philosophy / Fluorescence microscopy is an important imaging tool for observing biological processes and is widely used to study live-cell structures and activities. However, detecting and tracking objects in these images can be difficult because of the large amount of data, poor image quality, and lack of accurate annotations. It leads to the reliance on basic image segmentation approaches, which try to distinguish foreground from background by setting intensity thresholds. These methods have two main problems: they don't handle varying noise in image data well, resulting in inaccurate outputs, and they analyze each frame in a sequence of images independently, causing inconsistencies over time. To solve these issues, we developed two new techniques to improve detection performance in live-cell imaging. The first one, ConvexVST, makes the noise levels in image data more uniform, simplifying the following analysis. The second one, Joint Thresholding, can find the best intensity thresholds while maintaining consistency across frames over time. Our methods have shown significant improvements in detecting and tracking objects. For example, when applied to images of microglia (a type of brain cell), they provide more complete cell shapes and more accurate detection of cell structures. Additionally, by combining these techniques with existing frameworks, we create an advanced pipeline for detecting and tracking embryonic cells that outperforms current leading methods.
|
253 |
Graphical assessment of the prediction capability of response surface designsGiovannitti-Jensen, Ann January 1987 (has links)
A response surface analysis is concerned with the exploration of a system in order to determine the behavior of the response of the system as levels of certain factors which influence the response are changed. It is often of particular interest to predict the response in some region of the allowable factor values and to find the optimal operating conditions of the system.
In an experiment to search for the optimum response of a surface it is advantageous to predict the response with equal, or nearly equal, precision at all combinations of the levels of the variables which represent locations which are the same distance from the center of the experimental region. Measures of the quality of prediction at locations on the surface of a hypersphere are presented in this thesis. These measures are used to form a graphical method of assessing the overall prediction capability of an experimental design throughout the region of interest.
Rotatable designs give equal variances of predicted values corresponding to locations on the same sphere. In this case, the center of the sphere coincides with the center of the rotatable design. However, there is a need for a method to quantify the prediction capability on spheres for non-rotatable designs. The spherical variance is a measure of the average prediction variance at locations on the surface of a sphere. The spherical variance obtained with a design provides an assessment of how well the response is being estimated on the average at locations which are the same distance from the region center. This thesis introduces two measures which describe the dispersion in the variances of the predicted responses at aH locations on the surface of a sphere. These prediction variance dispersion (PVD) measures are used to evaluate the ability of a design to estimate the response with consistent precision at locations which are the same distance from the region center. The PVD measures are used in conjunction with the spherical variance to assess the prediction capability of a design.
A plot of the spherical variance and the maximum and minimum prediction variances for locations on a sphere against the radius of the sphere gives a comprehensive picture of the behavior of the prediction variances throughout a region, and, hence, of the quality of the predicted responses, obtained with a particular design. Such plots are used to investigate and compare the prediction capabilities of certain response surface designs currently available to the researcher. The plots are also used to investigate the robustness of a design under adverse experimental conditions and to determine the effects of taking an additional experimental run on the quality of the predicted responses. / Ph. D.
|
254 |
Design and analysis for a two level factorial experiment in the presence of dispersion effectsMays, Darcy P. 10 October 2005 (has links)
Standard response surface methodology experimental designs for estimating location models involve the assumption of homogeneous variance throughout the design region. However, with heterogeneity of variance these standard designs are not optimal.
Using the D and Q-optimality criteria, this dissertation proposes a two-stage experimental design procedure that gives more efficient designs than the standard designs when heterogeneous variance exists. Several multiple variable location models, with and without interactions, are considered. For each the first stage estimates the heterogeneous variance structure, while the second stage then augments the first stage to produce a D or Q-optimal design for fitting the location model under the estimated variance structure. However, there is a potential instability of the variance estimates in the first stage that can lower the efficiency of the two-stage procedure. This problem can be addressed and the efficiency of the procedure enhanced if certain mild assumptions concerning the variance structure are made and formulated as a prior distribution to produce a Bayes estimator.
With homogeneous variance, designs are analyzed using ordinary least squares. However, with heterogeneous variance the correct analysis is to use weighted least squares. This dissertation also examines the effects that analysis by weighted least squares can have and compares this procedure to the proposed two-stage procedure. / Ph. D.
|
255 |
A graphical approach for evaluating the potential impact of bias due to model misspecification in response surface designsVining, G. Geoffrey January 1988 (has links)
The basic purpose of response surface analysis is to generate a relatively simple model to serve as an adequate approximation for a more complex phenomenon. This model then may be used for other purposes, for example prediction or optimization. Since the proposed model is only an approximation, the analyst almost always faces the potential of bias due to model misspecification. The ultimate impact of this bias depends upon the choice both of the experimental design and of the region for conducting the experiment.
This dissertation proposes a graphical approach for evaluating the impact of bias upon response surface designs. Essentially, it extends the work of Giovannitti-Jensen (1987) and Giovannitti-Jensen and Myers (1988) who have developed a graphical technique for displaying a design's prediction variance capabilities. This dissertation extends this concept: (1) to the prediction bias due to model misspecification; (2) the prediction bias due to the presence of a single outlier; and (3) to a mean squared error of prediction. Several common first and second-order response surface designs are evaluated through this approach. / Ph. D.
|
256 |
Investor sentiment and the mean-variance relationship: European evidenceWang, Wenzhao 09 March 2020 (has links)
Yes / This paper investigates the impact of investor sentiment on the mean-variance relationship in 14 European stock markets. Applying three approaches to define investors’ neutrality and determine high and low sentiment periods, we find that individual investors’ increased presence and trading over high-sentiment periods would undermine the risk-return tradeoff. More importantly, we report that investors’ optimism (pessimism) is more determined by their normal sentiment state, represented by the all-period average sentiment level, rather than the neutrality value set in sentiment surveys.
|
257 |
The mean-variance relation and the role of institutional investor sentimentWang, Wenzhao 09 March 2020 (has links)
Yes / This paper investigates the role of institutional investor sentiment in the mean–variance relation. We find market returns are negatively (positively) related to market’s conditional volatility over bullish (bearish) periods. The evidence indicates institutional investors to be sentiment traders as well.
|
258 |
The mean–variance relation: A 24-hour storyWang, Wenzhao 07 October 2021 (has links)
Yes / This paper investigates the mean-variance relation during different time periods within trading days. We reveal that there is a positive mean-variance relation when the stock market is closed (i.e., overnight), but the positive relation is distorted when the market is open (i.e., intraday). The evidence offers a new explanation for the weak risk-return tradeoff in stock markets.
|
259 |
Les modèles VAR(p)Chukunyere, Amenan Christiane 06 April 2024 (has links)
Ce mémoire a pour objectif d’étudier une famille de méthodes pour modéliser de façon conjointe plusieurs séries temporelles. Nous nous servons de ces méthodes pour prédire le comportement de cinq séries temporelles américaines et de ressortir les liens dynamiques qui pourraient exister entre elles. Pour ce faire, nous utilisons les modèles de vecteurs autorégressifs d’ordre p proposés par Sims (1980) qui sont une généralisation multivariée des modèles de Box et Jenkins. Tout d’abord, nous définissons un ensemble de concepts et outils statistiques qui seront utiles à la compréhension de notions utilisées par la suite dans ce mémoire. S’ensuit la présentation des modèles et de la méthode de Box et Jenkins. Cette méthode est appliquée à chacune des cinq séries en vue d’avoir des modèles univariés. Puis, nous présentons les modèles VAR(p) et nous faisons un essai d’ajustement de ces modèles à un vecteur dont les composantes sont les cinq séries. Nous discutons de la valeur ajoutée de l’analyse multivariée par rapport à l’ensemble des analyses univariées / This thesis aims to study a family of methods to jointly model several time series. We use these methods to predict the behavior of five US time series and to highlight the dynamic links that might exist between them. To do this, we use the p-order autoregressive vector models proposed by Sims (1980), which are a multivariate generalization of the Box and Jenkins models. First, we define a set of concepts and statistical tools that will be useful for the understanding of notions used later in this thesis. Follows the presentation of the models and the method of Box and Jenkins. This method is applied to each of the five series in order to have univariate models. Then, we present the VAR(p) models and we test the fit of these models to a vector series whose components are the five aforementioned series. We discuss the added value of multivariate analysis compared to the five univariate analyzes.
|
260 |
Měření výkonnosti klasifikačního systému DRG v České republice / Measuring the DRG classification system performance in the Czech RepublicNový, Petr January 2016 (has links)
No description available.
|
Page generated in 0.0176 seconds