Spelling suggestions: "subject:"monotone degression"" "subject:"monotone aregression""
1 |
Isotone Optimization in R: Pool-Adjacent-Violators Algorithm (PAVA) and Active Set MethodsMair, Patrick, Hornik, Kurt, de Leeuw, Jan 21 October 2009 (has links) (PDF)
In this paper we give a general framework for isotone optimization. First we discuss a generalized version of the pool-adjacent-violators algorithm (PAVA) to minimize a separable convex function with simple chain constraints. Besides of general convex functions we extend existing PAVA implementations in terms of observation weights, approaches for tie handling, and responses from repeated measurement designs. Since isotone optimization problems can be formulated as convex programming problems with linear constraints we then develop a primal active set method to solve such problem. This methodology is applied on specific loss functions relevant in statistics. Both approaches are implemented in the R package isotone. (authors' abstract)
|
2 |
Regression approach to software reliability modelsMostafa, Abdelelah M 01 June 2006 (has links)
Many software reliability growth models have beenanalyzed for measuring the growth of software reliability. In this dissertation, regression methods are explored to study software reliability models. First, two parametric linear models are proposed and analyzed, the simple linear regression and transformed linearregression corresponding to a power law process. Some software failure data sets do not follow the linear pattern. Analysis of popular real life data showed that these contain outliers andleverage values. Linear regression methods based on least squares are sensitive to outliers and leverage values. Even though the parametric regression methods give good results in terms of error measurement criteria, these results may not be accurate due to violation of the parametric assumptions. To overcome these difficulties, nonparametric regression methods based on ranks are proposed as alternative techniques to build software reliability models. In particular, monotone regre
ssion and rank regression methods are used to evaluate the predictive capability of the models. These models are applied to real life data sets from various projects as well as to diverse simulated data sets. Both the monotone and the rank regression methods are robust procedures that are less sensitive to outliers and leverage values. In particular, the regression approach explains predictive properties of the mean time to failure for modeling the patterns of software failure times.In order to decide on model preference and to asses predictive accuracy of the mean time between failure time estimates for the defined data sets, the following error measurements evaluative criteria are used: the mean square error, mean absolute value difference, mean magnitude of relative error, mean magnitude oferror relative to the estimate, median of the absolute residuals, and a measure of dispersion. The methods proposed in this dissertation, when applied to real software failure data, give lesserror
in terms of all the measurement criteria compared to other popular methods from literature. Experimental results show that theregression approach offers a very promising technique in software reliability growth modeling and prediction.
|
3 |
Linear Subspace and Manifold Learning via Extrinsic GeometrySt. Thomas, Brian Stephen January 2015 (has links)
<p>In the last few decades, data analysis techniques have had to expand to handle large sets of data with complicated structure. This includes identifying low dimensional structure in high dimensional data, analyzing shape and image data, and learning from or classifying large corpora of text documents. Common Bayesian and Machine Learning techniques rely on using the unique geometry of these data types, however departing from Euclidean geometry can result in both theoretical and practical complications. Bayesian nonparametric approaches can be particularly challenging in these areas. </p><p> </p><p>This dissertation proposes a novel approach to these challenges by working with convenient embeddings of the manifold valued parameters of interest, commonly making use of an extrinsic distance or measure on the manifold. Carefully selected extrinsic distances are shown to reduce the computational cost and to increase accuracy of inference. The embeddings are also used to yield straight forward derivations for nonparametric techniques. The methods developed are applied to subspace learning in dimension reduction problems, planar shapes, shape constrained regression, and text analysis.</p> / Dissertation
|
Page generated in 0.0823 seconds