Spelling suggestions: "subject:"heighted"" "subject:"eighted""
361 |
A-optimal designs for weighted polynomial regressionSu, Yang-Chan 05 July 2005 (has links)
This paper is concerned with the problem of constructing
A-optimal design for polynomial regression with analytic weight
function on the interval [m-a,m+a]. It is
shown that the structure of the optimal design depends on a and
weight function only, as a close to 0. Moreover, if the weight
function is an analytic function a, then a scaled version of
optimal support points and weights is analytic functions of a at
$a=0$. We make use of a Taylor expansion which coefficients can be
determined recursively, for calculating the A-optimal designs.
|
362 |
Ds-optimal designs for weighted polynomial regressionMao, Chiang-Yuan 21 June 2007 (has links)
This paper is devoted to studying the problem of constructing Ds-optimal design for d-th degree polynomial regression with analytic weight function
on the interval [m-a,m+a],m,a in R. It is demonstrated that the structure of the optimal design depends on d, a and weight function only, as a close to 0. Moreover, the Taylor polynomials of the scaled versions of the optimal support points and weights can be computed via a recursive formula.
|
363 |
Controlling High Quality Manufacturing Processes: A Robustness Study Of The Lower-sided Tbe Ewma ProcedurePehlivan, Canan 01 September 2008 (has links) (PDF)
In quality control applications, Time-Between-Events (TBE) type observations may be monitored by using Exponentially Weighted Moving Average (EWMA) control charts. A widely accepted model for the TBE processes is the exponential
distribution, and hence TBE EWMA charts are designed under this assumption. Nevertheless, practical applications do not always conform to the theory and it is common that the observations do not fit the exponential model. Therefore, control charts that are robust to departures from the assumed distribution are desirable in practice. In this thesis, robustness of the lower-sided TBE EWMA charts to the assumption of exponentially distributed observations has been investigated. Weibull and lognormal distributions are considered in order to represent the departures from the assumed exponential model and Markov Chain approach is utilized for evaluating the performance of the chart. By analyzing the performance results, design settings are suggested in order to achieve robust lower-sided TBE EWMA charts.
|
364 |
Investigation Of The Spatial Relationship Of Municipal Solid Waste Generation In Turkey With Socio-economic, Demographic And Climatic FactorsKeser, Saniye 01 February 2010 (has links) (PDF)
This thesis investigates the significant factors affecting municipal solid waste (MSW) generation in Turkey. For this purpose, both spatial and non-spatial tech¬ / niques are utilized. Non-spatial technique is ordinary least squares (OLS) regression while spatial techniques employed are simultaneous spatial autoregression (SAR) and geographically weighted regression (GWR). The independent variables include socio-economic, demographic and climatic indicators. The results show that nearer provinces tend to have similar solid waste generation rate. Moreover, it is shown that the effects of independent variables vary among provinces. It is demonstrated that educational status and unemployment are significant factors of waste generation in Turkey.
|
365 |
A Novel Refinement Method For Automatic Image Annotation SystemsDemircioglu, Ersan 01 June 2011 (has links) (PDF)
Image annotation could be defined as the process of assigning a set of content related words to the image. An automatic image annotation system constructs the relationship between words and low level visual descriptors, which are extracted from images and by using these relationships annotates a newly seen image. The high demand on image annotation requirement increases the need to automatic image annotation systems. However, performances of current annotation methods are far from practical usage. The most common problem of current methods is the gap between semantic words and low level visual descriptors. Because of the semantic gap, annotation results of these methods contain irrelevant noisy words. To give more relevant results, refinement methods should be applied to classical image annotation outputs.
In this work, we represent a novel refinement approach for image annotation problem. The proposed system attacks the semantic gap problem by using the relationship between the words which are obtained from the dataset. Establishment of this relationship is the most crucial problem of the refinement process. In this study, we suggest a probabilistic and fuzzy approach for modelling the relationship among the words in the vocabulary, which is then employed to generate candidate annotations, based on the output of the image annotator. Candidate annotations are represented by a set of relational graphs. Finally, one of the generated candidate annotations is selected as a refined annotation result by using a clique optimization technique applied to the candidate annotation graph.
|
366 |
TEMPORARY THRESHOLD SHIFTS IN FINGERTIP VIBRATORY SENSATION FROM HAND-TRANSMITTED VIBRATION AND REPETITIVE SHOCKMAEDA, SETSUO 05 1900 (has links)
No description available.
|
367 |
Employing Multiple Kernel Support Vector Machines for Counterfeit Banknote RecognitionSu, Wen-pin 29 July 2008 (has links)
Finding an efficient method to detect counterfeit banknotes is imperative. In this study, we propose multiple kernel weighted support vector machine for counterfeit banknote recognition. A variation of SVM in optimizing false alarm rate, called FARSVM, is proposed which provide minimized false negative rate and false positive rate. Each banknote is divided into m ¡Ñ n partitions, and each partition comes with its own kernels. The optimal weight with each kernel matrix in the combination is obtained through the semidefinite programming (SDP) learning method. The amount of time and space required by the original SDP is very demanding. We focus on this framework and adopt two strategies to reduce the time and space requirements. The first strategy is to assume the non-negativity of kernel weights, and the second strategy is to set the sum of weights equal to 1. Experimental results show that regions with zero kernel weights are easy to imitate with today¡¦s digital imaging technology, and regions with nonzero kernel weights are difficult to imitate. In addition, these results show that the proposed approach outperforms single kernel SVM and standard SVM with SDP on Taiwanese banknotes.
|
368 |
On Some Properties of Interior Methods for OptimizationSporre, Göran January 2003 (has links)
<p>This thesis consists of four independent papers concerningdifferent aspects of interior methods for optimization. Threeof the papers focus on theoretical aspects while the fourth oneconcerns some computational experiments.</p><p>The systems of equations solved within an interior methodapplied to a convex quadratic program can be viewed as weightedlinear least-squares problems. In the first paper, it is shownthat the sequence of solutions to such problems is uniformlybounded. Further, boundedness of the solution to weightedlinear least-squares problems for more general classes ofweight matrices than the one in the convex quadraticprogramming application are obtained as a byproduct.</p><p>In many linesearch interior methods for nonconvex nonlinearprogramming, the iterates can "falsely" converge to theboundary of the region defined by the inequality constraints insuch a way that the search directions do not converge to zero,but the step lengths do. In the sec ond paper, it is shown thatthe multiplier search directions then diverge. Furthermore, thedirection of divergence is characterized in terms of thegradients of the equality constraints along with theasymptotically active inequality constraints.</p><p>The third paper gives a modification of the analytic centerproblem for the set of optimal solutions in linear semidefiniteprogramming. Unlike the normal analytic center problem, thesolution of the modified problem is the limit point of thecentral path, without any strict complementarity assumption.For the strict complementarity case, the modified problem isshown to coincide with the normal analytic center problem,which is known to give a correct characterization of the limitpoint of the central path in that case.</p><p>The final paper describes of some computational experimentsconcerning possibilities of reusing previous information whensolving system of equations arising in interior methods forlinear programming.</p><p><b>Keywords:</b>Interior method, primal-dual interior method,linear programming, quadratic programming, nonlinearprogramming, semidefinite programming, weighted least-squaresproblems, central path.</p><p><b>Mathematics Subject Classification (2000):</b>Primary90C51, 90C22, 65F20, 90C26, 90C05; Secondary 65K05, 90C20,90C25, 90C30.</p>
|
369 |
Algorithmic Trading : Hidden Markov Models on Foreign Exchange DataIdvall, Patrik, Jonsson, Conny January 2008 (has links)
<p>In this master's thesis, hidden Markov models (HMM) are evaluated as a tool for forecasting movements in a currency cross. With an ever increasing electronic market, making way for more automated trading, or so called algorithmic trading, there is constantly a need for new trading strategies trying to find alpha, the excess return, in the market.</p><p>HMMs are based on the well-known theories of Markov chains, but where the states are assumed hidden, governing some observable output. HMMs have mainly been used for speech recognition and communication systems, but have lately also been utilized on financial time series with encouraging results. Both discrete and continuous versions of the model will be tested, as well as single- and multivariate input data.</p><p>In addition to the basic framework, two extensions are implemented in the belief that they will further improve the prediction capabilities of the HMM. The first is a Gaussian mixture model (GMM), where one for each state assign a set of single Gaussians that are weighted together to replicate the density function of the stochastic process. This opens up for modeling non-normal distributions, which is often assumed for foreign exchange data. The second is an exponentially weighted expectation maximization (EWEM) algorithm, which takes time attenuation in consideration when re-estimating the parameters of the model. This allows for keeping old trends in mind while more recent patterns at the same time are given more attention.</p><p>Empirical results shows that the HMM using continuous emission probabilities can, for some model settings, generate acceptable returns with Sharpe ratios well over one, whilst the discrete in general performs poorly. The GMM therefore seems to be an highly needed complement to the HMM for functionality. The EWEM however does not improve results as one might have expected. Our general impression is that the predictor using HMMs that we have developed and tested is too unstable to be taken in as a trading tool on foreign exchange data, with too many factors influencing the results. More research and development is called for.</p>
|
370 |
Interpolation of non-smooth functions on anisotropic finite element meshesApel, Th. 30 October 1998 (has links) (PDF)
In this paper, several modifications of the quasi-interpolation operator
of Scott and Zhang (Math. Comp. 54(1990)190, 483--493) are discussed.
The modified operators are defined for non-smooth functions and are suited
for the application on anisotropic meshes. The anisotropy of the elements
is reflected in the local stability and approximation error estimates.
As an application, an example is considered where anisotropic finite element
meshes are appropriate, namely the Poisson problem in domains with edges.
|
Page generated in 0.0269 seconds