1 |
Coded modulation schemes for wireless channelsNg, Soon Xin January 2002 (has links)
No description available.
|
2 |
Radial Basis Collocation Method for Singularly Perturbed Partial Differential EquationsLi, Fang-wen 21 June 2004 (has links)
In this thesis, we integrate the particular solutions of singularly perturbed partial differential equations into radial basis collocation method to solve two kinds of boundary layer problem.
|
3 |
Refined error estimates for matrix-valued radial basis functions /Fuselier, Edward J., January 2006 (has links)
Thesis (Ph. D.)--Texas A&M University, 2006. / "May 2006." "Major subject: Mathematics." Vita. Includes bibliographical references (p. 67-70). Also available online.
|
4 |
L^p Bernstein Inequalities and Radial Basis Function ApproximationWard, John P. 2010 August 1900 (has links)
In approximation theory, three classical types of results are direct theorems,
Bernstein inequalities, and inverse theorems. In this paper, we include results about
radial basis function (RBF) approximation from all three classes. Bernstein inequalities
are a recent development in the theory of RBF approximation, and on Rd, only
L2 results are known for RBFs with algebraically decaying Fourier transforms (e.g.
the Sobolev splines and thin-plate splines). We will therefore extend what is known
by establishing Lp Bernstein inequalities for RBF networks on Rd. These inequalities
involve bounding a Bessel-potential norm of an RBF network by its corresponding Lp
norm in terms of the separation radius associated with the network. While Bernstein
inequalities have a variety of applications in approximation theory, they are most commonly
used to prove inverse theorems. Therefore, using the Lp Bernstein inequalities
for RBF approximants, we will establish the corresponding inverse theorems. The
direct theorems of this paper relate to approximation in Lp(Rd) by RBFs which are
perturbations of Green's functions. Results of this type are known for certain compact
domains, and results have recently been derived for approximation in Lp(Rd)
by RBFs that are Green's functions. Therefore, we will prove that known results for
approximation in Lp(Rd) hold for a larger class of RBFs. We will then show how this
result can be used to derive rates for approximation by Wendland functions.
|
5 |
Construction of Compact 3D Objects by Radial Basis Functions and Progressive CompressionHuang, Wei-Chiuan 02 February 2006 (has links)
Abstract
The representation of 3D Computer Graphics has been studied for a long time. Most 3D object models are obtained by 3D scan systems. These kinds of data are not only very huge, but also have a lot of redundancy. It consumes a large mount of time and resources. For this reason, how to represent the object efficiently is always an important issue. The purpose of this study is to present the objects by implicit functions. Different with the presentation of polygon mesh, implicit function is very compact to present objects because that the mathematical form of object can be obtained in different data forms. The implicit function used is Radial Basis Function, and then the BSP tree is used to partition the object to reduce the amount of computing. We also use the compression of progressive mesh to decrease the storage and the computing time. In addition, the object can be rendered according to the sampling points on each implicit surface.
|
6 |
Refined error estimates for matrix-valued radial basis functionsFuselier, Edward J., Jr. 17 September 2007 (has links)
Radial basis functions (RBFs) are probably best known for their applications to
scattered data problems. Until the 1990s, RBF theory only involved functions that
were scalar-valued. Matrix-valued RBFs were subsequently introduced by Narcowich
and Ward in 1994, when they constructed divergence-free vector-valued functions
that interpolate data at scattered points. In 2002, Lowitzsch gave the first error
estimates for divergence-free interpolants. However, these estimates are only valid
when the target function resides in the native space of the RBF. In this paper we develop
Sobolev-type error estimates for cases where the target function is less smooth
than functions in the native space. In the process of doing this, we give an alternate
characterization of the native space, derive improved stability estimates for the interpolation
matrix, and give divergence-free interpolation and approximation results
for band-limited functions. Furthermore, we introduce a new class of matrix-valued
RBFs that can be used to produce curl-free interpolants.
|
7 |
Neural and genetic modelling, control and real-time finite simulation of flexible manipulatorsShaheed, Mohammad Hasan January 2000 (has links)
No description available.
|
8 |
Radial Bases and Ill-Posed ProblemsChen, Ho-Pu 15 August 2006 (has links)
RBFs are useful in scientific computing. In this thesis, we are interested in the positions of collocation points and RBF centers which causes the matrix for RBF interpolation singular and ill-conditioned. We explore the best bases by minimizing error function in supremum norm and root mean squares. We also use radial basis function to interpolate shifted data and find the best basis in certain sense.
In the second part, we solve ill-posed problems by radial basis collocation method with different radial basis functions and various number of bases. If the solution is not unique, then the numerical solutions are different for different bases. To construct all the solutions, we can choose one approximation solution and add the linear combinations of the difference functions for various bases. If the solution does not exist, we show the numerical solution always fail to satisfy the origin equation.
|
9 |
Radial parts of invariant differential operators on Grassmann manifolds /Kurgalina, Olga S. January 2004 (has links)
Thesis (Ph.D.)--Tufts University, 2004. / Adviser: Fulton B. Gonzalez. Submitted to the Dept. of Mathematics. Includes bibliographical references (leaves 72-73). Access restricted to members of the Tufts University community. Also available via the World Wide Web;
|
10 |
Study on Additive Generalized Radial Basis Function NetworksLiao, Shih-hui 18 June 2009 (has links)
In this thesis, we propose a new class of learning models, namely the additive generalized radial basis function networks (AGRBFNs), for general nonlinear regression problems. This class of learning machines combines the generalized radial basis function networks (GRBFNs) commonly used in general machine learning problems and the additive models (AMs) frequently encountered in semiparametric regression problems. In statistical regression theory, AM is a good compromise between the linear model and the nonparametric model. In order for more general network structure hoping to address more general data sets, the AMs are embedded in the output layer of the GRBFNs to form the AGRBFNs. Simple weights updating rules based on incremental gradient descent will be derived. Several illustrative examples are provided to compare the performances for the classical GRBFNs and the proposed AGRBFNs. Simulation results show that upon proper selection of the hidden nodes and the bandwidth of the kernel smoother used in additive output layer, AGRBFNs can give better fits than the classical GRBFNs. Furthermore, for the given learning problem, AGRBFNs usually need fewer hidden nodes than those of GRBFNs for the same level of accuracy.
|
Page generated in 0.0551 seconds