• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 228
  • 129
  • 56
  • 23
  • 14
  • 6
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 516
  • 92
  • 80
  • 80
  • 68
  • 64
  • 56
  • 47
  • 47
  • 43
  • 39
  • 38
  • 36
  • 35
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Renormalized integrals and a path integral formula for the heat kernel on a manifold

Bär, Christian January 2012 (has links)
We introduce renormalized integrals which generalize conventional measure theoretic integrals. One approximates the integration domain by measure spaces and defines the integral as the limit of integrals over the approximating spaces. This concept is implicitly present in many mathematical contexts such as Cauchy's principal value, the determinant of operators on a Hilbert space and the Fourier transform of an L^p function. We use renormalized integrals to define a path integral on manifolds by approximation via geodesic polygons. The main part of the paper is dedicated to the proof of a path integral formula for the heat kernel of any self-adjoint generalized Laplace operator acting on sections of a vector bundle over a compact Riemannian manifold.
222

Méthode d'éléments spectraux avec joints pour des géométries axisymétriques

Satouri, Jamil 09 November 2010 (has links) (PDF)
Dans cette thèse on s'est intéressé aux problèmes tridimensionnels de Laplace et de stokes dans des domaines axisymétriques. Ces problèmes sont réduits, sans approximation et par des développements en coefficients de Fourier en une famille dénombrable de problèmes bidimensionnels. Les domaines qu'on a considéré présentent des singularités géométriques et sont décomposés de façons non nécessairement conformes. Les non conformités sur les interfaces entre les sous domaines sont traités par la méthode des joints. La méthode de base de discrétisation est la méthode spectrale. On a montre alors des résultats d'approximation optimaux, proches de ceux trouves lors de l'approximation conformes avec des contraintes de continuités sur les interfaces. Ceci prouve encore une fois l'efficacité de la méthode des joints.
223

Normally solvable nonlinear boundary value problems

Alsaedy, Ammar, Tarkhanov, Nikolai January 2013 (has links)
We study a boundary value problem for an overdetermined elliptic system of nonlinear first order differential equations with linear boundary operators. Such a problem is solvable for a small set of data, and so we pass to its variational formulation which consists in minimising the discrepancy. The Euler-Lagrange equations for the variational problem are far-reaching analogues of the classical Laplace equation. Within the framework of Euler-Lagrange equations we specify an operator on the boundary whose zero set consists precisely of those boundary data for which the initial problem is solvable. The construction of such operator has much in common with that of the familiar Dirichlet to Neumann operator. In the case of linear problems we establish complete results.
224

Efficient Kernel Methods for Statistical Detection

Su, Wanhua 20 March 2008 (has links)
This research is motivated by a drug discovery problem -- the AIDS anti-viral database from the National Cancer Institute. The objective of the study is to develop effective statistical methods to model the relationship between the chemical structure of a compound and its activity against the HIV-1 virus. And as a result, the structure-activity model can be used to predict the activity of new compounds and thus helps identify those active chemical compounds that can be used as drug candidates. Since active compounds are generally rare in a compound library, we recognize the drug discovery problem as an application of the so-called statistical detection problem. In a typical statistical detection problem, we have data {Xi,Yi}, where Xi is the predictor vector of the ith observation and Yi={0,1} is its class label. The objective of a statistical detection problem is to identify class-1 observations, which are extremely rare. Besides drug discovery problem, other applications of statistical detection include direct marketing and fraud detection. We propose a computationally efficient detection method called LAGO, which stands for "locally adjusted GO estimator". The original idea is inspired by an ancient game known today as "GO". The construction of LAGO consists of two steps. In the first step, we estimate the density of class 1 with an adaptive bandwidth kernel density estimator. The kernel functions are located at and only at the class-1 observations. The bandwidth of the kernel function centered at a certain class-1 observation is calculated as the average distance between this class-1 observation and its K-nearest class-0 neighbors. In the second step, we adjust the density estimated in the first step locally according to the density of class 0. It can be shown that the amount of adjustment in the second step is approximately inversely proportional to the bandwidth calculated in the first step. Application to the NCI data demonstrates that LAGO is superior to methods such as K nearest neighbors and support vector machines. One drawback of the existing LAGO is that it only provides a point estimate of a test point's possibility of being class 1, ignoring the uncertainty of the model. In the second part of this thesis, we present a Bayesian framework for LAGO, referred to as BLAGO. This Bayesian approach enables quantification of uncertainty. Non-informative priors are adopted. The posterior distribution is calculated over a grid of (K, alpha) pairs by integrating out beta0 and beta1 using the Laplace approximation, where K and alpha are two parameters to construct the LAGO score. The parameters beta0, beta1 are the coefficients of the logistic transformation that converts the LAGO score to the probability scale. BLAGO provides proper probabilistic predictions that have support on (0,1) and captures uncertainty of the predictions as well. By avoiding Markov chain Monte Carlo algorithms and using the Laplace approximation, BLAGO is computationally very efficient. Without the need of cross-validation, BLAGO is even more computationally efficient than LAGO.
225

On the Classification of the R-separable webs for the Laplace equation in E^3

Chanachowicz, Mark 16 April 2008 (has links)
In the first two Chapters I outline the theory and background of separation of variables as an ansatz for solving fundamental partial differential equations (pdes) in Mathematical Physics. Two fundamental approaches will be highlighted, and more modern approaches discussed. In Chapter 3 I calculate the general trace-free conformal Killing tensor defined in Euclidean space - from the sum of symmetric tensor products of conformal Killing vectors. In Chapter 4 I determine the subcases with rotational symmetry and recover known examples pertaining to classical rotational coordinates. In Chapter 5 I obtain the induced action of the conformal group on the space of trace-free conformal Killing tensors. In Chapter 6 I use the invariants of trace-free conformal Killing tensors under the action of the conformal group to characterize, up to equivalence, the symmetric R-separable webs in E^3 that permit conformal separation of variables of the fundamental pdes in Mathematical Physics. In Chapter 7 the asymmetric R-separable metrics are obtained via a study of the separability conditions for the conformally invariant Laplace equation.
226

Efficient Kernel Methods for Statistical Detection

Su, Wanhua 20 March 2008 (has links)
This research is motivated by a drug discovery problem -- the AIDS anti-viral database from the National Cancer Institute. The objective of the study is to develop effective statistical methods to model the relationship between the chemical structure of a compound and its activity against the HIV-1 virus. And as a result, the structure-activity model can be used to predict the activity of new compounds and thus helps identify those active chemical compounds that can be used as drug candidates. Since active compounds are generally rare in a compound library, we recognize the drug discovery problem as an application of the so-called statistical detection problem. In a typical statistical detection problem, we have data {Xi,Yi}, where Xi is the predictor vector of the ith observation and Yi={0,1} is its class label. The objective of a statistical detection problem is to identify class-1 observations, which are extremely rare. Besides drug discovery problem, other applications of statistical detection include direct marketing and fraud detection. We propose a computationally efficient detection method called LAGO, which stands for "locally adjusted GO estimator". The original idea is inspired by an ancient game known today as "GO". The construction of LAGO consists of two steps. In the first step, we estimate the density of class 1 with an adaptive bandwidth kernel density estimator. The kernel functions are located at and only at the class-1 observations. The bandwidth of the kernel function centered at a certain class-1 observation is calculated as the average distance between this class-1 observation and its K-nearest class-0 neighbors. In the second step, we adjust the density estimated in the first step locally according to the density of class 0. It can be shown that the amount of adjustment in the second step is approximately inversely proportional to the bandwidth calculated in the first step. Application to the NCI data demonstrates that LAGO is superior to methods such as K nearest neighbors and support vector machines. One drawback of the existing LAGO is that it only provides a point estimate of a test point's possibility of being class 1, ignoring the uncertainty of the model. In the second part of this thesis, we present a Bayesian framework for LAGO, referred to as BLAGO. This Bayesian approach enables quantification of uncertainty. Non-informative priors are adopted. The posterior distribution is calculated over a grid of (K, alpha) pairs by integrating out beta0 and beta1 using the Laplace approximation, where K and alpha are two parameters to construct the LAGO score. The parameters beta0, beta1 are the coefficients of the logistic transformation that converts the LAGO score to the probability scale. BLAGO provides proper probabilistic predictions that have support on (0,1) and captures uncertainty of the predictions as well. By avoiding Markov chain Monte Carlo algorithms and using the Laplace approximation, BLAGO is computationally very efficient. Without the need of cross-validation, BLAGO is even more computationally efficient than LAGO.
227

On the Classification of the R-separable webs for the Laplace equation in E^3

Chanachowicz, Mark 16 April 2008 (has links)
In the first two Chapters I outline the theory and background of separation of variables as an ansatz for solving fundamental partial differential equations (pdes) in Mathematical Physics. Two fundamental approaches will be highlighted, and more modern approaches discussed. In Chapter 3 I calculate the general trace-free conformal Killing tensor defined in Euclidean space - from the sum of symmetric tensor products of conformal Killing vectors. In Chapter 4 I determine the subcases with rotational symmetry and recover known examples pertaining to classical rotational coordinates. In Chapter 5 I obtain the induced action of the conformal group on the space of trace-free conformal Killing tensors. In Chapter 6 I use the invariants of trace-free conformal Killing tensors under the action of the conformal group to characterize, up to equivalence, the symmetric R-separable webs in E^3 that permit conformal separation of variables of the fundamental pdes in Mathematical Physics. In Chapter 7 the asymmetric R-separable metrics are obtained via a study of the separability conditions for the conformally invariant Laplace equation.
228

An introduction to Gerber-Shiu analysis

Huynh, Mirabelle January 2011 (has links)
A valuable analytical tool to understand the event of ruin is a Gerber-Shiu discounted penalty function. It acts as a unified means of identifying ruin-related quantities which may help insurers understand their vulnerability ruin. This thesis provides an introduction to the basic concepts and common techniques used for the Gerber-Shiu analysis. Chapter 1 introduces the insurer's surplus process in the ordinary Sparre Andersen model. Defective renewal equations, the Dickson-Hipp transform, and Lundberg's fundamental equation are reviewed. Chapter 2 introduces the classical Gerber-Shiu discounted penalty function. Two framework equations are derived by conditioning on the first drop in surplus below its initial value, and by conditioning on the time and amount of the first claim. A detailed discussion is provided for each of these conditioning arguments. The classical Poisson model (where interclaim times are exponentially distributed) is then considered. We also consider when claim sizes are exponentially distributed. Chapter 3 introduces the Gerber-Shiu function in the delayed renewal model which allows the time until the first claim to be distributed differently than subsequent interclaim times. We determine a functional relationship between the Gerber-Shiu function in the ordinary Sparre Andersen model and the Gerber-Shiu function in the delayed model for a class of first interclaim time densities which includes the equilibrium density for the stationary renewal model, and the exponential density. To conclude, Chapter 4 introduces a generalized Gerber-Shiu function where the penalty function includes two additional random variables: the minimum surplus level before ruin, and the surplus immediately after the claim before the claim causing ruin. This generalized Gerber-Shiu function allows for the study of random variables which otherwise could not be studied using the classical definition of the function. Additionally, it is assumed that the size of a claim is dependant on the interclaim time that precedes it. As is done in Chapter 2, a detailed discussion of each of the two conditioning arguments is provided. Using the uniqueness property of Laplace transforms, the form of joint defective discounted densities of interest are determined. The classical Poisson model and the exponential claim size assumption is also revisited.
229

Study of Singular Capillary Surfaces and Development of the Cluster Newton Method

Aoki, Yasunori January 2012 (has links)
In this thesis, we explore two important aspects of study of differential equations: analytical and computational aspects. We first consider a partial differential equation model for a static liquid surface (capillary surface). We prove through mathematical analyses that the solution of this mathematical model (the Laplace-Young equation) in a cusp domain can be bounded or unbounded depending on the boundary conditions. By utilizing the knowledge we have obtained about the singular behaviour of the solution through mathematical analysis, we then construct a numerical methodology to accurately approximate unbounded solutions of the Laplace-Young equation. Using this accurate numerical methodology, we explore some remaining open problems on singular solutions of the Laplace-Young equation. Lastly, we consider ordinary differential equation models used in the pharmaceutical industry and develop a numerical method for estimating model parameters from incomplete experimental data. With our numerical method, the parameter estimation can be done significantly faster and more robustly than with conventional methods.
230

Fundamental Studies of Capillary Forces in Porous Media

Alvarellos, Jose 18 March 2004 (has links)
The contact angle defined by Young's equation depends on the ratio between solid and liquid surface energies. Young's contact angle is constant for a given system, and cannot explain the stability of fluid droplets in capillary tubes. Within this framework, large variations in contact angle and explained aassuming surface roughness, heterogeneity or contamination. This research explores the static and dynamic behavior of fluid droplets within capillary tubes and the variations in contact angle among interacting menisci. Various cases are considered including wetting and non-wetting gluids, droplets in inclined capillary tubes or subjected to a pressure difference, within one-dimensional and three-dimensional capillary systems, and under static or dynamic conditions (either harmonic fluid pressure or tube oscillation). The research approach is based on complementary analytical modeling (total energy formulation) and experimental techniques (microscopic observations). The evolution of meniscus curvatures and droplet displacements are studied in all cases. Analytical and experimental results show that droplets can be stable within capillary tubes even under the influence of an external force, the resulting contact angles are not constant, and bariations from Young's contact angle aare extensively justified as menisci interaction. Menisci introduce stiffness, therefore two immiscible Newtonian fluids behave as a Maxwellian fluid, and droplets can exhibit resonance or relaxation spectral features.

Page generated in 0.0178 seconds