• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 816
  • 411
  • 122
  • 95
  • 43
  • 31
  • 24
  • 22
  • 17
  • 15
  • 14
  • 13
  • 12
  • 10
  • 8
  • Tagged with
  • 1891
  • 414
  • 358
  • 341
  • 207
  • 190
  • 180
  • 167
  • 144
  • 144
  • 140
  • 133
  • 119
  • 117
  • 115
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Bias of the maximum likelihood estimator of the generalized Rayleigh distribution

Ling, Xiao 29 August 2011 (has links)
We derive analytic expressions for the biases, to O(n^(-1)) of the maximum likelihood estimators of the parameters of the generalized Rayleigh distribution family. Using these expressions to bias-correct the estimators is found to be extremely effective in terms of bias reduction, and generally results in a small reduction in relative mean squared error. In general, the analytic bias-corrected estimators are also found to be superior to the alternative of bias-correction via the bootstrap. / Graduate
62

Extensions of Weyl metrics.

Morgan, Francis Hamilton. January 1977 (has links) (PDF)
Thesis (M.Sc.) -- University of Adelaide, Dept. of Mathematical Physics, 1977.
63

Die Gauss-Bonnet-Formel in konform-euklidischen Räumen

Raab, Werner. January 1972 (has links)
Habilitationsschrift, Bonn, 1971; extra t.p. inserted. / Includes bibliographical references (p. [92-93]).
64

Set-theoretic consistency results and topological theorems concerning the normal Moore space conjecture and related problems

Tall, Franklin D. January 1969 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1969. / Typescript. Vita. Description based on print version record. Includes bibliographical references.
65

Characterisierungen von Saturationsklassen in L¹En)

Trebels, Walter. January 1900 (has links)
Diss.--Technische Hochscule, Aachen. / Vita. Bibliography: p. 89-93.
66

Advances in kernel methods : towards general-purpose and scalable models

Samo, Yves-Laurent Kom January 2017 (has links)
A wide range of statistical and machine learning problems involve learning one or multiple latent functions, or properties thereof, from datasets. Examples include regression, classification, principal component analysis, optimisation, learning intensity functions of point processes and reinforcement learning to name but a few. For all these problems, positive semi-definite kernels (or simply kernels) provide a powerful tool for postulating flexible nonparametric hypothesis spaces over functions. Despite recent work on such kernel methods, parametric alternatives, such as deep neural networks, have been at the core of most artificial intelligence breakthroughs in recent years. In this thesis, both theoretical and methodological foundations are presented for constructing fully automated, scalable, and general-purpose kernel machines that perform very well over a wide range of input dimensions and sample sizes. This thesis aims to contribute towards bridging the gap between kernel methods and deep learning and to propose methods that have the advantage over deep learning in performing well on both small and large scale problems. In Part I we provide a gentle introduction to kernel methods, review recent work, identify remaining gaps and outline our contributions. In Part II we develop flexible and scalable Bayesian kernel methods in order to address gaps in methods capable of dealing with the special case of datasets exhibiting locally homogeneous patterns. We begin with two motivating applications. First we consider inferring the intensity function of an inhomogeneous point process in Chapter 2. This application is used to illustrate that often, by carefully adding some mild asymmetry in the dependency structure in Bayesian kernel methods, one may considerably scale-up inference while improving flexibility and accuracy. In Chapter 3 we propose a scalable scheme for online forecasting of time series and fully-online learning of related model parameters, under a kernel-based generative model that is provably sufficiently flexible. This application illustrates that, for one-dimensional input spaces, restricting the degree of differentiability of the latent function of interest may considerably speed-up inference without resorting to approximations and without any adverse effect on flexibility or accuracy. Chapter 4 generalizes these approaches and proposes a novel class of stochastic processes we refer to as string Gaussian processes (string GPs) that, when used as functional prior in a Bayesian nonparametric framework, allow for inference in linear time complexity and linear memory requirement, without resorting to approximations. More importantly, the corresponding inference scheme, which we derive in Chapter 5, also allows flexible learning of locally homogeneous patterns and automated learning of model complexity - that is automated learning of whether there are local patterns in the data in the first place, how much local patterns are present, and where they are located. In Part III we provide a broader discussion covering all types of patterns (homogeneous, locally homogeneous or heterogeneous patterns) and both Bayesian or frequentist kernel methods. In Chapter 6 we begin by discussing what properties a family of kernels should possess to enable fully automated kernel methods that are applicable to any type of datasets. In this chapter, we discuss a novel mathematical formalism for the notion of ‘general-purpose' families of kernels, and we argue that existing families of kernels are not general-purpose. In Chapter 7 we derive weak sufficient conditions for families of kernels to be general-purpose, and we exhibit tractable such families that enjoy a suitable parametrisation, that we refer to as generalized spectral kernels (GSKs). In Chapter 8 we provide a scalable inference scheme for automated kernel learning using general-purpose families of kernels. The proposed inference scheme scales linearly with the sample size and enables automated learning of nonstationarity and model complexity from the data, in virtually any kernel method. Finally, we conclude with a discussion in Chapter 9 where we show that deep learning can be regarded as a particular type of kernel learning method, and we discuss possible extensions in Chapter 10.
67

Extension theorems on L-topological spaces and L-fuzzy vector spaces

Pinchuck, Andrew January 2002 (has links)
A non-trivial example of an L-topological space, the fuzzy real line is examined. Various L-topological properties and their relationships are developed. Extension theorems on the L-fuzzy real line as well as extension theorems on more general L-topological spaces follow. Finally, a theory of L-fuzzy vector spaces leads up to a fuzzy version of the Hahn-Banach theorem.
68

Characterization of subspaces of rank two grassmann vectors of order two

Lim, Marion Josephine Sui Sim January 1967 (has links)
Let U be an n-dimensional vector space over an algebraically closed field. Let [formula omitted] denote the [formula omitted] space spanned by all Grassmann products [formula omitted]. Subsets of vectors of [formula omitted] denoted by [formula omitted] and [formula omitted] are defined as follows [formula omitted]. A vector which is in [formula omitted] or is zero is called pure or decomposable. Each vector in [formula omitted] is said to have rank one. Similarly each vector in [formula omitted] has rank two. A subspace of H of [formula omitted] is called a rank two subspace If [formula omitted] is contained in [formula omitted]. In this thesis we are concerned with investigating rank two subspaces. The main results are as follows: If dim [formula omitted] such that every nonzero vector [formula omitted] is independent in U. The rank two subspaces of dimension less than four are also characterized. / Science, Faculty of / Mathematics, Department of / Graduate
69

On the Generalized Dirichlet Problem

Haines, Paul Douglas 08 1900 (has links)
<p> In this thesis, we shall solve the classical Dirichlet problem for a ball in n-dimensional Euclidean space, and then point out that the classical Dirichlet problem is not always solvable. Following Wiener and Brelot, we then introduce a generalized Dirichlet problem for any bounded region in n-dimensional Euclidean space and establish necessary and sufficient conditions for its solution. We show that the solution of the generalized Dirichlet problem coincides with the solution of the classical Dirichlet problem whenever the latter exists. Finally, we characterize those regions for which the classical Dirichlet problem is solvable by considering the boundary behaviour of those functions for which the generalized problem is solvable.</p> / Thesis / Master of Science (MSc)
70

Application of Generalized Grids to Turbomachinery CFD Simulations

Singh, Rajkeshar 13 December 2002 (has links)
A generalized grid based technique was developed for handling the relative motion of grids in CFD simulations involving rotating machineries. In the present method, the relative motion between the grid blocks is handled by splitting the cellaces at the interface and updating the grid data structure appropriately. The resulting grid will have cells and cellaces with an arbitrary number of nodes and which are stored in a cellace based data structure. The current methodology is developed for cells with any number of nodes. However, the present work supports only tetrahedral elements at the interface of the rotating grid-blocks at the beginning of the simulation. Also the present approach can handle multiple objects in the domain of interest which are rotating in arbitrary directions. The current approach was tested by rotating a generalized grid for a single un-ducted SR7 propeller with eight blades designed with 41 degrees of sweep at the tip. This was also tested for two counter rotating SR7 propellers. After every rotation the new grid was tested for negative volumes, folded cellaces, proper connectivity of nodes forming the cellaces, and for gaps. Preliminary work has been conducted to couple the grid generation strategy to a generalized grid based flow solver.

Page generated in 0.0536 seconds