• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 3
  • 1
  • 1
  • Tagged with
  • 14
  • 14
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Invariantes de curvas em grassmannianas divisíveis e equações diferenciais ordinárias / Invariants of curves in divisible grassmannians and ordinary differential equations

Peixoto, Cíntia Rodrigues de Araújo 16 August 2018 (has links)
Orientador: Carlos Eduardo Durán Fernández / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica / Made available in DSpace on 2018-08-16T19:52:14Z (GMT). No. of bitstreams: 1 Peixoto_CintiaRodriguesdeAraujo_D.pdf: 2016009 bytes, checksum: 51b7c0e37f8a49e57e6bdb92841cc4de (MD5) Previous issue date: 2010 / Resumo: Neste trabalho estudamos a geometria de curvas de n-subespaços em Rkn, onde k é um natural qualquer, usando a mesma abordagem introduzida por J. C. Álvarez e C. Durán. Para isso generalizamos o endomorfismo fundamental e o descrevemos como um mergulho equivariante dos (k-1)-jets de curvas na Grassmanniana na álgebra de Lie de Gl(kn). Para descrição da geometria das curvas, analisamos as invariantes lineares obtidos do endomorfismo fundamental, comparados com os invariantes obtidos dos sistemas de equações diferenciais ordinárias de ordem k associados à curva. Como conseqüências, obtemos ainda uma solução para o problema de congruência de curvas na Grassmanniana e alguns casos especiais de curvas. / Abstract: In this work we study the geometry of curves of n-subspaces in Rkn, where k is any natural number. We use the same approach introduced by J. C. Álvarez e C. Durán. In order to this, we generalize the fundamental endomorphism and we describe it as a equivariant embedding of (k-1)-jets of curves in Grassmannian manifold to the Lie Algebra of Gl(kn). We describe the curve geometry analyzing the linear invariants that we obtain from the fundamental endomorphism and from the ordinary differential systems of equations with order k associated with the curve. We obtain in consequence the solution of the congruence problem for curves in the Grassmannian and some special cases of curves. / Doutorado / Geometria Diferencial / Doutor em Matemática
12

A geometria de curvas fanning e de suas reduções simpléticas / The geometry of fanning curves and of their simplectic reductions

Vitório, Henrique de Barros Correia 16 August 2018 (has links)
Orientadores: Carlos Eduardo Durán Fernandez, Marcos Benevenutto Jardim / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica / Made available in DSpace on 2018-08-16T11:28:16Z (GMT). No. of bitstreams: 1 Vitorio_HenriquedeBarrosCorreia_D.pdf: 1074812 bytes, checksum: e23ca71f5e87d6990c05425cdcb87bee (MD5) Previous issue date: 2010 / Resumo: A presente tese dá continuidade ao recente trabalho de J.C . Álvarez e C.E. Durán acerca dos invariantes geométricos de uma classe genérica de curvas em variedades de Grassmann, ditas "curvas fanning". Mais precisamente, considera-se como tais curvas de planos lagrangeanos comportam-se mediante uma redução simplética, e conclui-se a existência de dois novos invariantes que desempenham um papel fundamental neste contexto, mais notavelmente a maneira pela qual eles generalizam as bem conhecidas fórmulas de O'Neill para submersões isométricas / Abstract: The present thesis gives continuity to the recent work of J.C. Álvarez e C.E. Durán about the geometric invariants of a generic class of curves in the Grassmann manifolds, called "fanning curves". More precisely, we look at how such curves of lagrangean planes behave under a symplectic reduction, and establish the existence of two new invariants which play a fundamental role in that context, more notably the way they generalize the well known O'Neill's formulas for isometric submersions / Doutorado / Matematica / Doutor em Matemática
13

[pt] ANÁLISE EM GRASSMANNIANAS E O TEOREMA DE JOHNSON-LINDENSTRAUSS / [en] GRASSMANIAN ANALYSIS AND THE JOHNSON-LINDENSTRAUSS THEOREM

11 November 2021 (has links)
[pt] Seja V um conjunto de n pontos no espaço euclidiano X de dimensão d. Pelo teorema de Johnson-Lindenstrauss, existe uma projeção entre X e Y, outro espaço de dimensão k bastante menor, com a propriedade que as distâncias entre imagens de pontos de V sejam mantidas dentro de um fator c arbitrariamente próximo de 1. O teorema apresenta uma relação entre d, k e c, indicando a possibilidade de dramáticas reduções de dimensão para representações fidedignas de V. A demonstração emprega as Grassmannianas, as variedades de subespaços de dimensão k em X. São construídas cartas e uma medida homogênea em relação à ação natural do grupo ortogonal na Grassmanniana. O resultado segue estimando através de gaussianas certas integrais de caráter fortemente geométrico. / [en] Let V be a set of n points in the Euclidean space X of dimension d. The Johnson-Lindenstrauss theorem states that there is a projection between X a and Y, another Euclidean space of a smaller dimension k, with the property that images of points of X under projection do not differ by more that a multiplicative factor c arbitrarily close to 1. The theorem presents a relation among d, k and c, indicating the possibility of dramatic dimensional reduction of very faithful representations of V. The proof makes use of Grassmanians, the manifolds consisting of subspaces of dimension k in X. In the text, charts are presented, together with a measure which is homogeneous with respect to the natural action of the orthogonal group on the Grassmanian. The result follows by taking estimates using gaussians of certain integrals with a strong geometric flavor.
14

Algorithms in data mining using matrix and tensor methods

Savas, Berkant January 2008 (has links)
In many fields of science, engineering, and economics large amounts of data are stored and there is a need to analyze these data in order to extract information for various purposes. Data mining is a general concept involving different tools for performing this kind of analysis. The development of mathematical models and efficient algorithms is of key importance. In this thesis we discuss algorithms for the reduced rank regression problem and algorithms for the computation of the best multilinear rank approximation of tensors. The first two papers deal with the reduced rank regression problem, which is encountered in the field of state-space subspace system identification. More specifically the problem is \[ \min_{\rank(X) = k} \det (B - X A)(B - X A)\tp, \] where $A$ and $B$ are given matrices and we want to find $X$ under a certain rank condition that minimizes the determinant. This problem is not properly stated since it involves implicit assumptions on $A$ and $B$ so that $(B - X A)(B - X A)\tp$ is never singular. This deficiency of the determinant criterion is fixed by generalizing the minimization criterion to rank reduction and volume minimization of the objective matrix. The volume of a matrix is defined as the product of its nonzero singular values. We give an algorithm that solves the generalized problem and identify properties of the input and output signals causing a singular objective matrix. Classification problems occur in many applications. The task is to determine the label or class of an unknown object. The third paper concerns with classification of handwritten digits in the context of tensors or multidimensional data arrays. Tensor and multilinear algebra is an area that attracts more and more attention because of the multidimensional structure of the collected data in various applications. Two classification algorithms are given based on the higher order singular value decomposition (HOSVD). The main algorithm makes a data reduction using HOSVD of 98--99 \% prior the construction of the class models. The models are computed as a set of orthonormal bases spanning the dominant subspaces for the different classes. An unknown digit is expressed as a linear combination of the basis vectors. The resulting algorithm achieves 5\% in classification error with fairly low amount of computations. The remaining two papers discuss computational methods for the best multilinear rank approximation problem \[ \min_{\cB} \| \cA - \cB\| \] where $\cA$ is a given tensor and we seek the best low multilinear rank approximation tensor $\cB$. This is a generalization of the best low rank matrix approximation problem. It is well known that for matrices the solution is given by truncating the singular values in the singular value decomposition (SVD) of the matrix. But for tensors in general the truncated HOSVD does not give an optimal approximation. For example, a third order tensor $\cB \in \RR^{I \x J \x K}$ with rank$(\cB) = (r_1,r_2,r_3)$ can be written as the product \[ \cB = \tml{X,Y,Z}{\cC}, \qquad b_{ijk}=\sum_{\lambda,\mu,\nu} x_{i\lambda} y_{j\mu} z_{k\nu} c_{\lambda\mu\nu}, \] where $\cC \in \RR^{r_1 \x r_2 \x r_3}$ and $X \in \RR^{I \times r_1}$, $Y \in \RR^{J \times r_2}$, and $Z \in \RR^{K \times r_3}$ are matrices of full column rank. Since it is no restriction to assume that $X$, $Y$, and $Z$ have orthonormal columns and due to these constraints, the approximation problem can be considered as a nonlinear optimization problem defined on a product of Grassmann manifolds. We introduce novel techniques for multilinear algebraic manipulations enabling means for theoretical analysis and algorithmic implementation. These techniques are used to solve the approximation problem using Newton and Quasi-Newton methods specifically adapted to operate on products of Grassmann manifolds. The presented algorithms are suited for small, large and sparse problems and, when applied on difficult problems, they clearly outperform alternating least squares methods, which are standard in the field.

Page generated in 0.0549 seconds