The interdisciplinary field of machine learning studies algorithms in which functionality is dependent on data sets. This data is often treated as a matrix, and a variety of mathematical methods have been developed to glean information from this data structure such as matrix decomposition. The Laplacian matrix, for example, is commonly used to reconstruct networks, and the eigenpairs of this matrix are used in matrix decomposition. Moreover, concepts such as SVD matrix factorization are closely connected to manifold learning, a subfield of machine learning that assumes the observed data lie on a low-dimensional manifold embedded in a higher-dimensional space. Since many data sets have natural higher dimensions, tensor methods are being developed to deal with big data more efficiently. This thesis builds on these ideas by exploring how matrix methods can be extended to data presented as tensors rather than simply as ordinary vectors.
Identifer | oai:union.ndltd.org:ETSU/oai:dc.etsu.edu:etd-5460 |
Date | 01 August 2021 |
Creators | Sanders, Scott |
Publisher | Digital Commons @ East Tennessee State University |
Source Sets | East Tennessee State University |
Language | English |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Electronic Theses and Dissertations |
Rights | Copyright by the authors. |
Page generated in 0.0067 seconds