Spelling suggestions: "subject:"smooth interpolation"" "subject:"smooth anterpolation""
1 |
Algorithms for Scalable On-line Machine Learning on Regression TasksSchoenke, Jan H. 25 April 2019 (has links)
In the realm of ever increasing data volume and traffic the processing of data as a stream is key in order to build flexible and scalable data processing engines. On-line machine learning provides powerful algorithms for extracting predictive models from such data streams even if the modeled relation is time-variant in nature. The modeling of real valued data in on-line regression tasks is especially important as it connects to modeling and system identification tasks in engineering domains and bridges to other fields of machine learning like classification and reinforcement learning. Therefore, this thesis considers the problem of on-line regression on time variant data streams and introduces a new multi resolution perspective for tackling it.
The proposed incremental learning system, called AS-MRA, comprises a new interpolation scheme for symmetric simplicial input segmentations, a layered approximation structure of sequential local refinement layers and a learning architecture for efficiently training the layer structure. A key concept for making these components work together in harmony is a differential parameter encoding between subsequent refinement layers which allows to decompose the target function into independent additional components represented as individual refinement layers. The whole AS-MRA approach is designed to form a smooth approximation while having its computational demands scaling linearly towards the input dimension and the overall expressiveness and therefore potential storage demands scaling exponentially towards input dimension.
The AS-MRA provides no mandatory design parameters, but offers opportunities for the user to state tolerance parameters for the expected prediction performance which automatically and adaptively shape the resulting layer structure during the learning process. Other optional design parameters allow to restrict the resource consumption with respect to computational and memory demands. The effect of these parameters and the learning behavior of the AS-MRA as such are investigated with respect to various learning issues and compared to different related on-line learning approaches. The merits and contributions of the AS-MRA are experimentally shown and linked to general considerations about the relation between key concepts of the AS-MRA and fundamental results in machine learning.
|
Page generated in 0.0789 seconds