• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 14
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 90
  • 90
  • 90
  • 39
  • 36
  • 29
  • 27
  • 26
  • 24
  • 24
  • 17
  • 16
  • 16
  • 14
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Predicting User Mobility using Deep Learning Methods

Pamuluri, Harihara Reddy January 2020 (has links)
Context: The context of this thesis to predict user mobility using deep learning algorithms which can increase the quality of service for the users and reduce the cost of paging for telecom carriers. Objectives: This study first investigates to find the suitable deep learning algorithms that can be used to predict user mobility and then an experiment is performed with the chosen algorithms as a global model and individual model then evaluate the performance of algorithms. Methods: Firstly, a Literature review is used to find suitable deep learning algorithms and then based on finding an experiment is performed to evaluate the chosen deep learning algorithms. Results: Results from the literature review show that the RNN, LSTM, and variants of the LSTM are the suitable deep learning algorithms. The models are evaluated with metrics accuracy. The results from the experiment showed that the individual model gives better performance in predicting user mobility when compared to the global model. Conclusions: From the results obtained from the experiment, it can be concluded that the individual model is the technique of choice in predicting user mobility.
22

Forecasting conflict using RNNs

Hellman, Simon January 2021 (has links)
The rise in machine learning has made the subject interesting for new types of uses. This Master thesis implements and evaluates an LSTM-based algorithm on the conflict forecasting problem. Data is structured in country-month pairs, with information about conflict, economy, demography, democracy and unrest. The goal is to forecast the probability of at least one conflict event in a country based on a window of historic information. Results show that the model is not as good as a Random Forest. There are also indications of a lack of data with the network having difficulty performing consistently and with learning curves not flattening. Naive models perform surprisingly well. The conclusion is that the problem needs some restructuring in order to improve performance compared to naive approaches. To help this endeavourpossible paths for future work has been identified.
23

Forecasting Large-scale Time Series Data

Hartmann, Claudio 03 December 2018 (has links)
The forecasting of time series data is an integral component for management, planning, and decision making in many domains. The prediction of electricity demand and supply in the energy domain or sales figures in market research are just two of the many application scenarios that require thorough predictions. Many of these domains have in common that they are influenced by the Big Data trend which also affects the time series forecasting. Data sets consist of thousands of temporal fine grained time series and have to be predicted in reasonable time. The time series may suffer from noisy behavior and missing values which makes modeling these time series especially hard, nonetheless accurate predictions are required. Furthermore, data sets from different domains exhibit various characteristics. Therefore, forecast techniques have to be flexible and adaptable to these characteristics. Long-established forecast techniques like ARIMA and Exponential Smoothing do not fulfill these new requirements. Most of the traditional models only represent one individual time series. This makes the prediction of thousands of time series very time consuming, as an equally large number of models has to be created. Furthermore, these models do not incorporate additional data sources and are, therefore, not capable of compensating missing measurements or noisy behavior of individual time series. In this thesis, we introduce CSAR (Cross-Sectional AutoRegression Model), a new forecast technique which is designed to address the new requirements on forecasting large-scale time series data. It is based on the novel concept of cross-sectional forecasting that assumes that time series from the same domain follow a similar behavior and represents many time series with one common model. CSAR combines this new approach with the modeling concept of ARIMA to make the model adaptable to the various properties of data sets from different domains. Furthermore, we introduce auto.CSAR, that helps to configure the model and to choose the right model components for a specific data set and forecast task. With CSAR, we present a new forecast technique that is suited for the prediction of large-scale time series data. By representing many time series with one model, large data sets can be predicted in short time. Furthermore, using data from many time series in one model helps to compensate missing values and noisy behavior of individual series. The evaluation on three real world data sets shows that CSAR outperforms long-established forecast techniques in accuracy and execution time. Finally, with auto.CSAR, we create a way to apply CSAR to new data sets without requiring the user to have extensive knowledge about our new forecast technique and its configuration.
24

Inflation Index for the House and Content Portfolio : A Model to Calculate the Future Claim Costs for Trygg-Hansa

Eklund, Nadine January 2023 (has links)
Trygg-Hansa is a Swedish insurance company that specializes in business insurance, home insurance, vehicle insurance, and personal insurance. This work focuses on Trygg-Hansa’s House and Content portfolio, which insures customers’ homes, both the building itself and its contents. In the event of damage or accidents, the company compensates customers financially, but due to rising inflation, these expenses have become increasingly expensive.  Today, Trygg-Hansa has a model for predicting the future cost of compensate damages within the House and Content portfolio, but sees a great need to develop it further. The goal of this work is to find a better model for predicting future costs and to create an inflation index. This index can serve as a basis for the pricing department, as it can be used to adjust customers’ premiums to maintain a profitable business.  The data was collected from the company’s systems, and nine data sets were created, one for each type of damage. The models used to predict the future claim costs were Autoregressive Integrated Moving Average (ARIMA), Seasonal Autoregressive Integrated Moving Average (SARIMA), and Exponential Smoothing (ES). Each claim type was predicted two years ahead, and thereafter the Laspeyres Price Index was calculated. This was done for all three models, and then the results of the models were compared.  The models were trained for the years 2013-2021, while the years 2021-2023 were used to evaluate the models. All types of damage had rising costs between 2013- 2021, but at the beginning of 2021 and forwards, the trends changed to decreasing trends for almost all types of damage. This affected the results of the models, as they were only trained on rising trends, and therefore, the forecast evaluation (Root Mean Squared Error and Mean Average Percent Error) was not useful. The ARIMA and SARIMA models showed almost no trends in the predicted data. This may be due to too complex data with too much volatility and unclear trends for the implemented module.  The Exponential Smoothing model follows the historical data both trend-wise and with a likely seasonal pattern for all nine types of damage and for the historical LPI. The forecast made by the ES and SARIMA models also show similar seasonal patterns. Furthermore, the ES model has the best model fit according to the Box-Jenkins Diagnostic. The model may need to be corrected in a year when the declining trend has been included in the training data by setting more weight to the new data for the year 2021.
25

Post-Correction of time series forecasts

Kalinauskas, Arunas, Slepov, Dennis January 2023 (has links)
Time series forecasting is an important problem with various applications in different domains. Improving forecast performance has been the center of investigation in the last decades. Several pieces of research show that old statistical methods, such as ARIMA, are still state-of-the-art in many domains and applications. However, one of the main limitations of these methods is their low performance on longer horizons in multi-step forecasting scenarios. We attack this problem from an entirely new perspective. We propose a new universal post-correction approach that can be applied to fix the problematic forecasts of any forecasting model, including ARIMA. The idea is intuitive: We query the last window of observations plus the given forecast, searching for similar shapes in history, and using the future shape of the nearest neighbor, we post-correct the forecast. In order to make sure that post-correction is adequate, we train a supervised decision model on the successfulness of post-corrections on the training set. Our experiments on several datasets show that the proposed post-correction method effectively improves forecasts for 30 steps ahead and beyond.
26

Time Series Forecasting using Temporal Regularized Matrix Factorization and Its Application to Traffic Speed Datasets

Zeng, Jianfeng 30 September 2021 (has links)
No description available.
27

Прогнозирование нагрузки удостоверяющего центра : магистерская диссертация / Prognostication the load of the certification center

Левин, А. Д., Levin, A. D. January 2018 (has links)
Описаны методы прогнозирования нагрузки удостоверяющего центра, т. е. колчиества присланных заявок в удостоверяющий центр. / The methods of prdiction the load of the certifying center are described, that is the number of sent applications to the certifying center.
28

IMBALANCED TIME SERIES FORECASTING AND NEURAL TIME SERIES CLASSIFICATION

Chen, Xiaoqian 01 August 2023 (has links) (PDF)
This dissertation will focus on the forecasting and classification of time series. Specifically, the forecasting problem will focus on imbalanced time series (ITS) which contain a mix of a mix of low probability extreme observations and high probability normal observations. Two approaches are proposed to improve the forecasting of ITS. In the first approach proposed in chapter 2, an ITS will be modelled as a composition of normal and extreme observations, the input predictor variables and the associated forecast output will be combined into moving blocks, and the blocks will be categorized as extreme event (EE) or normal event (NE) blocks. Imbalance will be decreased by oversampling the minority EE blocks and undersampling the majority NE blocks using modifications of block bootstrapping and synthetic minority oversampling technique (SMOTE). Convolution neural networks (CNNs) and long-short term memory (LSTMs) will be selected for forecast modelling. In the second approach described in chapter 3, which focuses on improving the forecasting accuracies LSTM models, a training strategy called Circular-Shift Circular Epoch Training (CSET), is proposed to preserve the natural ordering of observations in epochs during training without any attempt to balance the extreme and normal observations. The strategy will be universal because it could be applied to train LSTMs to forecast events in normal time series or in imbalanced time series in exactly the same manner. The CSET strategy will be formulated for both univariate and multivariate time series forecasting. The classification problem will focus on the classification event-related potential neural time series by exploiting information offered by the cone of influence (COI) of the continuous wavelet transform (CWT). The COI is a boundary that is superimposed on the wavelet scalogram to delineate the coefficients that are accurate from those that are inaccurate due to edge effects. The features derived from the inaccurate coefficients are, therefore, unreliable. It is hypothesized that the classifier performance would improve if unreliable features, which are outside the COI, are zeroed out, and the performance would improve even further if those features are cropped out completely. Two CNN multidomain models will be introduced to fuse the multichannel Z-scalograms and the V-scalograms. In the first multidomain model, referred to as the Z-CuboidNet, the input to the CNN will be generated by fusing the Z-scalograms of the multichannel ERPs into a frequency-time-spatial cuboid. In the second multidomain model, referred to as the V-MatrixNet, the CNN input will be formed by fusing the frequency-time vectors of the V-scalograms of the multichannel ERPs into a frequency-time-spatial matrix.
29

Time Series Forecasting on Database Storage

Patel, Pranav January 2024 (has links)
Time Series Forecasting has become vital in various industries ranging from weather forecasting to business forecasting. There is a need to research database storage solutions for companies in order to optimize resource allocation, enhance decision making process and enable predictive data storage maintenance. With the introduction of Artificial Intelligence and a branch of AI, Machine Learning, Time Series Forecasting has become more powerful and efficient. This project attempts to validate the possibility of using time series forecasting on database storage data to make business predictions. Currently, predicting capabilities of database storage is an area which is not fully explored, despite the growing necessity of databases. Currently, most of the optimization of databases is left to human touch which is ultimately slower and more error prone. As such, this research will investigate the possibilities of time series forecasting in database storage. This project will use Machine Learning and Time-series Forecasting to predict the future trend of database storage to give information on how the trend of the data will change. Examining the pattern of database storage fluctuations will allow the respective owners an overview of their storage and in turn, make decisions on optimizing the database to prevent critical problems ahead of time. Three distinct approaches - employing a traditional linear model fore forecasting, utilizing a Convolutional Neural Network (CNN) to detect local changes in time series data, and leveraging a Recurrent Neural Network (RNN) to capture long term temporal dependencies - are implemented to assess which of these techniques is better suited for the provided dataset. Furthermore, two settings (single step and multi step) have been tested in order to test the changes in accuracy from a small prediction step to a major. The research indicates that currently the models do not have the possibility to be used. This is due to the mean absolute error being very big. The main purpose of the project was to establish which of the three different techniques is the best for the particular dataset provided by the company. In general, across all approaches (Linear, CNN, RNN), their performance was superior in the single step method. In the multi step aspect, The linear model suffered the greatest in the accuracy drop with CNN and RNN performing slightly better. The findings also indicated that the model with local change detection (CNN) performs better for the provided dataset in both single and multi step settings, as evidenced by its minimal Mean Absolute Error (MAE). This is because the dataset is comprised of local data and the models are only trained to check for normal changes. If the research had also checked for seasonality or sequential patterns, then it is possible that LSTM may have had a better outcome due to its capability of capturing those dependencies. The accuracy of single step forecasting using CNN is good (MAE = 0.25) but must be further explored and improved.
30

Web-based Benchmarks for Forecasting Systems: The ECAST Platform

Ulbricht, Robert, Hartmann, Claudio, Hahmann, Martin, Donker, Hilko, Lehner, Wolfgang 10 January 2023 (has links)
The role of precise forecasts in the energy domain has changed dramatically. New supply forecasting methods are developed to better address this challenge, but meaningful benchmarks are rare and time-intensive. We propose the ECAST online platform in order to solve that problem. The system's capability is demonstrated on a real-world use case by comparing the performance of different prediction tools.

Page generated in 0.1051 seconds