• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 9
  • 9
  • 9
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Co-movement in Market Liquidity Measures / 市場流動性指標之共動性

劉鴻耀, Liu, Hung-Yao Unknown Date (has links)
Abstract Undoubtedly, liquidity is one of the most popular topics of research among the academia for decades. However intuitively-clear it is, scholars and experts have always found it not only hard but vague to define and measure. Moreover, researches or methods concerning commonality in liquidity are proposed one after another. Most of these works attempt to document what lies beneath the commonality by offering industry-wide or market-wide explanations. Nevertheless, this paper adopts an exact multivariate model-based structural decomposition methodology developed by Casals, Jerez and Sotoca (2002) to analyze the co-movement in market liquidity measures in a totally different manner. Except for decomposing three well-known market liquidity measures, share volume, dollar volume and turnover rate, of the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) into trend, cycle, seasonal and irregular components, we conduct advanced bivariate analysis to extract common components, visualize them, and make a comparison among them at last. Evidence suggests that not only do these three liquidity proxies highly co-move with one another, but dollar volume seems to co-move slightly closer with share volume than with turnover rate. In the end, where this phenomenon, co-movement in market liquidity measures, accrues from is another long story and needs some further work not covered in this study.
2

Quantitative approach to short-term financial planning / Finanční plánování v podniku

Voráček, Lukáš January 2011 (has links)
The aim of this study is to certify the legitimacy of employing quantitative methods in the day-to-day business practice. The task is approached as a case study of a real-life financial planning process. I work with the financial data of POS Media Czech Republic (a media company providing point-of-sale advertising solutions). My intention is to simulate the projection of a pro forma income statement with the use of quantitative methods. More specifically, I am applying time series prediction techniques in order to forecast POS Media's sales. The goal is, first, to demonstrate that quantitative techniques can be handled even with limited statistical background and, second, to discuss the relevancy of the obtained results. In the methodical part of my thesis I deal with the theoretical aspects of financial planning. I further describe various methods of sales forecasting (qualitative vs. quantitative). Special emphasis is put on time series prediction methods. In the application part I provide a short description of POS Media and its business. I use time series decomposition techniques to predict POS Media's sales in 2012. Consequently, I outline the rest of the pro forma income statement.
3

Komponentenzerlegung des Regelleistungsbedarfs mit Methoden der Zeitreihenanalyse

Wenzel, Anne 29 October 2010 (has links)
Im Rahmen der Arbeit wurden die minutengenauen Daten des Regelleistungsbedarfs (Summe aus Sekundärregelleistung und Minutenreserve) der Monate April bis Dezember des Jahres 2009 einer Regelzone einer Zeitreihenanalyse unterzogen und in Komponenten gemäß dem klassischen Komponentenmodell zerlegt. Diese sind die Trendkomponente, ermittelt durch einen gleitenden Durchschnitt mit der Länge einer Stunde, weiterhin zwei periodische Komponenten mit der Periodenlänge einer Stunde sowie der Periodenlänge eines Tages und die Restkomponente, welche mit einem ARIMA(2,1,5)-Prozess modelliert wurde. In der Zukunft sollte das erstellte Modell des Regelleistungsbedarfs durch Hinzunahme einer jahreszeitlichen Komponente noch verbessert werden. Dies war im Rahmen der Arbeit nicht möglich, da keine Daten über einen Zeitraum von mehreren Jahren vorhanden waren. Zusätzlich kann geprüft werden, inwiefern mit dem Komponentenmodell Prognosen durchführbar sind. Dafür sollte die Trendkomponente anders gewählt werden, da sich der hier gewählte Weg zu sehr an den Daten orientiert. Der zweite Teil der Aufgabenstellung dieser Arbeit bestand im Identifizieren inhaltlicher Komponenten, also möglicher Zusammenhänge zwischen dem Regelleistungsbedarf und verschiedenen denkbaren Ursachen. Als potentielle Ursachen wurden der Lastverlauf sowie die Windenergieeinspeisung untersucht. Zwischen der Zeitreihe des Lastverlaufs und der des Regelleistungsbedarfs bestand eine leichte positive Korrelation, zwischen der Zeitreihe der Windenergieeinspeisung und der des Regelleistungsbedarfs eine geringe negative Korrelation.:Einleitung 1 Ausgangssituation und technische Gegebenheiten 2 Mathematische Grundlagen 3 Analyse der Regelleistungsdaten 4 Zusammenfassung und Ausblick
4

Time Series Decomposition using Automatic Learning Techniques for Predictive Models

Silva, Jesús, Hernández Palma, Hugo, Niebles Núẽz, William, Ovallos-Gazabon, David, Varela, Noel 07 January 2020 (has links)
This paper proposes an innovative way to address real cases of production prediction. This approach consists in the decomposition of original time series into time sub-series according to a group of factors in order to generate a predictive model from the partial predictive models of the sub-series. The adjustment of the models is carried out by means of a set of statistic techniques and Automatic Learning. This method was compared to an intuitive method consisting of a direct prediction of time series. The results show that this approach achieves better predictive performance than the direct way, so applying a decomposition method is more appropriate for this problem than non-decomposition.
5

Metody analýzy sezónnosti demografických jevů / Methods of analysis of seasonality in demography

Myšáková, Gabriela January 2011 (has links)
Methods of analysis of seasonality in demografy Gabriela Myšáková Abstract The thesis presents statistical methods suited for analysis of seasonality in time-series. Three statistical methods have been thoroughly described, namely the time-series decomposition, the X12−ARIMA method and the cointegration of time-series. Further methods applicable for similar analysis have been briefly discussed as well. Three main methods have been subsequently applied to monthly demographic data for the Czech Republic and chosen European countries by natality, nuptiality and mortality. Seasonality has been discovered in all three demographic events and by using the time-series decomposition and the X12−ARIMA method for time-series lay out to separate units, and those progressions have been track by graphic and verbal interpretation. Cluster analysis has been applied to European countries nuptiality and mortality time-series in order to reveal similarities and dissimilarities among particular countries. All the methods were used onto data set by using appropriate procedures in statistical software SAS.
6

Implementation of Hierarchical and K-Means Clustering Techniques on the Trend and Seasonality Components of Temperature Profile Data

Ogedegbe, Emmanuel 01 December 2023 (has links) (PDF)
In this study, time series decomposition techniques are used in conjunction with Kmeans clustering and Hierarchical clustering, two well-known clustering algorithms, to climate data. Their implementation and comparisons are then examined. The main objective is to identify similar climate trends and group geographical areas with similar environmental conditions. Climate data from specific places are collected and analyzed as part of the project. The time series is then split into trend, seasonality, and residual components. In order to categorize growing regions according to their climatic inclinations, the deconstructed time series are then submitted to K-means clustering and Hierarchical clustering with dynamic time warping. In order to understand how different regions’ climates compare to one another and how regions cluster based on the general trend of the temperature profile over the course of the full growing season as opposed to the seasonality component for the various locations, the created clusters are evaluated.
7

Ameliorating Environmental Effects on Hyperspectral Images for Improved Phenotyping in Greenhouse and Field Conditions

Dongdong Ma (9224231) 14 August 2020 (has links)
Hyperspectral imaging has become one of the most popular technologies in plant phenotyping because it can efficiently and accurately predict numerous plant physiological features such as plant biomass, leaf moisture content, and chlorophyll content. Various hyperspectral imaging systems have been deployed in both greenhouse and field phenotyping activities. However, the hyperspectral imaging quality is severely affected by the continuously changing environmental conditions such as cloud cover, temperature and wind speed that induce noise in plant spectral data. Eliminating these environmental effects to improve imaging quality is critically important. In this thesis, two approaches were taken to address the imaging noise issue in greenhouse and field separately. First, a computational simulation model was built to simulate the greenhouse microclimate changes (such as the temperature and radiation distributions) through a 24-hour cycle in a research greenhouse. The simulated results were used to optimize the movement of an automated conveyor in the greenhouse: the plants were shuffled with the conveyor system with optimized frequency and distance to provide uniform growing conditions such as temperature and lighting intensity for each individual plant. The results showed the variance of the plants’ phenotyping feature measurements decreased significantly (i.e., by up to 83% in plant canopy size) in this conveyor greenhouse. Secondly, the environmental effects (i.e., sun radiation) on <a>aerial </a>hyperspectral images in field plant phenotyping were investigated and modeled. <a>An artificial neural network (ANN) method was proposed to model the relationship between the image variation and environmental changes. Before the 2019 field test, a gantry system was designed and constructed to repeatedly collect time-series hyperspectral images with 2.5 minutes intervals of the corn plants under varying environmental conditions, which included sun radiation, solar zenith angle, diurnal time, humidity, temperature and wind speed. Over 8,000 hyperspectral images of </a>corn (<i>Zea mays </i>L.) were collected with synchronized environmental data throughout the 2019 growing season. The models trained with the proposed ANN method were able to accurately predict the variations in imaging results (i.e., 82.3% for NDVI) caused by the changing environments. Thus, the ANN method can be used by remote sensing professionals to adjust or correct raw imaging data for changing environments to improve plant characterization.
8

EDIFES 0.4: Scalable Data Analytics for Commercial Building Virtual Energy Audits

Pickering, Ethan M. 13 September 2016 (has links)
No description available.
9

Predictability of Nonstationary Time Series using Wavelet and Empirical Mode Decomposition Based ARMA Models

Lanka, Karthikeyan January 2013 (has links) (PDF)
The idea of time series forecasting techniques is that the past has certain information about future. So, the question of how the information is encoded in the past can be interpreted and later used to extrapolate events of future constitute the crux of time series analysis and forecasting. Several methods such as qualitative techniques (e.g., Delphi method), causal techniques (e.g., least squares regression), quantitative techniques (e.g., smoothing method, time series models) have been developed in the past in which the concept lies in establishing a model either theoretically or mathematically from past observations and estimate future from it. Of all the models, time series methods such as autoregressive moving average (ARMA) process have gained popularity because of their simplicity in implementation and accuracy in obtaining forecasts. But, these models were formulated based on certain properties that a time series is assumed to possess. Classical decomposition techniques were developed to supplement the requirements of time series models. These methods try to define a time series in terms of simple patterns called trend, cyclical and seasonal patterns along with noise. So, the idea of decomposing a time series into component patterns, later modeling each component using forecasting processes and finally combining the component forecasts to obtain actual time series predictions yielded superior performance over standard forecasting techniques. All these methods involve basic principle of moving average computation. But, the developed classical decomposition methods are disadvantageous in terms of containing fixed number of components for any time series, data independent decompositions. During moving average computation, edges of time series might not get modeled properly which affects long range forecasting. So, these issues are to be addressed by more efficient and advanced decomposition techniques such as Wavelets and Empirical Mode Decomposition (EMD). Wavelets and EMD are some of the most innovative concepts considered in time series analysis and are focused on processing nonlinear and nonstationary time series. Hence, this research has been undertaken to ascertain the predictability of nonstationary time series using wavelet and Empirical Mode Decomposition (EMD) based ARMA models. The development of wavelets has been made based on concepts of Fourier analysis and Window Fourier Transform. In accordance with this, initially, the necessity of involving the advent of wavelets has been presented. This is followed by the discussion regarding the advantages that are provided by wavelets. Primarily, the wavelets were defined in the sense of continuous time series. Later, in order to match the real world requirements, wavelets analysis has been defined in discrete scenario which is called as Discrete Wavelet Transform (DWT). The current thesis utilized DWT for performing time series decomposition. The detailed discussion regarding the theory behind time series decomposition is presented in the thesis. This is followed by description regarding mathematical viewpoint of time series decomposition using DWT, which involves decomposition algorithm. EMD also comes under same class as wavelets in the consequence of time series decomposition. EMD is developed out of the fact that most of the time series in nature contain multiple frequencies leading to existence of different scales simultaneously. This method, when compared to standard Fourier analysis and wavelet algorithms, has greater scope of adaptation in processing various nonstationary time series. The method involves decomposing any complicated time series into a very small number of finite empirical modes (IMFs-Intrinsic Mode Functions), where each mode contains information of the original time series. The algorithm of time series decomposition using EMD is presented post conceptual elucidation in the current thesis. Later, the proposed time series forecasting algorithm that couples EMD and ARMA model is presented that even considers the number of time steps ahead of which forecasting needs to be performed. In order to test the methodologies of wavelet and EMD based algorithms for prediction of time series with non stationarity, series of streamflow data from USA and rainfall data from India are used in the study. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability by the proposed algorithm is checked in two scenarios, first being six months ahead forecast and the second being twelve months ahead forecast. Normalized Root Mean Square Error (NRMSE) and Nash Sutcliffe Efficiency Index (Ef) are considered to evaluate the performance of the proposed techniques. Based on the performance measures, the results indicate that wavelet based analyses generate good variations in the case of six months ahead forecast maintaining harmony with the observed values at most of the sites. Although the methods are observed to capture the minima of the time series effectively both in the case of six and twelve months ahead predictions, better forecasts are obtained with wavelet based method over EMD based method in the case of twelve months ahead predictions. It is therefore inferred that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm could be used to model events such as droughts with reasonable accuracy. Also, some modifications that could be made in the model have been suggested which can extend the scope of applicability to other areas in the field of hydrology.

Page generated in 0.0793 seconds