• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 21
  • 19
  • 9
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 57
  • 57
  • 33
  • 19
  • 12
  • 12
  • 11
  • 11
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Analýza účetních výkazů firmy pomocí časových řad / Analysis of Company Financial Statements Using the Time Series

Zachovalová, Pavla January 2011 (has links)
This master´s thesis deals with the financial situation of SUDOP BRNO, spol. s r.o. The theoretical part explains the time series, regression analysis and financial analysis. The practical part is focused on development of economic indicators and predicts its trend for future. Also, the competitive status is evaluated. In the final part there are suggestions for improvement of company situation.
52

Kritické faktory implementace informačního systému kategorie ERP ve společnosti METAL STEEL INDUSTRY, spol. s r.o. / Critical Factors of Implementation of ERP System in the Company METAL STEEL INDUSTRY, spol. s r.o.

Ľubek, Matej January 2016 (has links)
This master thesis deals with the implementation of enterprise information system of ERP category in the company METAL STEEL INDUSTRY, spol. s.r.o. In the first part of the thesis is described the basic theoretical knowledge and information for a specific issue. The aim of this work is the detection and identification of critical factors and potential risks associated with the selection, implementation and subsequent operation of the enterprise information system. The output of this work is a time and economic evaluation of the project, creating proposals to reduce risks and create recommendations for the company.
53

Wie wirken Unternehmensberichte auf den Aktienkurs? - Eine statistische Untersuchung mittels Event Coincidence Analysis und Superposed Epoch Analysis

Rimatzki, Florian 01 November 2016 (has links)
Several times a year companies publish business reports to openly account for their business activities. This thesis examines the effect of those business reports on stock prices of businesses in the German automotive industry. Different statistical methods such as Event Coincidence Analysis and Superposed Epoch Analysis are used to examine possible negative and positive reactions of stock prices before and after the disclosure of business reports. It shows that there seems to be a stronger influence of a negative business report on the daily abnormal rate of return than of a positive business report. Furthermore the thesis confirms the hypothesis of Roeder that the information from a business report is processed not only on the day of publication but also on the day after.
54

A Bridge between Short-Range and Seasonal Forecasts: Data-Based First Passage Time Prediction in Temperatures

Wulffen, Anja von 25 January 2013 (has links)
Current conventional weather forecasts are based on high-dimensional numerical models. They are usually only skillful up to a maximum lead time of around 7 days due to the chaotic nature of the climate dynamics and the related exponential growth of model and data initialisation errors. Even the fully detailed medium-range predictions made for instance at the European Centre for Medium-Range Weather Forecasts do not exceed lead times of 14 days, while even longer-range predictions are limited to time-averaged forecast outputs only. Many sectors would profit significantly from accurate forecasts on seasonal time scales without needing the wealth of details a full dynamical model can deliver. In this thesis, we aim to study the potential of a much cheaper data-based statistical approach to provide predictions of comparable or even better skill up to seasonal lead times, using as an examplary forecast target the time until the next occurrence of frost. To this end, we first analyse the properties of the temperature anomaly time series obtained from measured data by subtracting a sinusoidal seasonal cycle, as well as the distribution properties of the first passage times to frost. The possibility of generating additional temperature anomaly data with the same properties by using very simple autoregressive model processes to potentially reduce the statistical fluctuations in our analysis is investigated and ultimately rejected. In a next step, we study the potential for predictability using only conditional first passage time distributions derived from the temperature anomaly time series and confirm a significant dependence of the distributions on the initial conditions. After this preliminary analysis, we issue data-based out-of-sample forecasts for three different prediction targets: The specific date of first frost, the probability of observing frost before summer for forecasts issued in spring, and the full probability distribution of the first passage times to frost. We then study the possibility of improving the forecast quality first by enhancing the stationarity of the temperature anomaly time series and then by adding as an additional input variable the state of the North Atlantic Oscillation on the date the predictions are issued. We are able to obtain significant forecast skill up to seasonal lead times when comparing our results to an unskilled reference forecast. A first comparison between the data-based forecasts and corresponding predictions gathered from a dynamical weather model, necessarily using a lead time of only up to 15 days, shows that our simple statistical schemes are only outperformed (and then only slightly) if further statistical post-processing is applied to the model output. / Aktuelle Wetterprognosen werden mit Hilfe von hochdimensionalen, numerischen Modellen generiert. Durch die dem Klima zugrunde liegende chaotische Dynamik wachsen Modellfehler und Ungenauigkeiten in der Modellinitialisierung exponentiell an, sodass Vorhersagen mit signifikanter Güte üblicherweise nur für eine Vorlaufzeit von maximal sieben Tagen möglich sind. Selbst die detaillierten Prognosen des Europäischen Zentrums für mittelfristige Wettervorhersagen gehen nicht über eine Vorlaufzeit von 14 Tagen hinaus, während noch längerfristigere Vorhersagen auf zeitgemittelte Größen beschränkt sind. Viele Branchen würden signifikant von akkuraten Vorhersagen auf saisonalen Zeitskalen pro-fitieren, ohne das ganze Ausmaß an Details zu benötigen, das von einem vollständigen dynamischen Modell geliefert werden kann. In dieser Dissertation beabsichtigen wir, am Beispiel einer Vorhersage der Zeitdauer bis zum nächsten Eintreten von Frost zu untersuchen, inwieweit deutlich kostengünstigere, datenbasierte statistische Verfahren Prognosen von gleicher oder sogar besserer Güte auf bis zu saisonalen Zeitskalen liefern können. Dazu analysieren wir zunächst die Eigenschaften der Zeitreihe der Temperaturanomalien, die aus den Messdaten durch das Subtrahieren eines sinusförmigen Jahresganges erhalten werden, sowie die Charakteristiken der Wahrscheinlichkeitsverteilungen der Zeitdauer bis zum nächsten Eintreten von Frost. Die Möglichkeit, durch einen einfachen autoregressiven Modellprozess zusätzliche Datenpunkte gleicher statistischer Eigenschaften wie der Temperaturanomalien zu generieren, um die statistischen Fluktuationen in der Analyse zu reduzieren, wird untersucht und letztendlich verworfen. Im nächsten Schritt analysieren wir das Vorhersagepotential, wenn ausschließlich aus den Temperaturanomalien gewonnene bedingte Wahrscheinlichkeitsverteilungen der Wartezeit bis zum nächsten Frost verwendet werden, und können eine signifikante Abhängigkeit der Verteilungen von den Anfangsbedingungen nachweisen. Nach dieser einleitenden Untersuchung erstellen wir datenbasierte Prognosen für drei verschiedene Vorhersagegrößen: Das konkrete Datum, an dem es das nächste Mal Frost geben wird; die Wahrscheinlichkeit, noch vor dem Sommer Frost zu beobachten, wenn die Vorhersagen im Frühjahr ausgegeben werden; und die volle Wahrscheinlichkeitsverteilung der Zeitdauer bis zum nächsten Eintreten von Frost. Anschließend untersuchen wir die Möglichkeit, die Vorhersagegüte weiter zu erhöhen - zunächst durch eine Verbesserung der Stationarität der Temperaturanomalien und dann durch die zusätzliche Berücksichtigung der Nordatlantischen Oszillation als einer zweiten, den Anfangszustand charakterisierenden Variablen im Vorhersageschema. Wir sind in der Lage, im Vergleich mit einem naiven Referenzvorhersageschema eine signifikante Verbesserung der Vorhersagegüte auch auf saisonalen Zeitskalen zu erreichen. Ein erster Vergleich zwischen den datenbasierten Vorhersagen und entsprechenden, aus den dynamischen Wettermodellen gewonnenen Prognosen, der sich notwendigerweise auf eine Vorlaufzeit der Vorhersagen von lediglich 15 Tagen beschränkt, zeigt, dass letztere unsere simplen statistischen Vorhersageschemata nur schlagen (und zwar knapp), wenn der Modelloutput noch einer statistischen Nachbearbeitung unterzogen wird.
55

Podnikatelský záměr / Business Plan

Marková, Adéla January 2013 (has links)
The master’s thesis is focused on development of the business plan, particularly the proposal for extending production in the small engineering company Unicode M&D Ltd. The first part deals with the theoretical resources needed for drafting. The second part contains analysis of neighbourhood. The last part presents the specific proposal for extanding production in this company.
56

Vorhersagbarkeit ökonomischer Zeitreihen auf verschiedenen zeitlichen Skalen

Mettke, Philipp 24 November 2015 (has links)
This thesis examines three decomposition techniques and their usability for economic and financial time series. The stock index DAX30 and the exchange rate from British pound to US dollar are used as representative economic time series. Additionally, autoregressive and conditional heteroscedastic simulations are analysed as benchmark processes to the real data. Discrete wavelet transform (DWT) uses wavelike functions to adapt the behaviour of time series on different time scales. The second method is the singular spectral analysis (SSA), which is applied to extract influential reconstructed modes. As a third algorithm, empirical mode decomposition (END) leads to intrinsic mode functions, who reflect the short and long term fluctuations of the time series. Some problems arise in the decomposition process, such as bleeding at the DWT method or mode mixing of multiple EMD mode functions. Conclusions to evaluate the predictability of the time series are drawn based on entropy - and recurrence - analysis. The cyclic behaviour of the decompositions is examined via the coefficient of variation, based on the instantaneous frequency. The results show rising predictability, especially on higher decomposition levels. The instantaneous frequency measure leads to low values for regular oscillatory cycles, irregular behaviour results in a high variation coefficient. The singular spectral analysis show frequency - stable cycles in the reconstructed modes, but represents the influences of the original time series worse than the other two methods, which show on the contrary very little frequency - stability in the extracted details.:1. Einleitung 2. Datengrundlage 2.1. Auswahl und Besonderheiten ökonomischer Zeitreihen 2.2. Simulationsstudie mittels AR-Prozessen 2.3. Simulationsstudie mittels GARCH-Prozessen 3. Zerlegung mittels modernen Techniken der Zeitreihenanalyse 3.1. Diskrete Wavelet Transformation 3.2. Singulärsystemanalyse 3.3. Empirische Modenzerlegung 4. Bewertung der Vorhersagbarkeit 4.1. Entropien als Maß der Kurzzeit-Vorhersagbarkeit 4.2. Rekurrenzanalyse 4.3. Frequenzstabilität der Zerlegung 5. Durchführung und Interpretation der Ergebnisse 5.1. Visuelle Interpretation der Zerlegungen 5.2. Beurteilung mittels Charakteristika 6. Fazit
57

Classification and repeatability studies of transient electromagnetic measurements with respect to the development of CO2-monitoring techniques

Bär, Matthias 09 February 2021 (has links)
The mitigation of greenhouse gases, like CO2 is a challenging aspect for our society. A strategy to hamper the constant emission of CO2 is utilizing carbon capture and storage technologies. CO2 is sequestrated in subsurface reservoirs. However, these reservoirs harbor the risk of leakage and appropriate geophysical monitoring methods are needed. A crucial aspect of monitoring is the assignment of measured data to certain events occurring. Especially if changes in the measured data are small, suitable statistical methods are needed. In this thesis, a new statistical workflow based on cluster analysis is proposed to detect similar transient electromagnetic signals. The similarity criteria dynamic time warping, the autoregressive distance, and the normalized root-mean-square distance are investigated and evaluated with respect to the classic Euclidean norm. The optimal number of clusters is determined using the gap statistic and visualized with multidimensional scaling. To validate the clustering results, silhouette values are used. The statistical workflow is applied to a synthetic data set, a long-term monitoring data set and a repeat measurement at a pilot CO2-sequestration site in Brooks, Alberta.

Page generated in 0.076 seconds