• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 57
  • 22
  • 10
  • 8
  • 7
  • 6
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 139
  • 139
  • 38
  • 32
  • 29
  • 27
  • 25
  • 25
  • 20
  • 19
  • 19
  • 18
  • 17
  • 16
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Mining IT Product Life Cycle from Massive Newsgroup Articles

Chou, Cheng-Chi 22 July 2003 (has links)
Product life cycle (PLC) may be used as a managerial tool. Marketing strategies must change as the product goes through its life cycle. If managers understand the cycle concept, they are in a better position to forecast the future sales activities and plan marketing strategies. However, people often make the wrong PLC because of the difficulty of data access and lacking decision-making information. Therefore, this thesis applies customer behavior model to analyze the relationship between the frequency and the duration time from the product discussion, and it calculates the PLC pattern to explore the product¡¦s current position in customers¡¦ mind. Finally, the PLC curve will be constructed by using the information that we got from previous analysis. Moreover, we also employ data mining and information retrieval technique to diagnose the variance of discussion frequency and the content of discussion article to extract the distinctive event that influenced PLC curve. The main contributions of this thesis are described as the following sentence:
22

Moving-Average approximations of random epsilon-correlated processes

Kandler, Anne, Richter, Matthias, vom Scheidt, Jürgen, Starkloff, Hans-Jörg, Wunderlich, Ralf 31 August 2004 (has links) (PDF)
The paper considers approximations of time-continuous epsilon-correlated random processes by interpolation of time-discrete Moving-Average processes. These approximations are helpful for Monte-Carlo simulations of the response of systems containing random parameters described by epsilon-correlated processes. The paper focuses on the approximation of stationary epsilon-correlated processes with a prescribed correlation function. Numerical results are presented.
23

Lösung parabolischer Differentialgleichungen mit zufälligen Randbedingungen mittels FEM

Kandler, Anne, vom Scheidt, Jürgen, Unger, Roman 31 August 2004 (has links) (PDF)
In dieser Arbeit werden stochastische Charakteristiken der Lösung parabolischer Differentialgleichungen mit zufälligen Neumann-Randbedingungen mit Hilfe der Finite-Elemente-Methode angegeben. Dabei wird der Berechnung der Korrelations- bzw. Varianzfunktion besondere Bedeutung beigemessen. Das stochastische Randanfangswertproblem wird durch Anwendung von FEM-Techniken durch ein System gewöhnlicher Differentialgleichungen mit stochastischen inhomogenen Termen approximiert. Die Modellierung der stochastischen Eingangsparameter durch epsilon-korrelierte Felder gestattet Entwicklungen der Lösungscharakteristiken nach der Korrelationslänge. Numerische Beispiele enthalten den Vergleich zwischen analytischen Ergebnissen und Simulationsresultaten.
24

Pricing and competition in the Swedish retail market for electricity

Lu, Yuhao January 2015 (has links)
Sweden, together with Norway, Finland and Denmark, have created a multi-national electricity market called NordPool. In this market, producers and retailers of electricity can buy and sell electricity, and the retailers then offers this electricity to end consumers such as households and industries. Previous studies have shown that pricing at the NordPool market is functioning quite well, but no other study has to my knowledge studied if pricing in the retail market to consumers in Sweden is well functioning. If the market is well functioning, with competition and low transaction costs when changing electricity retailer, we would expect that a homogeneous good such as electricity would be sold at the approximately same price, and that price changes would be highly correlated, in this market. Thus, the aim of this study is to test whether the price of Vattenfall, the largest energy firm in the Swedish market, is highly correlated to the price of other firms in the Swedish retail market for electricity. Descriptive statistics indicate that the price offered by Vattenfall is quite similar to the price of other firms in the market. In addition, regression analysis show that the correlation between the price of Vattenfall and other firms is as high as 0.98.
25

Kompiuterių tinklo srautų anomalijų aptikimo metodai / Detection of network traffic anomalies

Krakauskas, Vytautas 03 June 2006 (has links)
This paper describes various network monitoring technologies and anomaly detection methods. NetFlow were chosen for anomaly detection system being developed. Anomalies are detected using a deviation value. After evaluating quality of developed system, new enhancements were suggested and implemented. Flow data distribution was suggested, to achieve more precise NetFlow data representation, enabling a more precise network monitoring information usage for anomaly detection. Arithmetic average calculations were replaced with more flexible Exponential Weighted Moving Average algorithm. Deviation weight was introduced to reduce false alarms. Results from experiment with real life data showed that proposed changes increased precision of NetFlow based anomaly detection system.
26

Application of Block Sieve Bootstrap to Change-Point detection in time series

Zaman, Saad 30 August 2010 (has links)
Since the introduction of CUSUM statistic by E.S. Page (1951), detection of change or a structural break in time series has gained significant interest as its applications span across various disciplines including economics, industrial applications, and environmental data sets. However, many of the early suggested statistics, such as CUSUM or MOSUM, lose their effectiveness when applied to time series data. Either the size or power of the test statistic gets distorted, especially for higher order autoregressive moving average processes. We use the test statistic from Gombay and Serban (2009) for detecting change in the mean of an autoregressive process and show how the application of sieve bootstrap to the time series data can improve the performance of our test to detect change. The effectiveness of the proposed method is illustrated by applying it to economic data sets.
27

ARMA Identification of Graphical Models

Avventi, Enrico, Lindquist, Anders, Wahlberg, Bo January 2013 (has links)
Consider a Gaussian stationary stochastic vector process with the property that designated pairs of components are conditionally independent given the rest of the components. Such processes can be represented on a graph where the components are nodes and the lack of a connecting link between two nodes signifies conditional independence. This leads to a sparsity pattern in the inverse of the matrix-valued spectral density. Such graphical models find applications in speech, bioinformatics, image processing, econometrics and many other fields, where the problem to fit an autoregressive (AR) model to such a process has been considered. In this paper we take this problem one step further, namely to fit an autoregressive moving-average (ARMA) model to the same data. We develop a theoretical framework and an optimization procedure which also spreads further light on previous approaches and results. This procedure is then applied to the identification problem of estimating the ARMA parameters as well as the topology of the graph from statistical data. / <p>Updated from "Preprint" to "Article" QC 20130627</p>
28

SHECARE: Shared Haptic Environment on the Cloud for Arm Rehabilitation Exercises

Hoda, Mohamad January 2016 (has links)
It is well known that home exercise is as good as rehab center. Unfortunately, passive devices such as dumbbells, elastic bands, stress balls and tubing that have been widely used for home-based arm rehabilitation do not provide therapists with the information needed to monitor the patient’s progress, identify any impairment, and suggest treatments. Moreover, the lack of interactivity of these devices turns the rehabilitation exercises into a boring, unpleasant task. In this thesis, we introduce a family of home-based post-stroke rehabilitation systems aimed at solving the aforementioned problems. We call such applications: “Shared Haptic Environment on the Cloud for Arm Rehabilitation Exercises (SHECARE)”. The systems combine recent rehabilitation approaches with efficient, yet affordable skeleton tracking input technologies, and multimodal interactive computer environment. In addition, the systems provide a real-time feedback to the stroke patients, summarize the feedback after each session, and predict the overall recovery progress. Moreover, these systems show a new style of home-based rehabilitation approach that motivate the patients by engaging the whole family and friends in the rehabilitation process and allow the therapists to remotely assess the progress of the patients and adjust the training strategy accordingly. Two mathematical models have been presented in this thesis. The first model is developed to find the relationship between upper extremity kinematics and the associated forces/strength. The second model is used to evaluate the medical condition of the stroke patients and predict their recovery progress depending on their performance history. The objective assessments, clinical tests, and the subjective assessments, usability studies have shown the feasibility of the proposed systems for rehabilitation in stroke patients with upper limb motor dysfunction.
29

Assessing a quantitative approach to tactical asset allocation

Royston, Guy Andrew 04 August 2012 (has links)
The purpose of this paper is to determine whether the adoption of a simple trend-following quantitative method improves the risk-adjusted returns across various asset classes within a South African market setting. A simple moving average timing model is tested since 1925 on the South African equity and bond markets and within a tactical asset allocation framework. The timing solution when applied to the JSE All Share Index, RSA Government Bond Index and within an equally weighted portfolio improved returns, while reducing risk. Testing the model within sample by decade highlighted periods of inferior return performance providing evidence to support prior research (Faber, 2007) that the timing model acts as a risk reduction technique with limited to no impact on return. / Dissertation (MBA)--University of Pretoria, 2012. / Gordon Institute of Business Science (GIBS) / unrestricted
30

End-to-End Available Bandwidth Estimation and Monitoring

Guerrero Santander, Cesar Dario 20 February 2009 (has links)
Available Bandwidth Estimation Techniques and Tools (ABETTs) have recently been envisioned as a supporting mechanism in areas such as compliance of service level agreements, network management, traffic engineering and real-time resource provisioning, flow and congestion control, construction of overlay networks, fast detection of failures and network attacks, and admission control. However, it is unknown whether current ABETTs can run efficiently in any type of network, under different network conditions, and whether they can provide accurate available bandwidth estimates at the timescales needed by these applications. This dissertation investigates techniques and tools able to provide accurate, low overhead, reliable, and fast available bandwidth estimations. First, it shows how it is that the network can be sampled to get information about the available bandwidth. All current estimation tools use either the probe gap model or the probe rate model sampling techniques. Since the last technique introduces high additional traffic to the network, the probe gap model is the sampling method used in this work. Then, both an analytical and experimental approach are used to perform an extensive performance evaluation of current available bandwidth estimation tools over a flexible and controlled testbed. The results of the evaluation highlight accuracy, overhead, convergence time, and reliability performance issues of current tools that limit their use by some of the envisioned applications. Single estimations are affected by the bursty nature of the cross traffic and by errors generated by the network infrastructure. A hidden Markov model approach to end-to-end available bandwidth estimation and monitoring is investigated to address these issues. This approach builds a model that incorporates the dynamics of the available bandwidth. Every sample that generates an estimation is adjusted by the model. This adjustment makes it possible to obtain acceptable estimation accuracy with a small number of samples and in a short period of time. Finally, the new approach is implemented in a tool called Traceband. The tool, written in ANSI C, is evaluated and compared with Pathload and Spruce, the best estimation tools belonging to the probe rate model and the probe gap model, respectively. The evaluation is performed using Poisson, bursty, and self-similar synthetic cross traffic and real traffic from a network path at University of South Florida. Results show that Traceband provides more estimations per unit time with comparable accuracy to Pathload and Spruce and introduces minimum probing traffic. Traceband also includes an optional moving average technique that smooths out the estimations and improves its accuracy even further.

Page generated in 0.0776 seconds