• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 9
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 111
  • 32
  • 22
  • 20
  • 20
  • 17
  • 16
  • 16
  • 16
  • 15
  • 14
  • 13
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Différents procédés statistiques pour détecter la non-stationnarité dans les séries de précipitation

Charette, Kevin 04 1900 (has links)
Ce mémoire a pour objectif de déterminer si les précipitations convectives estivales simulées par le modèle régional canadien du climat (MRCC) sont stationnaires ou non à travers le temps. Pour répondre à cette question, nous proposons une méthodologie statistique de type fréquentiste et une de type bayésien. Pour l'approche fréquentiste, nous avons utilisé le contrôle de qualité standard ainsi que le CUSUM afin de déterminer si la moyenne a augmenté à travers les années. Pour l'approche bayésienne, nous avons comparé la distribution a posteriori des précipitations dans le temps. Pour ce faire, nous avons modélisé la densité \emph{a posteriori} d'une période donnée et nous l'avons comparée à la densité a posteriori d'une autre période plus éloignée dans le temps. Pour faire la comparaison, nous avons utilisé une statistique basée sur la distance d'Hellinger, la J-divergence ainsi que la norme L2. Au cours de ce mémoire, nous avons utilisé l'ARL (longueur moyenne de la séquence) pour calibrer et pour comparer chacun de nos outils. Une grande partie de ce mémoire sera donc dédiée à l'étude de l'ARL. Une fois nos outils bien calibrés, nous avons utilisé les simulations pour les comparer. Finalement, nous avons analysé les données du MRCC pour déterminer si elles sont stationnaires ou non. / The main goal of this master's thesis is to find whether the summer convective precipitations simulated by the Canadian Regional Climate Model (CRCM) are stationary over time or not. In order to answer that question, we propose both a frequentist and Bayesian statistical methodology. For the frequentist approach, we used standard quality control and the CUSUM to determine if the mean has increased over the years. For the Bayesian approach, we compared the posterior distributions of the precipitations over time. In order to do the comparison, we used a statistic based on the Hellinger's distance, the J-divergence and the L2 norm. In this master's thesis, we used the ARL (average run length) to calibrate each of our methods. Therefore, a big part of this thesis is about studying the actual property of the ARL. Once our tools are well calibrated, we used the simulation to compare them together. Finally, we studied the data from the CRCM to decide, whether or not, the data are stationary.
102

GLR Control Charts for Monitoring Correlated Binary Processes

Wang, Ning 27 December 2013 (has links)
When monitoring a binary process proportion p, it is usually assumed that the binary observations are independent. However, it is very common that the observations are correlated with p being the correlation between two successive observations. The first part of this research investigates the problem of monitoring p when the binary observations follow a first-order two-state Markov chain model with p remaining unchanged. A Markov Binary GLR (MBGLR) chart with an upper bound on the estimate of p is proposed to monitor a continuous stream of autocorrelated binary observations treating each observation as a sample of size n=1. The MBGLR chart with a large upper bound has good overall performance over a wide range of shifts. The MBGLR chart is optimized using the extra number of defectives (END) over a range of upper bounds for the MLE of p. The numerical results show that the optimized MBGLR chart has a smaller END than the optimized Markov binary CUSUM. The second part of this research develops a CUSUM-pp chart and a GLR-pp chart to monitor p and p simultaneously. The CUSUM-pp with two tuning parameters is designed to detect shifts in p and p when the shifted values are known. We apply two CUSUM-pp charts as a chart combination to detect increases in p and increases or decreases in p. The GLR-pp chart with an upper bound on the estimate of p, and an upper bound and a lower bound on the estimate of p works well when the shifts are unknown. We find that the GLR-pp chart has better overall performance. The last part of this research investigates the problem of monitoring p with p remains at the target value when the correlated binary observations are aggregated into samples with n>1. We assume that samples are independent and there is correlation between the observations in a sample. We proposed some GLR and CUSUM charts to monitor p and the performance of the charts are compared. The simulation results show MBNGLR has overall better performance than the other charts. / Ph. D.
103

Applying Goodness-Of-Fit Techniques In Testing Time Series Gaussianity And Linearity

Jahan, Nusrat 05 August 2006 (has links)
In this study, we present two new frequency domain tests for testing the Gaussianity and linearity of a sixth-order stationary univariate time series. Both are two-stage tests. The first stage is a test for the Gaussianity of the series. Under Gaussianity, the estimated normalized bispectrum has an asymptotic chi-square distribution with two degrees of freedom. If Gaussianity is rejected, the test proceeds to the second stage, which tests for linearity. Under linearity, with non-Gaussian errors, the estimated normalized bispectrum has an asymptotic non-central chi-square distribution with two degrees of freedom and constant noncentrality parameter. If the process is nonlinear, the noncentrality parameter is nonconstant. At each stage, empirical distribution function (EDF) goodness-ofit (GOF) techniques are applied to the estimated normalized bispectrum by comparing the empirical CDF with the appropriate null asymptotic distribution. The two specific methods investigated are the Anderson-Darling and Cramer-von Mises tests. Under Gaussianity, the distribution is completely specified, and application is straight forward. However, if Gaussianity is rejected, the proposed application of the EDF tests involves a transformation to normality. The performance of the tests and a comparison of the EDF tests to existing time and frequency domain tests are investigated under a variety of circumstances through simulation. For illustration, the tests are applied to a number of data sets popular in the time series literature.
104

Intrångsdetektering på CAN bus data : En studie för likvärdig jämförelse av metoder

Hedman, Pontus, Skepetzis, Vasilios January 2020 (has links)
Utförda hacker-attacker på moderna fordon belyser ett behov av snabb detektering av hot inom denna miljö, särskilt när det förekommer en trend inom denna industri där moderna fordon idag kan klassas som IoT-enheter. Det förekommer kända fall av attacker där en angripare förmår stoppa fordon i drift, eller ta bromsar ur funktion, och detta har påvisats ske fjärrstyrt. Denna studie undersöker detektion av utförda attacker, på en riktig bil, genom studie av CAN bus meddelanden. De två modellerna CUSUM, från området Change Point Detection, och Random Forests, från området maskininlärning, tillämpas på riktig datamängd, för att sedan jämföras på simulerad data sinsemellan. En ny hypotesdefinition introduceras vilket möjliggör att evalueringsmetoden Conditional expected delay kan nyttjas för fallet Random Forests, där resultat förmås jämföras med evalueringsresultat från CUSUM. Conditional expected delay har inte tidigare studerats för metod av maskininlärning. De båda metoderna evalueras också genom ROC-kurva. Sammantaget förmås de båda metoderna jämföras sinsemellan, med varandras etablerade evalueringsmetoder. Denna studie påvisar metod och hypotes för att brygga de två områdena change point detection och maskininlärning, för att evaluera de två enligt gemensamt motiverade parametervärden. / There are known hacker attacks which have been conducted on modern vehicles. These attacks illustrates a need for early threat detection in this environment. Development of security systems in this environment is of special interest due to the increasing interconnection of vehicles and their newfound classification as IoT devices. Known attacks, that have even been carried out remotely on modern vehicles, include attacks which allow a perpetrator to stop vehicles, or to disable brake mechanisms. This study examines the detection of attacks carried out on a real vehicle, by studying CAN bus messages. The two methods CUSUM, from the field of Change Point Detection, and Random Forests, from the field of Machine Learning, are both applied to real data, and then later comparably evaluated on simulated data. A new hypothesis defintion is introduced which allows for the evaluation method Conditional expected delay to be used in the case of Random Forests, where results may be compared to evaluation results from CUSUM. Conditional expected delay has not been studied in the machinelarning case before. Both methods are also evaluated by method of ROC curve. The combined hypothesis definition for the two separate fields, allow for a comparison between the two models, in regard to each other's established evaluation methods. This study present a method and hypothesis to bridge the two separate fields of study, change point detection, and machinelearning, to achieve a comparable evaluation between the two.
105

Monitoring energy performance in local authority buildings

Stuart, Graeme January 2011 (has links)
Energy management has been an important function of organisations since the oil crisis of the mid 1970’s led to hugely increased costs of energy. Although the financial costs of energy are still important, the growing recognition of the environmental costs of fossil-fuel energy is becoming more important. Legislation is also a key driver. The UK has set an ambitious greenhouse gas (GHG) reduction target of 80% of 1990 levels by 2050 in response to a strong international commitment to reduce GHG emissions globally. This work is concerned with the management of energy consumption in buildings through the analysis of energy consumption data. Buildings are a key source of emissions with a wide range of energy-consuming equipment, such as photocopiers or refrigerators, boilers, air-conditioning plant and lighting, delivering services to the building occupants. Energy wastage can be identified through an understanding of consumption patterns and in particular, of changes in these patterns over time. Changes in consumption patterns may have any number of causes; a fault in heating controls; a boiler or lighting replacement scheme; or a change in working practice entirely unrelated to energy management. Standard data analysis techniques such as degree-day modelling and CUSUM provide a means to measure and monitor consumption patterns. These techniques were designed for use with monthly billing data. Modern energy metering systems automatically generate data at half-hourly or better resolution. Standard techniques are not designed to capture the detailed information contained in this comparatively high-resolution data. The introduction of automated metering also introduces the need for automated analysis. This work assumes that consumption patterns are generally consistent in the short-term but will inevitably change. A novel statistical method is developed which builds automated event detection into a novel consumption modelling algorithm. Understanding these changes to consumption patterns is critical to energy management. Leicester City Council has provided half-hourly data from over 300 buildings covering up to seven years of consumption (a total of nearly 50 million meter readings). Automatic event detection pinpoints and quantifies over 5,000 statistically significant events in the Leicester dataset. It is shown that the total impact of these events is a decrease in overall consumption. Viewing consumption patterns in this way allows for a new, event-oriented approach to energy management where large datasets are automatically and rapidly analysed to produce summary meta-data describing their salient features. These event-oriented meta-data can be used to navigate the raw data event by event and are highly complementary to strategic energy management.
106

股價指數報酬率厚尾程度之研究

李佳晏 Unknown Date (has links)
許多觀察到的時間序列資料,多呈現高峰厚尾(leptokurtic)的現象,本文引用時間序列資料為Paretian分配之假設,估計各個國家股價指數報酬率於不同頻率資料下之最大級數動差,以觀察其厚尾程度。實證結果發現,各個國家指數報酬率於不同頻率資料下之四級以上動差大部分存在,且不隨資料之頻率不同,而有不同的表現。由此可推論,各個國家股價指數報酬率之歷史分配,其離群值之活動並不嚴重。接著,利用樣本分割預測檢定(Sample Split Prediction Test)來檢定所觀察各個國家股價指數報酬率於同一樣本期間內,其左右尾之厚尾程度是否一致,及檢定所觀察各個國家指數報酬率於跨期間左尾或右尾之厚尾程度是否穩定。在同一樣本期間,檢定時間序列之左右尾之厚尾程度是否一致之檢定中,發現各個國家指數報酬率在所觀察樣本期間內,其左右尾之厚尾程度大致相同;而在跨期間之樣本分割預測檢定中,發現各個國家指數報酬率在像是1987年10月美國股市大崩盤、1990年至1991年間之波斯灣戰爭、1997年亞洲金融風暴等事件前後,其左(右)尾之厚尾程度有顯著差異。最後提出Cusum of Squares檢定,係用於檢定一時間序列資料在所觀察之樣本期間內,其非條件變異數是否為一常數。 Cusum of Squares檢定之檢定結果顯示,本文之各個國家指數報酬率在所觀察之樣本期間內,其非條件變異數並非為一常數。進一步觀察各個國家指數報酬率之Cusum of Squares圖,並綜合前述跨期間樣本分割預測檢定之結果,可推論在處理較長樣本期間之時間序列資料可能遇到結構性變動之情況時,跨期間之樣本分割預測檢定及Cusum of Squares檢定可提供結構性變動可能發生之時點。
107

Three Essays on Estimation and Testing of Nonparametric Models

Ma, Guangyi 2012 August 1900 (has links)
In this dissertation, I focus on the development and application of nonparametric methods in econometrics. First, a constrained nonparametric regression method is developed to estimate a function and its derivatives subject to shape restrictions implied by economic theory. The constrained estimators can be viewed as a set of empirical likelihood-based reweighted local polynomial estimators. They are shown to be weakly consistent and have the same first order asymptotic distribution as the unconstrained estimators. When the shape restrictions are correctly specified, the constrained estimators can achieve a large degree of finite sample bias reduction and thus outperform the unconstrained estimators. The constrained nonparametric regression method is applied on the estimation of daily option pricing function and state-price density function. Second, a modified Cumulative Sum of Squares (CUSQ) test is proposed to test structural changes in the unconditional volatility in a time-varying coefficient model. The proposed test is based on nonparametric residuals from local linear estimation of the time-varying coefficients. Asymptotic theory is provided to show that the new CUSQ test has standard null distribution and diverges at standard rate under the alternatives. Compared with a test based on least squares residuals, the new test enjoys correct size and good power properties. This is because, by estimating the model nonparametrically, one can circumvent the size distortion from potential structural changes in the mean. Empirical results from both simulation experiments and real data applications are presented to demonstrate the test's size and power properties. Third, an empirical study of testing the Purchasing Power Parity (PPP) hypothesis is conducted in a functional-coefficient cointegration model, which is consistent with equilibrium models of exchange rate determination with the presence of trans- actions costs in international trade. Supporting evidence of PPP is found in the recent float exchange rate era. The cointegration relation of nominal exchange rate and price levels varies conditioning on the real exchange rate volatility. The cointegration coefficients are more stable and numerically near the value implied by PPP theory when the real exchange rate volatility is relatively lower.
108

Gr?ficos de controle para o monitoramento de processos autorregressivo de valores inteiros com infla??o ou defla??o de zeros / Control charts for monitoring of autoregressive processes of integer values with inflation or deflation of zeros

Fernandes, Fidel Henrique 20 February 2018 (has links)
Submitted by Automa??o e Estat?stica (sst@bczm.ufrn.br) on 2018-03-12T18:58:57Z No. of bitstreams: 1 FidelHenriqueFernandes_DISSERT.pdf: 1091767 bytes, checksum: e62c7343dd6383ea93795ea06d1fe95d (MD5) / Approved for entry into archive by Arlan Eloi Leite Silva (eloihistoriador@yahoo.com.br) on 2018-03-16T14:34:02Z (GMT) No. of bitstreams: 1 FidelHenriqueFernandes_DISSERT.pdf: 1091767 bytes, checksum: e62c7343dd6383ea93795ea06d1fe95d (MD5) / Made available in DSpace on 2018-03-16T14:34:02Z (GMT). No. of bitstreams: 1 FidelHenriqueFernandes_DISSERT.pdf: 1091767 bytes, checksum: e62c7343dd6383ea93795ea06d1fe95d (MD5) Previous issue date: 2018-02-20 / A s?rie temporal ? uma cole??o de observa??es medidas sequencialmente ao longo do tempo, sendo estudadas com profunda notoriedade nos ?ltimos anos juntamente com o controle estat?stico de processo. O presente trabalho objetiva estudar o desempenho dos gr?ficos de controle CUSUM e Shewhart na detec??o de m?dias do processo com infla??o ou defla??o de zeros atrav?s do modelo autorregressivo de valores inteiros geom?trico zero modificado de primeira ordem [ZMGINAR(1)]. Nesse sentido, analisa-se ainda a sensibilidade atrav?s de simula??es, o n?mero m?dio de amostras que excedem o limite de controle (NMA), al?m disso, novos estimadores foram propostos afim de verificar, atrav?s do estudo de Monte Carlo, o comportamento dos estimadores atrav?s do erro quadr?tico m?dio (EQM) e vi?s em diferentes cen?rios, os estimadores propostos se mostraram mais eficazes. No que concerne a simula??o, os diferentes cen?rios apresentados com infla??o e defla??o de zeros, o CUSUM mostrou-se mais eficiente no cen?rio com defla??o de zeros e o de Shewhart com infla??o de zeros em determinados casos. Nessa inst?ncia, considerou-se duas aplica??es, uma com infla??o de zeros e outra com defla??o de zeros. Assim como na simula??o, o CUSUM ? melhor no cen?rio com defla??o de zeros e o Shewhart com infla??o de zeros. O grande diferencial deste trabalho ? a apari??o da defla??o de zeros modelada nos gr?ficos de controle, al?m disso o modelo a ser trabalhado possui distribui??o marginal conhecida diferentemente de outros modelos, o que ? uma vantagem na implementa??o e constru??o de novos estimadores, acrescido a isso, considera-se ainda as estimativas dos par?metros por diversos m?todos: M?xima Verossimilhan?a, Yule-Walker e o estimador baseado em Probabilidade.
109

Generation and Detection of Adversarial Attacks for Reinforcement Learning Policies

Drotz, Axel, Hector, Markus January 2021 (has links)
In this project we investigate the susceptibility ofreinforcement rearning (RL) algorithms to adversarial attacks.Adversarial attacks have been proven to be very effective atreducing performance of deep learning classifiers, and recently,have also been shown to reduce performance of RL agents.The goal of this project is to evaluate adversarial attacks onagents trained using deep reinforcement learning (DRL), aswell as to investigate how to detect these types of attacks. Wefirst use DRL to solve two environments from OpenAI’s gymmodule, namely Cartpole and Lunarlander, by using DQN andDDPG (DRL techniques). We then evaluate the performanceof attacks and finally we also train neural networks to detectattacks. The attacks was successful at reducing performancein the LunarLander environment and CartPole environment.The attack detector was very successful at detecting attacks onthe CartPole environment, but performed not quiet as well onLunarLander.We hypothesize that continuous action space environmentsmay pose a greater difficulty for attack detectors to identifypotential adversarial attacks. / I detta projekt undersöker vikänsligheten hos förstärknings lärda (RL) algotritmerför attacker mot förstärknings lärda agenter. Attackermot förstärknings lärda agenter har visat sig varamycket effektiva för att minska prestandan hos djuptförsärknings lärda klassifierare och har nyligen visat sigockså minska prestandan hos förstärknings lärda agenter.Målet med detta projekt är att utvärdera attacker motdjupt förstärknings lärda agenter och försöka utföraoch upptäcka attacker. Vi använder först RL för attlösa två miljöer från OpenAIs gym module CartPole-v0och ContiniousLunarLander-v0 med DQN och DDPG.Vi utvärderar sedan utförandet av attacker och avslutarslutligen med ett möjligt sätt att upptäcka attacker.Attackerna var mycket framgångsrika i att minskaprestandan i både CartPole-miljön och LunarLandermiljön. Attackdetektorn var mycket framgångsrik medatt upptäcka attacker i CartPole-miljön men presteradeinte lika bra i LunarLander-miljön.Vi hypotiserar att miljöer med kontinuerligahandlingsrum kan innebära en större svårighet fören attack identifierare att upptäcka attacker mot djuptförstärknings lärda agenter. / Kandidatexjobb i elektroteknik 2021, KTH, Stockholm
110

Estimation of Ship Properties for Energy Efficient Automation

Nilsson, Lucas January 2016 (has links)
One method to increase efficiency, robustness and accuracy of automatic control, is to introduce mathematical models of the system in question to increase performance. With these models, it is possible to predict the behavior of the system, which enables control according to the predictions. The problem here is that if these models do not describe the dynamics of the system well enough, this method could fail to increase performance. To address this problem, one idea is to estimate the dynamics of the system during operation, using methods for system identification, signal processing and sensor fusion. In this thesis, the possibilities of estimating a ship's dynamics during operation have been investigated. The mathematical model describing the dynamics of the ship is a graybox model, which is based on the physical and mechanical relations. This model's properties are therefore described by physical quantities such as mass and moment of inertia, all of which are unknown. This means that, when estimating the model, these physical properties will be estimated. For a systematic approach, first a simulation environment with a 4-degrees-of-freedom ship model has been developed. This environment has been used for validation of system identification methods. A model of a podded propulsion system has also been derived and validated. The methods for estimating the properties of the ship have been analyzed using the data collected from the simulations. For system identification and estimation of ship properties, the influence of measurement noise and potential of detecting a change in dynamics has been analyzed. This has been done through Monte Carlo simulations of the estimation method with different noise realizations in the simulations, to analyze how the measurement noise affects the variance and bias for the estimates. The results show that variance and bias vary a lot between the parameters and that even a small change in dynamics is visible in some parameter estimates when only ten minutes of data have been used. A method based on cumulative summation (CUSUM) has been proposed and validated to analyze if such a method could yield fast and effective detection of system deviations. The results show that the method is rather effective a with robust detection of changes in the dynamics after about four minutes of data collection. Finally, the methods have been validated on data collected on a real ship to analyze the potential of the methods under actual circumstances. The results show that the particular data is not appropriate for this kind of application along with some additional problems that can yield impaired results. / Genom att inkludera matematiska modeller som beskriver ett systems dynamik i styrningsalgoritmer, kan man åstadkomma en automatisk styrning med förbättrad effektivitet, robusthet och noggrannhet. Med dessa modeller går det att förutsäga beteendet hos systemet och därmed öppnas också möjligheten att använda sig av detta i styrningen. Problemet är att om dessa modeller inte beskriver systemets dynamik tillräckligt bra kan prestandan istället sänkas genom dessa metoder. Den här sortens problem kan man lösa genom att aktivt skatta systemets dynamik under körning, med hjälp av metoder för systemidentifiering, signalbehandling och sensorfusion. I denna exjobbsrapport har möjligheterna att skatta ett skepps girdynamik undersökts. Den matematiska modell som beskriver skeppets dynamik är en grålådemodell som baserar sig på fysikaliska och mekaniska samband. Denna modells egenskaper beskrivs därför av fysikaliska storheter så som massa, tröghetsmoment och tyngdpunkt, vilka alla är okända. Detta innebär att vid modellskattning skattas dessa fysikaliska storheter, vilka kan vara av stort intresse. En simuleringsmiljö med en skeppsmodell med fyra frihetsgrader har skapats och använts för att validera metoder för systemidentifiering. En modell av ett roterbart framdrivningssystem har också härletts och inkluderats i simuleringsmodellen. Vid systemidentifiering och skattning av skeppets egenskaper har dels inverkan av mätbrus analyserats samt även möjligheter till att detektera skillnader i dynamik. Detta har gjorts med Monte Carlo-simuleringar av skattningsmetoden med olika brusrealiseringar för att analysera hur mätbrus påverkar variansen och metodfelet hos skattningarna. Resultaten visar att vissa parametrar skattas med större noggrannhet och hos dessa kan därmed en förändring i dynamik identifieras när endast tio minuter av data har använts. En metod baserad på kumulativ summering av residualer har formulerats och validerats, detta för att undersöka om en sådan metod kan ge snabb och effektiv detektion av systemförändringar. Resultat visar på robusthet i att detektera skillnader i dynamik efter ungefär fyra minuter av datainsamling. Slutligen har metoderna validerats på data insamlad på ett riktigt skepp för att undersöka potentialen under verkliga omständigheter. Resultaten visar att just denna data inte är lämplig för denna applikation samt några problem som kan leda till försämrade resultat.

Page generated in 0.0259 seconds