• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 61
  • 4
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 91
  • 91
  • 91
  • 19
  • 18
  • 13
  • 11
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Représentations pour la détection d’anomalies : Application aux données vibratoires des moteurs d’avions / Representations for anomaly detection : Application to aircraft engines’ vibration data

Abdel Sayed, Mina 03 July 2018 (has links)
Les mesures de vibrations sont l’une des données les plus pertinentes pour détecter des anomalies sur les moteurs. Les vibrations sont acquises sur banc d’essai en phase d’accélération et de décélération pour assurer la fiabilité du moteur à la sortie de la chaine de production. Ces données temporelles sont converties en spectrogrammes pour permettre aux experts d’effectuer une analyse visuelle de ces données et de détecter les différentes signatures atypiques. Les sources vibratoires correspondent à des raies sur les spectrogrammes. Dans cette thèse, nous avons mis en place un outil d’aide à la décision automatique pour analyser les spectrogrammes et détecter tout type de signatures atypiques, ces signatures ne proviennent pas nécessairement d’un endommagement du moteur. En premier lieu, nous avons construit une base de données numérique de spectrogrammes annotés. Il est important de noter que les signatures inusuelles sont variables en forme, intensité et position et se trouvent dans un faible nombre de données. Par conséquent, pour détecter ces signatures, nous caractérisons les comportements normaux des spectrogrammes, de manière analogue aux méthodes de détection de nouveautés, en représentant les patchs des spectrogrammes sur des dictionnaires comme les curvelets et la Non-negative matrix factorization (NMF), ainsi qu’en estimant la distribution de chaque point du spectrogramme à partir de données normales dépendamment ou non de leur voisinage. La détection des points atypiques est réalisée par comparaison des données tests au modèle de normalité estimé sur des données d’apprentissage normales. La détection des points atypiques permet la détection des signatures inusuelles composées par ces points. / Vibration measurements are one of the most relevant data for detecting anomalies in engines. Vibrations are recorded on a test bench during acceleration and deceleration phases to ensure the reliability of every flight engine at the end of the production line. These temporal signals are converted into spectrograms for experts to perform visual analysis of these data and detect any unusual signature. Vibratory signatures correspond to lines on the spectrograms. In this thesis, we have developed a decision support system to automatically analyze these spectrograms and detect any type of unusual signatures, these signatures are not necessarily originated from a damage in the engine. Firstly, we have built a numerical spectrograms database with annotated zones, it is important to note that data containing these unusual signatures are sparse and that these signatures are quite variable in shape, intensity and position. Consequently, to detect them, like in the novelty detection process, we characterize the normal behavior of the spectrograms by representing patches of the spectrograms in dictionaries such as the curvelets and the Non-negative matrix factorization (NMF) and by estimating the distribution of every points of the spectrograms with normal data depending or not of the neighborhood. The detection of the unusual points is performed by comparing test data to the model of normality estimated on learning normal data. The detection of the unusual points allows the detection of the unusual signatures composed by these points.
62

Neural membrane mutual coupling characterisation using entropy-based iterative learning identification

Tang, X., Zhang, Qichun, Dai, X., Zou, Y. 17 November 2020 (has links)
Yes / This paper investigates the interaction phenomena of the coupled axons while the mutual coupling factor is presented as a pairwise description. Based on the Hodgkin-Huxley model and the coupling factor matrix, the membrane potentials of the coupled myelinated/unmyelinated axons are quantified which implies that the neural coupling can be characterised by the presented coupling factor. Meanwhile the equivalent electric circuit is supplied to illustrate the physical meaning of this extended model. In order to estimate the coupling factor, a data-based iterative learning identification algorithm is presented where the Rényi entropy of the estimation error has been minimised. The convergence of the presented algorithm is analysed and the learning rate is designed. To verified the presented model and the algorithm, the numerical simulation results indicate the correctness and the effectiveness. Furthermore, the statistical description of the neural coupling, the approximation using ordinary differential equation, the measurement and the conduction of the nerve signals are discussed respectively as advanced topics. The novelties can be summarised as follows: 1) the Hodgkin-Huxley model has been extended considering the mutual interaction between the neural axon membranes, 2) the iterative learning approach has been developed for factor identification using entropy criterion, and 3) the theoretical framework has been established for this class of system identification problems with convergence analysis. / This work was supported in part by the National Natural Science Foundation of China (NSFC) under Grant 51807010, and in part by the Natural Science Foundation of Hunan under Grant 1541 and Grant 1734. / Research Development Fund Publication Prize Award winner, Nov 2020.
63

Dynamic Myocardial SPECT Imaging Using Single-Pinhole Collimator Detectors: Distance-Driven Forward and Back-Projection, and KDE-Based Image Reconstruction Methods

Ihsani, Alvin January 2015 (has links)
SPECT (Single Photon Emission Computed Tomography) is the modality of choice for myocardial perfusion imaging due to the high sensitivity and specificity, and the lower cost of equipment and radiotracers compared to PET. Dynamic SPECT imaging provides new possibilities for myocardial perfusion imaging by encoding more information in the reconstructed images in the form of time-activity functions. The recent introduction of small solid-state SPECT cameras using multiple pinhole collimators, such as the GE Discovery NM 530c, offers the ability to obtain accurate myocardial perfusion information with markedly decreased acquisition times and offers the possibility to obtain quantitative dynamic perfusion information. This research targets two aspects of dynamic SPECT imaging with the intent of contributing to the improvement of projection and reconstruction methods. First, we propose an adaptation of distance-driven projection to SPECT imaging systems using single-pinhole collimator detectors. The proposed distance-driven projection approach accounts for the finite size of the pinhole, the possibly coarse discretization of the detector and object spaces, and the tilt of the detector surface. We evaluate the projection method in terms of resolution and signal to noise ratio (SNR). We also propose two maximum a posteriori (MAP) iterative image reconstruction methods employing kernel density estimators. The proposed reconstruction methods cluster time-activity functions (or intensity values) by their spatial proximity and similarity, each of which is determined by spatial and range scaling parameters respectively. The results of our experiments support our belief that the proposed reconstruction methods are especially effective when performing reconstructions from low-count measurements. / Thesis / Doctor of Philosophy (PhD)
64

Putting the Wild Back into Wilderness: GIS Analysis of the Daniel Boone National Forest for Potential Red Wolf Reintroduction

Jacobs, Teri A. January 2009 (has links)
No description available.
65

Validation and Inferential Methods for Distributional Form and Shape

Mayorov, Kirill January 2017 (has links)
This thesis investigates some problems related to the form and shape of statistical distributions with the main focus on goodness of fit and bump hunting. A bump is a distinctive characteristic of distributional shape. A search for bumps, or bump hunting, in a probability density function (PDF) has long been an important topic in statistical research. We introduce a new definition of a bump which relies on the notion of the curvature of a planar curve. We then propose a new method for bump hunting which is based on a kernel density estimator of the unknown PDF. The method gives not only the number of bumps but also the location of their centers and base points. In quantitative risk applications, the selection of distributions that properly capture upper tail behavior is essential for accurate modeling. We study tests of distributional form, or goodness-of-fit (GoF) tests, that assess simple hypotheses, i.e., when the parameters of the hypothesized distribution are completely specified. From theoretical and practical perspectives, we analyze the limiting properties of a family of weighted Cramér-von Mises GoF statistics W2 with weight function psi(t)=1/(1-t)^beta (for beta<=2) which focus on the upper tail. We demonstrate that W2 has no limiting distribution. For this reason, we provide a normalization of W2 that leads to a non-degenerate limiting distribution. Further, we study W2 for composite hypotheses, i.e., when distributional parameters must be estimated from a sample at hand. When the hypothesized distribution is heavy-tailed, we examine the finite sample properties of W2 under the Chen-Balakrishnan transformation that reduces the original GoF test (the direct test) to a test for normality (the indirect test). In particular, we compare the statistical level and power of the pairs of direct and indirect tests. We observe that decisions made by the direct and indirect tests agree well, and in many cases they become independent as sample size grows. / Thesis / Doctor of Philosophy (PhD)
66

Nonparametric estimation of the off-pulse interval(s) of a pulsar light curve / Willem Daniël Schutte

Schutte, Willem Daniël January 2014 (has links)
The main objective of this thesis is the development of a nonparametric sequential estimation technique for the off-pulse interval(s) of a source function originating from a pulsar. It is important to identify the off-pulse interval of each pulsar accurately, since the properties of the off-pulse emissions are further researched by astrophysicists in an attempt to detect potential emissions from the associated pulsar wind nebula (PWN). The identification technique currently used in the literature is subjective in nature, since it is based on the visual inspection of the histogram estimate of the pulsar light curve. The developed nonparametric estimation technique is not only objective in nature, but also accurate in the estimation of the off-pulse interval of a pulsar, as evident from the simulation study and the application of the developed technique to observed pulsar data. The first two chapters of this thesis are devoted to a literature study that provides background information on the pulsar environment and -ray astronomy, together with an explanation of the on-pulse and off-pulse interval of a pulsar and the importance thereof for the present study. This is followed by a discussion on some fundamental circular statistical ideas, as well as an overview of kernel density estimation techniques. These two statistical topics are then united in order to illustrate kernel density estimation techniques applied to circular data, since this concept is the starting point of the developed nonparametric sequential estimation technique. Once the basic theoretical background of the pulsar environment and circular kernel density estimation has been established, the new sequential off-pulse interval estimator is formulated. The estimation technique will be referred to as `SOPIE'. A number of tuning parameters form part of SOPIE, and therefore the performed simulation study not only serves as an evaluation of the performance of SOPIE, but also as a mechanism to establish which tuning parameter configurations consistently perform better than some other configurations. In conclusion, the optimal parameter configurations are utilised in the application of SOPIE to pulsar data. For several pulsars, the sequential off-pulse interval estimators are compared to the off-pulse intervals published in research papers, which were identified with the subjective \eye-ball" technique. It is found that the sequential off-pulse interval estimators are closely related to the off-pulse intervals identified with subjective visual inspection, with the benefit that the estimated intervals are objectively obtained with a nonparametric estimation technique. / PhD (Statistics), North-West University, Potchefstroom Campus, 2014
67

A multi-wavelength study of a sample of galaxy clusters / Susan Wilson

Wilson, Susan January 2012 (has links)
In this dissertation we aim to perform a multi-wavelength analysis of galaxy clusters. We discuss various methods for clustering in order to determine physical parameters of galaxy clusters required for this type of study. A selection of galaxy clusters was chosen from 4 papers, (Popesso et al. 2007b, Yoon et al. 2008, Loubser et al. 2008, Brownstein & Mo at 2006) and restricted by redshift and galactic latitude to reveal a sample of 40 galaxy clusters with 0.0 < z < 0.15. Data mining using Virtual Observatory (VO) and a literature survey provided some background information about each of the galaxy clusters in our sample with respect to optical, radio and X-ray data. Using the Kayes Mixture Model (KMM) and the Gaussian Mixing Model (GMM), we determine the most likely cluster member candidates for each source in our sample. We compare the results obtained to SIMBADs method of hierarchy. We show that the GMM provides a very robust method to determine member candidates but in order to ensure that the right candidates are chosen we apply a select choice of outlier tests to our sources. We determine a method based on a combination of GMM, the QQ Plot and the Rosner test that provides a robust and consistent method for determining galaxy cluster members. Comparison between calculated physical parameters; velocity dispersion, radius, mass and temperature, and values obtained from literature show that for the majority of our galaxy clusters agree within 3 range. Inconsistencies are thought to be due to dynamically active clusters that have substructure or are undergoing mergers, making galaxy member identi cation di cult. Six correlations between di erent physical parameters in the optical and X-ray wavelength were consistent with published results. Comparing the velocity dispersion with the X-ray temperature, we found a relation of T0:43 as compared to T0:5 obtained from Bird et al. (1995). X-ray luminosity temperature and X-ray luminosity velocity dispersion relations gave the results LX T2:44 and LX 2:40 which lie within the uncertainty of results given by Rozgacheva & Kuvshinova (2010). These results all suggest that our method for determining galaxy cluster members is e cient and application to higher redshift sources can be considered. Further studies on galaxy clusters with substructure must be performed in order to improve this method. In future work, the physical parameters obtained here will be further compared to X-ray and radio properties in order to determine a link between bent radio sources and the galaxy cluster environment. / MSc (Space Physics), North-West University, Potchefstroom Campus, 2013
68

Nonparametric estimation of the off-pulse interval(s) of a pulsar light curve / Willem Daniël Schutte

Schutte, Willem Daniël January 2014 (has links)
The main objective of this thesis is the development of a nonparametric sequential estimation technique for the off-pulse interval(s) of a source function originating from a pulsar. It is important to identify the off-pulse interval of each pulsar accurately, since the properties of the off-pulse emissions are further researched by astrophysicists in an attempt to detect potential emissions from the associated pulsar wind nebula (PWN). The identification technique currently used in the literature is subjective in nature, since it is based on the visual inspection of the histogram estimate of the pulsar light curve. The developed nonparametric estimation technique is not only objective in nature, but also accurate in the estimation of the off-pulse interval of a pulsar, as evident from the simulation study and the application of the developed technique to observed pulsar data. The first two chapters of this thesis are devoted to a literature study that provides background information on the pulsar environment and -ray astronomy, together with an explanation of the on-pulse and off-pulse interval of a pulsar and the importance thereof for the present study. This is followed by a discussion on some fundamental circular statistical ideas, as well as an overview of kernel density estimation techniques. These two statistical topics are then united in order to illustrate kernel density estimation techniques applied to circular data, since this concept is the starting point of the developed nonparametric sequential estimation technique. Once the basic theoretical background of the pulsar environment and circular kernel density estimation has been established, the new sequential off-pulse interval estimator is formulated. The estimation technique will be referred to as `SOPIE'. A number of tuning parameters form part of SOPIE, and therefore the performed simulation study not only serves as an evaluation of the performance of SOPIE, but also as a mechanism to establish which tuning parameter configurations consistently perform better than some other configurations. In conclusion, the optimal parameter configurations are utilised in the application of SOPIE to pulsar data. For several pulsars, the sequential off-pulse interval estimators are compared to the off-pulse intervals published in research papers, which were identified with the subjective \eye-ball" technique. It is found that the sequential off-pulse interval estimators are closely related to the off-pulse intervals identified with subjective visual inspection, with the benefit that the estimated intervals are objectively obtained with a nonparametric estimation technique. / PhD (Statistics), North-West University, Potchefstroom Campus, 2014
69

A multi-wavelength study of a sample of galaxy clusters / Susan Wilson

Wilson, Susan January 2012 (has links)
In this dissertation we aim to perform a multi-wavelength analysis of galaxy clusters. We discuss various methods for clustering in order to determine physical parameters of galaxy clusters required for this type of study. A selection of galaxy clusters was chosen from 4 papers, (Popesso et al. 2007b, Yoon et al. 2008, Loubser et al. 2008, Brownstein & Mo at 2006) and restricted by redshift and galactic latitude to reveal a sample of 40 galaxy clusters with 0.0 < z < 0.15. Data mining using Virtual Observatory (VO) and a literature survey provided some background information about each of the galaxy clusters in our sample with respect to optical, radio and X-ray data. Using the Kayes Mixture Model (KMM) and the Gaussian Mixing Model (GMM), we determine the most likely cluster member candidates for each source in our sample. We compare the results obtained to SIMBADs method of hierarchy. We show that the GMM provides a very robust method to determine member candidates but in order to ensure that the right candidates are chosen we apply a select choice of outlier tests to our sources. We determine a method based on a combination of GMM, the QQ Plot and the Rosner test that provides a robust and consistent method for determining galaxy cluster members. Comparison between calculated physical parameters; velocity dispersion, radius, mass and temperature, and values obtained from literature show that for the majority of our galaxy clusters agree within 3 range. Inconsistencies are thought to be due to dynamically active clusters that have substructure or are undergoing mergers, making galaxy member identi cation di cult. Six correlations between di erent physical parameters in the optical and X-ray wavelength were consistent with published results. Comparing the velocity dispersion with the X-ray temperature, we found a relation of T0:43 as compared to T0:5 obtained from Bird et al. (1995). X-ray luminosity temperature and X-ray luminosity velocity dispersion relations gave the results LX T2:44 and LX 2:40 which lie within the uncertainty of results given by Rozgacheva & Kuvshinova (2010). These results all suggest that our method for determining galaxy cluster members is e cient and application to higher redshift sources can be considered. Further studies on galaxy clusters with substructure must be performed in order to improve this method. In future work, the physical parameters obtained here will be further compared to X-ray and radio properties in order to determine a link between bent radio sources and the galaxy cluster environment. / MSc (Space Physics), North-West University, Potchefstroom Campus, 2013
70

Three essays on the econometric analysis of high-frequency data

Malec, Peter 27 June 2013 (has links)
Diese Dissertation behandelt die ökonometrische Analyse von hochfrequenten Finanzmarktdaten. Kapitel 1 stellt einen neuen Ansatz zur Modellierung von seriell abhängigen positiven Variablen, die einen nichttrivialen Anteil an Nullwerten aufweisen, vor. Letzteres ist ein weitverbreitetes Phänomen in hochfrequenten Finanzmarktzeitreihen. Eingeführt wird eine flexible Punktmassenmischverteilung, ein maßgeschneiderter semiparametrischer Spezifikationstest sowie eine neue Art von multiplikativem Fehlermodell (MEM). Kapitel 2 beschäftigt sich mit dem Umstand, dass feste symmetrische Kerndichteschätzer eine geringe Präzision aufweisen, falls eine positive Zufallsvariable mit erheblicher Wahrscheinlichkeitsmasse nahe Null gegeben ist. Wir legen dar, dass Gammakernschätzer überlegen sind, wobei ihre relative Präzision von der genauen Form der Dichte sowie des Kerns abhängt. Wir führen einen verbesserten Gammakernschätzer sowie eine datengetriebene Methodik für die Wahl des geeigneten Typs von Gammakern ein. Kapitel 3 wendet sich der Frage nach dem Nutzen von Hochfrequenzdaten für hochdimensionale Portfolioallokationsanwendungen zu. Wir betrachten das Problem der Konstruktion von globalen Minimum-Varianz-Portfolios auf der Grundlage der Konstituenten des S&P 500. Wir zeigen auf, dass Prognosen, welche auf Hochfrequenzdaten basieren, im Vergleich zu Methoden, die tägliche Renditen verwenden, eine signifikant geringere Portfoliovolatilität implizieren. Letzteres geht mit spürbaren Nutzengewinnen aus der Sicht eines Investors mit hoher Risikoaversion einher. / In three essays, this thesis deals with the econometric analysis of financial market data sampled at intraday frequencies. Chapter 1 presents a novel approach to model serially dependent positive-valued variables realizing a nontrivial proportion of zero outcomes. This is a typical phenomenon in financial high-frequency time series. We introduce a flexible point-mass mixture distribution, a tailor-made semiparametric specification test and a new type of multiplicative error model (MEM). Chapter 2 addresses the problem that fixed symmetric kernel density estimators exhibit low precision for positive-valued variables with a large probability mass near zero, which is common in high-frequency data. We show that gamma kernel estimators are superior, while their relative performance depends on the specific density and kernel shape. We suggest a refined gamma kernel and a data-driven method for choosing the appropriate type of gamma kernel estimator. Chapter 3 turns to the debate about the merits of high-frequency data in large-scale portfolio allocation. We consider the problem of constructing global minimum variance portfolios based on the constituents of the S&P 500. We show that forecasts based on high-frequency data can yield a significantly lower portfolio volatility than approaches using daily returns, implying noticeable utility gains for a risk-averse investor.

Page generated in 0.0166 seconds