• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 115
  • 72
  • 15
  • 12
  • 10
  • 8
  • 7
  • 6
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 305
  • 104
  • 55
  • 49
  • 46
  • 43
  • 34
  • 32
  • 31
  • 30
  • 29
  • 25
  • 23
  • 23
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Real time perfusion and oxygenation monitoring in an implantable optical sensor

Subramanian, Hariharan 12 April 2006 (has links)
Simultaneous blood perfusion and oxygenation monitoring is crucial for patients undergoing a transplant procedure. This becomes of great importance during the surgical recovery period of a transplant procedure when uncorrected loss of perfusion or reduction in oxygen saturation can result in patient death. Pulse oximeters are standard monitoring devices which are used to obtain the perfusion level and oxygen saturation using the optical absorption properties of hemoglobin. However, in cases of varying perfusion due to hemorrhage, blood clot or acute blockage, the oxygenation results obtained from traditional pulse oximeters are erroneous due to a sudden drop in signal strength. The long term goal of the project is to devise an implantable optical sensor which is able to perform better than the traditional pulse oximeters with changing perfusion and function as a local warning for sudden blood perfusion and oxygenation loss. In this work, an optical sensor based on a pulse oximeter with an additional source at 810nm wavelength has been developed for in situ monitoring of transplant organs. An algorithm has been designed to separate perfusion and oxygenation signals from the composite signal obtained from the three source pulse oximetry-based sensor. The algorithm uses 810nm reference signals and an adaptive filtering routine to separate the two signals which occur at the same frequency. The algorithm is initially applied to model data and its effectiveness is further tested using in vitro and in vivo data sets to quantify its ability to separate the signals of interest. The entire process is done in real time in conjunction with the autocorrelation-based time domain technique. This time domain technique uses digital filtering and autocorrelation to extract peak height information and generate an amplitude measurement and has shown to perform better than the traditional fast Fourier transform (FFT) for semi-periodic signals, such as those derived from heart monitoring. In particular, in this paper it is shown that the two approaches produce comparable results for periodic in vitro perfusion signals. However, when used on semi periodic, simulated, perfusion signals and in vivo data generated from an optical perfusion sensor the autocorrelation approach clearly (Standard Error, SE = 0.03) outperforms the FFT-based analysis (Standard Error, SE = 0.62).
82

Software testing testbed for MPEG-4 video traffic over IEEE 802.11b wireless lans [electronic resource] / by Praveen Chiranjeevi Ikkurthy.

Ikkurthy, Praveen Chiranjeevi. January 2003 (has links)
Title from PDF of title page. / Document formatted into pages; contains 65 pages. / Thesis (M.S.C.S.)--University of South Florida, 2003. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: Several traffic characterization studies have been performed on wireless LANs with the main objective of realizing good and accurate models of the errors in the wireless channel. These models have been extended to model the effect of errors on higher layer protocols, mainly at the data link layer. However, no prior work has been done to study the application level characteristics of MPEG-4 video traffic over 802.11b wireless networks. In this thesis a traffic characterization study of MPEG-4 video traffic over IEEE 802.11b wireless LANs with the main goal of building a tool for software testing is performed. Using two freely available tools to send and receive real-time streams and collect and analyze traces, MPEG-4 encoded video frames are sent over a 11 Mbps, 802.11b wireless LAN to characterize the errors in the channel and the effect of those errors on the quality of the movie. The results of this traffic characterization were modeled using ARTA (Auto Regressive-To-Anything) software. These modeled characteristics were then used to build a tool that generates synthetic traffic emulating real wireless network scenario. The tool emulates the error length and error free length characteristics of the wireless network for the MPEG-4 video traffic using the corresponding modeled characteristics generated by ARTA. The tool can be used by software developers to test their MPEG-4 streaming media applications without the need of the real infrastructure. The tool can also be trained and extended to support testing of any streaming media applications. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.
83

Παραβιάσεις των βασικών υποθέσεων του γραμμικού μοντέλου παλινδρόμησης

Γρηγοριάδου, Μαρία 05 February 2015 (has links)
Το στατιστικό μοντέλο είναι μία τυποποίηση στοχαστικών σχέσεων μεταξύ μεταβλητών σε μορφή μαθηματικών εξισώσεων με σκοπό την όσο το δυνατόν πιο ακριβή περιγραφή ενός συστήματος (φαινομένου ή γεγονότος). Σχεδόν σε κάθε σύστημα, υπάρχουν μεταβλητές ποσότητες που αλλάζουν. Ένα ενδιαφέρον ζήτημα είναι η μελέτη των επιδράσεων που αυτές οι μεταβλητές ασκούν (ή φαίνεται να ασκούν) πάνω σε άλλες. Η μελέτη αυτή είναι το αντικείμενο της ανάλυσης παλινδρόμησης, μίας ευρέως χρησιμοποιούμενης στατιστικής τεχνικής, την οποία χρησιμοποιούμε για να ανιχνεύσουμε και να μοντελοποιήσουμε σχέσεις και εξαρτήσεις μεταξύ μεταβλητών. Όταν οι σχέσεις μεταξύ των μεταβλητών είναι γραμμικές, προκύπτουν τα λεγόμενα γραμμικά παλινδρομικά μοντέλα. Τα στατιστικά μοντέλα παλινδρόμησης, βασίζονται σε κάποιες βασικές υποθέσεις, τις οποίες υποχρεούμαστε να ελέγχουμε πριν την ανάλυση του μοντέλου. Στην πράξη, όμως, οι υποθέσεις αυτές συχνά παραβιάζονται. Όταν δε, έχουμε να κάνουμε με δεδομένα του πραγματικού κόσμου, η παραβίαση των υποθέσεων αυτών είναι τόσο συχνή που αποτελεί στη συντριπτική πλειοψηφία τον κανόνα παρά την εξαίρεση. Η παρούσα διπλωματική εργασία πραγματεύεται το σημαντικότατο θέμα που ανακύπτει σε περιπτώσεις στις οποίες κάποιες από τις βασικές υποθέσεις που διέπουν το γραμμικό μοντέλο παλινδρόμησης παραβιάζονται. Σκοπός της εργασίας αυτής είναι : α)να αναλυθούν οι αιτίες που προκαλούν την κάθε παραβίαση και οι επιπτώσεις που έχει αυτή στο μοντέλο, β)να καταγραφούν οι βασικότεροι τρόποι ανίχνευσης των παραβιάσεων στο υπόδειγμα, γ)να βρεθούν τρόποι αντιμετώπισης των "προβληματικών καταστάσεων". Τα αποτελέσματα δείχνουν ότι ο συνδυασμός της καθεστηκυίας γνώσης (του θεωρητικού υποβάθρου) για το αντικείμενο και των σύγχρονων μεθόδων και ιδεών μπορούν να μειώσουν σημαντικά τις δυσμενείς επιπτώσεις που επιφέρουν οι παραβιάσεις των κανόνων στο μοντέλο, και παράλληλα μας επιτρέπει να "περισώσουμε" ικανοποιητικό ποσό πληροφορίας. / The statistical model is a standarization of stochastic relationships between variables in a form of mathematical equations in order to accurately describe a system, either phenomena, or facts. Almost every system includes some variable amounts that change.The interesting question is to investigate the effects those variables have (or appear to have) on other variables. This kind of investigation is the object of the regression analysis,a widely used statistical technic, which is used so as to detect relations and dependences between variables. Linear regression models are created when there are linear relations between variables. In addition, statistical models are based on some significant assumptions, that we are obliged to validate before we analyze the model. However, these assumptions are often violated in practise. Especially when we have to face with <<real world>> data, the violation is too frecuent that ends to be the rule instead the exception. The current thesis addresses the important subject which arises when some basic assumptions of the linear regression model are violated.The purpose of writing this thesis is : a)to analyse the reasons why the basic assumptions are violated and how these violations effect to our model b)to report the main methods in order to scan the model for violations c)to find ways to fight the problems The investigation results to the fact that if we combine the theoretical backround and the modern methods and techniques, we can reduce the adverse consecuences -and occasionally even reverse the damages- that the violations breed to the model, with simultaneous <<salvation>> of a quite satisfactory amount of information.
84

Γραμμικά μοντέλα χρονοσειρών και αυτοσυσχέτισης

Γαζή, Σταυρούλα 07 July 2015 (has links)
Ο σκοπός αυτής της μεταπτυχιακής εργασίας είναι διπλός και συγκεκριμένα αφορά στη μελέτη του απλού / γενικευμένου πολλαπλού μοντέλου παλινδρόμησης όταν σε αυτό παραβιάζεται μια από τις συνθήκες των Gauss-Markov και πιο συγκεκριμένα όταν, Cov{ε_i,ε_j }≠0, ∀ i≠j και στην ανάλυση χρονοσειρών. Αρχικά, γίνεται συνοπτική αναφορά στο απλό και στο πολλαπλό γραμμικό μοντέλο παλινδρόμησης, στις ιδιότητες καθώς και στις εκτιμήσεις των συντελεστών παλινδρόμησης. Περιγράφονται οι ιδιότητες των τυχαίων όρων όπως μέση τιμή, διασπορά, συντελεστές συσχέτισης κ.α., εφόσον υπάρχει παραβίαση της ιδιότητας της συνδιασποράς αυτών. Τέλος, περιγράφεται ο έλεγχος για αυτοσυσχέτιση των τυχαίων όρων των Durbin-Watson καθώς και μια ποικιλία διορθωτικών μέτρων με σκοπό την εξάλειψή της. Στο δεύτερο μέρος, αρχικά αναφέρονται βασικές έννοιες της θεωρίας των χρονοσειρών. Στη συνέχεια, γίνεται ανάλυση διαφόρων στάσιμων χρονοσειρών και συγκεκριμένα, ξεκινώντας από το λευκό θόρυβο, παρουσιάζονται οι χρονοσειρές κινητού μέσου (ΜΑ), οι αυτοπαλινδρομικές χρονοσειρές (ΑR), οι χρονοσειρές ARMA, καθώς και η γενική περίπτωση μη στάσιμων χρονοσειρών, των ΑRΙΜΑ χρονοσειρών και παρατίθενται συνοπτικά τα πρώτα στάδια ανάλυσης μιας χρονοσειράς για κάθε μια από τις περιπτώσεις αυτές. Η εργασία αυτή βασίστηκε σε δύο σημαντικά βιβλία διακεκριμένων επιστημόνων, του κ. Γεώργιου Κ. Χρήστου, Εισαγωγή στην Οικονομετρία και στο βιβλίο των John Neter, Michael H. Kutner, Christofer J. Nachtsheim και William Wasserman, Applied Linear Regression Models. / The purpose of this thesis is twofold, namely concerns the study of the simple / generalized multiple regression model when this violated one of the conditions of Gauss-Markov specifically when, Cov {e_i, e_j} ≠ 0, ∀ i ≠ j and time series analysis. Initially, there is a brief reference to the simple and multiple linear regression model, the properties and estimates of regression coefficients. Describe the properties of random terms such as mean, variance, correlation coefficients, etc., if there is a breach of the status of their covariance. Finally, described the test for autocorrelation of random terms of the Durbin-Watson and a variety of corrective measures to eliminate it. In the second part, first mentioned basic concepts of the theory of time series. Then, various stationary time series analyzes and specifically, starting from the white noise, the time series moving average presented (MA), the aftopalindromikes time series (AR) time series ARMA, and the general case of non-stationary time series of ARIMA time series and briefly presents the first analysis steps in a time series for each of these cases. This work was based on two important books of distinguished scientists, Mr. George K. Christou, Introduction to Econometrics, and in the book of John Neter, Michael H. Kutner, Christofer J. Nachtsheim and William Wasserman, Applied Linear Regression Models.
85

Critical behavior for the model of random spatial permutations

Kerl, John R. January 2010 (has links)
We examine a phase transition in a model of random spatial permutations which originates in a study of the interacting Bose gas. Permutations are weighted according to point positions; the low-temperature onset of the appearance of arbitrarily long cycles is connected to the phase transition of Bose-Einstein condensates. In our simplified model, point positions are held fixed on the fully occupied cubic lattice and interactions are expressed as Ewens-type weights on cycle lengths of permutations. The critical temperature of the transition to long cycles depends on an interaction-strength parameter α. For weak interactions, the shift in critical temperature is expected to be linear in α with constant of linearity c. Using Markov chain Monte Carlo methods and finite-size scaling, we find c = 0.618 ± 0.086. This finding matches a similar analytical result of Ueltschi and Betz. We also examine the mean longest cycle length as a fraction of the number of sites in long cycles, recovering an earlier result of Shepp and Lloyd for non-spatial permutations. The plan of this paper is as follows. We begin with a non-technical discussion of the historical context of the project, along with a mention of alternative approaches. Relevant previous works are cited, thus annotating the bibliography. The random-cycle approach to the BEC problem requires a model of spatial permutations. This model it is of its own probabilistic interest; it is developed mathematically, without reference to the Bose gas. Our Markov-chain Monte Carlo algorithms for sampling from the random-cycle distribution - the swap-only, swap-and-reverse, band-update, and worm algorithms - are presented, compared, and contrasted. Finite-size scaling techniques are used to obtain information about infinite-volume quantities from finite-volume computational data.
86

GARCH models based on Brownian Inverse Gaussian innovation processes / Gideon Griebenow

Griebenow, Gideon January 2006 (has links)
In classic GARCH models for financial returns the innovations are usually assumed to be normally distributed. However, it is generally accepted that a non-normal innovation distribution is needed in order to account for the heavier tails often encountered in financial returns. Since the structure of the normal inverse Gaussian (NIG) distribution makes it an attractive alternative innovation distribution for this purpose, we extend the normal GARCH model by assuming that the innovations are NIG-distributed. We use the normal variance mixture interpretation of the NIG distribution to show that a NIG innovation may be interpreted as a normal innovation coupled with a multiplicative random impact factor adjustment of the ordinary GARCH volatility. We relate this new volatility estimate to realised volatility and suggest that the random impact factors are due to a news noise process influencing the underlying returns process. This GARCH model with NIG-distributed innovations leads to more accurate parameter estimates than the normal GARCH model. In order to obtain even more accurate parameter estimates, and since we expect an information gain if we use more data, we further extend the model to cater for high, low and close data, as well as full intraday data, instead of only daily returns. This is achieved by introducing the Brownian inverse Gaussian (BIG) process, which follows naturally from the unit inverse Gaussian distribution and standard Brownian motion. Fitting these models to empirical data, we find that the accuracy of the model fit increases as we move from the models assuming normally distributed innovations and allowing for only daily data to those assuming underlying BIG processes and allowing for full intraday data. However, we do encounter one problematic result, namely that there is empirical evidence of time dependence in the random impact factors. This means that the news noise processes, which we assumed to be independent over time, are indeed time dependent, as can actually be expected. In order to cater for this time dependence, we extend the model still further by allowing for autocorrelation in the random impact factors. The increased complexity that this extension introduces means that we can no longer rely on standard Maximum Likelihood methods, but have to turn to Simulated Maximum Likelihood methods, in conjunction with Efficient Importance Sampling and the Control Variate variance reduction technique, in order to obtain an approximation to the likelihood function and the parameter estimates. We find that this time dependent model assuming an underlying BIG process and catering for full intraday data fits generated data and empirical data very well, as long as enough intraday data is available. / Thesis (Ph.D. (Risk Analysis))--North-West University, Potchefstroom Campus, 2006.
87

On the Robustness of the Rank-Based CUSUM Chart against Autocorrelation

Hackl, Peter, Maderbacher, Michael January 1999 (has links) (PDF)
Even a modest positive autocorrelation results in a considerable increase in the number of false alarms that are produced when applying a CUSUM chart. Knowledge of the process to be controlled allows for suitable adaptation of the CUSUM procedure. If one has to suspect the normality assumption, nonparametric control procedures such as the rank-based CUSUM chart are a practical alternative. The paper reports the results of a simulation study on the robustness (in terms of sensitivity of the ARL) of the rank-based CUSUM chart against serial correlation of the control variable. The results indicate that the rank-based CUSUM chart is less affected by correlation than the observation-based chart: The rank-based CUSUM chart shows a smaller increase in the number of false alarms and a higher decrease in the ARL in the out-of-control case than the the observation-based chart. (author's abstract) / Series: Forschungsberichte / Institut für Statistik
88

Internal leakage diagnosis in valve controlled actuation systems and electrohydrostatic actuation systems

Alozie, Chinenye 16 May 2014 (has links)
Diagnosis of faults associated with hydraulic actuators is essential to avoid accidents or loss of system functionality. This thesis focuses on internal leakage fault diagnosis in valve controlled hydraulic actuation systems (VCA) as well as electrohydrostatic actuation systems (EHA). For the VCA, the hydraulic actuator is driven in a closed loop mode to track a pseudorandom input signal whereas for the EHA, an actuator is driven in an open loop mode to track a sinusoidal input. Motivated by developing a method that does not rely on the model of the system or type of fault, signal processing techniques based on the ratio of metric lengths of pressure signals, autocorrelation of pressure signal, cross correlation between chamber pressure signals, and cross correlation between control signal and piston displacement is employed for internal leakage diagnosis. For the VCA, autocorrelation of pressure signals performed well at lower lags (less than 4) and at a window size of 200 data points; both cross correlation between pressure signals and cross correlation between control signal and piston displacement performed well at higher lags (higher than 8) and at a window size of 100 data points; ratio of metric lengths of pressure signals was found to be more effective at higher lag ratios (more than 16:3). All methods were sensitive to the lowest simulated leakage of 0.047 L/min, though with different level of success; ratio of metric lengths produced 84% sensitivity, autocorrelation 19% sensitivity, cross correlation between pressure signals 25% sensitivity and cross correlation between piston displacement and control signal 20% sensitivity. For the EHA, all methods were capable of identifying small leakage of 0.98 L/min. The ratio of metric lengths produced 6.7% sensitivity, autocorrelation 2.59% sensitivity, cross correlation between pressure signals 9.4% sensitivity and cross correlation between piston displacement and control signal 31.9% sensitivity. The low leakage detection achieved without requiring a model of the actuator or leakage type make these methods very attractive for industrial implementation
89

GARCH models based on Brownian Inverse Gaussian innovation processes / Gideon Griebenow

Griebenow, Gideon January 2006 (has links)
In classic GARCH models for financial returns the innovations are usually assumed to be normally distributed. However, it is generally accepted that a non-normal innovation distribution is needed in order to account for the heavier tails often encountered in financial returns. Since the structure of the normal inverse Gaussian (NIG) distribution makes it an attractive alternative innovation distribution for this purpose, we extend the normal GARCH model by assuming that the innovations are NIG-distributed. We use the normal variance mixture interpretation of the NIG distribution to show that a NIG innovation may be interpreted as a normal innovation coupled with a multiplicative random impact factor adjustment of the ordinary GARCH volatility. We relate this new volatility estimate to realised volatility and suggest that the random impact factors are due to a news noise process influencing the underlying returns process. This GARCH model with NIG-distributed innovations leads to more accurate parameter estimates than the normal GARCH model. In order to obtain even more accurate parameter estimates, and since we expect an information gain if we use more data, we further extend the model to cater for high, low and close data, as well as full intraday data, instead of only daily returns. This is achieved by introducing the Brownian inverse Gaussian (BIG) process, which follows naturally from the unit inverse Gaussian distribution and standard Brownian motion. Fitting these models to empirical data, we find that the accuracy of the model fit increases as we move from the models assuming normally distributed innovations and allowing for only daily data to those assuming underlying BIG processes and allowing for full intraday data. However, we do encounter one problematic result, namely that there is empirical evidence of time dependence in the random impact factors. This means that the news noise processes, which we assumed to be independent over time, are indeed time dependent, as can actually be expected. In order to cater for this time dependence, we extend the model still further by allowing for autocorrelation in the random impact factors. The increased complexity that this extension introduces means that we can no longer rely on standard Maximum Likelihood methods, but have to turn to Simulated Maximum Likelihood methods, in conjunction with Efficient Importance Sampling and the Control Variate variance reduction technique, in order to obtain an approximation to the likelihood function and the parameter estimates. We find that this time dependent model assuming an underlying BIG process and catering for full intraday data fits generated data and empirical data very well, as long as enough intraday data is available. / Thesis (Ph.D. (Risk Analysis))--North-West University, Potchefstroom Campus, 2006.
90

Spatial Methods in Econometrics. An Application to R&D Spillovers.

Gumprecht, Daniela January 2005 (has links) (PDF)
In this paper I will give a brief and general overview of the characteristics of spatial data, why it is useful to use such data and how to use the information included in spatial data. The first question to be answered is: how to detect spatial dependency and spatial autocorrelation in data? Such effects can for instance be found by calculating Moran's I, which is a measure for spatial autocorrelation. The Moran's I is also the basis for a test for spatial autocorrelation (Moran's test). Once we found some spatial structure we can use special models and estimation techniques. There are two famous spatial processes, the SAR- (spatial autoregressive) and the SMA- (spatial moving average process) process, which are used to model spatial effects. For estimation of spatial regression models there are mainly two different possibilities, the first one is called spatial filtering, where the spatial effect is filtered out and standard techniques are used, the second one is spatial two stage least square estimation. Finally there are some results of a spatial analysis of R&D spillovers data (for a panel dataset with 22 countries and 20 years) shown. (author's abstract) / Series: Research Report Series / Department of Statistics and Mathematics

Page generated in 0.1181 seconds