1 |
An experimental study of some Z-transform discrete filtersFlakas, Gerald Kenneth, January 1968 (has links)
Thesis (M.S.)--University of Wisconsin--Madison, 1968. / eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references.
|
2 |
The z-transform as a general tool in approximationPendleton, Freeman Luke, 1928- January 1960 (has links)
No description available.
|
3 |
Functions annihilable by samplingHo, Joseph Ping-Liong. January 1961 (has links)
Call number: LD2668 .T4 1961 H62
|
4 |
The Z-transform method for the calculation of molecular weight distributions in polymerizationChen, Paul Yuan, January 1968 (has links)
Thesis (Eng. Sc. D.)--Columbia University, 1968. / Typescript. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 177-179).
|
5 |
Z-Transformation auf Basis des Residuensatzes / Z-Transformation based on residue theoremGeitner, Gert-Helge 17 June 2013 (has links) (PDF)
Die Z-Transformation ist ein wichtiges Werkzeug für diskontinuierliche Regelungen. Ein wenig beachteter Transformationsweg führt über die Berechnung von Residuen. Alle wichtigen Besonderheiten wie Mehrfachpole, Sprungfähigkeit, komplexe Pole oder Zeitverschiebung unterhalb der Abtastzeit (modifizierte Z-Transformation) können einbezogen werden. Die Vorgabe von Reihenabbruchkriterien oder das Auffinden von Summenformeln ist nicht notwendig. Der Weg ist prinzipiell für manuelle und maschinelle Berechnungen einsetzbar. Die Vorgehensweise wird durch drei ausführliche Beispiele verdeutlicht. / The Z-transformation is an important tool for the control of discontinuous systems. A seldom considered transformation way uses the residue computation. All relevant fea-tures as for instance multiple poles, process with feedthrough, complex poles or delay less than the sampling time (so called modified Z-transformation) may be included. There is no truncation criterion or search for a sum formula necessary. This way is always applicable for manual as well as for computer-aided computations. Three detailed examples demonstrate the use of the formulas.
|
6 |
Control and estimation for large-scale systems having spatial symmetryWall, Joseph Edward January 1978 (has links)
Thesis. 1978. Ph.D.--Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING. / Vita. / Includes bibliographical references. / by Joseph Edward Wall, Jr. / Ph.D.
|
7 |
A Comparison of Meta-Analytic Approaches to the Analysis of Reliability EstimatesMason, Denise Corinne 10 July 2003 (has links)
In the last few years, several studies have attempted to meta-analyze reliability estimates. The initial study, to outline a methodology for meta-analyzing reliability coefficients, was published by Vacha-Haase in 1998. Vacha-Haase used a very basic meta-analytic model to find a mean effect size (reliability) across studies. There are two main reasons for meta-analyzing reliability coefficients. First, recent research has shown that many studies fail to report the appropriate reliability for the measure and population of the actual study (Vacha-Haase, Ness, Nilsson and Reetz, 1999; Whittington, 1998; Yin and Fan, 2000). Second, very little research has been published describing the way reliabilities for the same measure vary according to moderators such as time, form length, population differences in trait variability and others. Vacha-Haase (1998) proposed meta-analysis, as a method by which the impact of moderators may become better understood.
Although other researchers have followed the Vacha-Haase example and meta-analyzed the reliabilities for several measures, little has been written about the best methodology to use for such analysis. Reliabilities are much larger on average than are validities, and thus tend to show greater skew in their sampling distributions. This study took a closer look at the methodology with which reliability can be meta-analyzed. Specifically, a Monte Carlo study was run so that population characteristics were known. This provided a unique ability to test how well each of three methods estimates the true population characteristics. The three methods studied were the Vacha-Haase method as outlined in her 1998 article, the well-known Hunter and Schmidt "bare bones method" (1990) and the random-effects version of Hedges' method as described by Lipsey and Wilson (2001). The methods differ both in how they estimate the random-effects variance component (or in one case, whether the random-effects variance component is estimated at all) and in how they treat moderator variables. Results showed which of these methods is best applied to reliability meta-analysis. A combination of the Hunter and Schmidt (1999) method and weighted least squares regression is proposed.
|
8 |
Interval Estimation for the Correlation CoefficientJung, Aekyung 11 August 2011 (has links)
The correlation coefficient (CC) is a standard measure of the linear association between two random variables. The CC plays a significant role in many quantitative researches. In a bivariate normal distribution, there are many types of interval estimation for CC, such as z-transformation and maximum likelihood estimation based methods. However, when the underlying bivariate distribution is unknown, the construction of confidence intervals for the CC is still not well-developed. In this thesis, we discuss various interval estimation methods for the CC. We propose a generalized confidence interval and three empirical likelihood-based non-parametric intervals for the CC. We also conduct extensive simulation studies to compare the new intervals with existing intervals in terms of coverage probability and interval length. Finally, two real examples are used to demonstrate the application of the proposed methods.
|
9 |
A comparison of meta-analytic approaches to the analysis of reliability estimates [electronic resource] / by Denise Corinne Mason.Mason, Denise Corinne. January 2003 (has links)
Includes vita. / Title from PDF of title page. / Document formatted into pages; contains 114 pages. / Thesis (Ph.D.)--University of South Florida, 2003. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: In the last few years, several studies have attempted to meta-analyze reliability estimates. The initial study, to outline a methodology for meta-analyzing reliability coefficients, was published by Vacha-Haase in 1998. Vacha-Haase used a very basic meta-analytic model to find a mean effect size (reliability) across studies. There are two main reasons for meta-analyzing reliability coefficients. First, recent research has shown that many studies fail to report the appropriate reliability for the measure and population of the actual study (Vacha-Haase, Ness, Nilsson and Reetz, 1999; Whittington, 1998; Yin and Fan, 2000). Second, very little research has been published describing the way reliabilities for the same measure vary according to moderators such as time, form length, population differences in trait variability and others. / ABSTRACT: Vacha-Haase (1998) proposed meta-analysis, as a method by which the impact of moderators may become better understood. Although other researchers have followed the Vacha-Haase example and meta-analyzed the reliabilities for several measures, little has been written about the best methodology to use for such analysis. Reliabilities are much larger on average than are validities, and thus tend to show greater skew in their sampling distributions. This study took a closer look at the methodology with which reliability can be meta-analyzed. Specifically, a Monte Carlo study was run so that population characteristics were known. This provided a unique ability to test how well each of three methods estimates the true population characteristics. / ABSTRACT: The three methods studied were the Vacha-Haase method as outlined in her 1998 article, the well-known Hunter and Schmidt "bare bones method" (1990) and the random-effects version of Hedges' method as described by Lipsey and Wilson (2001). The methods differ both in how they estimate the random-effects variance component (or in one case, whether the random-effects variance component is estimated at all) and in how they treat moderator variables. Results showed which of these methods is best applied to reliability meta-analysis. A combination of the Hunter and Schmidt (1999) method and weighted least squares regression is proposed. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.
|
10 |
Evaluation of applying Crum-based transformation in solving two point boundary value problemsJogiat, Aasif January 2016 (has links)
A dissertation submitted to the Faculty of Engineering and the Built Environment, University of the
Witwatersrand, in ful llment of the requirements for the degree of Master of Science in Engineering, Johannesburg, 2016 / The aim of this research project is evaluating the application of the Crum-based transformation
in solving engineering systems modelled as two-point boundary value problems. The boundary
value problems were subjected to the various combinations of Dirichlet, Non-Dirichlet and Affine
boundary conditions. The engineering systems that were modelled were in the elds of electrostatics,
heat conduction and longitudinal vibrations. Other methods such as the Z-transforms and iterative
methods have been discussed. An attractive property of the Crum-based transformation is that
it can be applied to cases where the eigenparameters (function of eigenvalues) generated in the
discrete case are negative and was therefore chosen to be explored further in this dissertation. An
alternative matrix method was proposed and used instead of the algebraic method in the Crum-
based transformation. The matrix method was tested against the algebraic method using three unit
intervals. The analysis revealed, that as the number of unit intervals increase, there is a general
increase in the accuracy of the approximated continuous-case eigenvalues generated for the discrete
case. The other observed general trend was that the accuracy of the approximated continuous-
case eigenvalues decrease as one ascends the continuous-case eigenvalue spectrum. Three cases:
(Affine, Dirichlet), (Affine, Non-Dirichlet) and (Affine, Affine) generated negative eigenparameters.
The approximated continuous-case eigenvalues, derived from the negative eigenparameters, were
shown not to represent true physical natural frequencies since the discrete eigenvalues, derived from
negative eigenparameters, do not satisfy the condition for purely oscillatory behaviour. The research
has also shown that the Crum-based transformation method was useful in approximating the shifted
eigenvalues of the continuous case, in cases where the generated eigenparameters were negative:
since, as the number of unit intervals increase, the post-transformed approximated eigenvalues
improved in accuracy. The accuracy was also found to be better in the post-transformed case than
in the pre-transformed case. Furthermore, the approximated non-shifted and shifted continuous-
case eigenvalues (except the approximated continuous-case eigenvalues generated from negative
eigenparameters) satis ed the condition for purely oscillatory behaviour. / MT2017
|
Page generated in 0.1293 seconds