When making an estimate of the total of some quantity, sampling at carefully selected points will frequently be preferable to employing a method which involves randomization. The estimation of the total stand of timber on a given area or the amount of energy being released in a given time and space are examples of problems where specified points for sampling should result in a reduction the error of estimate. Problems such as these naturally lead us to to numerical integration methods. In the case of single integrals, the Newton-Cotes formulae can be applied directly to experimentally determined ordinates at equally-spaced abscissa points and are of great practical importance. Gauss’ formulae yield maximum efficiency with respect to controlling the polynomial error and can be used appropriately when an analytical expression for the curve in question is available but defies exact integration, or if for some other reason the statistical error is of minor importance. Tchebichef’s a formulae give maximum efficiency with respect to controlling the statistical or observational error.
The basic elements in the development of numerical integration formulae like Newton-Cotes, Gauss' and Tchebichef's, can be extended to developing formulae for the approximate evaluation of multiple integrals.
In the case of double integrals, an eight point and a thirteen point formula for fifth degree accuracy and a twelve point and a twenty-one point formula for seventh degree accuracy have been developed for integrating over a rectangle and similar formulae have been developed for integrating over areas bounded by a parabola and a straight line or two parabolas.
Formulae for the numerical evaluation of triple integrals taken over a rectangular parallelepiped are developed, including a twenty-one point formula with fifth degree accuracy. It is shown that comparable formulae can be developed for integrating functions of more than three variables and a 2n /- 1 point formula with third degree accuracy for integrating a function of n variables over a rectangular n-space is obtained.
In many problems involving statistical estimation, the dominant source of inaccuracies will be the error of observation. The magnitude of this error can be estimated by subjecting the observations to an orthogonal transformation which will isolate the trends and leave the residual variance free from these effects. This treatment is most easily carried out in terms of orthogonal polynomials and it is shown that this type analysis can be extended to functions of several variables. / Ph. D.
Identifer | oai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/54174 |
Date | January 1949 |
Creators | Tyler, George William |
Contributors | Statistics |
Publisher | Virginia Polytechnic Institute |
Source Sets | Virginia Tech Theses and Dissertation |
Language | en_US |
Detected Language | English |
Type | Dissertation, Text |
Format | 1 v. (various pagings), application/pdf, application/pdf, application/pdf |
Rights | In Copyright, http://rightsstatements.org/vocab/InC/1.0/ |
Relation | OCLC# 20624640 |
Page generated in 0.013 seconds