• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 189
  • 35
  • 1
  • Tagged with
  • 431
  • 431
  • 156
  • 152
  • 150
  • 124
  • 108
  • 108
  • 79
  • 53
  • 52
  • 45
  • 43
  • 39
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Applications of forecasting and optimisation in the Australian national electricity market

Baloi, C. A. Unknown Date (has links)
No description available.
182

New solutions of the Yang-Baxter equation associated with quantised orthosymplectic lie superalgebras

Mehta, Maithili Unknown Date (has links)
No description available.
183

Cubature rules from a generalized Taylor perspective

Hanna, George T. January 2009 (has links) (PDF)
The accuracy and efficiency of computing multiple integrals is a very important problem that arises in many scientific, financial and engineering applications. The research conducted in this thesis is designed to build on past work and develop and analyze new numerical methods to evaluate double integrals efficiently. The fundamental aim is to develop and assess techniques for (numerically) evaluating double integrals with high accuracy. The general approach presented in this thesis involves the development of new multivariate approximations from a generalaised Taylor perspective in terms of Appell type polynomials and to study their use in multi-dimensional integration. The expectation is that the new methods will provide polynomial and polynomial-like approximations that can be used for application in a straight forward manner with better accuracy. That is, we aim to devise and investigate new multiple integration formulae and as well as provide information on a priori error bounds. A further major contribution of the work builds on the research conducted in the field of Grüss type inequalities and leads to a new approximation of the one and two dimensional finite Fourier transform. The approximations are in terms of the complex exponential mean and estimate of the error of approximation for different classes of functions of bounded variation defined on finite intervals. It is believed that this work will also have an impact in the area of numerical multidimensional integral evaluation for other integral operators.
184

Forensic Applications of Bayesian Inference to Glass Evidence

Curran, James Michael January 1996 (has links)
The role of the scientist in the courtroom has come under more scrutiny this century than ever before. As a consequence, scientists must constantly look for ways to improve the validity of the evidence they deliver. It is here that the professional statistician can provide assistance. The use of statistics in the courtroom and in forensic science is not new, but until recently has not been common either. Statistics can provide objectivity to subjective assessments and strengthen a case for the prosecution or the defence, but only if is used correctly. The aim of this thesis is to enhance and replace the existing technology used in statistical analysis and presentation of trace evidence, i.e. all non-genetic evidence (hairs, fibres, glass, paint, etc.) and transfer problems.
185

Effects of serial correlation on linear models

Triggs, Christopher M. January 1975 (has links)
Given a linear regression model y = Xβ + e, where e has a multivariate normal distribution N(0, Σ) consequences of the erroneous assumption that e is distributed as N(0, I) are considered. For a general linear hypothesis concerning the parameters β, in a general model the distribution of the statistic to test the hypothesis, derived under the erroneous assumption is studied. Particular linear hypotheses concerning particular linear models are investigated so as to describe the effects of various patterns of serial correlation on the test statistics arising from these hypotheses. Attention is specially paid to the models of one- and two- way analysis of variance.
186

Understanding linear algebra concepts through the embodied, symbolic and formal worlds of mathematical thinking

Stewart, Sepideh January 2008 (has links)
Linear algebra is one of the first advanced mathematics courses that students encounter at university level. The transfer from a primarily procedural or algorithmic school approach to an abstract and formal presentation of concepts through concrete definitions, seems to be creating difficulty for many students who are barely coping with procedural aspects of the subject. This research proposes applying APOS theory, in conjunction with Tall’s three worlds of embodied, symbolic and formal mathematics, to create a framework in order to examine the learning of a variety of linear algebra concepts by groups of first and second year university students. The aim is to investigate the difficulties in understanding some linear algebra concepts and to propose potential paths for preventing them. As part of this research project several case studies were conducted where groups of first and second year students were exposed to teaching and learning some introductory linear algebra concepts based on the framework and expressed their thinking through their involvements in tests, interviews and concept maps. The results suggest that the students had limited understanding of the concepts, they struggled to recognise the concepts in different registers, and their lack of ability in linking the major concepts became apparent. However, they also revealed that those with more representational diversity had more overall understanding of the concepts. In particular the embodied introduction of the concept proved a valuable adjunct to their thinking. Since difficulties with learning linear algebra by average students are universally acknowledged, it is anticipated that this study may provide suggestions with the potential for widespread positive consequences for learning.
187

Significance testing in automatic interaction detection (A.I.D.)

Worsley, Keith John January 1978 (has links)
Automatic Interaction Detection (A.I.D.) is the name of a computer program, first used in the social sciences, to find the interaction between a set of predictor variables and a single dependent variable. The program proceeds in stages, and at each stage the categories of a predictor variable induce a split of the dependent variable into two groups, so that the between groups sum of squares ( BSS ) is a maximum. In this way, the optimum split defines the interaction between predictor and dependent variable, and the criterion BSS is taken as a measure of the explanatory power of the split. One of the strengths of A.I.D. is that this interaction is established without any reference to a specific model, and for this reason it is widely used in practice. However this strength is also its weakness; with no model there is no measure of its significance. Barnard (1974) has said: “… nowadays with more and more apparently sophisticated computer programs for social science, failure to take account of possible sampling fluctuations is leading to a glut of unsound analyses … I have in mind procedures such as A.I.D., the automatic interaction detector, which guarantees to get significance out of any data whatsoever. Methods of this kind require validation …” The aim of this thesis is to supply Part of that validation by investigating the null distribution of the optimum BSS for a single predictor at a single stage of A.I.D., so that the significance of any particular split can be judged. The problem of the overall significance of a complete A.I.D. analysis, combining many stages, still remains to be solved. In Chapter 1 the A.I.D. method is described in more detail and an example is presented to illustrate its use. A null hypothesis that the dependent variable observations have independent and identical normal distributions is proposed as a model for no interaction. In Chapters 2 and 3 the null distributions of the optimum BSS for a single predictor are derived and tables of percentage points are given. In Chapter 4 the normal assumption is dropped and non-parametric A.I.D. criteria, based on ranks, are proposed. Tables of percentage points, found by direct enumeration and by Monte Carlo methods, are given. In Chapter 5 the example presented in Chapter 1 is used to illustrate the application of the theory and tables in Chapters 2, 3 and 4 and some final conclusions are drawn.
188

Automatic structures

Rubin, Sasha January 2004 (has links)
This thesis investigates structures that are presentable by finite automata working synchronously on tuples of finite words. The emphasis is on understanding the expressiveness and limitations of automata in this setting. In particular, the thesis studies the classification of classes of automatic structures, the complexity of the isomorphism problem, and the relationship between definability and recognisability.
189

New methods for analysing generalised linear models with applications to epidemiology

Holden, Jennifer Kay January 2001 (has links)
Whole document restricted, see Access Instructions file below for details of how to access the print copy. / The aim of capture-recapture methods in epidemiology is to accurately estimate the total number of people who have a specific disease. These methods were first developed by ecologists interested in estimating the total population size of various animal species. Capture-recapture methods have a relatively short history, at least in terms of application to epidemiological data sets. If applied correctly they can be of great benefit, and are invaluable for planning and resource allocation. The aim of this thesis is to enhance the existing methods used in epidemiological capture-recapture problems. This research explores new methods for analysing generalised linear models, within the capture-recapture framework, with applications to epidemiology. In particular, we critically examine two New Zealand data sets. We compare two small sample adjustments for capture-recapture methods, and find that the Evans and Bonett adjustment can be a useful tool for sparse data. We employ stratified capture-recapture analyses to alleviate problems with heterogeneity and reporting patterns. In addition, we consider a type of cost-benefit analysis for the reporting sources. Two proposed methods of internal validity analysis are scrutinised. We find that one of these is counter-intuitive and of no use, however, the other method may be of some use in at least indicating the direction of any bias in the capture-recapture estimates. We use simulation to explore the effects of errors on patient records, and find that even relatively small percentages of errors can affect estimates dramatically. We conclude with a study of the optimal number of sources to use in epidemiological capture-recapture analyses. We argue that using three sources is not necessarily optimal, and that using four sources is also entirely manageable. This thesis outlines a strategy for analysing epidemiological data sets using capture-recapture methods, and includes aspects of model fitting and selection, cost-benefit analysis, diagnostic checking through simulations of the effects of record errors, and the effects of collapsing lists, as well as a critical check of the capture-recapture assumptions. This investigation demonstrates the potential of capture-recapture methods to provide accurate estimates of the size of various disease populations.
190

Hyperbolic Geometry and Reflection Groups

Marshall, T. H. (Timothy Hamilton) January 1994 (has links)
The n-dimensional pseudospheres are the surfaces in Rn+l given by the equations x12+x22+...+xk2-xk+12-...-xn+12=1(1 ≤ k ≤ n+1). The cases k=l, n+1 give, respectively a pair of hyperboloids, and the ordinary n-sphere. In the first chapter we consider the pseudospheres as surfaces h En+1,k, where Em,k=Rk x (iR)m-k, and investigate their geometry in terms of the linear algebra of these spaces. The main objects of investigation are finite sequences of hyperplanes in a pseudosphere. To each such sequence we associate a square symmetric matrix, the Gram matrix, which gives information about angle and incidence properties of the hyperplanes. We find when a given matrix is the Gram matrix of some sequence of hyperplanes, and when a sequence is determined up to isometry by its Gram matrix. We also consider subspaces of pseudospheres and projections onto them. This leads to an n-dimensional cosine rule for spherical and hyperbolic simplices. In the second chapter we derive integral formulae for the volume of an n-dimensional spherical or hyperbolic simplex, both in terms of its dihedral angles and its edge lengths. For the regular simplex with common edge length γ we then derive power series for the volume, both in u = sinγ/2, and in γ itself, and discuss some of the properties of the coefficients. In obtaining these series we encounter an interesting family of entire functions, Rn(p) (n a nonnegative integer and pεC). We derive a functional equation relating Rn(p) and Rn-1(p). Finally we classify, up to isometry, all tetrahedra with one or more vertices truncated, for which the dihedral angles along the edges formed by the truncatons. are all π/2, and the remaining dihedral angles are all sub-multiples of π. We show how to find the volumes of these polyhedra, and find presentations and small generating sets for the orientation-preserving subgroups of their reflection groups. For particular families of these groups, we find low index torsion free subgroups, and construct associated manifolds and manifolds with boundary In particular, we find a sequence of manifolds with totally geodesic boundary of genus, g≥2, which we conjecture to be of least volume among such manifolds.

Page generated in 0.1102 seconds