Spelling suggestions: "subject:"amathematical ciences"" "subject:"amathematical csciences""
411 |
Bayesian inference on astrophysical binary inspirals based on gravitational-wave measurementsRöver, Christian January 2007 (has links)
Gravitational waves are predicted by general relativity theory. Their existence could be confirmed by astronomical observations, but until today they have not yet been measured directly. A measurement would not only confirm general relativity, but also allow for interesting astronomical observations. Great effort is currently being expended to facilitate gravitational radiation measurement, most notably through earth-bound interferometers (such as LIGO and Virgo), and the planned space-based LISA interferometer. Earth-bound interferometers have recently taken up operation, so that a detection might be made at any time, while the space-borne LISA interferometer is scheduled to be launched within the next decade.Among the most promising signals for a detection are the waves emitted by the inspiral of a binary system of stars or black holes. The observable gravitational-wave signature of such an event is determined by properties of the inspiralling system, which may in turn be inferred from theobserved data. A Bayesian inference framework for the estimation of parameters of binary inspiral events as measured by ground- and space-based interferometers is described here. Furthermore, appropriate computational methods are developed that are necessary for its application in practice. Starting with a simplified model considering only 5 parameters and data from a single earth-bound interferometer, the model is subsequently refined by extending it to 9 parameters, measurements from several interferometers, and more accurate signal waveform approximations. A realistic joint prior density for the 9 parameters is set up. For the LISA application the model is generalised so that the noise spectrum is treated as unknown as well and can be inferred along with the signal parameters. Inference through the posterior distribution is facilitated by the implementation of Markov chain Monte Carlo (MCMC) methods. The posterior distribution exhibits many local modes, and there is only a small "attraction region" around the global mode(s), making it hard, if not impossible, for basic MCMC algorithms to find the relevant region in parameter space. This problem is solved by introducing a parallel tempering algorithm. Closer investigation of its internal functionality yields some insight into a proper setup of this algorithm, which in turn also enables the efficient implementation for the LISA problem with its vastly enlarged parameter space. Parallel programming was used to implement this computationally expensive MCMC algorithm, so that the code can be run efficiently on a computer cluster. In this thesis, a Bayesian approach to gravitational wave astronomy is shown to be feasible and promising.
|
412 |
Critical Sets in Latin Squares and Associated StructuresBean, Richard Winston Unknown Date (has links)
A critical set in a Latin square of order n is a set of entries in an n×n array which can be embedded in precisely one Latin square of order n, with the property that if any entry of the critical set is deleted, the remaining set can be embedded in more than one Latin square of order n. The number of critical sets grows super-exponentially as the order of the Latin square increases. It is difficult to find patterns in Latin squares of small order (order 5 or less) which can be generalised in the process of creating new theorems. Thus, I have written many algorithms to find critical sets with various properties in Latin squares of order greater than 5, and to deal with other related structures. Some algorithms used in the body of the thesis are presented in Chapter 3; results which arise from the computational studies and observations of the patterns and subsequent results are presented in Chapters 4, 5, 6, 7 and 8. The cardinality of the largest critical set in any Latin square of order n is denoted by lcs(n). In 1978 Curran and van Rees proved that lcs(n)<=n²-n. In Chapter 4, it is shown that lcs(n)<=n²-3n+3. Chapter 5 provides new bounds on the maximum number of intercalates in Latin squares of orders m×2^α (m odd, α>=2) and m×2^α+1 (m odd, α>=2 and α≠3), and a new lower bound on lcs(4m). It also discusses critical sets in intercalate-rich Latin squares of orders 11 and 14. In Chapter 6 a construction is given which verifies the existence of a critical set of size n²÷ 4 + 1 when n is even and n>=6. The construction is based on the discovery of a critical set of size 17 for a Latin square of order 8. In Chapter 7 the representation of Steiner trades of volume less than or equal to nine is examined. Computational results are used to identify those trades for which the associated partial Latin square can be decomposed into six disjoint Latin interchanges. Chapter 8 focusses on critical sets in Latin squares of order at most six and extensive computational routines are used to identify all the critical sets of different sizes in these Latin squares.
|
413 |
Regulated rewriting in formal language theoryTaha, Mohamed A. M. S 03 1900 (has links)
Thesis (MSc (Mathematical Sciences))--University of Stellenbosch, 2008. / Context-free grammars are well-studied and well-behaved in terms of decidability, but many
real-world problems cannot be described with context-free grammars. Grammars with regulated
rewriting are grammars with mechanisms to regulate the applications of rules, so that
certain derivations are avoided. Thus, with context-free rules and regulated rewriting mechanisms,
one can often generate languages that are not context-free.
In this thesis we study grammars with regulated rewriting mechanisms. We consider problems
in which context-free grammars are insufficient and in which more descriptive grammars
are required. We compare bag context grammars with other well-known classes of grammars
with regulated rewriting mechanisms. We also discuss the relation between bag context grammars
and recognizing devices such as counter automata and Petri net automata. We show
that regular bag context grammars can generate any recursively enumerable language. We
reformulate the pumping lemma for random permitting context languages with context-free
rules, as introduced by Ewert and Van der Walt, by using the concept of a string homomorphism.
We conclude the thesis with decidability and complexity properties of grammars with
regulated rewriting.
|
414 |
Formal specification and verification of safety interlock systems: A comparative case studySeotsanyana, Motlatsi 12 1900 (has links)
Thesis (MSc (Mathematical Sciences))--University of Stellenbosch, 2007. / The ever-increasing reliance of society on computer systems has led to a need for highly reliable
systems. There are a number of areas where computer systems perform critical functions and
the development of such systems requires a higher level of attention than any other type of
system. The appropriate approach in this situation is known as formal methods. Formal
methods refer to the use of mathematical techniques for the specification, development and
verification of software and hardware systems. The two main goals of this thesis are:
1. The design of mathematical models as a basis for the implementation of error-free software
for the safety interlock system at iThemba LABS (http://www.tlabs.ac.za/).
2. The comparison of formal method techniques that addresses the lack of much-needed
empirical studies in the field of formal methods.
Mathematical models are developed using model checkers: Spin, Uppaal, Smv and a theorem
prover Pvs. The criteria used for the selection of the tools was based on the popularity of
the tools, support of the tools, representation of properties, representativeness of verification
techniques, and ease of use.
The procedure for comparing these methods is divided into two phases. Phase one involves
the time logging of activities followed by a novice modeler to model check and theorem prove
software systems. The results show that it takes more time to learn and use a theorem prover
than a model checker. Phase two involves the performance of the tools in relation to the time
taken to verify a property, memory used, number of states and transitions generated. In spite
of the differences between models, the results are in favor of Smv and this maybe attributed
to the nature of the safety interlock system, as it involves a lot of hard-wired lines.
|
415 |
Reducing communication in distributed model checkingFourie, Jean Francois 12 1900 (has links)
Thesis (Msc (Mathematical Sciences. Computer Science))--University of Stellenbosch, 2009. / ENGLISH ABSTRACT: Model checkers are programs that automatically verify, without human assistance, that certain
user-specified properties hold in concurrent software systems. Since these programs often have
expensive time and memory requirements, an active area of research is the development of distributed
model checkers that run on clusters. Of particular interest is how the communication
between the machines can be reduced to speed up their running time.
In this thesis the design decisions involved in an on-the-fly distributed model checker are identified
and discussed. Furthermore, the implementation of such a program is described. The
central idea behind the algorithm is the generation and distribution of data throughout the
nodes of the cluster.
We introduce several techniques to reduce the communication among the nodes, and study
their effectiveness by means of a set of models. / AFRIKAANSE OPSOMMING: Modeltoetsers is programme wat outomaties bevestig, sonder enige hulp van die gebruiker,
dat gelopende sagteware aan sekere gespesifiseerde eienskappe voldoen. Die feit dat hierdie
programme dikwels lang looptye en groot geheues benodig, het daartoe aanleiding gegee dat
modeltoetsers wat verspreid oor ’n groep rekenaars hardloop, aktief nagevors word. Dit is
veral belangrik om vas te stel hoe die kommunikasie tussen rekenaars verminder kan word om
sodoende die looptyd te verkort.
Hierdie tesis identifiseer en bespreek die ontwerpsbesluite betrokke in die ontwikkeling van
’n verspreide modeltoetser. Verder word die implementasie van so ’n program beskryf. Die
kernidee is die generasie en verspreiding van data na al die rekenaars in die groep wat aan die
probleem werk.
Ons stel verskeie tegnieke voor om die kommunikasie tussen die rekenaar te verminder en
bestudeer die effektiwiteit van hierdie tegnieke aan die hand van ’n lys modelle.
|
416 |
Background subtraction algorithms for a video based systemProfitt, Barton 12 1900 (has links)
Thesis (MScEng (Mathematical Sciences)--University of Stellenbosch, 2009. / ENGLISH ABSTRACT: To reliably classify parts of an image sequence as foreground or background
is an important part of many computer vision systems, such as video surveillance,
tracking and robotics. It can also be important in applications where
bandwidth is the limiting factor, such as video conferencing.
Independent foreground motion is an attractive source of information for this
task, and with static cameras, background subtraction is a particularly popular
type of approach. The idea behind background subtraction is to compare
the current image with a reference image of the background, and from there
decide on a pixel by pixel basis, what is foreground and what is background
by observing the changes in the pixel sequence.
The problem is to get the useful reference image, especially when large parts
of the background are occluded by moving/stationary foreground objects; i.e.
some parts of the background are never seen.
In this thesis four algorithms are reviewed that segment an image sequence
into foreground and background components with varying degrees of success
that can be measured on speed, comparative accuracy and/or memory requirements.
These measures can be then effectively used to decide the application
scope of the individual algorithms. / AFRIKAANSE OPSOMMING: Om betroubaar dele van ’n beeld reeks te klassifiseer as voorgrond of agtergrond
is ’n belangrike deel van baie rekenaarvisie sisteme, byvoorbeeld video
bewaking, volging en robotika. Dit kan ook belangrik wees in toepassings waar
bandwydte die beperkende faktor is, byvoorbeeld video konferensie gesprekke.
Onafhanklik voorgrond beweging is ’n aantreklike bron van informasie vir hierdie
taak, en met statiese kameras, is agtergrond aftrekking ’n populêre benadering.
Die idee agter agtergrond aftrekking is om die huidige beeld met
’n naslaan beeld van die agtergrond te vergelyk, en daarvandaan besluit op ’n
piksel-na-piksel basis, wat is voorgrond en wat is agtergrond deur die observasies
van die veranderinge in die piksel-reeks.
Die probleem is om die naslaan beeld te kry om mee te werk, veral wanneer
groot dele van die agtergrond onsigbaar bly as gevolg van bewegende of stilstaande
voorgrond objekte en sommige dele van die agtergrond word dalk nooit
gesien nie.
In hierdie tesis word vier algorithms ondersoek wat ’n beeld reeks segmenteer
in respektiewe voorgrond en agtergrond komponente met wisselende grade van
sukses wat gemeet kan word deur spoed, vergelykbare akkuraatheid en/of geheu gebruik. Hierdie metings kan dan effektief gebruik word om die applikasie
veld van die individuele algoritmes the bepaal.
|
417 |
A study of image compression techniques, with specific focus on weighted finite automataMuller, Rikus 12 1900 (has links)
Thesis (MSc (Mathematical Sciences)--University of Stellenbosch, 2005. / Image compression using weighted finite automata (WFA) is studied and implemented
in Matlab. Other more prominent image compression techniques, namely JPEG, vector
quantization, EZW wavelet image compression and fractal image compression are also
presented. The performance of WFA image compression is then compared to those of
some of the abovementioned techniques.
|
418 |
Link failure recovery among dynamic routes in telecommunication networksStapelberg, Dieter 12 1900 (has links)
Thesis (MSc (Mathematical Sciences. Computer Science))--University of Stellenbosch, 2009. / ENGLISH ABSTRACT: Since 2002 data tra c has overtaken voice tra c in volume [1]. Telecom /
Network operators still generate most of their income carrying voice tra c.
There is however a huge revenue potential in delivering reliable guaranteed
data services. Network survivability and recovery from network failures are
integral to network reliability. Due to the nature of the Internet, recovery
from link failures needs to be distributed and dynamic in order to be scalable.
Link failure recovery schemes are evaluated in terms of the survivability of
the network, the optimal use of network resources, scalability, and the recovery
time of such schemes. The need for recovery time to be improved is highlighted
by real-time data tra c such as VoIP and video services carried over the
Internet.
The goal of this thesis is to examine existing link failure recovery schemes
and evaluate the need for their extension, and to evaluate the performance of
the proposed link failure recovery schemes.
i / AFRIKAANSE OPSOMMING: Sedert 2002 het data verkeer die stem verkeer in volume verbygesteek [1].
Telekommunikasie / netwerk operateurs genereer egter steeds die meeste van
hul inkomste met stem verkeer. Netwerk oorlewing en die herstel van netwerk
mislukkings is integraal tot netwerk stabiliteit. Die samestelling van die Internet
noodsaak dat die herstel van skakel mislukkings verspreid en dinamies van
natuur moet wees.
Die herstel-skema van skakel mislukkings word evalueer in terme van die
oorleefbaarheid van die netwerk, die mees e ektiewe benutting van network
bronne, aanpasbaarheid, en die herstel tydperk van die skema. Die vinnig
moontlikste herstel tydperk word genoodsaak deur oombliklike data verkeer
soos VoIP en beeld dienste wat oor die Internet gedra word.
The doel van hierdie tesis is om bestaande skakel mislukking herstel skemas
te evalueer, en dan verder ondersoek in te stel na hul uitbreiding. Daarna word
die voorgestelde skakel mislukking skema se e ektiwiteit gemeet.
|
419 |
Fast generation of digitally reconstructed radiographs for use in 2D-3D image registrationCarstens, Jacobus Everhardus 12 1900 (has links)
Thesis (MSc (Mathematical Sciences))--Stellenbosch University, 2008. / A novel implementation exploiting modern hardware is explored and found to be a significant improvement over current methods used. A 50 times performance increase in the computation time of DRRs is achieved over the conventional ray casting approach and image registration is performed in under a minute.
|
420 |
Learning to see in the Pietist Orphanage : geometry, philanthropy and the science of perfection, 1695-1730Whitmer, Kelly Joan 11 1900 (has links)
This is a dissertation about the Halle method, or the visual pedagogies of the Pietist Orphanage as they were developed in the German university town of Halle from 1695 until 1730. A “Pietist” was someone who was affiliated with an evangelical reform movement first initiated by Philipp Jakob Spener in the 1670s. A long and deeply entrenched historiographical tradition has portrayed the Halle proponents of this movement—especially their leader August Hermann Francke—as zealous, yet practical, Lutheran reformers who were forced to directly confront the ideals of early Enlightenment in conjunction with the state-building mandate of Brandenburg-Prussia. This has led to a persistent tendency to see Halle Pietists as “others” who cultivated their collective identity in opposition to so-called Enlightenment intellectuals, like Christian Wolff, at the same time as they exerted a marked influence on these same persons. As a result of this dichotomous portrayal over the years, the impact of the Halle method on educational reform, and on the meanings eighteenth-century Europeans attached to philanthropy more generally, has been misunderstood. I argue that the Pietist Orphanage holds the key to remedying several problems that have impeded our ability to understand the significance of Pietist pedagogy and philanthropy. This was a site specifically designed to introduce children to the conciliatory knowledge-making strategies of the first Berlin Academy of Science members and their associates. These strategies championed the status of the heart as an assimilatory juncture point and were refined in the schools of the Pietist Orphanage, which itself functioned as a visual showplace that viewers could observe in order to edify and improve themselves. It was the material expression of Halle Pietists’ commitment to a “third way” and marked their attempt to assimilate experience and cognition, theology and philosophy, absolutism and voluntarism. The dissertation examines several personalities who had a direct bearing on this conciliatory project: namely E. W. von Tschirnhaus, Johann Christoph Sturm, Leonhard Christoph Sturm, Gottfried Wilhelm Leibniz and Christian Wolff. It also examines how the method was applied in the Halle Orphanage schools and extended elsewhere. / Arts, Faculty of / History, Department of / Graduate
|
Page generated in 0.1139 seconds