• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 11
  • 11
  • 8
  • 8
  • 8
  • 7
  • 7
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Analysis of magnetoencephalographic data as a nonlinear dynamical system

Woon, Wei Lee January 2002 (has links)
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
2

Pragmatic algorithms for implementing geostatistics with large datasets

Ingram, Benjamin R. January 2008 (has links)
With the ability to collect and store increasingly large datasets on modern computers comes the need to be able to process the data in a way that can be useful to a Geostatistician or application scientist. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively for likelihood-based Geostatistics. Various methods have been proposed and are extensively used in an attempt to overcome these complexity issues. This thesis introduces a number of principled techniques for treating large datasets with an emphasis on three main areas: reduced complexity covariance matrices, sparsity in the covariance matrix and parallel algorithms for distributed computation. These techniques are presented individually, but it is also shown how they can be combined to produce techniques for further improving computational efficiency.
3

Non-linear hierarchical visualisation

Sun, Yi January 2002 (has links)
This thesis applies a hierarchical latent trait model system to a large quantity of data. The motivation for it was lack of viable approaches to analyse High Throughput Screening datasets which maybe include thousands of data points with high dimensions. We believe that a latent variable model (LTM) with a non-linear mapping from the latent space to the data space is a preferred choice for visualising a complex high-dimensional data set. As a type of latent variable model, the latent trait model can deal with either continuous data or discrete data, which makes it particularly useful in this domain. In addition, with the aid of differential geometry, we can imagine that distribution of data from magnification factor and curvature plots. Rather than obtaining the useful information just from a single plot, a hierarchical LTM arranges a set of LTMs and their corresponding plots in a tree structure. We model the whole data set with a LTM at the top level, which is broken down into clusters at deeper levels of the hierarchy. In this manner, the refined visualisation plots can be displayed in deeper levels and sub-clusters may be found. Hierarchy of LTMs is trained using expectation-maximisation (EM) algorithm to maximise its likelihood with respect to the data sample. Training proceeds interactively in a recursive fashion (top-down). The user subjectively identifies interesting regions on the visualisation plot that they would like to model in a greater detail. At each stage of hierarchical LTM construction, the EM algorithm alternates between the E - and M - step. Another problem that can occur when visualising a large data set is that there may be significant overlaps of data clusters. It is very difficult for the user to judge where centres of regions of interest should be put. We address this problem by employing the minimum message length technique, which can help the user to decide the optimal structure of the model.
4

Modelling nonlinear stochastic dynamics in financial time series

Lesch, Ragnar H. January 2000 (has links)
For analysing financial time series two main opposing viewpoints exist, either capital markets are completely stochastic and therefore prices follow a random walk, or they are deterministic and consequently predictable. For each of these views a great variety of tools exist with which it can be tried to confirm the hypotheses. Unfortunately, these methods are not well suited for dealing with data characterised in part by both paradigms. This thesis investigates these two approaches in order to model the behaviour of financial time series. In the deterministic framework methods are used to characterise the dimensionality of embedded financial data. The stochastic approach includes here an estimation of the unconditioned and conditional return distributions using parametric, non- and semi-parametric density estimation techniques. Finally, it will be shown how elements from these two approaches could be combined to achieve a more realistic model for financial time series.
5

Investigating viscous fluid flow in an internal mixer using computational fluid dynamics

Harries, Alun M. January 2000 (has links)
This thesis presents an effective methodology for the generation of a simulation which can be used to increase the understanding of viscous fluid processing equipment and aid in their development, design and optimisation. The Hampden RAPRA Torque Rheometer internal batch twin rotor mixer has been simulated with a view to establishing model accuracies, limitations, practicalities and uses. As this research progressed, via the analyses several 'snap-shot' analysis of several rotor configurations using the commercial code Polyflow, it was evident that the model was of some worth and its predictions are in good agreement with the validation experiments, however, several major restrictions were identified. These included poor element form, high man-hour requirements for the construction of each geometry and the absence of the transient term in these models. All, or at least some, of these limitations apply to the numerous attempts to model internal mixes by other researchers and it was clear that there was no generally accepted methodology to provide a practical three-dimensional model which has been adequately validated. This research, unlike others, presents a full complex three-dimensional, transient, non-isothermal, generalised non-Newtonian simulation with wall slip which overcomes these limitations using unmatched ridding and sliding mesh technology adapted from CFX codes. This method yields good element form and, since only one geometry has to be constructed to represent the entire rotor cycle, is extremely beneficial for detailed flow field analysis when used in conjunction with user defined programmes and automatic geometry parameterisation (AGP), and improves accuracy for investigating equipment design and operation conditions. Model validation has been identified as an area which has been neglected by other researchers in this field, especially for time dependent geometries, and has been rigorously pursued in terms of qualitative and quantitative velocity vector analysis of the isothermal, full fill mixing of generalised non-Newtonian fluids, as well as torque comparison, with a relatively high degree of success. This indicates that CFD models of this type can be accurate and perhaps have not been validated to this extent previously because of the inherent difficulties arising from most real processes.
6

Digital image watermarking

Bounkong, Stephane January 2004 (has links)
In recent years, interest in digital watermarking has grown significantly. Indeed, the use of digital watermarking techniques is seen as a promising mean to protect intellectual property rights of digital data and to ensure the authentication of digital data. Thus, a significant research effort has been devoted to the study of practical watermarking systems, in particular for digital images. In this thesis, a practical and principled approach to the problem is adopted. Several aspects of practical watermarking schemes are investigated. First, a power constaint formulation of the problem is presented. Then, a new analysis of quantisation effects on the information rate of digital watermarking scheme is proposed and compared to other approaches suggested in the literature. Subsequently, a new information embedding technique, based on quantisation, is put forward and its performance evaluated. Finally, the influence of image data representation on the performance of practical scheme is studied along with a new representation based on independent component analysis.
7

Exploiting uncertainty in nonlinear stochastic control problem

Herzallah, Randa January 2003 (has links)
This work introduces a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. Convergence of the output error for the proposed control method is verified by using a Lyapunov function. Several simulation examples are provided to demonstrate the efficiency of the developed control method. The manner in which such a method is extended to nonlinear multi-variable systems with different delays between the input-output pairs is considered and demonstrated through simulation examples.
8

A communicative model for stakeholder consultation : towards a framework for action inquiry in tourism I.T

Alford, Philip January 2007 (has links)
This thesis focuses on an under-researched area of tourism -the multi stakeholder, inter organisational business to business Tourism IT domain which exhibits a marked rate of failure. A critical review of B2B case studies reveals that this failure is in large part due to the primacy afforded to technical problem solving approaches over human centred ones. The main purpose of the research is therefore stated as: "how do we ensure that, as technological solutions are implemented within this domain, due consideration is given to human-centred issues?" In order to tackle this research problem an interdisciplinary approach is taken and a communicative model for stakeholder consultation is developed. At the centre of the model lies an innovative method for deconstructing and reconstructing stakeholder discourse. A Co-operative Inquiry research methodology was used and a significant number of stakeholders were engaged in an Open Space event sponsored by two major Tourism IT companies who wanted to investigate the issues and opportunities connected with travel distribution and technology. This was followed up with face to face interviews and live discussions over the internet. In addition stakeholder discourse was captured via the Travelmole tourism discussion site. The discourse between stakeholders was reconstructed and the normative and objective claims analysed in depth. The presentation of these reconstructions in textual, tabular and diagrammatic formats captures the complexity of stakeholder interactions, revealing that although IT is an important tool, what really lies at the core of multi stakeholder projects are the normative positions to which participants subscribe. The model provided a practical means for critiquing stakeholder discourse, helping to identify stakeholders both involved and affected by the issue; juxtaposing the 'is' against the 'ought'; and enabling critical reflection on the coercive use of power. The review of the tourism literature revealed that these issues are as important in general B2B tourism partnerships as in Tourism IT and in this respect the model provides a practical tool for critique and for enabling the formation of a shared normative infrastructure on which multi stakeholder projects can proceed. In addition, while borrowing from Management Science, this thesis also makes a contribution to it, specifically in the area of boundary critique, through the way in which Habermas' ideal speech criteria arc practically implemented.
9

An exploration of the chasm in the protection of classified information in South African government departments

Mahlatsi, Lehlohonolo Wonderboy 08 1900 (has links)
The chasm in the protection of classified information in South African government indicates that all the departments have at their disposal information that is to some extent sensitive in nature and obviously requires security measures. This study shows that government officials who in their official duties come to contact with classified information are either vulnerable or are implementing the security controls incorrectly. It is also clear that in the absence of a comprehensive statutory framework, the government departments’ classified information has resulted in an unstable and inconsistent classification and declassification environment. The statutory framework would, in addition to other things, address the rising threat of espionage and antagonistic activities, the selling of information and the protection of critical records in government, without hindering the constitutional rights of citizens to access information. This would create a system of valuable informantion and clarify which information requires security measures with respect to the protection of classified information. / Kgaohanao e tshireletsong ya tlhahisoleseding e sireleditsweng ke mmuso wa Afrika Borwa e supa hore mafapha ohle a ona a na le tlhahisoleseding eo, ka ho hong, e leng ya sephiri mme e hloka maemo a tshireletso. Boithuto bona bo bontsha hore bahlanka ba mmuso bao, tshebetsong ya bona ya semmuso, ba teanang le tlhahisoleseding ya sephiri, ba kotsing hobane ba sebedisa ditaelo tsa polokeho ka mokgwa o fosahetseng. Ho boetse ho hlakile hore, bosikong ba moralo o phethahetseng wa semolao, disistimi tse sa sebetseng hantle tsa mafapa a mmuso tsa tlhahisoleseding ya sephiri di bakile tikoloho e sa tsitsang hape e sa hlophiswang ya tlhophiso le tloso ya tlhophiso ya tlhahisoleseding. Moralo wa semolao, hara tse ding, o ka sebetsana le phephetso e eketsehang ya bohlwela le diketsahalo tse ding tse belaetsang tse jwalo ka thekiso ya tlhahisoleseding, mme o sireletse direkote tsa mmuso tsa bohlokwa ntle le ho hatakela tokelo ya Molaotheo ya baahi ya phihlello ho tlhahisoleseding. Hona ho ka theha sistimi ya tlhahisoleseding ya bohlokwa le ho hlakisa hore na ke tlhahisoleseding efe e hlokang maemo a tshireletso ha ho tluwa ntlheng polokeho ya tlhahisoleseding ya sephiri. / Umsantsa okhoyo ekukhuseleni ulwazi olukhethekileyo kurhulumente woMzantsi Afrika ubonisa ukuba onke amaSebe anolwazi analo olunokuba nkenenkene, kwaye oludinga ukhuseleko. Esi sifundo sibonisa ukuba asesichengeni amagosa karhulumente aye athi apha ekusebenzeni kwawo, adibane nolwazi olukhethekileyo, ngoba azisebenzisa gwenxa iindlela zokulawula ukhuseleko. Kukwacaca ukuba, ekubeni kungekho sikhokelo namigaqo isemthethweni, iinkqubo ezingasebenzi kakuhle zamaSebe karhulumente, ulwazi olukhethekileyo aluhlelwa ngendlela eyiyo kwaye lufumaneka kwiimeko ezingaluphathi ngokukhetheka. Ubukho besikhokelo nemigaqo yokhuseleko lolwazi inganceda ekunqandeni isoyikiso esikhulu sobhukuqo mbuso nezinye iziganeko ezikrokrisayo, ezifana nokuthengiswa kolwazi, Esi sikhokelo singanceda nasekukhuseleni iingxelo zikarhulumente ezinkenenkene ngaphandle kokucinezela amalungelo abemi okufumana ulwazi njengoko uvuma uMgaqo Siseko. Oku kuya kuvelisa inkqubo yolwazi olunexabiso kwaye kuya kucacisa ukuba loluphi ulwazi oludinga imimiselo yokhuseleko malunga nokukhuselwa kolwazi olukhethekileyo. / Criminology and Security Science / M. Tech. (Forensic Investigation)
10

Srovnávací studie proveditelnosti informačních systému pro nakládání s utajovanými informacemi do stupně utajení důvěrné v oblasti informačně - technologické a ekonomické. / Comparative study of feasibility of information systems handling classified information up to the CONFIDENTIAL level in the area of information-technological and economical.

HULIČOVÁ, Hana January 2015 (has links)
This thesis deals with the design of a comparative study of the feasibility of an information system handling classified information up to the Confidebtial level in the information-technological and economic area (i.e. economical and financial analysis).

Page generated in 0.1187 seconds