Spelling suggestions: "subject:"regularisation"" "subject:"régularisation""
1 |
Residential management strategies in formal and informal settlements, a case study in Trinidad and TobagoBrown, David Frederick January 1997 (has links)
No description available.
|
2 |
Mis-specification tests for neural regression models : applications in business and financeHolt, William Travis January 1999 (has links)
No description available.
|
3 |
Optimal and adaptive radial basis function neural networksShahsavand, Akbar January 2000 (has links)
The optimisation and adaptation of single hidden layer feed-forward neural networks employing radial basis activation functions (RBFNs) was investigated. Previous work on RBFNs has mainly focused on problems with large data sets. The training algorithms developed with large data sets prove unreliable for problems with a small number of observations, a situation frequently encountered in process engineering. The primary objective of this study was the development of efficient and reliable learning algorithms for the training of RJBFNs with small and noisy data sets. It was demonstrated that regularisation is essential in order to filter out the noise and prevent over-fitting. The selection of the appropriate level of regularisation, lambda*, with small data sets presents a major challenge. The leave-one-out cross validation technique was considered as a potential means for automatic selection of lambda*. The computational burden of selecting lambda* was significantly reduced by a novel application of the generalised singular value decomposition. The exact solution of the multivariate linear regularisation problem can be represented as a single hidden layer neural network, the Regularisation Network, with one neurone for each distinct exemplar. A new formula was developed for automatic selection of the regularisation level for a Regularisation Network with given non-linearities. It was shown that the performance of a Regularisation Network is critically dependent on the non-linear parameters of the activation function employed; a point which has received surprisingly little attention. It was demonstrated that a measure of the effective degrees of freedom df(lambda*,alpha) of a Regularisation Network can be used to select the appropriate width of the local radial basis functions, alpha, based on the data alone. The one-to-one correspondence between the number of exemplars and the number of hidden neurones of a Regularisation Network may prove computationally prohibitive. The remedy is to use a network with a smaller number of neurones, the Generalised Radial Basis Function Network (GRBFN). The training of a GRBFN ultimately settles down to a large-scale non-linear optimisation problem. A novel sequential back-fit algorithm was developed for training the GRBFNs, which enabled the optimisation to proceed one neurone at a time. The new algorithm was tested with very promising results and its application to a simple chemical engineering process was demonstrated In some applications the overall response is composed of sharp localised features superimposed on a gently varying global background. Existing multivariate regression techniques as well as conventional neural networks are aimed at filtering the noise and recovering the overall response. An initial attempt was made at developing an Adaptive GRBFN to separate the local and global features. An efficient algorithm was developed simply by insisting that all the activation functions which are responsible for capturing the global trend should lie in the null space of the differential operator generating the activation function of the kernel based neurones. It was demonstrated that the proposed algorithm performs extremely well in the absence of strong global input interactions.
|
4 |
Maximum Entropy Regularisation Applied to Ultrasonic Image ReconstructionBattle, David John January 1999 (has links)
Image reconstruction, in common with many other inverse problems, is often mathematically ill-posed in the sense that solutions are neither stable nor unique. Ultrasonic image reconstruction is particularly notorious in this regard, with narrow transducer bandwidths and limited - sometimes sparsely sampled apertures posing formidable difficulties for conventional signal processing. To overcome these difficulties, some form of regularisation is mandatory, whereby the ill-posed problem is restated as a closely related, well-posed problem, and then solved uniquely. This thesis explores the application of maximum entropy (MaxEnt) regularisation to the problem of reconstructing complex-valued imagery from sparsely sampled coherent ultrasonic field data, with particular emphasis on three-dimensional problems in the non-destructive evaluation (NDE) of materials. MaxEnt has not previously been applied to this class of problem, and yet in comparison with many other approaches to image reconstruction, it emerges as the clear leader in terms of resolution and overall image quality. To account for this performance, it is argued that the default image model used with MaxEnt is particularly meaningful in cases of ultrasonic scattering by objects embedded in homogeneous media. To establish physical and mathematical insights into the forward problem, linear equations describing scattering from both penetrable and impenetrable objects are first derived using the Born and physical optics approximations respectively. These equations are then expressed as a shift-invariant computational model that explicitly incorporates sparse sampling. To validate this model, time-domain scattering responses are computed and compared with analytical solutions for a simple canonical test case drawn from the field of NDE. The responses computed via the numerical model are shown to accurately reproduce the analytical responses. To solve inverse scattering problems via MaxEnt, the robust Cambridge algorithm is generalised to the complex domain and extended to handle broadband (multiple-frequency) data. Two versions of the augmented algorithm are then compared with a range of other algorithms, including several linearly regularised algorithms and lastly, due to its acknowledged status as a competitor with MaxEnt in radio-astronomy, the non-linear CLEAN algorithm. These comparisons are made through simulated 3-D imaging experiments under conditions of both complete and sparse aperture sampling with low and high levels of additive Gaussian noise. As required in any investigation of inverse problems, the experimental confirmation of algorithmic performance is emphasised, and two common imaging geometries relevant to NDE are selected for this purpose. In monostatic synthetic aperture imaging experiments involving side-drilled holes in an aluminium plate and test objects immersed in H2O, MaxEnt image reconstruction is demonstrated to be robust against grating-lobe and side-lobe formation, in addition to temporal bandwidth restriction. This enables efficient reconstruction of 2-D and 3-D images from small numbers of discrete samples in the spatial and frequency domains. The thesis concludes with a description of the design and testing of a novel polyvinylidene fluoride (PVDF) bistatic array transducer that offers advantages over conventional point-sampled arrays in terms of construction simplicity and signal-to-noise ratio. This ultra-sparse orthogonal array is the only one of its kind yet demonstrated, and was made possible by MaxEnt signal processing.
|
5 |
Maximum Entropy Regularisation Applied to Ultrasonic Image ReconstructionBattle, David John January 1999 (has links)
Image reconstruction, in common with many other inverse problems, is often mathematically ill-posed in the sense that solutions are neither stable nor unique. Ultrasonic image reconstruction is particularly notorious in this regard, with narrow transducer bandwidths and limited - sometimes sparsely sampled apertures posing formidable difficulties for conventional signal processing. To overcome these difficulties, some form of regularisation is mandatory, whereby the ill-posed problem is restated as a closely related, well-posed problem, and then solved uniquely. This thesis explores the application of maximum entropy (MaxEnt) regularisation to the problem of reconstructing complex-valued imagery from sparsely sampled coherent ultrasonic field data, with particular emphasis on three-dimensional problems in the non-destructive evaluation (NDE) of materials. MaxEnt has not previously been applied to this class of problem, and yet in comparison with many other approaches to image reconstruction, it emerges as the clear leader in terms of resolution and overall image quality. To account for this performance, it is argued that the default image model used with MaxEnt is particularly meaningful in cases of ultrasonic scattering by objects embedded in homogeneous media. To establish physical and mathematical insights into the forward problem, linear equations describing scattering from both penetrable and impenetrable objects are first derived using the Born and physical optics approximations respectively. These equations are then expressed as a shift-invariant computational model that explicitly incorporates sparse sampling. To validate this model, time-domain scattering responses are computed and compared with analytical solutions for a simple canonical test case drawn from the field of NDE. The responses computed via the numerical model are shown to accurately reproduce the analytical responses. To solve inverse scattering problems via MaxEnt, the robust Cambridge algorithm is generalised to the complex domain and extended to handle broadband (multiple-frequency) data. Two versions of the augmented algorithm are then compared with a range of other algorithms, including several linearly regularised algorithms and lastly, due to its acknowledged status as a competitor with MaxEnt in radio-astronomy, the non-linear CLEAN algorithm. These comparisons are made through simulated 3-D imaging experiments under conditions of both complete and sparse aperture sampling with low and high levels of additive Gaussian noise. As required in any investigation of inverse problems, the experimental confirmation of algorithmic performance is emphasised, and two common imaging geometries relevant to NDE are selected for this purpose. In monostatic synthetic aperture imaging experiments involving side-drilled holes in an aluminium plate and test objects immersed in H2O, MaxEnt image reconstruction is demonstrated to be robust against grating-lobe and side-lobe formation, in addition to temporal bandwidth restriction. This enables efficient reconstruction of 2-D and 3-D images from small numbers of discrete samples in the spatial and frequency domains. The thesis concludes with a description of the design and testing of a novel polyvinylidene fluoride (PVDF) bistatic array transducer that offers advantages over conventional point-sampled arrays in terms of construction simplicity and signal-to-noise ratio. This ultra-sparse orthogonal array is the only one of its kind yet demonstrated, and was made possible by MaxEnt signal processing.
|
6 |
Elektrická impedanční tomografie měkkých tkání: Řešení přímé a obrácené úlohy / Electrical impedance tomography of soft tissue: Forward and inverse modellingPšenka, Marek January 2017 (has links)
Electrical impedance tomography of soft tissue: Forward and inverse modelling The diploma thesis builds the neccesary apparatus to formulate and solve the inverse problem of Eletrical Impedance Tomography (EIT), including strategies to remedy the ill-conditioning of the problem. The problem itself lies in determining the structure of a body of interest by driving a set of electrical currents through electrodes connected to its surface. The aim of the thesis is to investigate possible utility of this method in medical applications, namely scanning for malignancies in the female breast, by studying the interaction of tissue with the electromagnetic field and by preparing a set of correspoding numerical experiments. An approximate characterization of the method's sensitivity with respect to noise is derived based on the most basic set of such numerical experiments, which were prepared by a complete software solution called prs4D developed by the author and his advisor, while some aspects of its implementation are included in the thesis.
|
7 |
La régularisation des favelas par l’électricité : un service entre Etat, marché et citoyenneté / The regularisation of favelas through electricity : a service between State, market and citizenshipPilo, Francesca 06 February 2015 (has links)
L'accueil de plusieurs événements internationaux ayant réactualisé l'enjeu sécuritaire, dès la fin de l'année 2008, le gouvernement de l'État de Rio de Janeiro a mis en place une nouvelle politique de sécurité publique pour reprendre le contrôle territorial d'un grand nombre de favelas en s'appuyant sur les Unités de police de pacification (UPP). Dans ce cadre, les pouvoirs publics ont en partie remodelé leur projet d'intégration des favelas. Depuis les années 1990, il était principalement envisagé en termes d'aménagement, par l'amélioration des infrastructures et des voies d'accès ainsi que, dans une moindre mesure, de régularisation foncière et urbaine. Désormais, les autorités envisagent de promouvoir « l'intégration par la régularisation » des relations marchandes et administratives, associant les différents acteurs des sphères publiques et privées. Cette thèse pose la question de l'intégration des favelas selon une perspective peu explorée : celle de la régularisation par le réseau d'électricité, dont l'objectif est de faire des « usagers clandestins » de nouveaux « clients abonnés », liés à l'entreprise de distribution par un compteur. En particulier, nous nous attacherons à mettre en exergue l'articulation entre logiques publiques et privées à l'œuvre dans les projets de régularisation du service d'électricité dans deux favelas, Santa Marta et Cantagalo. Pour ce faire, notre analyse se propose d'étudier la régularisation du service d'électricité à travers ses outils - socio-techniques (installation des compteurs et réfection du réseau), commerciaux (modes de recouvrement des factures) et de maîtrise de la consommation d'électricité – et leurs modes d'appropriation par les abonnés. La recherche montre que la régularisation du service d'électricité reconfigure la relation des favelados à l'Etat et au marché qui se heurte à certaines limites : la relation commerciale contractualisée peine à s'ancrer dans un rapport de confiance ; les actions de maîtrise de la consommation prônent une « mise aux normes » des comportements plus qu'un accompagnement des usages ; la régularisation du service reproduit plutôt qu'elle ne dépasse les inégalités socio-économiques, qui perdent par ailleurs progressivement leur caractère politique. Cette thèse vise ainsi à contribuer à une meilleure compréhension des modalités d'intégration des favelas dans le cadre d'une néolibéralisation accrue des politiques urbaines / With the country's hosting of a number of major international events having refocused attention on security issues, the government of the state of Rio de Janeiro introduced a new public security policy at the end of 2008 to regain territorial control over many of the city's favelas through the use of Pacifying Police Units (UPP). This programme has led to a partial revamp of the public authorities' favelas integration project. Since the 1990s, development has mainly involved improving infrastructure and access roads and, to a lesser extent, land and urban regularisation. Now, however, the authorities plan to promote ‘integration through the regularisation' of market and administrative relationships, involving various stakeholders from both the public and private spheres. This thesis examines the integration of these favelas from a relatively unexplored perspective: that of regularisation through the electricity network, the aim of which is to transform ‘illegal users' into new ‘registered customers', connected to the distribution company by a meter. In particular, we will highlight the link between the public and private approaches being used in projects to regularise the electricity service in two favelas, Santa Marta and Cantagalo. To this end, our analysis will focus on studying regularisation of the electricity service using its own tools - including socio-technical (installing meters and rehabilitating the network), commercial (billing collection methods) and controlling electricity consumption tools - and examining the ways in which customers have taken ownership of these. Research shows that regularising the electricity service tends to reshape the favelados' relationship with the state and the market; however, this has a number of limitations: it is difficult to build contractual customer relationships based on trust; activities to control consumption advocate bringing behaviours ‘up to standard' rather than supporting use; service regularisation tends to reproduce socio-economic inequalities rather than rise above them and these inequalities also gradually become less political. Thus, the aim of this thesis is to help improve understanding of the methods being used to integrate the favelas given the growing neo-liberalisation of urban policy
|
8 |
Estimation of long-range dependenceVivero, Oskar January 2010 (has links)
A set of observations from a random process which exhibit correlations that decay slower than an exponential rate is regarded as long-range dependent. This phenomenon has stimulated great interest in the scientific community as it appears in a wide range of areas of knowledge. For example, this property has been observed in data pertaining to electronics, econometrics, hydrology and biomedical signals.There exist several estimation methods for finding model parameters that help explain the set of observations exhibiting long-range dependence. Among these methods, maximum likelihood is attractive, given its desirable statistical properties such as asymptotic consistency and efficiency. However, its computational complexity makes the implementation of maximum likelihood prohibitive.This thesis presents a group of computationally efficient estimators based on the maximum likelihood framework. The thesis consists of two main parts. The first part is devoted to developing a computationally efficient alternative to the maximum likelihood estimate. This alternative is based on the circulant embedding concept and it is shown to maintain the desirable statistical properties of maximum likelihood.Interesting results are obtained by analysing the circulant embedding estimate. In particular, this thesis shows that the maximum likelihood based methods are ill-conditioned; the estimators' performance will deteriorate significantly when the set of observations is corrupted by errors. The second part of this thesis focuses on developing computationally efficient estimators with improved performance under the presence of errors in the observations.
|
9 |
Mathematical imaging tools in cancer research : from mitosis analysis to sparse regularisationGrah, Joana Sarah January 2018 (has links)
This dissertation deals with customised image analysis tools in cancer research. In the field of biomedical sciences, mathematical imaging has become crucial in order to account for advancements in technical equipment and data storage by sound mathematical methods that can process and analyse imaging data in an automated way. This thesis contributes to the development of such mathematically sound imaging models in four ways: (i) automated cell segmentation and tracking. In cancer drug development, time-lapse light microscopy experiments are conducted for performance validation. The aim is to monitor behaviour of cells in cultures that have previously been treated with chemotherapy drugs, since atypical duration and outcome of mitosis, the process of cell division, can be an indicator of successfully working drugs. As an imaging modality we focus on phase contrast microscopy, hence avoiding phototoxicity and influence on cell behaviour. As a drawback, the common halo- and shade-off effect impede image analysis. We present a novel workflow uniting both automated mitotic cell detection with the Hough transform and subsequent cell tracking by a tailor-made level-set method in order to obtain statistics on length of mitosis and cell fates. The proposed image analysis pipeline is deployed in a MATLAB software package called MitosisAnalyser. For the detection of mitotic cells we use the circular Hough transform. This concept is investigated further in the framework of image regularisation in the general context of imaging inverse problems, in which circular objects should be enhanced, (ii) exploiting sparsity of first-order derivatives in combination with the linear circular Hough transform operation. Furthermore, (iii) we present a new unified higher-order derivative-type regularisation functional enforcing sparsity of a vector field related to an image to be reconstructed using curl, divergence and shear operators. The model is able to interpolate between well-known regularisers such as total generalised variation and infimal convolution total variation. Finally, (iv) we demonstrate how we can learn sparsity promoting parametrised regularisers via quotient minimisation, which can be motivated by generalised Eigenproblems. Learning approaches have recently become very popular in the field of inverse problems. However, the majority aims at fitting models to favourable training data, whereas we incorporate knowledge about both fit and misfit data. We present results resembling behaviour of well-established derivative-based sparse regularisers, introduce novel families of non-derivative-based regularisers and extend this framework to classification problems.
|
10 |
A General-Purpose GPU Reservoir ComputerKeith, Tūreiti January 2013 (has links)
The reservoir computer comprises a reservoir of possibly non-linear, possibly chaotic dynamics. By perturbing and taking outputs from this reservoir, its dynamics may be harnessed to compute complex problems at “the edge of chaos”. One of the first forms of reservoir computer, the Echo State Network (ESN), is a form of artificial neural network that builds its reservoir from a large and sparsely connected recurrent neural network (RNN). The ESN was initially introduced as an innovative solution to train RNNs which, up until that point, was a notoriously difficult task. The innovation of the ESN is that, rather than train the RNN weights, only the output is trained. If this output is assumed to be linear, then linear regression may be used.
This work presents an effort to implement the Echo State Network, and an offline linear regression training method based on Tikhonov regularisation. This implementation targeted the general purpose graphics processing unit (GPU or GPGPU). The behaviour of the implementation was examined by comparing it with a central processing unit (CPU) implementation, and by assessing its performance against several studied learning problems. These assessments were performed using all 4 cores of the Intel i7-980 CPU and an Nvidia GTX480. When compared with a CPU implementation, the GPU ESN implementation demonstrated a speed-up starting from a reservoir size of between 512 and 1,024. A maximum speed-up of approximately 6 was observed at the largest reservoir size tested (2,048). The Tikhonov regularisation (TR) implementation was also compared with a CPU implementation. Unlike the ESN execution, the GPU TR implementation was largely slower than the CPU implementation. Speed-ups were observed at the largest reservoir and state history sizes, the largest of which was 2.6813. The learning behaviour of the GPU ESN was tested on three problems, a sinusoid, a Mackey-Glass time-series, and a multiple superimposed oscillator (MSO). The normalised root-mean squared errors of the predictors were compared. The best observed sinusoid predictor outperformed the best MSO predictor by 4 orders of magnitude. In turn, the best observed MSO predictor outperformed the best Mackey-Glass predictor by 2 orders of magnitude.
|
Page generated in 0.1141 seconds