• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 68
  • 28
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 139
  • 139
  • 25
  • 23
  • 19
  • 19
  • 18
  • 17
  • 16
  • 15
  • 15
  • 12
  • 12
  • 12
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Information Entropy and Ecological Energetics : Predicting and Analysing Structure and Energy Flow in Ecological Networks applying the Concept of MaxEnt

Brinck, Katharina January 2014 (has links)
Ecological networks are complex systems forming hierarchical structures in which energy and matter is transferred between the network’s compartments. Predicting energy flows in food webs usually involves complex parameter-rich models. In this thesis, the application of the principle of maximum entropy (MaxEnt) to obtain least biased probability distributions based on prior knowledge is proposed as an alternative to predict the most likely energy flows in food webs from the network topology alone. This approach not only simplifies the characterisation of food web flow patterns based on little empirical knowledge but can also be used to investigate the role of bottom-up and top-down controlling forces in ecosystems resulting from the emergent phenomena based on the complex interactions on the level of species and individuals. The integrative measure of “flow extent”, incorporating both bottom- up and top-down controlling forces on ecosystems, is proposed as a principle behind ecosystem evolution and evaluated against empirical data on food web structure. It could be demonstrated that the method of predicting energy flow with the help of MaxEnt is very flexible, applicable to many different setting and types of questions in ecology, and therefore providing a powerful tool for modelling the energy transfer in ecosystems. Further research has to show in how far the most likely flow patterns are realised in real-word ecosystems. The concept of flow extent maximisation as a selection principle during ecosystem evolution can enhance the understanding of emergent phenomena in complex ecosystems and maybe help to draw a link between thermodynamics and ecology.
12

Maximum Entropy Regularisation Applied to Ultrasonic Image Reconstruction

Battle, David John January 1999 (has links)
Image reconstruction, in common with many other inverse problems, is often mathematically ill-posed in the sense that solutions are neither stable nor unique. Ultrasonic image reconstruction is particularly notorious in this regard, with narrow transducer bandwidths and limited - sometimes sparsely sampled apertures posing formidable difficulties for conventional signal processing. To overcome these difficulties, some form of regularisation is mandatory, whereby the ill-posed problem is restated as a closely related, well-posed problem, and then solved uniquely. This thesis explores the application of maximum entropy (MaxEnt) regularisation to the problem of reconstructing complex-valued imagery from sparsely sampled coherent ultrasonic field data, with particular emphasis on three-dimensional problems in the non-destructive evaluation (NDE) of materials. MaxEnt has not previously been applied to this class of problem, and yet in comparison with many other approaches to image reconstruction, it emerges as the clear leader in terms of resolution and overall image quality. To account for this performance, it is argued that the default image model used with MaxEnt is particularly meaningful in cases of ultrasonic scattering by objects embedded in homogeneous media. To establish physical and mathematical insights into the forward problem, linear equations describing scattering from both penetrable and impenetrable objects are first derived using the Born and physical optics approximations respectively. These equations are then expressed as a shift-invariant computational model that explicitly incorporates sparse sampling. To validate this model, time-domain scattering responses are computed and compared with analytical solutions for a simple canonical test case drawn from the field of NDE. The responses computed via the numerical model are shown to accurately reproduce the analytical responses. To solve inverse scattering problems via MaxEnt, the robust Cambridge algorithm is generalised to the complex domain and extended to handle broadband (multiple-frequency) data. Two versions of the augmented algorithm are then compared with a range of other algorithms, including several linearly regularised algorithms and lastly, due to its acknowledged status as a competitor with MaxEnt in radio-astronomy, the non-linear CLEAN algorithm. These comparisons are made through simulated 3-D imaging experiments under conditions of both complete and sparse aperture sampling with low and high levels of additive Gaussian noise. As required in any investigation of inverse problems, the experimental confirmation of algorithmic performance is emphasised, and two common imaging geometries relevant to NDE are selected for this purpose. In monostatic synthetic aperture imaging experiments involving side-drilled holes in an aluminium plate and test objects immersed in H2O, MaxEnt image reconstruction is demonstrated to be robust against grating-lobe and side-lobe formation, in addition to temporal bandwidth restriction. This enables efficient reconstruction of 2-D and 3-D images from small numbers of discrete samples in the spatial and frequency domains. The thesis concludes with a description of the design and testing of a novel polyvinylidene fluoride (PVDF) bistatic array transducer that offers advantages over conventional point-sampled arrays in terms of construction simplicity and signal-to-noise ratio. This ultra-sparse orthogonal array is the only one of its kind yet demonstrated, and was made possible by MaxEnt signal processing.
13

Aerosol Characterization and Analytical Modeling of Concentric Pneumatic and Flow Focusing Nebulizers for Sample Introduction

Kashani, Arash 17 February 2011 (has links)
A concentric pneumatic nebulizer (CPN) and a custom designed flow focusing nebulizer (FFN) are characterized. As will be shown, the classical Nukiyama-Tanasawa and Rizk-Lefebvre models lead to erroneous size prediction for the concentric nebulizer under typical operating conditions due to its specific design, geometry, dimension and different flow regimes. The models are then modified to improve the agreement with the experimental results. The size prediction of the modified models together with the spray velocity characterization are used to determine the overall nebulizer efficiency and also employed as input to a new Maximum Entropy Principle (MEP) based model to predict joint size-velocity distribution analytically. The new MEP model is exploited to study the local variation of size-velocity distribution in contrast to the classical models where MEP is applied globally to the entire spray cross section. As will be demonstrated, the velocity distribution of the classical MEP models shows poor agreement with experiments for the cases under study. Modifications to the original MEP modeling are proposed to overcome this deficiency. In addition, the new joint size-velocity distribution agrees better with our general understanding of the drag law and yields realistic results. / PhD
14

Bayesian Analysis of Intratumoural Oxygen Data

Tang, Herbert Hoi Chi January 2009 (has links)
There is now ample evidence to support the notion that a lack of oxygen (hypoxia) within the tumour adversely affects the outcome of radiotherapy and whether a patient is able to remain disease free. Thus, there is increasing interest in accurately determining oxygen concentration levels within a tumour. Hypoxic regions arise naturally in cancerous tumours because of their abnormal vasculature and it is believed that oxygen is necessary in order for radiation to be effective in killing cancer cells. One method of measuring oxygen concentration within a tumour is the Eppendorf polarographic needle electrode; a method that is favored by many clinical researchers because it is the only device that is inserted directly into the tumour, and reports its findings in terms of oxygen partial pressure (PO2). Unfortunately, there are often anomalous readings in the Eppendorf measurements (negative and extremely high values) and there is little consensus as to how best to interpret the data. In this thesis, Bayesian methods are applied to estimate two measures commonly used to quantify oxygen content within a tumour in the current literature: the median PO2, and Hypoxic Proportion (HP5), the percentage of readings less than 5mmHg. The results will show that Bayesian methods of parameter estimation are able to reproduce the standard estimate for HP5 while providing an additional piece of information, the error bar, that quantifies how uncertain we believe our estimate to be. Furthermore, using the principle of Maximum Entropy, we will estimate the true median PO2 of the distribution instead of simply relying on the sample median, a value which may or may not be an accurate indication of the actual median PO2 inside the tumour. The advantage of the Bayesian method is that it takes advantage of probability theory and presents its results in the form of probability density functions. These probability density functions provide us with more information about the desired quantity than the single number that is produced in the current literature and allows us to make more accurate and informative statements about the measure of hypoxia that we are trying to estimate.
15

Bayesian Analysis of Intratumoural Oxygen Data

Tang, Herbert Hoi Chi January 2009 (has links)
There is now ample evidence to support the notion that a lack of oxygen (hypoxia) within the tumour adversely affects the outcome of radiotherapy and whether a patient is able to remain disease free. Thus, there is increasing interest in accurately determining oxygen concentration levels within a tumour. Hypoxic regions arise naturally in cancerous tumours because of their abnormal vasculature and it is believed that oxygen is necessary in order for radiation to be effective in killing cancer cells. One method of measuring oxygen concentration within a tumour is the Eppendorf polarographic needle electrode; a method that is favored by many clinical researchers because it is the only device that is inserted directly into the tumour, and reports its findings in terms of oxygen partial pressure (PO2). Unfortunately, there are often anomalous readings in the Eppendorf measurements (negative and extremely high values) and there is little consensus as to how best to interpret the data. In this thesis, Bayesian methods are applied to estimate two measures commonly used to quantify oxygen content within a tumour in the current literature: the median PO2, and Hypoxic Proportion (HP5), the percentage of readings less than 5mmHg. The results will show that Bayesian methods of parameter estimation are able to reproduce the standard estimate for HP5 while providing an additional piece of information, the error bar, that quantifies how uncertain we believe our estimate to be. Furthermore, using the principle of Maximum Entropy, we will estimate the true median PO2 of the distribution instead of simply relying on the sample median, a value which may or may not be an accurate indication of the actual median PO2 inside the tumour. The advantage of the Bayesian method is that it takes advantage of probability theory and presents its results in the form of probability density functions. These probability density functions provide us with more information about the desired quantity than the single number that is produced in the current literature and allows us to make more accurate and informative statements about the measure of hypoxia that we are trying to estimate.
16

Evaluation of Maximum Entropy Moment Closure for Solution to Radiative Heat Transfer Equation

Fan, Doreen 22 November 2012 (has links)
The maximum entropy moment closure for the two-moment approximation of the radiative transfer equation is presented. The resulting moment equations, known as the M1 model, are solved using a finite-volume method with adaptive mesh refinement (AMR) and two Riemann-solver based flux function solvers: a Roe-type and a Harten-Lax van Leer (HLL) solver. Three different boundary schemes are also presented and discussed. When compared to the discrete ordinates method (DOM) in several representative one- and two-dimensional radiation transport problems, the results indicate that while the M1 model cannot accurately resolve multi-directional radiation transport occurring in low-absorption media, it does provide reasonably accurate solutions, both qualitatively and quantitatively, when compared to the DOM predictions in most of the test cases involving either absorbing-emitting or scattering media. The results also show that the M1 model is computationally less expensive than DOM for more realistic radiation transport problems involving scattering and complex geometries.
17

Evaluation of Maximum Entropy Moment Closure for Solution to Radiative Heat Transfer Equation

Fan, Doreen 22 November 2012 (has links)
The maximum entropy moment closure for the two-moment approximation of the radiative transfer equation is presented. The resulting moment equations, known as the M1 model, are solved using a finite-volume method with adaptive mesh refinement (AMR) and two Riemann-solver based flux function solvers: a Roe-type and a Harten-Lax van Leer (HLL) solver. Three different boundary schemes are also presented and discussed. When compared to the discrete ordinates method (DOM) in several representative one- and two-dimensional radiation transport problems, the results indicate that while the M1 model cannot accurately resolve multi-directional radiation transport occurring in low-absorption media, it does provide reasonably accurate solutions, both qualitatively and quantitatively, when compared to the DOM predictions in most of the test cases involving either absorbing-emitting or scattering media. The results also show that the M1 model is computationally less expensive than DOM for more realistic radiation transport problems involving scattering and complex geometries.
18

Scaling conditional random fields for natural language processing

Cohn, Trevor A Unknown Date (has links) (PDF)
This thesis deals with the use of Conditional Random Fields (CRFs; Lafferty et al. (2001)) for Natural Language Processing (NLP). CRFs are probabilistic models for sequence labelling which are particularly well suited to NLP. They have many compelling advantages over other popular models such as Hidden Markov Models and Maximum Entropy Markov Models (Rabiner, 1990; McCallum et al., 2001), and have been applied to a number of NLP tasks with considerable success (e.g., Sha and Pereira (2003) and Smith et al. (2005)). Despite their apparent success, CRFs suffer from two main failings. Firstly, they often over-fit the training sample. This is a consequence of their considerable expressive power, and can be limited by a prior over the model parameters (Sha and Pereira, 2003; Peng and McCallum, 2004). Their second failing is that the standard methods for CRF training are often very slow, sometimes requiring weeks of processing time. This efficiency problem is largely ignored in current literature, although in practise the cost of training prevents the application of CRFs to many new more complex tasks, and also prevents the use of densely connected graphs, which would allow for much richer feature sets. (For complete abstract open document)
19

Maximum Entropy Regularisation Applied to Ultrasonic Image Reconstruction

Battle, David John January 1999 (has links)
Image reconstruction, in common with many other inverse problems, is often mathematically ill-posed in the sense that solutions are neither stable nor unique. Ultrasonic image reconstruction is particularly notorious in this regard, with narrow transducer bandwidths and limited - sometimes sparsely sampled apertures posing formidable difficulties for conventional signal processing. To overcome these difficulties, some form of regularisation is mandatory, whereby the ill-posed problem is restated as a closely related, well-posed problem, and then solved uniquely. This thesis explores the application of maximum entropy (MaxEnt) regularisation to the problem of reconstructing complex-valued imagery from sparsely sampled coherent ultrasonic field data, with particular emphasis on three-dimensional problems in the non-destructive evaluation (NDE) of materials. MaxEnt has not previously been applied to this class of problem, and yet in comparison with many other approaches to image reconstruction, it emerges as the clear leader in terms of resolution and overall image quality. To account for this performance, it is argued that the default image model used with MaxEnt is particularly meaningful in cases of ultrasonic scattering by objects embedded in homogeneous media. To establish physical and mathematical insights into the forward problem, linear equations describing scattering from both penetrable and impenetrable objects are first derived using the Born and physical optics approximations respectively. These equations are then expressed as a shift-invariant computational model that explicitly incorporates sparse sampling. To validate this model, time-domain scattering responses are computed and compared with analytical solutions for a simple canonical test case drawn from the field of NDE. The responses computed via the numerical model are shown to accurately reproduce the analytical responses. To solve inverse scattering problems via MaxEnt, the robust Cambridge algorithm is generalised to the complex domain and extended to handle broadband (multiple-frequency) data. Two versions of the augmented algorithm are then compared with a range of other algorithms, including several linearly regularised algorithms and lastly, due to its acknowledged status as a competitor with MaxEnt in radio-astronomy, the non-linear CLEAN algorithm. These comparisons are made through simulated 3-D imaging experiments under conditions of both complete and sparse aperture sampling with low and high levels of additive Gaussian noise. As required in any investigation of inverse problems, the experimental confirmation of algorithmic performance is emphasised, and two common imaging geometries relevant to NDE are selected for this purpose. In monostatic synthetic aperture imaging experiments involving side-drilled holes in an aluminium plate and test objects immersed in H2O, MaxEnt image reconstruction is demonstrated to be robust against grating-lobe and side-lobe formation, in addition to temporal bandwidth restriction. This enables efficient reconstruction of 2-D and 3-D images from small numbers of discrete samples in the spatial and frequency domains. The thesis concludes with a description of the design and testing of a novel polyvinylidene fluoride (PVDF) bistatic array transducer that offers advantages over conventional point-sampled arrays in terms of construction simplicity and signal-to-noise ratio. This ultra-sparse orthogonal array is the only one of its kind yet demonstrated, and was made possible by MaxEnt signal processing.
20

Generative models of similarity-based classification /

Cazzanti, Luca. January 2007 (has links)
Thesis (Ph. D.)--University of Washington, 2007. / Vita. Includes bibliographical references (p. 101-107).

Page generated in 0.055 seconds