• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 56
  • 56
  • 55
  • 24
  • 18
  • 18
  • 16
  • 16
  • 15
  • 12
  • 12
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Design of a system to support policy formulation for sustainable biofuel production

Singh, Minerva January 2010 (has links)
The increased demand for biofuels is expected to put additional strain on the available agricultural resources while at the same time causing environmental degradation. Hence, new energy policies need to be formulated and implemented in order to meet global energy needs while reducing the impact of biofuels farming and production. This research focuses on proving a decision support system which can aid the formulation of policies for the sustainable biofuel production. The system seeks to address policy formulation that requires reconciliation of the qualitative aspects of decision making (such as stakeholder’s viewpoints) with quantitative data, which often may be imprecise. To allow this, based on: Fuzzy logic and Multi Criteria Decision Making (MCDM) in the form of Analytical Hierarchy Process (AHP). Using these concepts, three software functionalities, “Options vs. Fuzzy Criteria Matrix”, “Analytical Hierarchy Process” and “Fuzzy AHP” were developed. These were added within the framework of pre-existing base software, Compendium (developed by the Open University, UK). A number of case study based models have been investigated using the software. These models made use of data from the Philippines and India in order to pinpoint suitable land and crop options for these countries. The models based on AHP and Fuzzy AHP were very successful in identifying suitable crop options for India by capturing both the stakeholder viewpoints and quantitative data. The software functionalities are very effective in scenario planning and selection of policies that would be beneficial in achieving a desired future scenario. The models further revealed that the newly developed software correctly identified many of the important issues in a consistent manner.
32

Modelling the role of nitric oxide in cerebral autoregulation

Catherall, Mark January 2014 (has links)
Malfunction of the system which regulates the bloodflow in the brain is a major cause of stroke and dementia, costing many lives and many billions of pounds each year in the UK alone. This regulatory system, known as cerebral autoregulation, has been the subject of much experimental and mathematical investigation yet our understanding of it is still quite limited. One area in which our understanding is particularly lacking is that of the role of nitric oxide, understood to be a potent vasodilator. The interactions of nitric oxide with the better understood myogenic response remain un-modelled and poorly understood. In this thesis we present a novel model of the arteriolar control mechanism, comprising a mixture of well-established and new models of individual processes, brought together for the first time. We show that this model is capable of reproducing experimentally observed behaviour very closely and go on to investigate its stability in the context of the vasculature of the whole brain. In conclusion we find that nitric oxide, although it plays a central role in determining equilibrium vessel radius, is unimportant to the dynamics of the system and its responses to variation in arterial blood pressure. We also find that the stability of the system is very sensitive to the dynamics of Ca<sup>2+</sup> within the muscle cell, and that self-sustaining Ca2+ waves are not necessary to cause whole-vessel radius oscillations consistent with vasomotion.
33

Morphometric analysis of brain structures in MRI

González Ballester, Miguel Ángel January 1999 (has links)
Medical computer vision is a novel research discipline based on the application of computer vision methods to data sets acquired via medical imaging techniques. This work focuses on magnetic resonance imaging (MRI) data sets, particularly in studies of schizophrenia and multiple sclerosis. Research on these diseases is challenged by the lack of appropriate morphometric tools to accurately quantify lesion growth, assess the effectiveness of a drug treatment, or investigate anatomical information believed to be evidence of schizophrenia. Thus, most hypotheses involving these conditions remain unproven. This thesis contributes towards the development of such morphometric techniques. A framework combining several tools is established, allowing for compensation of bias fields, boundary detection by modelling partial volume effects (PVE), and a combined statistical and geometrical segmentation method. Most importantly, it also allows for the computation of confidence bounds in the location of the object being segmented by bounding PVE voxels. Bounds obtained in such fashion encompass a significant percentage of the volume of the object (typically 20-60%). A statistical model of the intensities contained in PVE voxels is used to provide insight into the contents of PVE voxels and further narrow confidence bounds. This not only permits a reduction by an order of magnitude in the width of the confidence intervals, but also establishes a statistical mechanism to obtain probability distributions on shape descriptors (e.g. volume), instead of just a raw magnitude or a set of confidence bounds. A challenging clinical study is performed using these tools: to investigate differences in asymmetry of the temporal horns in schizophrenia. This study is of high clinical relevance. The results show that our tools are sufficiently accurate for studies of this kind, thus providing clinicians, for the first time, with the means to corroborate unproven hypotheses or reliably assess patient evolution.
34

Novelty detection with extreme value theory in vital-sign monitoring

Hugueny, Samuel Y. January 2013 (has links)
Every year in the UK, tens of thousands of hospital patients suffer adverse events, such as un-planned transfers to Intensive Therapy Units or unexpected cardiac arrests. Studies have shown that in a large majority of cases, significant physiological abnormalities can be observed within the 24-hour period preceding such events. Such warning signs may go unnoticed, if they occur between observations by the nursing staff, or are simply not identified as such. Timely detection of these warning signs and appropriate escalation schemes have been shown to improve both patient outcomes and the use of hospital resources, most notably by reducing patients’ length of stay. Automated real-time early-warning systems appear to be cost-efficient answers to the need for continuous vital-sign monitoring. Traditionally, a limitation of such systems has been their sensitivity to noisy and artefactual measurements, resulting in false-alert rates that made them unusable in practice, or earned them the mistrust of clinical staff. Tarassenko et al. (2005) and Hann (2008) proposed a novelty detection approach to the problem of continuous vital-sign monitoring, which, in a clinical trial, was shown to yield clinically acceptable false alert rates. In this approach, an observation is compared to a data fusion model, and its “normality” assessed by comparing a chosen statistic to a pre-set threshold. The method, while informed by large amounts of training data, has a number of heuristic aspects. This thesis proposes a principled approach to multivariate novelty detection in stochastic time- series, where novelty scores have a probabilistic interpretation, and are explicitly linked to the starting assumptions made. Our approach stems from the observation that novelty detection using complex multivariate, multimodal generative models is generally an ill-defined problem when attempted in the data space. In situations where “novel” is equivalent to “improbable with respect to a probability distribution ”, formulating the problem in a univariate probability space allows us to use classical results of univariate statistical theory. Specifically, we propose a multivariate extension to extreme value theory and, more generally, order statistics, suitable for performing novelty detection in time-series generated from a multivariate, possibly multimodal model. All the methods introduced in this thesis are applied to a vital-sign monitoring problem and compared to the existing method of choice. We show that it is possible to outperform the existing method while retaining a probabilistic interpretation. In addition to their application to novelty detection for vital-sign monitoring, contributions in this thesis to existing extreme value theory and order statistics are also valid in the broader context of data-modelling, and may be useful for analysing data from other complex systems.
35

Use of inertial sensors to measure upper limb motion : application in stroke rehabilitation

Shublaq, Nour January 2010 (has links)
Stroke is the largest cause of severe adult complex disability, caused when the blood supply to the brain is interrupted, either by a clot or a burst blood vessel. It is characterised by deficiencies in movement and balance, changes in sensation, impaired motor control and muscle tone, and bone deformity. Clinically applied stroke management relies heavily on the observational opinion of healthcare workers. Despite the proven validity of a few clinical outcome measures, they remain subjective and inconsistent, and suffer from a lack of standardisation. Motion capture of the upper limb has also been used in specialised laboratories to obtain accurate and objective information, and monitor progress in rehabilitation. However, it is unsuitable in environments that are accessible to stroke patients (for example at patients’ homes or stroke clubs), due to the high cost, special set-up and calibration requirements. The aim of this research project was to validate and assess the sensitivity of a relatively low cost, wearable, compact and easy-to-use monitoring system, which uses inertial sensors in order to obtain detailed analysis of the forearm during simple functional exercises, typically used in rehabilitation. Forearm linear and rotational motion were characterised for certain movements on four healthy subjects and a stroke patient using a motion capture system. This provided accuracy and sensitivity specifications for the wearable monitoring system. With basic signal pre-processing, the wearable system was found to report reliably on acceleration, angular velocity and orientation, with varying degrees of confidence. Integration drift errors in the estimation of linear velocity were unresolved. These errors were not straightforward to eliminate due to the varying position of the sensor accelerometer relative to gravity over time. The cyclic nature of rehabilitation exercises was exploited to improve the reliability of velocity estimation with model-based Kalman filtering, and least squares optimisation techniques. Both signal processing methods resulted in an encouraging reduction of the integration drift in velocity. Improved sensor information could provide a visual display of the movement, or determine kinematic quantities relevant to the exercise performance. Hence, the system could potentially be used to objectively inform patients and physiotherapists about progress, increasing patient motivation and improving consistency in assessment and reporting of outcomes.
36

Morphodynamics of sand mounds in shallow flows

Garcia-Hermosa, M. Isabel January 2008 (has links)
Large-scale bed features are often encountered in coastal waters, and include sandbanks and spoil heaps. The morphodynamic development of such features involves complicated nonlinear interactions between the flow hydrodynamics, sediment transport, and bed profile. Numerical modelling of the morphodynamic evolution and migration of large-scale bed features is necessary in order to understand their long-term behaviour in response to changing environmental conditions. This thesis describes detailed measurements of the morphodynamics of sand mounds in unidirectional and oscillatory (tidal) flows, undertaken at the U.K. Coastal Research Facility (UKCRF). High quality data were collected, including water velocities, water levels and overhead images. The parameters tested are: three types of mound shape (circular and elliptical in plan shape, and Gaussian, cosine and triangular in cross-section); underlying fixed or mobile bed conditions; and initial crest height (submerged, surface-touching and surface-piercing). Peak flow velocities are about 0.5 m/s, the sand median grain size is 0.454 mm, and transport occurring mostly as bedload. When analysing the data, the bed contours are determined by digitising the shoreline at different water levels. From these plots, the volume, height, and centroid position of the mound are calculated. A large-scale fit method, based on a Gaussian function has been used to separate small-scale ripples from the large-scale bed structure during the evolution of an isolated sand mound or spoil heap. The bed profile after the ripples are removed is comparable to typical predictions by shallow-flow numerical solvers. The UKCRF experiments investigated the morphodynamic response of a bed mound to hydrodynamic forcing: shape changes, migration rates, volume decay and sediment transport rates. The measured migration rate and decay of a submerged sand mound in the UKCRF are found to be in satisfactory agreement with results from various theoretical models, such as the analytical solution derived by De Vriend. Numerical predictions of mound evolution by a commercial code, PISCES, are also presented for a fully submerged sand mound; the bed evolution is reasonably similar to that observed in the UKCRF. The data provided as a result of the research reported in this thesis provide insight into the behaviour of sand mounds in steady and unsteady flows at laboratory scale, and should also be useful for benchmark (validation) purposes to numerical modellers of large-scale morphodynamics.
37

Droplet interface bilayers for the study of membrane proteins

Hwang, William January 2008 (has links)
Aqueous droplets submerged in an oil-lipid mixture become enclosed by a lipid monolayer. The droplets can be connected to form robust networks of droplet interface bilayers (DIBs) with functions such as a biobattery and a light sensor. The discovery and characterization of an engineered nanopore with diode-like properties is enabling the construction of DIB networks capable of biochemical computing. Moreover, DIB networks might be used as model systems for the study of membrane-based biological phenomena. We develop and experimentally validate an electrical modeling approach for DIB networks. Electrical circuit simulations will be important in guiding the development of increasingly complex DIB networks. In cell membranes, the lipid compositions of the inner and outer leaflets differ. Therefore, a robust model system that enables single-channel electrical recording with asymmetric bilayers would be very useful. Towards this end, we incorporate lipid vesicles of different compositions into aqueous droplets and immerse them in an oil bath to form asymmetric DIBs (a-DIBs). Both α-helical and β-barrel membrane proteins insert readily into a-DIBs, and their activity can be measured by single-channel electrical recording. We show that the gating behavior of outer membrane protein G (OmpG) from Escherichia coli differs depending on the side of insertion in an asymmetric DIB with a positively charged leaflet opposing a negatively charged leaflet. The a-DIB system provides a general platform for studying the effects of bilayer leaflet composition on the behavior of ion channels and pores. Even with the small volumes (~100 nL) that can be used to form DIBs, the separation between two adjacent bilayers in a DIB network is typically still hundreds of microns. In contrast, dual-membrane spanning proteins require the bilayer separation to be much smaller; for example, the bilayer separation for gap junctions must be less than 5 nm. We designed a double bilayer system that consists of two monolayer-coated aqueous spheres brought into contact with each side of a water film submerged in an oil-lipid solution. The spheres could be brought close enough together such that they physically deflected without rupturing the double bilayer. Future work on quantifying the bilayer separation and studying dual-membrane spanning proteins with the double bilayer platform is planned.
38

Feature detection in mammographic image analysis

Linguraru, Marius George January 2004 (has links)
In modern society, cancer has become one of the most terrifying diseases because of its high and increasing death rate. The disease's deep impact demands extensive research to detect and eradicate it in all its forms. Breast cancer is one of the most common forms of cancer, and approximately one in nine women in the Western world will develop it over the course of their lives. Screening programmes have been shown to reduce the mortality rate, but they introduce an enormous amount of information that must be processed by radiologists on a daily basis. Computer Aided Diagnosis (CAD) systems aim to assist clinicians in their decision-making process, by acting as a second opinion and helping improve the detection and classification ratios by spotting very difficult and subtle cases. Although the field of cancer detection is rapidly developing and crosses over imaging modalities, X-ray mammography remains the principal tool to detect the first signs of breast cancer in population screening. The advantages and disadvantages of other imaging modalities for breast cancer detection are discussed along with the improvements and difficulties encountered in screening programmes. Remarkable achievements to date in breast CAD are equally presented. This thesis introduces original results for the detection of features from mammographic image analysis to improve the effectiveness of early cancer screening programmes. The detection of early signs of breast cancer is vital in managing such a fast developing disease with poor survival rates. Some of the earliest signs of cancer in the breast are the clusters of microcalcifications. The proposed method is based on image filtering comprising partial differential equations (PDE) for image enhancement. Subsequently, microcalcifications are segmented using characteristics of the human visual system, based on the superior qualities of the human eye to depict localised changes of intensity and appearance in an image. Parameters are set according to the image characteristics, which makes the method fully automated. The detection of breast masses in temporal mammographic pairs is also investigated as part of the development of a complete breast cancer detection tool. The design of this latter algorithm is based on the detection sequence used by radiologists in clinical routine. To support the classification of masses into benign or malignant, novel tumour features are introduced. Image normalisation is another key concept discussed in this thesis along with its benefits for cancer detection.
39

Human layout estimation using structured output learning

Mittal, Arpit January 2012 (has links)
In this thesis, we investigate the problem of human layout estimation in unconstrained still images. This involves predicting the spatial configuration of body parts. We start our investigation with pictorial structure models and propose an efficient method of model fitting using skin regions. To detect the skin, we learn a colour model locally from the image by detecting the facial region. The resulting skin detections are also used for hand localisation. Our next contribution is a comprehensive dataset of 2D hand images. We collected this dataset from publicly available image sources, and annotated images with hand bounding boxes. The bounding boxes are not axis aligned, but are rather oriented with respect to the wrist. Our dataset is quite exhaustive as it includes images of different hand shapes and layout configurations. Using our dataset, we train a hand detector that is robust to background clutter and lighting variations. Our hand detector is implemented as a two-stage system. The first stage involves proposing hand hypotheses using complementary image features, which are then evaluated by the second stage classifier. This improves both precision and recall and results in a state-of-the-art hand detection method. In addition we develop a new method of non-maximum suppression based on super-pixels. We also contribute an efficient training algorithm for structured output ranking. In our algorithm, we reduce the time complexity of an expensive training component from quadratic to linear. This algorithm has a broad applicability and we use it for solving human layout estimation and taxonomic multiclass classification problems. For human layout, we use different body part detectors to propose part candidates. These candidates are then combined and scored using our ranking algorithm. By applying this bottom-up approach, we achieve accurate human layout estimation despite variations in viewpoint and layout configuration. In the multiclass classification problem, we define the misclassification error using a class taxonomy. The problem then reduces to a structured output ranking problem and we use our ranking method to optimise it. This allows inclusion of semantic knowledge about the classes and results in a more meaningful classification system. Lastly, we substantiate our ranking algorithm with theoretical proofs and derive the generalisation bounds for it. These bounds prove that the training error reduces to the lowest possible error asymptotically.
40

Model-based ultrasonic temperature estimation for monitoring HIFU therapy

Ye, Guoliang January 2008 (has links)
High Intensity Focused Ultrasound (HIFU) is a new cancer thermal therapy method which has achieved encouraging results in clinics recently. However, the lack of a temperature monitoring makes it hard to apply widely, safely and efficiently. Conventional ultrasonic temperature estimation based on echo strain suffers from artifacts caused by signal distortion over time, leading to poor estimation and visualization of the 2D temperature map. This thesis presents a novel model-based stochastic framework for ultrasonic temperature estimation, which combines the temperature information from the ultrasound images and a theoretical model of the heat diffusion. Consequently the temperature estimation is more consistent over time and its visualisation is improved. There are 3 main contributions of this thesis related to: improving the conventional echo strain method to estimate temperature, developing and applying approximate heat models to model temperature, and finally combining the estimation and the models. First in the echo strain based temperature estimation, a robust displacement estimator is first introduced to remove displacement outliers caused by the signal distortion over time due to the thermo-acoustic lens effect. To transfer the echo strain to temperature more accurately, an experimental method is designed to model their relationship using polynomials. Experimental results on a gelatine phantom show that the accuracy of the temperature estimation is of the order of 0.1 ◦C. This is better than results reported previously of 0.5 ◦C in a rubber phantom. Second in the temperature modelling, heat models are derived approximately as Gaussian functions which are mathematically simple. Simulated results demonstrate that the approximate heat models are reasonable. The simulated temperature result is analytical and hence computed in much less than 1 second, while the conventional simulation of using finite element methods requires about 25 minutes under the same conditions. Finally, combining the estimation and the heat models is the main contribution of this thesis. A 2D spatial adaptive Kalman filter with the predictive step defined by the shape model from the heat models is applied to the temperature map estimated from ultrasound images. It is shown that use of the temperature shape model enables more reliable temperature estimation in the presence of distorted or blurred strain measurements which are typically found in practice. The experimental results on in-vitro bovine liver show that the visualisation on the temperature map over time is more consistent and the iso-temperature contours are clearly visualised.

Page generated in 0.1457 seconds