Spelling suggestions: "subject:"3dmodeling. engineering."" "subject:"3dmodeling. ingineering.""
41 |
Accurate telemonitoring of Parkinson's disease symptom severity using nonlinear speech signal processing and statistical machine learningTsanas, Athanasios January 2012 (has links)
This study focuses on the development of an objective, automated method to extract clinically useful information from sustained vowel phonations in the context of Parkinson’s disease (PD). The aim is twofold: (a) differentiate PD subjects from healthy controls, and (b) replicate the Unified Parkinson’s Disease Rating Scale (UPDRS) metric which provides a clinical impression of PD symptom severity. This metric spans the range 0 to 176, where 0 denotes a healthy person and 176 total disability. Currently, UPDRS assessment requires the physical presence of the subject in the clinic, is subjective relying on the clinical rater’s expertise, and logistically costly for national health systems. Hence, the practical frequency of symptom tracking is typically confined to once every several months, hindering recruitment for large-scale clinical trials and under-representing the true time scale of PD fluctuations. We develop a comprehensive framework to analyze speech signals by: (1) extracting novel, distinctive signal features, (2) using robust feature selection techniques to obtain a parsimonious subset of those features, and (3a) differentiating PD subjects from healthy controls, or (3b) determining UPDRS using powerful statistical machine learning tools. Towards this aim, we also investigate 10 existing fundamental frequency (F_0) estimation algorithms to determine the most useful algorithm for this application, and propose a novel ensemble F_0 estimation algorithm which leads to a 10% improvement in accuracy over the best individual approach. Moreover, we propose novel feature selection schemes which are shown to be very competitive against widely-used schemes which are more complex. We demonstrate that we can successfully differentiate PD subjects from healthy controls with 98.5% overall accuracy, and also provide rapid, objective, and remote replication of UPDRS assessment with clinically useful accuracy (approximately 2 UPDRS points from the clinicians’ estimates), using only simple, self-administered, and non-invasive speech tests. The findings of this study strongly support the use of speech signal analysis as an objective basis for practical clinical decision support tools in the context of PD assessment.
|
42 |
Inexpensive uncertainty analysis for CFD applicationsGhate, Devendra January 2014 (has links)
The work presented in this thesis aims to provide various tools to be used during design process to make maximum use of the increasing availability of accurate engine blade measurement data for high fidelity fluid mechanic simulations at a reasonable computational expense. A new method for uncertainty propagation for geometric error has been proposed for fluid mechanics codes using adjoint error correction. Inexpensive Monte Carlo (IMC) method targets small uncertainties and provides complete probability distribution for the objective function at a significantly reduced computational cost. A brief literature survey of the existing methods is followed by the formulation of IMC. An example algebraic model is used to demonstrate the IMC method. The IMC method is extended to fluid mechanic applications using Principal Component Analysis (PCA) for reduced order modelling. Implementation details for the IMC method are discussed using an example airfoil code. Finally, the IMC method has been implemented and validated for an industrial fluid mechanic code HYDRA. A consistent methodology has been developed for the automatic generation of the linear and adjoint codes by selective use of automatic differentiation (AD) technique. The method has the advantage of keeping the linear and the adjoint codes in-sync with the changes in the underlying nonlinear fluid mechanic solver. The use of various consistency checks have been demonstrated to ease the development and maintenance process of the linear and the adjoint codes. The use of AD has been extended for the calculation of the complete Hessian using forward-on-forward approach. The complete mathematical formulation for Hessian calculation using the linear and the adjoint solutions has been outlined for fluid mechanic solvers. An efficient implementation for the Hessian calculation is demonstrated using the airfoil code. A new application of the Independent Component Analysis (ICA) is proposed for manufacturing uncertainty source identification. The mathematical formulation is outlined followed by an example application of ICA for artificially generated uncertainty for the NACA0012 airfoil.
|
43 |
A laser based straightness monitor for a prototype automated linear collider tunnel surveying systemMoss, Gregory Richard January 2013 (has links)
For precise measurement of new TeV-scale physics and precision studies of the Higgs Boson, a new lepton collider is required. To enable meaningful analysis, a centre of mass energy of 500GeV and luminosity of 10<sup>34</sup>cm<sup>-2</sup>s<sup>-1</sup> is needed. The planned 31km long International Linear Collider is capable of meeting these targets, requiring a final emittance of 10 micro-radians horizontally and 35nmrad vertically. To achieve these demanding emittance values, the accelerator components in the main linacs must be aligned against an accurately mapped network of reference markers along the entire tunnel. An automated system could map this tunnel network quickly, accurately, safely and repeatedly; the Linear Collider Alignment and Survey (LiCAS) Rapid Tunnel Reference Surveyor (RTRS) is a working prototype of such a system. The LiCAS RTRS is a train of measurement units that accurately locate regularly spaced retro-reflector markers using Frequency Scanning Interferometry (FSI). The unit locations with respect to each other are precisely reconstructed using a Laser Straightness Monitor (LSM) and tilt sensor system, along with a system of internal FSI lines. The design, commissioning, practical usage, calibration, and reconstruction performance of the LSM is addressed in this work. The commissioned RTRS is described and the properties of the LSM components are investigated in detail. A method of finding the position of laser beam spots on the LSM cameras is developed, along with a process of combining individual spot positions into a more robust measurement compatible with the data from other sub-systems. Laser beam propagation along the LSM is modelled and a robust method of reconstructing CCD beam spot position measurements into positions and orientations of the LSM units is described. A method of calibrating LSM units using an external witness system is presented, along with a way of using the overdetermined nature of the LSM to improve calibration constant errors by including data taken from unwitnessed runs. The reconstruction uncertainty, inclusive of both statistical and systematic effects, of the LSM system is found to be of 5.8 microns × 5.3 microns in lateral translations and 27.6 microradians × 34.1 microradians in rotations perpendicular to the beam, with an uncertainty of 51.1 microradians in rotations around the beam coming from a tilt-sensor arrangement.
|
44 |
Cavitation-enhanced delivery of therapeutics to solid tumorsRifai, Bassel January 2011 (has links)
Poor drug penetration through tumor tissue has emerged as a fundamental obstacle to cancer therapy. The solid tumor microenvironment presents several physiological abnormalities which reduce the uptake of intravenously administered therapeutics, including leaky, irregularly spaced blood vessels, and a pressure gradient which resists transport of therapeutics from the bloodstream into the tumor. Because of these factors, a systemically administered anti-cancer agent is unlikely to reach 100% of cancer cells at therapeutic dosages, which is the efficacy required for curative treatment. The goal of this project is to use high-intensity focused ultrasound (HIFU) to enhance drug delivery via phenomena associated with acoustic cavitation. ‘Cavitation’ is the formation, oscillation, and collapse of bubbles in a sound field, and can be broadly divided into two types: ‘inertial’ and ‘stable’. Inertial cavitation involves violent bubble collapse and is associated with phenomena such as heating, fluid jetting, and broadband noise emission. Stable cavitation occurs at lower pressure amplitudes, and can generate liquid microstreaming in the bubble vicinity. It is the combination of fluid jetting and microstreaming which it is attempted to explore, control, and apply to the drug delivery problem in solid tumors. First, the potential for cavitation to enhance the convective transport of a model therapeutic into obstructed vasculature in a cell-free in vitro tumor model is evaluated. Transport is quantified using post-treatment image analysis of the distribution of a dye-labeled macromolecule, while cavitation activity is quantified by analyzing passively recorded acoustic emissions. The introduction of exogenous cavitation nuclei into the acoustic field is found to dramatically enhance both cavitation activity and convective transport. The strong correlation between inertial cavitation activity and drug delivery in this study suggested both a mechanism of action and the clinical potential for non-invasive treatment monitoring. Next, a flexible and efficient method to simulate numerically the microstreaming fields instigated by cavitating microbubbles is developed. The technique is applied to the problem of quantifying convective transport of a scalar quantity in the vicinity of acoustically cavitating microbubbles of various initial radii subject to a range of sonication parameters, yielding insight regarding treatment parameter choice. Finally, in vitro and in vivo models are used to explore the effect of HIFU on delivery and expression of a biologically active adenovirus. The role of cavitation in improving the distribution of adenovirus in porous media is established, as well as the critical role of certain sonication parameters in sustaining cavitation activity in vivo. It is shown that following intratumoral or intravenous co-injection of ultrasound contrast agents and adenovirus, both the distribution and expression of viral transgenes are enhanced in the presence of inertial cavitation. This ultrasound-based drug delivery system has the potential to be applied in conjunction with a broad range of macromolecular therapeutics to augment their bioavailability for cancer treatment. In order to reach this objective, further developmental work is recommended, directed towards improving therapeutic transducer design, using transducer arrays for treatment monitoring and mapping, and continuing the development of functionalized monodisperse cavitation nuclei.
|
45 |
Efficient numerical methods for ultrasound elastographySquires, Timothy Richard January 2012 (has links)
In this thesis, two algorithms are introduced for use in ultrasound elastography. Ultrasound elastography is a technique developed in the last 20 years by which anomalous regions in soft tissue are located and diagnosed without the need for biopsy. Due to this, the relativity cheap cost of ultrasound imaging and the high level of accuracy in the methods, ultrasound elastography methods have shown great potential for the diagnosis of cancer in soft tissues. The algorithms introduced in this thesis represent an advance in this field. The first algorithm is a two-step iteration procedure consisting of two minimization problems - displacement estimation and elastic parameter calculation that allow for diagnosis of any anomalous regions within soft tissue. The algorithm represents an improvement on existing methods in several ways. A weighting factor is introduced for each different point in the tissue dependent on the confidence in the accuracy of the data at that point, an exponential substitution is made for the elasticity modulus, an adjoint method is used for efficient calculation of the gradient vector and a total variation regularization technique is used. Most importantly, an adaptive mesh refinement strategy is introduced that allows highly efficient calculation of the elasticity distribution of the tissue though using a number of degrees of freedom several orders lower than methods that use a uniform mesh refinement strategy. Results are presented that show the algorithm is robust even in the presence of significant noise and that it can locate a tumour of 4mm in diameter within a 5cm square region of tissue. Also, the algorithm is extended into 3 dimensions and results are presented that show that it can calculate a 3 dimensional elasticity distribution efficiently. This extension into 3-d is a significant advance in the field. The second algorithm is a one-step algorithm that seeks to combine the two problems of elasticity distribution and displacement calculation into one. As in the two-step algorithm, a weighting factor, exponential substitution for the elasticity parameter, adjoint method for calculation of the gradient vector, total variation regularization and adaptive mesh refinement strategy are incorporated. Results are presented that show that this original approach can locate tumours of varying sizes and shapes in the presence of varying levels of added artificial noise and that it can determine the presence of a tumour in images taken from breast tissue in vivo.
|
46 |
DifFUZZY : a novel clustering algorithm for systems biologyCominetti Allende, Ornella Cecilia January 2012 (has links)
Current studies of the highly complex pathobiology and molecular signatures of human disease require the analysis of large sets of high-throughput data, from clinical to genetic expression experiments, containing a wide range of information types. A number of computational techniques are used to analyse such high-dimensional bioinformatics data. In this thesis we focus on the development of a novel soft clustering technique, DifFUZZY, a fuzzy clustering algorithm applicable to a larger class of problems than other soft clustering approaches. This method is better at handling datasets that contain clusters that are curved, elongated or are of different dispersion. We show how DifFUZZY outperforms a number of frequently used clustering algorithms using a number of examples of synthetic and real datasets. Furthermore, a quality measure based on the diffusion distance developed for DifFUZZY is presented, which is employed to automate the choice of its main parameter. We later apply DifFUZZY and other techniques to data from a clinical study of children from The Gambia with different types of severe malaria. The first step was to identify the most informative features in the dataset which allowed us to separate the different groups of patients. This led to us reproducing the World Health Organisation classification for severe malaria syndromes and obtaining a reduced dataset for further analysis. In order to validate these features as relevant for malaria across the continent and not only in The Gambia, we used a larger dataset for children from different sites in Sub-Saharan Africa. With the use of a novel network visualisation algorithm, we identified pathobiological clusters from which we made and subsequently verified clinical hypotheses. We finish by presenting conclusions and future directions, including image segmentation and clustering time-series data. We also suggest how we could bridge data modelling with bioinformatics by embedding microarray data into cell models. Towards this end we take as a case study a multiscale model of the intestinal crypt using a cell-vertex model.
|
47 |
Large-scale layered systems and synthetic biology : model reduction and decompositionPrescott, Thomas Paul January 2014 (has links)
This thesis is concerned with large-scale systems of Ordinary Differential Equations that model Biomolecular Reaction Networks (BRNs) in Systems and Synthetic Biology. It addresses the strategies of model reduction and decomposition used to overcome the challenges posed by the high dimension and stiffness typical of these models. A number of developments of these strategies are identified, and their implementation on various BRN models is demonstrated. The goal of model reduction is to construct a simplified ODE system to closely approximate a large-scale system. The error estimation problem seeks to quantify the approximation error; this is an example of the trajectory comparison problem. The first part of this thesis applies semi-definite programming (SDP) and dissipativity theory to this problem, producing a single a priori upper bound on the difference between two models in the presence of parameter uncertainty and for a range of initial conditions, for which exhaustive simulation is impractical. The second part of this thesis is concerned with the BRN decomposition problem of expressing a network as an interconnection of subnetworks. A novel framework, called layered decomposition, is introduced and compared with established modular techniques. Fundamental properties of layered decompositions are investigated, providing basic criteria for choosing an appropriate layered decomposition. Further aspects of the layering framework are considered: we illustrate the relationship between decomposition and scale separation by constructing singularly perturbed BRN models using layered decomposition; and we reveal the inter-layer signal propagation structure by decomposing the steady state response to parametric perturbations. Finally, we consider the large-scale SDP problem, where large scale SDP techniques fail to certify a system’s dissipativity. We describe the framework of Structured Storage Functions (SSF), defined where systems admit a cascaded decomposition, and demonstrate a significant resulting speed-up of large-scale dissipativity problems, with applications to the trajectory comparison technique discussed above.
|
48 |
Assessment of collateral blood flow in the brain using magnetic resonance imagingOkell, Thomas William January 2011 (has links)
Collateral blood flow is the compensatory flow of blood to the tissue through secondary channels when the primary channel is compromised. It is of vital importance in cerebrovascular disease where collateral flow can maintain large regions of brain tissue which would otherwise have suffered ischaemic damage. Traditional x-ray based techniques for visualising collateral flow are invasive and carry risks to the patient. In this thesis novel magnetic resonance imaging techniques for performing vessel-selective labelling of brain feeding arteries are explored and developed to reveal the source and extent of collateral flow in the brain non-invasively and without the use of contrast agents. Vessel-encoded pseudo-continuous arterial spin labelling (VEPCASL) allows the selective labelling of blood water in different combinations of brain feeding arteries that can be combined in post-processing to yield vascular territory maps. The mechanism of VEPCASL was elucidated and optimised through simulations of the Bloch equations and phantom experiments, including its sensitivity to sequence parameters, blood velocity and off-resonance effects. An implementation of the VEPCASL pulse sequence using an echo-planar imaging (EPI) readout was applied in healthy volunteers to enable optimisation of the post-labelling delay and choice of labelling plane position. Improvements to the signal-to-noise ratio (SNR) and motion-sensitivity were made through the addition of background suppression pulses and a partial-Fourier scheme. Experiments using a three-dimensional gradient and spin echo (3D-GRASE) readout were somewhat compromised by significant blurring in the slice direction, but showed potential for future work with a high SNR and reduced dropout artefacts. The VEPCASL preparation was also applied to a dynamic 2D angiographic readout, allowing direct visualisation of collateral blood flow in the brain as well as a morphological and functional assessment of the major cerebral arteries. The application of a balanced steady-state free precession (bSSFP) readout significantly increased the acquisition efficiency, allowing the generation of dynamic 3D vessel-selective angiograms. A theoretical model of the dynamic angiographic signal was also derived, allowing quantification of blood flow through specified vessels, providing a significant advantage over qualitative x-ray based methods. Finally, these methods were applied to a number of patient groups, including those with vertebro-basilar disease, carotid stenosis and arteriovenous malformation. These preliminary studies demonstrate that useful clinical information regarding collateral blood flow can be obtained with these techniques.
|
49 |
Analysis of 3D echocardiographyChykeyuk, Kiryl January 2014 (has links)
Heart disease is the major cause of death in the developed world. Due to its fast, portable, low-cost and harmless way of imaging the heart, echocardiography has become the most frequent tool for diagnosis of cardiac function in clinical routine. However, visual assessment of heart function from echocardiography is challenging, highly operatordependant and is subject to intra- and inter observer errors. Therefore, development of automated methods for echocardiography analysis is important towards accurate assessment of cardiac function. In this thesis we develop new ways to model echocardiography data using Bayesian machine learning methods and concern three problems: (i) wall motion analysis in 2D stress echocardiography, (ii) segmentation of the myocardium in 3D echocardiography, and (iii) standard views extraction from 3D echocardiography. Firstly, we propose and compare four discriminative methods for feature extraction and wall motion classification of 2D stress echocardiography (images of the heart taken at rest and after exercise or pharmalogical stress). The four methods are based on (i) Support Vector Machines, (ii) Relevance Vector Machines, (iii) Lasso algorithm and Regularised Least Squares, (iv) Elastic Net regularisation and Regularised Least Squares. Although all the methods are shown to have superior performance to the state-of-the-art, one conclusion is that good segmentation of the myocardium in echocardiography is key for accurate assessment of cardiac wall motion. We investigate the application of one of the most promising current machine learning techniques, called Decision Random Forests, to segment the myocardium from 3D echocardiograms. We demonstrate that more reliable and ultrasound specific descriptors are needed in order to achieve the best results. Specifically, we introduce two sets of new features to improve the segmentation results: (i) LoCo and GloCo features with a local and a global shape constraint on coupled endoand epicardial boundaries, and (ii) FA features, which use the Feature Asymmetry measure to highlight step-like edges in echocardiographic images. We also reinforce the traditional features such as Haar and Rectangular features by aligning 3D echocardiograms. For that we develop a new registration technique, which is based on aligning centre lines of the left ventricles. We show that with alignment performance is boosted by approximately 15%. Finally, a novel approach to detect planes in 3D images using regression voting is proposed. To the best of our knowledge we are the first to use a one-step regression approach for the task of plane detection in 3D images. We investigate the application to standard views extraction from 3D echocardiography to facilitate efficient clinical inspection of cardiac abnormalities and diseases. We further develop a new method, called the Class- Specific Regression Forest, where class label information is incorporating into the training phase to reinforce the learning from semantically relevant to the problem classes. During testing the votes from irrelevant classes are excluded from voting to maximise the confidence of output predictors. We demonstrate that the Class-Specific Regression Random Forest outperforms the classic Regression Random Forest and produces results comparable to the manual annotations.
|
50 |
Modelling embankment breaching due to overflowvan Damme, Myron January 2014 (has links)
Correct modelling of embankment breach formation is essential for an accurate assessment of the associated flood risk. Modelling breach formation due to overflow requires a thorough understanding of the geotechnical processes in unsaturated soils as well as erosion processes under supercritical flow conditions. This thesis describes 1D slope stability analysis performed for unsaturated soils whose moisture content changes with time. The analysis performed shows that sediment-laden gravity flows play an important role in the erosion behaviour of embankments. The thesis also describes a practical, fast breach model based on a simplified description of the physical processes that can be used in modelling and decision support frameworks for flooding. To predict the breach hydrograph, the rapid model distinguishes between breach formation due to headcut erosion and surface erosion in the case of failure due to overflow. The model also predicts the breach hydrograph in the case of failure due to piping. The assumptions with respect to breach flow modelling are reviewed, and result in a new set of breadth-integrated Navier-Stokes equations, that account for wall shear stresses and a variable breadth geometry. The vertical 2D flow field described by the equations can be used to calculate accurately the stresses on the embankment during the early stages of breach formation. Pressure-correction methods are given for solving the 2D Navier-Stokes equations for a variable breadth, and good agreement is found when validating the flow model against analytical solutions.
|
Page generated in 0.0913 seconds