• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Development of statistical shape and intensity models of eroded scapulae to improve shoulder arthroplasty

Sharif Ahmadian, Azita 22 December 2021 (has links)
Reverse Total shoulder arthroplasty (RTSA) is an effective treatment and a surgical alternative approach to conventional total shoulder arthroplasty for patients with severe rotator cuff tears and glenoid erosion. To help optimize RTSA design, it is necessary to gain insight into the geometry of glenoid erosions and consider their unique morphology across the entire bone. One of the most powerful tools to systematically quantify and visualize the variation of bone geometry throughout a population is Statistical Shape Modeling (SSM); this method can assess the variation in the full shape of a bone, rather than of discrete anatomical features, which is very useful in identifying abnormalities, planning surgeries, and improving implant designs. Recently, many scapula SSMs have been presented in the literature; however, each has been created using normal and healthy bones. Therefore, creation of a scapula SSM derived exclusively from patients exhibiting complex glenoid bone erosions is critical and significantly challenging. In addition, several studies have quantified scapular bone properties in patients with complex glenoid erosion. However, because of their discrete nature these analyses cannot be used as the basis for Finite Element Modeling (FEM). Thus, a need exists to systematically quantify the variation of bone properties in a glenoid erosion patient population using a method that captures variation across the entire bone. This can be achieved using Statistical Intensity Modeling (SIM), which can then generate scapula FEMs with realistic bone properties for evaluation of orthopaedic implants. Using an SIM enables researchers to generate models with bone properties that represent a specific, known portion of the population variation, which makes the findings more generalizable. Accordingly, the main purpose of this research is to develop an SSM and SIM to mathematically quantifying the variation of bone geometries in a systematic manner for the complex geometry of scapulae with severe glenoid erosion and to determine the main modes of variation in bone property distribution, which could be used for future FEM studies, respectively. To draw meaningful statistical conclusions from the dataset, we need to compare and relate corresponding parts of the scapula. To achieve this correspondence, 3D triangulated mesh models of 61 scapulae were created from pre-operative CT scans from patients who were treated with RTSA and then a Non-Rigid (NR) registration method was used to morph one Atlas point cloud to the shapes of all other bones. However, the more complex the shape, the more difficult it is to maintain good correspondence. To overcome this challenge, we have adapted and optimized a NR-Iterative Closest Point (ICP) method and applied that on 61 eroded scapulae which results in each bone shape having identical mesh structure (i.e., same number and anatomical location of points). To assess the quality of our proposed algorithm, the resulting correspondence error was evaluated by comparing the positions of ground truth points and the corresponding point locations produced by the algorithm. The average correspondence error of all anatomical landmarks across the two observers was 2.74 mm with inter and intra-observer reliability of ±0.31 and ±0.06 mm. Moreover, the Root-Mean-Square (RMS) and Hausdorff errors of geometric registration between the original and the deformed models were calculated 0.25±0.04 mm and 0.76±0.14 mm, respectively. After registration, Principal Component Analysis (PCA) is applied to the deformed models as a group to describe independent modes of variation in the dataset. The robustness of the SSM is also evaluated using three standard metrics: compactness, generality, and specificity. Regarding compactness, the first 9 principal modes of variations accounted for 95% variability, while the model’s generality error and the calculated specificity over 10,000 instances were found to be 2.6 mm and 2.99 mm, respectively. The SIM results showed that the first mode of variation accounts for overall changes in intensity across the entire bone, while the second mode represented localized changes in the glenoid vault bone quality. The third mode showed changes in intensity at the posterior and inferior glenoid rim associated with posteroinferior glenoid rim erosion which suggests avoiding fixation in this region and preferentially placing screws in the anterosuperior region of the glenoid to improve implant fixation. / Graduate
2

A tale of two applications: closed-loop quality control for 3D printing, and multiple imputation and the bootstrap for the analysis of big data with missingness

Wenbin Zhu (12226001) 20 April 2022 (has links)
<div><b>1. A Closed-Loop Machine Learning and Compensation Framework for Geometric Accuracy Control of 3D Printed Products</b></div><div><b><br></b></div>Additive manufacturing (AM) systems enable direct printing of three-dimensional (3D) physical products from computer-aided design (CAD) models. Despite the many advantages that AM systems have over traditional manufacturing, one of their significant limitations that impedes their wide adoption is geometric inaccuracies, or shape deviations between the printed product and the nominal CAD model. Machine learning for shape deviations can enable geometric accuracy control of 3D printed products via the generation of compensation plans, which are modifications of CAD models informed by the machine learning algorithm that reduce deviations in expectation. However, existing machine learning and compensation frameworks cannot accommodate deviations of fully 3D shapes with different geometries. The feasibility of existing frameworks for geometric accuracy control is further limited by resource constraints in AM systems that prevent the printing of multiple copies of new shapes.<div><br></div><div>We present a closed-loop machine learning and compensation framework that can improve geometric accuracy control of 3D shapes in AM systems. Our framework is based on a Bayesian extreme learning machine (BELM) architecture that leverages data and deviation models from previously printed products to transfer deviation models, and more accurately capture deviation patterns, for new 3D products. The closed-loop nature of compensation under our framework, in which past compensated products that do not adequately meet dimensional specifications are fed into the BELMs to re-learn the deviation model, enables the identification of effective compensation plans and satisfies resource constraints by printing only one new shape at a time. The power and cost-effectiveness of our framework are demonstrated with two validation experiments that involve different geometries for a Markforged Metal X AM machine printing 17-4 PH stainless steel products. As demonstrated in our case studies, our framework can reduce shape inaccuracies by 30% to 60% (depending on a shape's geometric complexity) in at most two iterations, with three training shapes and one or two test shapes for a specific geometry involved across the iterations. We also perform an additional validation experiment using a third geometry to establish the capabilities of our framework for prospective shape deviation prediction of 3D shapes that have never been printed before. This third experiment indicates that choosing one suitable class of past products for prospective prediction and model transfer, instead of including all past printed products with different geometries, could be sufficient for obtaining deviation models with good predictive performance. Ultimately, our closed-loop machine learning and compensation framework provides an important step towards accurate and cost-efficient deviation modeling and compensation for fully 3D printed products using a minimal number of printed training and test shapes, and thereby can advance AM as a high-quality manufacturing paradigm.<br></div><div><br></div><div><b>2. Multiple Imputation and the Bootstrap for the Analysis of Big Data with Missingness</b></div><div><br></div><div>Inference can be a challenging task for Big Data. Two significant issues are that Big Data frequently exhibit complicated missing data patterns, and that the complex statistical models and machine learning algorithms typically used to analyze Big Data do not have convenient quantification of uncertainties for estimators. These two difficulties have previously been addressed using multiple imputation and the bootstrap, respectively. However, it is not clear how multiple imputation and bootstrap procedures can be effectively combined to perform statistical inferences on Big Data with missing values. We investigate a practical framework for the combination of multiple imputation and bootstrap methods. Our framework is based on two principles: distribution of multiple imputation and bootstrap calculations across parallel computational cores, and the quantification of sources of variability involved in bootstrap procedures that use subsampling techniques via random effects or hierarchical models. This framework effectively extends the scope of existing methods for multiple imputation and the bootstrap to a broad range of Big Data settings. We perform simulation studies for linear and logistic regression across Big Data settings with different rates of missingness to characterize the frequentist properties and computational efficiencies of the combinations of multiple imputation and the bootstrap. We further illustrate how effective combinations of multiple imputation and the bootstrap for Big Data analyses can be identified in practice by means of both the simulation studies and a case study on COVID infection status data. Ultimately, our investigation demonstrates how the flexible combination of multiple imputation and the bootstrap under our framework can enable valid statistical inferences in an effective manner for Big Data with missingness.<br></div>

Page generated in 0.15 seconds