• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 62
  • 16
  • 10
  • 9
  • 5
  • 4
  • 4
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 142
  • 142
  • 24
  • 13
  • 13
  • 13
  • 11
  • 9
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Defect Prediction using Exception Handling Method Call Structures

Sawadpong, Puntitra 09 May 2015 (has links)
The main purpose of exception handling mechanisms is to improve software robustness by handling exceptions when they occur. However, empirical evidence indicates that improper implementation of exception handling code can be a source of faults in software systems. There is still limited empirical knowledge about the relationship between exception handling code and defects. In this dissertation, we present three case studies investigating defect densities of exception handling code. The results show that in every system under study, the defect density of exception handling code was significantly higher than the defect density of overall source code and normal code. The ability to predict the location of faults can assist in directing quality enhancement efforts to modules that are likely to have faults. This information can be used to guide test plans, narrow the test space, and improve software quality. We hypothesize that complicated exception handling structure is a predictive factor that is associated with defects. To the best of our knowledge, no study has addressed the relationship between the attributes of exception handling method call structures and defect occurrence, nor has prior work addressed fault prediction. We extract exception-based software metrics from the structural attributes of exception handling call graphs. To find out whether there are patterns of relationship between exception-based software metrics and fault-proneness, we propose a defect prediction model using exception handling call structures. We apply the J48 algorithm, which is the Java implementation of the C4.5 algorithm, to build exception defect prediction models. In two out of three systems under study, the results reveal that there are logical patterns of relationships between most class level exception metrics and fault-proneness. The accuracy of our prediction models is comparable to the results of defect prediction model studies in the literature. It was observed that our approach has somewhat worse predictive accuracy when a system has low average defects per class.
62

Development of a deep learning-based patient-specific target contour prediction model for markerless tumor positioning / マーカーレス腫瘍位置決めを目的とした深層学習に基づく患者固有標的輪郭予測モデルの開発

Zhou, Dejun 23 March 2023 (has links)
京都大学 / 新制・課程博士 / 博士(人間健康科学) / 甲第24542号 / 人健博第113号 / 新制||人健||8(附属図書館) / 京都大学大学院医学研究科人間健康科学系専攻 / (主査)教授 中尾 恵, 教授 杉本 直三, 教授 黒田 知宏 / 学位規則第4条第1項該当 / Doctor of Human Health Sciences / Kyoto University / DFAM
63

Development and internal validation of a clinical prediction model for acute adjacent vertebral fracture after vertebral augmentation: the AVA score / 椎体形成術後早期隣接椎体骨折発生予測モデルの開発と内的妥当性検証:AVAスコア

Hijikata, Yasukazu 23 May 2022 (has links)
京都大学 / 新制・課程博士 / 博士(社会健康医学) / 甲第24094号 / 社医博第125号 / 新制||社医||12(附属図書館) / 京都大学大学院医学研究科社会健康医学系専攻 / (主査)教授 佐藤 俊哉, 教授 中山 健夫, 教授 松田 秀一 / 学位規則第4条第1項該当 / Doctor of Public Health / Kyoto University / DFAM
64

Development of a Predictive Model for Frailty Utilizing Electronic Health Records

Poronsky, Kye 28 June 2022 (has links)
Frailty is a multifaceted, geriatric syndrome that is associated with age-related declines in functional reserves resulting in increased risks of in-hospital death, readmissions and discharge to nursing homes. The risks associated with frailty highlights the need for providers to be able to quickly, and accurately, assess someone’s frailty level. Previous studies have shown that bedside clinician assessment is not a reliable or valid way to determine frailty, meaning that a more reliable, valid and concise method is needed. We developed a prediction model using discharge ICD-9/ICD-10 diagnostic codes and other demographic variables to predict Reported Edmonton Frail Scale scores. Participants were from the Baystate Frailty Study, a prospective cohort design study among elderly patients greater than 65 years old who were admitted to a single academic medical center between 2014 and 2016. Three different predictive models were completed utilizing the LASSO approach. The adjusted r-square increased across the three models indicating an increase in the predictive ability of the models. In this study of 762 hospitalized patients over the age of 65 years old, we found that a frailty prediction model that included ICD codes only had a poor prediction ability (adjusted r-square=0.10). The prediction ability improved 2-fold after adding demographic information, a comorbidity score and interaction terms (adjusted r-square=0.26). This study provided additional insights into the development of an automatic frailty assessment, something which is currently missing from clinical care.
65

Predicting Length of Stay and Non-Home Discharge: A Novel Approach to Reduce Wasted Resources after Cardiac Surgery

Pattakos, Gregory January 2011 (has links)
No description available.
66

Factors Predictive of Adverse Postoperative Events Following Tonsillectomy

Subramanyam, Rajeev January 2013 (has links)
No description available.
67

A Biomechanical Comparison of Locking Compression Plate Constructs with Plugs/Screws in Osteoporotic Bone Model

Desai, Krishna P. 22 April 2010 (has links)
No description available.
68

Statistical Methods for Variability Management in High-Performance Computing

Xu, Li 15 July 2021 (has links)
High-performance computing (HPC) variability management is an important topic in computer science. Research topics include experimental designs for efficient data collection, surrogate models for predicting the performance variability, and system configuration optimization. Due to the complex architecture of HPC systems, a comprehensive study of HPC variability needs large-scale datasets, and experimental design techniques are useful for improved data collection. Surrogate models are essential to understand the variability as a function of system parameters, which can be obtained by mathematical and statistical models. After predicting the variability, optimization tools are needed for future system designs. This dissertation focuses on HPC input/output (I/O) variability through three main chapters. After the general introduction in Chapter 1, Chapter 2 focuses on the prediction models for the scalar description of I/O variability. A comprehensive comparison study is conducted, and major surrogate models for computer experiments are investigated. In addition, a tool is developed for system configuration optimization based on the chosen surrogate model. Chapter 3 conducts a detailed study for the multimodal phenomena in I/O throughput distribution and proposes an uncertainty estimation method for the optimal number of runs for future experiments. Mixture models are used to identify the number of modes for throughput distributions at different configurations. This chapter also addresses the uncertainty in parameter estimation and derives a formula for sample size calculation. The developed method is then applied to HPC variability data. Chapter 4 focuses on the prediction of functional outcomes with both qualitative and quantitative factors. Instead of a scalar description of I/O variability, the distribution of I/O throughput provides a comprehensive description of I/O variability. We develop a modified Gaussian process for functional prediction and apply the developed method to the large-scale HPC I/O variability data. Chapter 5 contains some general conclusions and areas for future work. / Doctor of Philosophy / This dissertation focuses on three projects that are all related to statistical methods in performance variability management in high-performance computing (HPC). HPC systems are computer systems that create high performance by aggregating a large number of computing units. The performance of HPC is measured by the throughput of a benchmark called the IOZone Filesystem Benchmark. The performance variability is the variation among throughputs when the system configuration is fixed. Variability management involves studying the relationship between performance variability and the system configuration. In Chapter 2, we use several existing prediction models to predict the standard deviation of throughputs given different system configurations and compare the accuracy of predictions. We also conduct HPC system optimization using the chosen prediction model as the objective function. In Chapter 3, we use the mixture model to determine the number of modes in the distribution of throughput under different system configurations. In addition, we develop a model to determine the number of additional runs for future benchmark experiments. In Chapter 4, we develop a statistical model that can predict the throughout distributions given the system configurations. We also compare the prediction of summary statistics of the throughput distributions with existing prediction models.
69

Analysis and Modeling of the Mechanical Durability of Proton Exchange Membranes Using Pressure-Loaded Blister Tests

Grohs, Jacob R. 29 May 2009 (has links)
Environmental fluctuations in operating fuel cells impose significant biaxial stresses in the constrained proton exchange membranes (PEM). The PEM's ability to withstand cyclic environment-induced stresses plays an important role in membrane integrity and consequently, fuel cell durability. In this thesis, pressure loaded blister tests are used to study the mechanical durability of Gore-Select® series 57 over a range of times, temperatures, and loading histories. Ramped pressure tests are used with a linear viscoelastic analog to Hencky's classical solution for a pressurized circular membrane to estimate biaxial burst strength values. Biaxial strength master curves are constructed using traditional time-temperature superposition principle techniques and the associated temperature shift factors show good agreement when compared with shifts obtained from other modes of testing on the material. Investigating a more rigorous blister stress analysis becomes nontrivial due to the substantial deflections and thinning of the membrane. To further improve the analysis, the digital image correlation (DIC) technique is used to measure full-field displacements under ramped and constant pressure loading. The measured displacements are then used to validate the constitutive model and methods of the finite element analysis (FEA). With confidence in the FEA, stress histories of constant pressure tests are used to develop linear damage accumulation and residual strength based lifetime prediction models. Robust models, validated by successfully predicting fatigue failures, suggest the ability to predict failures under any given stress history whether mechanically or environmentally induced - a critical step in the effort to predict fuel cell failures caused by membrane mechanical failure. / Master of Science
70

Bond of nanoinclusions reinforced concrete with old concrete: strength, reinforcing mechanisms and prediction model

Wang, X., Dong, S., Ashour, Ashraf, Han, B. 16 February 2021 (has links)
Yes / This paper investigated the bond strength of eight nanoinclusions reinforced concrete with old concrete through a splitting tensile test. The reinforcing mechanisms of bond due to nanoinclusion was also explored by means of scanning electron microscope and energy dispersive spectrometer. A prediction model for the bond strength between nanoinclusion reinforced concrete with old concrete substrate was developed and calibrated against the experimental results obtained. The experimental results indicated that bond strength between nanoinclusions reinforced concrete and old concrete can reach 2.85 MPa, which is 0.8 MPa/39.0% higher than that between new concrete without nanoinclusions and old concrete. The reinforcing mechanisms can be attributed to the enrichment of nanoinclusions in the new-to-old concrete interface, compacting the interfacial microstructures and connecting hydration products in micropores of old concrete with that in bulk new concrete. In addition, the prediction model proposed on the basis of reinforcing mechanisms can accurately describe the relationship of the nanoinclusion content and the bond strength of nanoinclusions reinforced concrete with old concrete.

Page generated in 0.0435 seconds