Spelling suggestions: "subject:"anumerical analysis.data processing."" "subject:"anumerical analysisdata processing.""
1 |
Finite element models for impedance plethysmography.Tymchyshyn, Sophia. January 1972 (has links)
No description available.
|
2 |
Finite element models for impedance plethysmography.Tymchyshyn, Sophia. January 1972 (has links)
No description available.
|
3 |
Computational approaches in compressed sensingWoolway, Matthew 01 September 2014 (has links)
A dissertation submitted to the Faculty of Science, University of the Witwatersrand, Johannesburg, in fulfilment of the requirements for the degree of Master of Science. Johannesburg, 2014. / This thesis aims to provide a summary on computational approaches to solving the
Compressed Sensing problem. The theoretical problem of solving systems of linear
equations has long been investigated in academic literature. A relatively new field,
Compressed Sensing is an application of such a problem. Specifically, with the ability to
change the way in which we obtain and process signals. Under the assumption of sparse
signals, Compressed Sensing is able to recover signals sampled at a rate much lower than
that of the current Shannon/Nyquist sampling rate. The primary goal of this thesis, is to
describe major algorithms currently used in the Compressed Sensing problem. This is done
as a means to provide the reader with sufficient up to date knowledge on current
approaches as well as their means of implementation, on central processing units (CPUs)
and graphical processing units (GPUs), when considering computational concerns such as
computational time, storage requirements and parallelisability.
|
4 |
Finite element solution of axisymmetric scalar fields.Konrad, Adalbert January 1971 (has links)
No description available.
|
5 |
Finite element solution of axisymmetric scalar fields.Konrad, Adalbert January 1971 (has links)
No description available.
|
6 |
Shamir's secret sharing scheme using floating point arithmeticUnknown Date (has links)
Implementing Shamir's secret sharing scheme using floating point arithmetic would provide a faster and more efficient secret sharing scheme due to the speed in which GPUs perform floating point arithmetic. However, with the loss of a finite field, properties of a perfect secret sharing scheme are not immediately attainable. The goal is to analyze the plausibility of Shamir's secret sharing scheme using floating point arithmetic achieving the properties of a perfect secret sharing scheme and propose improvements to attain these properties. Experiments indicate that property 2 of a perfect secret sharing scheme, "Any k-1 or fewer participants obtain no information regarding the shared secret", is compromised when Shamir's secret sharing scheme is implemented with floating point arithmetic. These experimental results also provide information regarding possible solutions and adjustments. One of which being, selecting randomly generated points from a smaller interval in one of the proposed schemes of this thesis. Further experimental results indicate improvement using the scheme outlined. Possible attacks are run to test the desirable properties of the different schemes and reinforce the improvements observed in prior experiments. / by Timothy Finemore. / Thesis (M.S.)--Florida Atlantic University, 2012. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2012. Mode of access: World Wide Web.
|
7 |
Variable selection and structural discovery in joint models of longitudinal and survival dataHe, Zangdong January 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Joint models of longitudinal and survival outcomes have been used with increasing frequency in clinical investigations. Correct specification of fixed and random effects, as well as their functional forms is essential for practical data analysis. However, no existing methods have been developed to meet this need in a joint model setting. In this dissertation, I describe a penalized likelihood-based method with adaptive least absolute shrinkage and selection operator (ALASSO) penalty functions for model selection. By reparameterizing variance components through a Cholesky decomposition, I introduce a penalty function of group shrinkage; the penalized likelihood is approximated by Gaussian quadrature and optimized by an EM algorithm. The functional forms of the independent effects are determined through a procedure for structural discovery. Specifically, I first construct the model by penalized cubic B-spline and then decompose the B-spline to linear and nonlinear elements by spectral decomposition. The decomposition represents the model in a mixed-effects model format, and I then use the mixed-effects variable selection method to perform structural discovery. Simulation studies show excellent performance. A clinical application is described to illustrate the use of the proposed methods, and the analytical results demonstrate the usefulness of the methods.
|
8 |
Statistical analysis of clinical trial data using Monte Carlo methodsHan, Baoguang 11 July 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / In medical research, data analysis often requires complex statistical methods where no closed-form solutions are available. Under such circumstances, Monte Carlo (MC) methods have found many applications. In this dissertation, we proposed several novel statistical models where MC methods are utilized. For the first part, we focused on semicompeting risks data in which a non-terminal event was subject to dependent censoring by a terminal event. Based on an illness-death multistate survival model, we proposed flexible random effects models. Further, we extended our model to the setting of joint modeling where both semicompeting risks data and repeated marker data are simultaneously analyzed. Since the proposed methods involve high-dimensional integrations, Bayesian Monte Carlo Markov Chain (MCMC) methods were utilized for estimation. The use of Bayesian methods also facilitates the prediction of individual patient outcomes. The proposed methods were demonstrated in both simulation and case studies.
For the second part, we focused on re-randomization test, which is a nonparametric method that makes inferences solely based on the randomization procedure used in clinical trials. With this type of inference, Monte Carlo method is often used for generating null distributions on the treatment difference. However, an issue was recently discovered when subjects in a clinical trial were randomized with unbalanced treatment allocation to two treatments according to the minimization algorithm, a randomization procedure frequently used in practice. The null distribution of the re-randomization test statistics was found not to be centered at zero, which comprised power of the test. In this dissertation, we investigated the property of the re-randomization test and proposed a weighted re-randomization method to overcome this issue. The proposed method was demonstrated through extensive simulation studies.
|
9 |
Joint models for longitudinal and survival dataYang, Lili 11 July 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Epidemiologic and clinical studies routinely collect longitudinal measures of multiple outcomes. These longitudinal outcomes can be used to establish the temporal order of relevant biological processes and their association with the onset of clinical symptoms. In the first
part of this thesis, we proposed to use bivariate change point models for two longitudinal outcomes with a focus on estimating the correlation between the two change points. We adopted a Bayesian approach for parameter estimation and inference. In the second part, we considered the situation when time-to-event outcome is also collected along with multiple longitudinal biomarkers measured until the occurrence of the event or censoring. Joint models for longitudinal and time-to-event data can be used to estimate the association between the characteristics of the longitudinal measures over time and survival time. We developed a maximum-likelihood method to joint model multiple longitudinal biomarkers and a time-to-event outcome. In addition, we focused on predicting conditional survival probabilities and evaluating the predictive accuracy of multiple longitudinal biomarkers in the joint modeling framework. We assessed the performance of the proposed methods in
simulation studies and applied the new methods to data sets from two cohort studies. / National Institutes of Health (NIH) Grants R01 AG019181, R24 MH080827, P30 AG10133, R01 AG09956.
|
Page generated in 0.131 seconds