Spelling suggestions: "subject:"computational 3dmodeling"" "subject:"computational bymodeling""
61 |
Pyruvate sensitizes pancreatic tumors to hypoxia-activated prodrug TH-302Wojtkowiak, Jonathan W., Cornnell, Heather C., Matsumoto, Shingo, Saito, Keita, Takakusagi, Yoichi, Dutta, Prasanta, Kim, Munju, Zhang, Xiaomeng, Leos, Rafael, Bailey, Kate M., Martinez, Gary, Lloyd, Mark C., Weber, Craig, Mitchell, James B., Lynch, Ronald M., Baker, Amanda F., Gatenby, Robert A., Rejniak, Katarzyna A., Hart, Charles, Krishna, Murali C., Gillies, Robert J. 20 May 2016 (has links)
BACKGROUND: Hypoxic niches in solid tumors harbor therapy-resistant cells. Hypoxia-activated prodrugs (HAPs) have been designed to overcome this resistance and, to date, have begun to show clinical efficacy. However, clinical HAPs activity could be improved. In this study, we sought to identify non-pharmacological methods to acutely exacerbate tumor hypoxia to increase TH-302 activity in pancreatic ductal adenocarcinoma (PDAC) tumor models. RESULTS: Three human PDAC cell lines with varying sensitivity to TH-302 (Hs766t > MiaPaCa-2 > SU.86.86) were used to establish PDAC xenograft models. PDAC cells were metabolically profiled in vitro and in vivo using the Seahorse XF system and hyperpolarized 13C pyruvate MRI, respectively, in addition to quantitative immunohistochemistry. The effect of exogenous pyruvate on tumor oxygenation was determined using electroparamagnetic resonance (EPR) oxygen imaging. Hs766t and MiaPaCa-2 cells exhibited a glycolytic phenotype in comparison to TH-302 resistant line SU.86.86. Supporting this observation is a higher lactate/pyruvate ratio in Hs766t and MiaPaCa xenografts as observed during hyperpolarized pyruvate MRI studies in vivo. Coincidentally, response to exogenous pyruvate both in vitro (Seahorse oxygen consumption) and in vivo (EPR oxygen imaging) was greatest in Hs766t and MiaPaCa models, possibly due to a higher mitochondrial reserve capacity. Changes in oxygen consumption and in vivo hypoxic status to pyruvate were limited in the SU.86.86 model. Combination therapy of pyruvate plus TH-302 in vivo significantly decreased tumor growth and increased survival in the MiaPaCa model and improved survival in Hs766t tumors. CONCLUSIONS: Using metabolic profiling, functional imaging, and computational modeling, we show improved TH-302 activity by transiently increasing tumor hypoxia metabolically with exogenous pyruvate. Additionally, this work identified a set of biomarkers that may be used clinically to predict which tumors will be most responsive to pyruvate + TH-302 combination therapy. The results of this study support the concept that acute increases in tumor hypoxia can be beneficial for improving the clinical efficacy of HAPs and can positively impact the future treatment of PDAC and other cancers.
|
62 |
Essays in computational economicsPugh, David January 2014 (has links)
The focus of my PhD research has been on the acquisition of computational modeling and simulation methods used in both theoretical and applied Economics. My first chapter provides an interactive review of finite-difference methods for solving systems of ordinary differential equations (ODEs) commonly encountered in economic applications using Python. The methods surveyed in this chapter, as well as the accompanying code and IPython lab notebooks should be of interest to any researcher interested in applying finite-difference methods for solving ODEs to economic problems. My second chapter is an empirical analysis of the evolution of the distribution of bank size in the U.S. This paper assesses the statistical support for Zipf's Law (i.e., a power law, or Pareto, distribution with a scaling exponent of α = 2) as an appropriate model for the upper tail of the distribution of U.S. banks. Using detailed balance sheet data for all FDIC regulated banks for the years 1992 through 2011, I find significant departures from Zipf's Law for most measures of bank size inmost years. Although Zipf's Law can be statistically rejected, a power law distribution with α of roughly 1.9 statistically outperforms other plausible heavy-tailed alternative distributions. In my final chapter, which is based on joint work with Dr. David Comerford, I apply computational methods to model the relationship between per capita income and city size. A well-known result from the urban economics literature is that a monopolistically competitive market structure combined with internal increasing returns to scale can be used to generate log-linear relations between income and population. I extend this theoretical framework to allow for a variable elasticity of substitution between factors of production in a manner similar to Zhelobodko et al. (2012). Using data on Metropolitan Statistical Areas (MSAs) in the U.S. I find evidence that supports what Zhelobodko et al. (2012) refer to as "increasing relative love for variety (RLV)." Increasing RLV generates procompetitive effects as market size increases which means that IRS, whilst important for small to medium sized cities, are exhausted as cities become large. This has important policy implications as it suggests that focusing intervention on creating scale for small populations is potentially much more valuable than further investments to increase market size in the largest population centers.
|
63 |
Analyzing multicellular interactions: A hybrid computational and biological pattern recognition approachWhite, Douglas 27 May 2016 (has links)
Pluripotent embryonic stem cells (ESCs) can differentiate into all somatic cell types, making them a useful platform for studying a variety of cellular phenomenon. Furthermore, ESCs can be induced to form aggregates called embryoid bodies (EBs) which recapitulate the dynamics of development and morphogenesis. However, many different factors such as gradients of soluble morphogens, direct cell-to-cell signaling, and cell-matrix interactions have all been implicated in directing ESC differentiation. Though the effects of individual factors have often been investigated independently, the inherent difficulty in assaying combinatorial effects has made it difficult to ascertain the concerted effects of different environmental parameters, particularly due to the spatial and temporal dynamics associated with such cues. Dynamic computational models of ESC differentiation can provide powerful insight into how different cues function in combination both spatially and temporally. By combining particle based diffusion models, cellular agent based approaches, and physical models of morphogenesis, a multi-scale, rules-based modeling framework can provide insight into how each component contributes to differentiation. I propose to investigate the complex regulatory cues which govern complex morphogenic behavior in 3D ESC systems via a computational rules based modeling approach. The objective of this study is to examine how spatial patterns of differentiation by ESCs arise as a function of the microenvironment. The central hypothesis is that spatial control of soluble morphogens and cell-cell signaling will allow enhanced control over the patterns and efficiency of stem cell differentiation in embryoid bodies.
|
64 |
3D thermal-electrochemical lithium-ion battery computational modelingGerver, Rachel Ellen 2009 August 1900 (has links)
The thesis presents a modeling framework for simulating three dimensional effects in lithium-ion batteries. This is particularly important for understanding the performance of large scale batteries used under high power conditions such as in hybrid electric vehicle applications. While 1D approximations may be sufficient for the smaller scale batteries used in cell phones and laptops, they are severely limited when scaled up to larger batteries, where significant 3D gradients can develop in concentration, current, temperature, and voltage. Understanding these 3D effects is critical for designing lithium-ion batteries for improved safety and long term durability, as well as for conducting effective design optimization studies. The model couples an electrochemical battery model with a thermal model to understand how thermal effects will influence electrochemical behavior and to determine temperature distributions throughout the battery. Several modeling example results are presented including thermal influences on current distribution, design optimization of current collector thickness and current collector tab placement, and investigation of lithium plating risk in three dimensions. / text
|
65 |
Gradience in grammar : experimental and computational aspects of degrees of grammaticalityKeller, Frank January 2001 (has links)
This thesis deals with gradience in grammar, i.e., with the fact that some linguistic structures are not fully acceptable or unacceptable, but receive gradient linguistic judgments. The importance of gradient data for linguistic theory has been recognized at least since Chomsky's Logical Structure of Linguistic Theory. However, systematic empirical studies of gradience are largely absent, and none of the major theoretical frameworks is designed to account for gradient data. The present thesis addresses both questions. In the experimental part of the thesis (Chapters 3-5), we present a set of magnitude estimation experiments investigating gradience in grammar. The experiments deal with unaccusativity/unergativity, extraction, binding, word order, and gapping. They cover all major modules of syntactic theory, and draw on data from three languages (English, German, and Greek). In the theoretical part of thesis (Chapters 6 and 7), we use these experimental results to motivate a model of gradience in grammar. This model is a variant of Optimality Theory, and explains gradience in terms of the competition of ranked, violable linguistic constraints. The experimental studies in this thesis deliver two main results. First, they demonstrate that an experimental investigation of gradient phenomena can advance linguistic theory by uncovering acceptability distinctions that have gone unnoticed in the theoretical literature. An experimental approach can also settle data disputes that result from the informal data collection techniques typically employed in theoretical linguistics, which are not well-suited to investigate the behavior of gradient linguistic data. Second, we identify a set of general properties of gradient data that seem to be valid for a wide range of syntactic phenomena and across languages. (a) Linguistic constraints are ranked, in the sense that some constraint violations lead to a greater degree of unacceptability than others. (b) Constraint violations are cumulative, i.e., the degree of unacceptability of a structure increases with the number of constraints it violates. (c) Two constraint types can be distinguished experimentally: soft constraints lead to mild unacceptability when violated, while hard constraint violations trigger serious unacceptability. (d) The hard/soft distinction can be diagnosed by testing for effects from the linguistic context; context effects only occur for soft constraints; hard constraints are immune to contextual variation. (e) The soft/hard distinction is crosslinguistically stable. In the theoretical part of the thesis, we develop a model of gradient grammaticality that borrows central concepts from Optimality Theory, a competition-based grammatical framework. We propose an extension, Linear Optimality Theory, motivated by our experimental results on constraint ranking and the cumulativity of violations. The core assumption of our model is that the relative grammaticality of a structure is determined by the weighted sum of the violations it incurs. We show that the parameters of the model (the constraint weights), can be estimated using the least square method, a standard model fitting algorithm. Furthermore, we prove that standard Optimality Theory is a special case of Linear Optimality Theory. To test the validity of Linear Optimality Theory, we use it to model data from the experimental part of the thesis, including data on extraction, gapping, and word order. For all data sets, a high model fit is obtained and it is demonstrated that the model's predictions generalize to unseen data. On a theoretical level, our modeling results show that certain properties of gradient data (the hard/soft distinction, context effects, and crosslinguistic effects) do not have to be stipulated, but follow from core assumptions of Linear Optimality Theory.
|
66 |
The Design and Validation of a Novel Computational Simulation of the Leg for the Investigation of Injury, Disease, and Surgical TreatmentIaquinto, Joseph 05 May 2010 (has links)
Computational modeling of joints and their function, a developing field, is becoming a significant health and wellness tool of our modern age. Due to familiarity of prior research focused on the lower extremity, a foot and ankle 3D computational model was created to explore the potential for these computational methods. The method of isolating CT scanned tissue and rendering a patient specific anatomy in the digital domain was accomplished by the use of MIMICS™ , SolidWorks™, and COSMOSMotion™ – all available in the commercial domain. The kinematics of the joints are driven solely by anatomically modeled soft tissue applied to articulating joint geometry. Soft tissues are based on highly realistic measurements of anatomical dimension and behavior. By restricting all model constraints to true to life anatomical approximations and recreating their behavior, this model uses inverse kinematics to predict the motion of the foot under various loading conditions. Extensive validation of the function of the model was performed. This includes stability of the arch (due to ligament deficiency) and joint behavior (due to disease and repair). These simulations were compared to a multitude of studies, which confirmed the accuracy of soft tissue strain, joint alignment, joint contact force and plantar load distribution. This demonstrated the capability of the simulation technique to both qualitatively recreate trends seen experimentally and clinically, as well as quantitatively predict a variety of tissue and joint measures. The modeling technique has further strength by combining measurements that are typically done separate (experimental vs. clinical) to build a more holistic model of foot behavior. This has the potential to allow additional conclusions to be drawn about complications associated with repair techniques. This model was built with the intent to provide an example of how patient specific bony geometry can be used as either a research or surgical tool when considering a disease state or repair technique. The technique also allows for the repeated use of anatomy, which is not possible experimentally or clinically. These qualities, along with the accuracy demonstrated in validation, prove the integrity of the technique along with demonstrating its strengths.
|
67 |
COMPUTATIONAL MODELING OF MULITSENSORY PROCESSING USING NETWORK OF SPIKING NEURONSLim, Hun Ki 04 May 2011 (has links)
Multisensory processing in the brain underlies a wide variety of perceptual phenomena, but little is known about the underlying mechanisms of how multisensory neurons are generated and how the neurons integrate sensory information from environmental events. This lack of knowledge is due to the difficulty of biological experiments to manipulate and test the characteristics of multisensory processing. By using a computational model of multisensory processing this research seeks to provide insight into the mechanisms of multisensory processing. From a computational perspective, modeling of brain functions involves not only the computational model itself but also the conceptual definition of the brain functions, the analysis of correspondence between the model and the brain, and the generation of new biologically plausible insights and hypotheses. In this research, the multisensory processing is conceptually defined as the effect of multisensory convergence on the generation of multisensory neurons and their integrated response products, i.e., multisensory integration. Thus, the computational model is the implementation of the multisensory convergence and the simulation of the neural processing acting upon the convergence. Next, the most important step in the modeling is analysis of how well the model represents the target, i.e., brain function. It is also related to validation of the model. One of the intuitive and powerful ways of validating the model is to apply methods standard to neuroscience for analyzing the results obtained from the model. In addition, methods such as statistical and graph-theoretical analyses are used to confirm the similarity between the model and the brain. This research takes both approaches to provide analyses from many different perspectives. Finally, the model and its simulations provide insight into multisensory processing, generating plausible hypotheses, which will need to be confirmed by real experimentation.
|
68 |
Patient-Specific Modeling Of Adult Acquired Flatfoot Deformity Before And After SurgerySpratley, Edward Meade 05 December 2013 (has links)
The use of computational modeling is an increasingly commonplace technique for the investigation of biomechanics in intact and pathological musculoskeletal systems. Moreover, given the robust and repeatable nature of computer simulation and the prevalence of software techniques for accurate 3-D reconstructions of tissues, the predictive power of these models has increased dramatically. However, there are no patient-specific kinematic models whose function is dictated solely by physiologic soft-tissue constraints, articular shape and contact, and without idealized joint approximations. Moreover, very few models have attempted to predict surgical effects combined with postoperative validation of those predictions. Given this, it is not surprising that the area of foot/ankle modeling has been especially underserved. Thus, we chose to investigate the pre- and postoperative kinematics of Adult Acquired Flatfoot Deformity (AAFD) across a cohort of clinically diagnosed sufferers. AAFD was chosen as it is a chronic and degenerative disease wherein degradation of soft-tissue supporters of the medial arch eventually cause gross malalignment in the mid- and hindfoot, along with significant pain and dysfunction. Also, while planar radiographs are still used to diagnose and stage the disease, it is widely acknowledged that these 2-D measures fail to fully describe the 3-D nature of AAFD. Thus, a population of six patient-specific rigid-body computational models was developed using the commercially available software packages Mimics® and SolidWorks® in order to investigate foot function in patients with diagnosed Stage IIb AAFD. Each model was created from patient-specific sub-millimeter MRI scans, loaded with body weight, individualized muscle forces, and ligament forces, in single leg stance. The predicted model kinematics were validated pre- and postoperatively using clinically utilized radiographic angle distance measures as well as plantar force distributions. The models were then further exploited to predict additional biomechanical parameters such as articular contact force and soft-tissue strain, as well as the effect of hypothetical surgical interventions. Subsequently, kinematic simulations demonstrated that the models were able to accurately predict foot/ankle motion in agreement with their respective patients. Additionally, changes in joint contact force and ligament strain observed across surgical states further elucidate the complex biomechanical underpinnings of foot and ankle function.
|
69 |
Regression Wavelet Analysis for Progressive-Lossy-to-Lossless Coding of Remote-Sensing DataAmrani, Naoufal, Serra-Sagrista, Joan, Hernandez-Cabronero, Miguel, Marcellin, Michael 03 1900 (has links)
Regression Wavelet Analysis (RWA) is a novel wavelet-based scheme for coding hyperspectral images that employs multiple regression analysis to exploit the relationships among spectral wavelet transformed components. The scheme is based on a pyramidal prediction, using different regression models, to increase the statistical independence in the wavelet domain For lossless coding, RWA has proven to be superior to other spectral transform like PCA and to the best and most recent coding standard in remote sensing, CCSDS-123.0. In this paper we show that RWA also allows progressive lossy-to-lossless (PLL) coding and that it attains a rate-distortion performance superior to those obtained with state-of-the-art schemes. To take into account the predictive significance of the spectral components, we propose a Prediction Weighting scheme for JPEG2000 that captures the contribution of each transformed component to the prediction process.
|
70 |
Computational Models of Nuclear ProliferationFrankenstein, William 01 May 2016 (has links)
This thesis utilizes social influence theory and computational tools to examine the disparate impact of positive and negative ties in nuclear weapons proliferation. The thesis is broadly in two sections: a simulation section, which focuses on government stakeholders, and a large-scale data analysis section, which focuses on the public and domestic actor stakeholders. In the simulation section, it demonstrates that the nonproliferation norm is an emergent behavior from political alliance and hostility networks, and that alliances play a role in current day nuclear proliferation. This model is robust and contains second-order effects of extended hostility and alliance relations. In the large-scale data analysis section, the thesis demonstrates the role that context plays in sentiment evaluation and highlights how Twitter collection can provide useful input to policy processes. It first highlights the results of an on-campus study where users demonstrated that context plays a role in sentiment assessment. Then, in an analysis of a Twitter dataset of over 7.5 million messages, it assesses the role of ‘noise’ and biases in online data collection. In a deep dive analyzing the Iranian nuclear agreement, we demonstrate that the middle east is not facing a nuclear arms race, and show that there is a structural hole in online discussion surrounding nuclear proliferation. By combining both approaches, policy analysts have a complete and generalizable set of computational tools to assess and analyze disparate stakeholder roles in nuclear proliferation.
|
Page generated in 0.2445 seconds