Spelling suggestions: "subject:"computational modeling,"" "subject:"eomputational modeling,""
91 |
Impaired signaling in senescing T cells: investigation of the role of reactive oxygen species using microfluidic platforms and computational modelingRivet, Catherine-Aurélie 21 June 2012 (has links)
The goal of cancer immunotherapies is to boost the immune system's ability to detect tumor antigens and mount an effective anti-tumor immune response. Currently, adoptive T cell transfer therapy (ACT), the administration of ex vivo expanded autologous tumor-specific T cells, is one of the most promising immunotherapies under development; however, its efficacy has been limited so far with a mere 10% complete remission rate in the most successful clinical trials. The prolonged ex vivo culture process is a potential reason for this ineffectiveness because the transfused cells may reach replicative senescence and immunosenescence prior to patient transfer. The objective of this thesis is to offer two approaches towards an improvement of treatment efficacy. First, we generated a 'senescence metric' from the identification of biomarkers that can be used in the clinic towards predicting age and responsiveness of ex vivo expanded T cells. The second approach is to understand at the molecular level the changes that occur during ex vivo expansion to devise improved ACT protocols. In particular, we focused on the shift towards a pro-oxidizing environment and its potential effects on calcium signaling. The combined development and application of microfluidic technologies and computational models in this thesis facilitated our investigations of the phenotypic and signaling changes occurring in T cells during the progression towards immunosenescence. Our findings of altered T cell properties over long term culture provide insight for the design of future cancer immunotherapy protocols.
|
92 |
Cartesian Linguistics: From Historical Antecedents to Computational ModelingBehme, Christina 07 June 2011 (has links)
Chomsky’s Cartesian Linguistics frames into his linguistic work and the resulting debates between rationalists and empiricists. I focus on the following key aspects: (i) the historic connection of Cartesian Linguistics to previous linguistic theorizing, (ii) the development of Chomsky’s own theorizing, (iii) the empirical work addressing the problem of language acquisition and (iv) the problem of computational modeling of language learning. Chomsky claims that his view is situated within a rationalist Cartesian tradition and that only rationalists will be able to account fully for all aspects of human language. My thesis challenges both claims.
I found only remote connections between Cartesian and Chomskyan commitments. Chomsky holds that (i) language is species-specific, (ii) language is domain-specific, and (iii) language acquisition depends on innate knowledge. Descartes accepted (i), but argued that language is an indicator of domain-general intelligence. Innate resources play a different role for language acquisition for Chomsky and for Descartes.
Chomsky revived linguistics during the 1950s by promising to make it a rigorous part of the biological sciences. However, his work has not resulted in a better understanding of language acquisition and use. Key concepts like ‘innateness’, ‘Universal Grammar’ and ‘Language Acquisition Device’ remain in need of precise definition, and the Poverty of the Stimulus Argument does not rule out data-driven domain-general language acquisition.
Empirical work in developmental psychology has demonstrated that children acquire and practice many language-related cognitive abilities long before they produce their first words. Chomsky’s dictum that language learning is uniform across the species and invariably follows genetically determined stages remains empirically unconfirmed. Computational modeling has accounted for some internal structure of language acquisition mechanisms and simulates the specific conditions under which children learn language. Contemporary models use samples of child-directed-speech as input and have replicated numerous aspects of human performance.
Given my findings I suggest that Chomskyan linguistics is not Cartesian in substance or in spirit. Descartes was wary of those “who take no account of experience and think that truth will spring from their brains like Minerva from the head of Jupiter” (CSM I, p. 21). His science relied on sense experience (empiricism) and deduction (rationalism) and a truly Cartesian Linguistics will revive this part of the Cartesian tradition.
|
93 |
Computational Modeling of Complex Reactions Kinetics in Biosensors / Kompiuterinis daugiapakopių reakcijų kinetikos biojutikliuose modeliavimasGaidamauskaitė, Evelina 22 November 2011 (has links)
Biosensors are analytical devices made up of a combination of a biological entity, usually an enzyme, that recognizes a specific analyte (substrate) and the transducer that translates the biorecognition event into a signal. In order to create new types of biosensors the corresponding experimental studies are necessary. Computational experiments could very well replace very expensive physical ones. However, the multi-step character of a chemical reaction scheme must be considered and modeled accordingly. In this thesis such reaction schemes were studied in great details. Original mathematical models were developed for optical peroxidase-based and amperometric laccase-based biosensors. The deterministic nature of model construction allows the automated models to be built. Based on this assumption flexible model for computational modeling of different practical multistep biosensors was developed. In order to optimize the numerical solution of the reaction-diffusion type equations common finite difference schemes were compared. The comparison shows that the fastest schemes to achieve the required relative error are implicit and Hopscotch schemes. For the problems where accuracy is not a significant factor but the speed is, the simplest explicit scheme should be used. Applying the new flexible model a computational modeling of the multi-step biosensors were produced. The modeling of laccase biosensor explained and confirmed the synergistic effect. The computational modeling of the... [to full text] / Biojutikliai yra analitiniai įtaisai sudaryti iš biologiškai aktyvios bei selektyviai atpažįstančios substratą medžiagos, dažniausiai fermento, ir keitiklio formuojančio makroskopinį fizinį signalą. Naujų įtaisų kūrimui būtini lygiagretūs eksperimentiniai tyrimai. Skaitiniai eksperimentai gali patikimai pakeisti fizinius. Modeliuojant tokius biojutiklius, būtina atsižvelgti į juose vykstančių procesų daugiapakopį pobūdį. Šiame darbe nuodugniai ištirtos tokių reakcijų schemų savybės. Sudaryti originalūs matematiniai modeliai optiniam peroksidaziniam bei amperometriniam lakaziniam daugiapakopiams biojutikliams. Deterministinė modelių sudarymo proceso prigimtis leidžia jį automatizuoti. Remiantis šiuo principu sukurtas bendras įrankis kompiuteriniam daugiapakopių biojutiklių modeliavimui. Siekiant optimizuoti skaitinį sprendimą palygintos dažniausiai naudojamos baigtinių skirtumų skaitinio sprendimo schemos sprendžiant reakcijos - difuzijos lygtis. Pastarasis palyginimas parodė, kad greičiausiai reikiamas sprendinio tikslumas pasiekiamas taikant neišreikštinę bei Hopscotch schemas. Uždaviniams, kuriems sparta svarbesnė už tikslumą, turėtų būti taikoma išreikštinė schema. Taikant naują įrankį atliktas kompiuterinis daugiapakopių biojutiklių modeliavimas. Kompiuterinis lakazinio biojutiklio modeliavimas teoriškai paaiškino eksperimentiškai stebėtą sinergetinę mediatoriaus įtaką biojutiklio atsakui. Peroksidazinio biojutiklio kompiuterinio modeliavimo rezultatai parodė, kad plataus... [toliau žr. visą tekstą]
|
94 |
Kompiuterinis daugiapakopių reakcijų kinetikos biojutikliuose modeliavimas / Computational Modeling of Complex Reactions Kinetics in BiosensorsGaidamauskaitė, Evelina 22 November 2011 (has links)
Biojutikliai yra analitiniai įtaisai sudaryti iš biologiškai aktyvios bei selektyviai atpažįstančios substratą medžiagos, dažniausiai fermento, ir keitiklio formuojančio makroskopinį fizinį signalą. Naujų įtaisų kūrimui būtini lygiagretūs eksperimentiniai tyrimai. Skaitiniai eksperimentai gali patikimai pakeisti fizinius. Modeliuojant tokius biojutiklius, būtina atsižvelgti į juose vykstančių procesų daugiapakopį pobūdį. Šiame darbe nuodugniai ištirtos tokių reakcijų schemų savybės. Sudaryti originalūs matematiniai modeliai optiniam peroksidaziniam bei amperometriniam lakaziniam daugiapakopiams biojutikliams. Deterministinė modelių sudarymo proceso prigimtis leidžia jį automatizuoti. Remiantis šiuo principu sukurtas bendras įrankis kompiuteriniam daugiapakopių biojutiklių modeliavimui. Siekiant optimizuoti skaitinį sprendimą palygintos dažniausiai naudojamos baigtinių skirtumų skaitinio sprendimo schemos sprendžiant reakcijos - difuzijos lygtis. Pastarasis palyginimas parodė, kad greičiausiai reikiamas sprendinio tikslumas pasiekiamas taikant neišreikštinę bei Hopscotch schemas. Uždaviniams, kuriems sparta svarbesnė už tikslumą, turėtų būti taikoma išreikštinė schema. Taikant naują įrankį atliktas kompiuterinis daugiapakopių biojutiklių modeliavimas. Kompiuterinis lakazinio biojutiklio modeliavimas teoriškai paaiškino eksperimentiškai stebėtą sinergetinę mediatoriaus įtaką biojutiklio atsakui. Peroksidazinio biojutiklio kompiuterinio modeliavimo rezultatai parodė, kad plataus... [toliau žr. visą tekstą] / Biosensors are analytical devices made up of a combination of a biological entity, usually an enzyme, that recognizes a specific analyte (substrate) and the transducer that translates the biorecognition event into a signal. In order to create new types of biosensors the corresponding experimental studies are necessary. Computational experiments could very well replace very expensive physical ones. However, the multi-step character of a chemical reaction scheme must be considered and modeled accordingly. In this thesis such reaction schemes were studied in great details. Original mathematical models were developed for optical peroxidase-based and amperometric laccase-based biosensors. The deterministic nature of model construction allows the automated models to be built. Based on this assumption flexible model for computational modeling of different practical multistep biosensors was developed. In order to optimize the numerical solution of the reaction-diffusion type equations common finite difference schemes were compared. The comparison shows that the fastest schemes to achieve the required relative error are implicit and Hopscotch schemes. For the problems where accuracy is not a significant factor but the speed is, the simplest explicit scheme should be used. Applying the new flexible model a computational modeling of the multi-step biosensors were produced. The modeling of laccase biosensor explained and confirmed the synergistic effect. The computational modeling of the... [to full text]
|
95 |
Dosimetric verification of radiation therapy including intensity modulated treatments, using an amorphous-silicon electronic portal imaging deviceChytyk-Praznik, Krista January 2009 (has links)
Radiation therapy is continuously increasing in complexity due to technological innovation in delivery techniques, necessitating thorough dosimetric verification. Comparing accurately predicted portal dose images to measured images obtained during patient treatment can determine if a particular treatment was delivered correctly. The goal of this thesis was to create a method to predict portal dose images that was versatile and accurate enough to use in a clinical setting. All measured images in this work were obtained with an amorphous silicon electronic portal imaging device (a-Si EPID), but the technique is applicable to any planar imager. A detailed, physics-motivated fluence model was developed to characterize fluence exiting the linear accelerator head. The model was further refined using results from Monte Carlo simulations and schematics of the linear accelerator. The fluence incident on the EPID was converted to a portal dose image through a superposition of Monte Carlo-generated, monoenergetic dose kernels specific to the a-Si EPID. Predictions of clinical IMRT fields with no patient present agreed with measured portal dose images within 3% and 3 mm. The dose kernels were applied ignoring the geometrically divergent nature of incident fluence on the EPID. A computational investigation into this parallel dose kernel assumption determined its validity under clinically relevant situations. Introducing a patient or phantom into the beam required the portal image prediction algorithm to account for patient scatter and attenuation. Primary fluence was calculated by attenuating raylines cast through the patient CT dataset, while scatter fluence was determined through the superposition of pre-calculated scatter fluence kernels. Total dose in the EPID was calculated by convolving the total predicted incident fluence with the EPID-specific dose kernels. The algorithm was tested on water slabs with square fields, agreeing with measurement within 3% and 3 mm. The method was then applied to five prostate and six head-and-neck IMRT treatment courses (~1900 clinical images). Deviations between the predicted and measured images were quantified. The portal dose image prediction model developed in this thesis work has been shown to be accurate, and it was demonstrated to be able to verify patients’ delivered radiation treatments.
|
96 |
Dendritic and axonal ion channels supporting neuronal integration : From pyramidal neurons to peripheral nociceptorsPetersson, Marcus January 2012 (has links)
The nervous system, including the brain, is a complex network with billions of complex neurons. Ion channels mediate the electrical signals that neurons use to integrate input and produce appropriate output, and could thus be thought of as key instruments in the neuronal orchestra. In the field of neuroscience we are not only curious about how our brains work, but also strive to characterize and develop treatments for neural disorders, in which the neuronal harmony is distorted. By modulating ion channel activity (pharmacologically or otherwise) it might be possible to effectively restore neuronal harmony in patients with various types of neural (including channelopathic) disorders. However, this exciting strategy is impeded by the gaps in our understanding of ion channels and neurons, so more research is required. Thus, the aim of this thesis is to improve the understanding of how specific ion channel types contribute to shaping neuronal dynamics, and in particular, neuronal integration, excitability and memory. For this purpose I have used computational modeling, an approach which has recently emerged as an excellent tool for understanding dynamically complex neurophysiological phenomena. In the first of two projects leading to this thesis, I studied how neurons in the brain, and in particular their dendritic structures, are able to integrate synaptic inputs arriving at low frequencies, in a behaviorally relevant range of ~8 Hz. Based on recent experimental data on synaptic transient receptor potential channels (TRPC), metabotropic glutamate receptor (mGluR) dynamics and glutamate decay times, I developed a novel model of the ion channel current ITRPC, the importance of which is clear but largely neglected due to an insufficient understanding of its activation mechanisms. We found that ITRPC, which is activated both synaptically (via mGluR) and intrinsically (via Ca2+) and has a long decay time constant (τdecay), is better suited than the classical rapidly decaying currents (IAMPA and INMDA) in supporting low-frequency temporal summation. It was further concluded that τdecay varies with stimulus duration and frequency, is linearly dependent on the maximal glutamate concentration, and might require a pair-pulse protocol to be properly assessed. In a follow-up study I investigated small-amplitude (a few mV) long-lasting (a few seconds) depolarizations in pyramidal neurons of the hippocampal cortex, a brain region important for memory and spatial navigation. In addition to confirming a previous hypothesis that these depolarizations involve an interplay of ITRPC and voltage-gated calcium channels, I showed that they are generated in distal dendrites, are intrinsically stable to weak excitatory and inhibitory synaptic input, and require spatial and temporal summation to occur. I further concluded that the existence of multiple stable states cannot be ruled out, and that, in spite of their small somatic amplitudes, these depolarizations may strongly modulate the probability of action potential generation. In the second project I studied the axonal mechanisms of unmyelinated peripheral (cutaneous) pain-sensing neurons (referred to as C-fiber nociceptors), which are involved in chronic pain. To my knowledge, the C-fiber model we developed for this purpose is unique in at least three ways, since it is multicompartmental, tuned from human microneurography (in vivo) data, and since it includes several biologically realistic ion channels, Na+/K+ concentration dynamics, a Na-K-pump, morphology and temperature dependence. Based on simulations aimed at elucidating the mechanisms underlying two clinically relevant phenomena, activity-dependent slowing (ADS) and recovery cycles (RC), we found an unexpected support for the involvement of intracellular Na+ in ADS and extracellular K+ in RC. We also found that the two major Na+ channels (NaV1.7 and NaV1.8) have opposite effects on RC. Furthermore, I showed that the differences between mechano-sensitive and mechano-insensitive C-fiber types might reside in differing ion channel densities. To conclude, the work of this thesis provides key insights into neuronal mechanisms with relevance for memory, pain and neural disorders, and at the same time demonstrates the advantage of using computational modeling as a tool for understanding and discovering fundamental properties of central and peripheral neurons. / <p>QC 20120914</p>
|
97 |
Dosimetric verification of radiation therapy including intensity modulated treatments, using an amorphous-silicon electronic portal imaging deviceChytyk-Praznik, Krista January 2009 (has links)
Radiation therapy is continuously increasing in complexity due to technological innovation in delivery techniques, necessitating thorough dosimetric verification. Comparing accurately predicted portal dose images to measured images obtained during patient treatment can determine if a particular treatment was delivered correctly. The goal of this thesis was to create a method to predict portal dose images that was versatile and accurate enough to use in a clinical setting. All measured images in this work were obtained with an amorphous silicon electronic portal imaging device (a-Si EPID), but the technique is applicable to any planar imager. A detailed, physics-motivated fluence model was developed to characterize fluence exiting the linear accelerator head. The model was further refined using results from Monte Carlo simulations and schematics of the linear accelerator. The fluence incident on the EPID was converted to a portal dose image through a superposition of Monte Carlo-generated, monoenergetic dose kernels specific to the a-Si EPID. Predictions of clinical IMRT fields with no patient present agreed with measured portal dose images within 3% and 3 mm. The dose kernels were applied ignoring the geometrically divergent nature of incident fluence on the EPID. A computational investigation into this parallel dose kernel assumption determined its validity under clinically relevant situations. Introducing a patient or phantom into the beam required the portal image prediction algorithm to account for patient scatter and attenuation. Primary fluence was calculated by attenuating raylines cast through the patient CT dataset, while scatter fluence was determined through the superposition of pre-calculated scatter fluence kernels. Total dose in the EPID was calculated by convolving the total predicted incident fluence with the EPID-specific dose kernels. The algorithm was tested on water slabs with square fields, agreeing with measurement within 3% and 3 mm. The method was then applied to five prostate and six head-and-neck IMRT treatment courses (~1900 clinical images). Deviations between the predicted and measured images were quantified. The portal dose image prediction model developed in this thesis work has been shown to be accurate, and it was demonstrated to be able to verify patients’ delivered radiation treatments.
|
98 |
Computational Analysis of Asymmetric Environments of Soluble Epidermal Growth Factor and Application to Single Cell Polarization and Fate ControlVerneau, Julien January 2011 (has links)
Stem and progenitor cells have the ability to regulate fate decisions through asymmetric cells divisions. The coordinated choice of cell division symmetry in space and time contributes to the physiological development of tissues and organs. Conversely, deregulation of these decisions can lead to the uncontrolled proliferation of cells as observed in cancer. Understanding the mechanisms of cell fate choices is necessary for the design of biomimetic culture systems and the production of therapeutic cell populations in the context of regenerative medicine.
Environmental signals can guide the fate decision process at the single level but the exact nature of these signals remains to be discovered. Gradients of factors are important during development and several methods have been developed to recreate gradients and/or pulses of factors in vitro. In the context of asymmetric cell division, the effect of the soluble factor environment on the polarization of cell surface receptors and intracellular proteins has not been properly investigated.
We developed a finite-element model of a single cell in culture in which epidermal growth factor (EGF) was delivered through a micropipette onto a single cell surface. A two-dimensional approach initially allowed for the development of a set of metrics to evaluate the polarization potential with respect to different delivery strategies. We further analyzed a three-dimensional model in which conditions consistent with single cell polarization were identified. The benefits of finite-element modeling were illustrated through the demonstration of complex geometry effects resulting from the culture chamber and neighboring cells.
Finally, physiological effects of in vitro polarization were analyzed at the single cell level in HeLa and primary cells. The potential of soluble factor signaling in the context of directed fate control was demonstrated. Long term phenotypical effects were studied using live-cell imaging which demonstrated the degree of heterogeneity of in vitro culture systems and future challenges for the production of therapeutic cell populations.
|
99 |
First principles approach to understanding stability and phase transitions of metal A(II)B(IV)hexafluoridesPueschel, Charles A. 24 November 2015 (has links)
No description available.
|
100 |
Comando e controle no contexto da digitalização : um estudo com base em modelagem computacional / Command and control in the context of digitization: a study based on computational modelingBertol, Frederico Licks January 2018 (has links)
Este trabalho propõe uma discussão em torno dos impactos da digitalização sobre sistemas militares de comando e controle. A hipótese central é que o emprego intensivo de tecnologias digitais está associado a um maior risco de sobrecarga informacional nesses sistemas. Isso se aplica em especial às forças militares que adotaram doutrinas de viés tecnocrático, como a guerra centrada em redes. No primeiro capítulo, discutimos o contexto no qual nosso tema de pesquisa se insere, fazendo uma breve retrospectiva do processo de digitalização e também definindo alguns conceitos-chave. No segundo capítulo, em formato de artigo, apresentamos o modelo computacional que foi desenvolvido para simular o funcionamento de um sistema de comando e controle sob a condição de sobrecarga informacional. O artigo também reúne uma revisão crítica das abordagens sobre comando e controle, com ênfase na literatura sobre guerra centrada em redes. O terceiro e último capítulo traz algumas conclusões sobre o emprego da modelagem computacional como metodologia de pesquisa e o estado atual do debate sobre guerra centrada e redes. / This work proposes a discussion on the impacts of digitization over military command and control systems. The central hypothesis is that the intensive deployment of digital technologies is associated to a greater risk of informational overload in those systems. This applies especially to military forces that have adopted doctrines with a technocratic bias, such as the network-centric warfare. In the first chapter, we discuss the context that encompass our research topic, making a brief retrospective of the process of digitization and defining some key concepts. In the second chapter, in form of article, we present the computational model developed for simulating the operation of a command and control system under the condition of informational overload. The article also contains a critical review on the command and control approaches, with emphasis on the literature about network-centric warfare. The third and last chapter brings out some conclusions regarding the use of computational modeling as a research method and the current state of the debate on network-centric warfare.
|
Page generated in 0.1022 seconds