• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 12
  • 12
  • 1
  • 1
  • 1
  • Tagged with
  • 114
  • 114
  • 14
  • 14
  • 13
  • 12
  • 11
  • 11
  • 10
  • 10
  • 10
  • 10
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Neural network modelling for shear strength of concrete members reinforced with FRP bars

Bashir, Rizwan, Ashour, Ashraf 10 April 2012 (has links)
Yes / This paper investigates the feasibility of using artificial neural networks (NNs) to predict the shear capacity of concrete members reinforced longitudinally with fibre reinforced polymer (FRP) bars, and without any shear reinforcement. An experimental database of 138 test specimens failed in shear is created and used to train and test NNs as well as to assess the accuracy of three existing shear design methods. The created NN predicted to a high level of accuracy the shear capacity of FRP reinforced concrete members. Garson index was employed to identify the relative importance of the influencing parameters on the shear capacity based on the trained NNs weightings. A parametric analysis was also conducted using the trained NN to establish the trend of the main influencing variables on the shear capacity. Many of the assumptions made by the shear design methods are predicted by the NN developed; however, few are inconsistent with the NN predictions.
72

Computational modelling of the human heart and multiscale simulation of its electrophysiological activity aimed at the treatment of cardiac arrhythmias related to ischaemia and Infarction

López Pérez, Alejandro Daniel 02 September 2019 (has links)
[ES] Las enfermedades cardiovasculares constituyen la principal causa de morbilidad y mortalidad a nivel mundial, causando en torno a 18 millones de muertes cada año. De entre ellas, la más común es la enfermedad isquémica cardíaca, habitualmente denominada como infarto de miocardio (IM). Tras superar un IM, un considerable número de pacientes desarrollan taquicardias ventriculares (TV) potencialmente mortales durante la fase crónica del IM, es decir, semanas, meses o incluso años después la fase aguda inicial. Este tipo concreto de TV normalmente se origina por una reentrada a través de canales de conducción (CC), filamentos de miocardio superviviente que atraviesan la cicatriz del infarto fibrosa y no conductora. Cuando los fármacos anti-arrítmicos resultan incapaces de evitar episodios recurrentes de TV, la ablación por radiofrecuencia (ARF), un procedimiento mínimamente invasivo realizado mediante cateterismo en el laboratorio de electrofisiología (EF), se usa habitualmente para interrumpir de manera permanente la propagación eléctrica a través de los CCs responsables de la TV. Sin embargo, además de ser invasivo, arriesgado y requerir mucho tiempo, en casos de TVs relacionadas con IM crónico, hasta un 50% de los pacientes continúa padeciendo episodios recurrentes de TV tras el procedimiento de ARF. Por tanto, existe la necesidad de desarrollar nuevas estrategias pre-procedimiento para mejorar la planificación de la ARF y, de ese modo, aumentar esta tasa de éxito relativamente baja. En primer lugar, realizamos una revisión exhaustiva de la literatura referente a los modelos cardiacos 3D existentes, con el fin de obtener un profundo conocimiento de sus principales características y los métodos usados en su construcción, con especial atención sobre los modelos orientados a simulación de EF cardíaca. Luego, usando datos clínicos de un paciente con historial de TV relacionada con infarto, diseñamos e implementamos una serie de estrategias y metodologías para (1) generar modelos computacionales 3D específicos de paciente de ventrículos infartados que puedan usarse para realizar simulaciones de EF cardíaca a nivel de órgano, incluyendo la cicatriz del infarto y la región circundante conocida como zona de borde (ZB); (2) construir modelos 3D de torso que permitan la obtención del ECG simulado; y (3) llevar a cabo estudios in-silico de EF personalizados y pre-procedimiento, tratando de replicar los verdaderos estudios de EF realizados en el laboratorio de EF antes de la ablación. La finalidad de estas metodologías es la de localizar los CCs en el modelo ventricular 3D para ayudar a definir los objetivos de ablación óptimos para el procedimiento de ARF. Por último, realizamos el estudio retrospectivo por simulación de un caso, en el que logramos inducir la TV reentrante relacionada con el infarto usando diferentes configuraciones de modelado para la ZB. Validamos nuestros resultados mediante la reproducción, con una precisión razonable, del ECG del paciente en TV, así como en ritmo sinusal a partir de los mapas de activación endocárdica obtenidos invasivamente mediante sistemas de mapeado electroanatómico en este último caso. Esto permitió encontrar la ubicación y analizar las características del CC responsable de la TV clínica. Cabe destacar que dicho estudio in-silico de EF podría haberse efectuado antes del procedimiento de ARF, puesto que nuestro planteamiento está completamente basado en datos clínicos no invasivos adquiridos antes de la intervención real. Estos resultados confirman la viabilidad de la realización de estudios in-silico de EF personalizados y pre-procedimiento de utilidad, así como el potencial del abordaje propuesto para llegar a ser en un futuro una herramienta de apoyo para la planificación de la ARF en casos de TVs reentrantes relacionadas con infarto. No obstante, la metodología propuesta requiere de notables mejoras y validación por medio de es / [CA] Les malalties cardiovasculars constitueixen la principal causa de morbiditat i mortalitat a nivell mundial, causant entorn a 18 milions de morts cada any. De elles, la més comuna és la malaltia isquèmica cardíaca, habitualment denominada infart de miocardi (IM). Després de superar un IM, un considerable nombre de pacients desenvolupen taquicàrdies ventriculars (TV) potencialment mortals durant la fase crònica de l'IM, és a dir, setmanes, mesos i fins i tot anys després de la fase aguda inicial. Aquest tipus concret de TV normalment s'origina per una reentrada a través dels canals de conducció (CC), filaments de miocardi supervivent que travessen la cicatriu de l'infart fibrosa i no conductora. Quan els fàrmacs anti-arítmics resulten incapaços d'evitar episodis recurrents de TV, l'ablació per radiofreqüència (ARF), un procediment mínimament invasiu realitzat mitjançant cateterisme en el laboratori de electrofisiologia (EF), s'usa habitualment per a interrompre de manera permanent la propagació elèctrica a través dels CCs responsables de la TV. No obstant això, a més de ser invasiu, arriscat i requerir molt de temps, en casos de TVs relacionades amb IM crònic fins a un 50% dels pacients continua patint episodis recurrents de TV després del procediment d'ARF. Per tant, existeix la necessitat de desenvolupar noves estratègies pre-procediment per a millorar la planificació de l'ARF i, d'aquesta manera, augmentar la taxa d'èxit, que es relativament baixa. En primer lloc, realitzem una revisió exhaustiva de la literatura referent als models cardíacs 3D existents, amb la finalitat d'obtindre un profund coneixement de les seues principals característiques i els mètodes usats en la seua construcció, amb especial atenció sobre els models orientats a simulació de EF cardíaca. Posteriorment, usant dades clíniques d'un pacient amb historial de TV relacionada amb infart, dissenyem i implementem una sèrie d'estratègies i metodologies per a (1) generar models computacionals 3D específics de pacient de ventricles infartats capaços de realitzar simulacions de EF cardíaca a nivell d'òrgan, incloent la cicatriu de l'infart i la regió circumdant coneguda com a zona de vora (ZV); (2) construir models 3D de tors que permeten l'obtenció del ECG simulat; i (3) dur a terme estudis in-silico de EF personalitzats i pre-procediment, tractant de replicar els vertaders estudis de EF realitzats en el laboratori de EF abans de l'ablació. La finalitat d'aquestes metodologies és la de localitzar els CCs en el model ventricular 3D per a ajudar a definir els objectius d'ablació òptims per al procediment d'ARF. Finalment, a manera de prova de concepte, realitzem l'estudi retrospectiu per simulació d'un cas, en el qual aconseguim induir la TV reentrant relacionada amb l'infart usant diferents configuracions de modelatge per a la ZV. Validem els nostres resultats mitjançant la reproducció, amb una precisió raonable, del ECG del pacient en TV, així com en ritme sinusal a partir dels mapes d'activació endocardíac obtinguts invasivament mitjançant sistemes de mapatge electro-anatòmic en aquest últim cas. Això va permetre trobar la ubicació i analitzar les característiques del CC responsable de la TV clínica. Cal destacar que aquest estudi in-silico de EF podria haver-se efectuat abans del procediment d'ARF, ja que el nostre plantejament està completament basat en dades clíniques no invasius adquirits abans de la intervenció real. Aquests resultats confirmen la viabilitat de la realització d'estudis in-silico de EF personalitzats i pre-procediment d'utilitat, així com el potencial de l'abordatge proposat per a arribar a ser en un futur una eina de suport per a la planificació de l'ARF en casos de TVs reentrants relacionades amb infart. No obstant això, la metodologia proposada requereix de notables millores i validació per mitjà d'estudis de simulació amb grans cohorts de pacients. / [EN] Cardiovascular diseases represent the main cause of morbidity and mortality worldwide, causing around 18 million deaths every year. Among these diseases, the most common one is the ischaemic heart disease, usually referred to as myocardial infarction (MI). After surviving to a MI, a considerable number of patients develop life-threatening ventricular tachycardias (VT) during the chronic stage of the MI, that is, weeks, months or even years after the initial acute phase. This particular type of VT is typically sustained by reentry through slow conducting channels (CC), which are filaments of surviving myocardium that cross the non-conducting fibrotic infarct scar. When anti-arrhythmic drugs are unable to prevent recurrent VT episodes, radiofrequency ablation (RFA), a minimally invasive procedure performed by catheterization in the electrophysiology (EP) laboratory, is commonly used to interrupt the electrical conduction through the CCs responsible for the VT permanently. However, besides being invasive, risky and time-consuming, in the cases of VTs related to chronic MI, up to 50% of patients continue suffering from recurrent VT episodes after the RFA procedure. Therefore, there exists a need to develop novel pre-procedural strategies to improve RFA planning and, thereby, increase this relatively low success rate. First, we conducted an exhaustive review of the literature associated with the existing 3D cardiac models in order to gain a deep knowledge about their main features and the methods used for their construction, with special focus on those models oriented to simulation of cardiac EP. Later, using a clinical dataset of a chronically infarcted patient with a history of infarct-related VT, we designed and implemented a number of strategies and methodologies to (1) build patient-specific 3D computational models of infarcted ventricles that can be used to perform simulations of cardiac EP at the organ level, including the infarct scar and the surrounding region known as border zone (BZ); (2) construct 3D torso models that enable to compute the simulated ECG; and (3) carry out pre-procedural personalized in-silico EP studies, trying to replicate the actual EP studies conducted in the EP laboratory prior to the ablation. The goal of these methodologies is to allow locating the CCs into the 3D ventricular model in order to help in defining the optimal ablation targets for the RFA procedure. Lastly, as a proof-of-concept, we performed a retrospective simulation case study, in which we were able to induce an infarct-related reentrant VT using different modelling configurations for the BZ. We validated our results by reproducing with a reasonable accuracy the patient's ECG during VT, as well as in sinus rhythm from the endocardial activation maps invasively recorded via electroanatomical mapping systems in this latter case. This allowed us to find the location and analyse the features of the CC responsible for the clinical VT. Importantly, such in-silico EP study might have been conducted prior to the RFA procedure, since our approach is completely based on non-invasive clinical data acquired before the real intervention. These results confirm the feasibility of performing useful pre-procedural personalized in-silico EP studies, as well as the potential of the proposed approach to become a helpful tool for RFA planning in cases of infarct-related reentrant VTs in the future. Nevertheless, the developed methodology requires further improvements and validation by means of simulation studies including large cohorts of patients. / During the carrying out of this doctoral thesis, the author Alejandro Daniel López Pérez was financially supported by the Ministerio de Economía, Industria y Competitividad of Spain through the program Ayudas para contratos predoctorales para la formación de doctores, with the grant number BES-2013-064089. / López Pérez, AD. (2019). Computational modelling of the human heart and multiscale simulation of its electrophysiological activity aimed at the treatment of cardiac arrhythmias related to ischaemia and Infarction [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/124973
73

Computational modelling of the neural systems involved in schizophrenia

Thurnham, A. J. January 2008 (has links)
The aim of this thesis is to improve our understanding of the neural systems involved in schizophrenia by suggesting possible avenues for future computational modelling in an attempt to make sense of the vast number of studies relating to the symptoms and cognitive deficits relating to the disorder. This multidisciplinary research has covered three different levels of analysis: abnormalities in the microscopic brain structure, dopamine dysfunction at a neurochemical level, and interactions between cortical and subcortical brain areas, connected by cortico-basal ganglia circuit loops; and has culminated in the production of five models that provide useful clarification in this difficult field. My thesis comprises three major relevant modelling themes. Firstly, in Chapter 3 I looked at an existing neural network model addressing the Neurodevelopmental Hypothesis of Schizophrenia by Hoffman and McGlashan (1997). However, it soon became clear that such models were overly simplistic and brittle when it came to replication. While they focused on hallucinations and connectivity in the frontal lobes they ignored other symptoms and the evidence of reductions in volume of the temporal lobes in schizophrenia. No mention was made of the considerable evidence of dysfunction of the dopamine system and associated areas, such as the basal ganglia. This led to my second line of reasoning: dopamine dysfunction. Initially I helped create a novel model of dopamine neuron firing based on the Computational Substrate for Incentive Salience by McClure, Daw and Montague (2003), incorporating temporal difference (TD) reward prediction errors (Chapter 5). I adapted this model in Chapter 6 to address the ongoing debate as to whether or not dopamine encodes uncertainty in the delay period between presentation of a conditioned stimulus and receipt of a reward, as demonstrated by sustained activation seen in single dopamine neuron recordings (Fiorillo, Tobler & Schultz 2003). An answer to this question could result in a better understanding of the nature of dopamine signaling, with implications for the psychopathology of cognitive disorders, like schizophrenia, for which dopamine is commonly regarded as having a primary role. Computational modelling enabled me to suggest that while sustained activation is common in single trials, there is the possibility that it increases with increasing probability, in which case dopamine may not be encoding uncertainty in this manner. Importantly, these predictions can be tested and verified by experimental data. My third modelling theme arose as a result of the limitations to using TD alone to account for a reinforcement learning account of action control in the brain. In Chapter 8 I introduce a dual weighted artificial neural network, originally designed by Hinton and Plaut (1987) to address the problem of catastrophic forgetting in multilayer artificial neural networks. I suggest an alternative use for a model with fast and slow weights to address the problem of arbitration between two systems of control. This novel approach is capable of combining the benefits of model free and model based learning in one simple model, without need for a homunculus and may have important implications in addressing how both goal directed and stimulus response learning may coexist. Modelling cortical-subcortical loops offers the potential of incorporating both the symptoms and cognitive deficits associated with schizophrenia by taking into account the interactions between midbrain/striatum and cortical areas.
74

Towards systems pharmacology models of druggable targets and disease mechanisms

Knight-Schrijver, Vincent January 2019 (has links)
The development of essential medicines is being slowed by a lack of efficiency in drug development as ninety per cent of drugs fail at some stage during clinical evaluation. This attrition in drug development is seen not because of a reduction in pharmaceutical research expenditure nor is it caused by a declining understanding of biology, if anything, these are both increasing. Instead, drugs are failing because we are unable to effectively predict how they will work before they are given to patients. This is due to limitations of the current methods used to evaluate a drug's toxicity and efficacy prior to its development. Quite simply, these methods do not account for the full complexity of biology in humans. Systems pharmacology models are a likely candidate for increasing the efficiency of drug discovery as they seek to comprehensively model the fundamental biology of disease mechanisms in a quantit- ative manner. They are computational models, designed and hailed as a strategy for making well-informed and cost effective decisions on drug viability and target druggability and therefore attempt to reduce this time-consuming and costly attrition. Using text mining and text classification I present a growing landscape of systems pharmacology models in literature growing from humble roots because of step-wise increases in our understanding of biology. Furthermore, I develop a case for the capability of systems pharmacology models in making predictions by constructing a model of interleukin-6 signalling for rheumatoid arthritis. This model shows that druggable target selection is not necessarily an intuitive task as it results in an emergent but unanswered hypothesis for safety concerns in a monoclonal antibody. Finally, I show that predictive classification models can also be used to explore gene expression data in a novel work flow by attempting to predict patient response classes to an influenza vaccine.
75

A comparative analysis of Purkinje cells across species combining modelling, machine learning and information theory

Kidd, Kirsty January 2017 (has links)
There have been a number of computational modelling studies that aim to replicate the cerebellar Purkinje cell, though these typically use the morphology of rodent cells. While many species, including rodents, display intricate dendritic branching, it is not a universal feature among Purkinje cells. This study uses morphological reconstructions of 24 Purkinje cells from seven species to explore the changes that occur to the cell through evolution and examine whether this has an effect on the processing capacity of the cell. This is achieved by combining several modes of study in order to gain a comprehensive overview of the variations between the cells in both morphology and behaviour. Passive and active computational models of the cells were created, using the same electrophysiological parameters and ion channels for all models, to characterise the voltage attenuation and electrophysiological behaviour of the cells. These results and several measures of branching and size were then used to look for clusters in the data set using machine learning techniques. They were also used to visualise the differences within each species group. Information theory methods were also employed to compare the estimated information transfer from input to output across each cell. Along with a literature review into what is known about Purkinje cells and the cerebellum across the phylogenetic tree, these results show that while there are some obvious differences in morphology, the variation within species groups in electrophysiological behaviour is often as high as between them. This suggests that morphological changes may occur in order to conserve behaviour in the face of other changes to the cerebellum.
76

Computational Modelling Of Heat Transfer In Reheat Furnaces

Harish, J 12 1900 (has links)
Furnaces that heat metal parts (blooms) prior to hot-working processes such as rolling or forging are called pre-forming reheat furnaces. In these furnaces, the fundamental idea is to heat the blooms to a prescribed temperature without very large temperature gradients in them. This is to ensure correct performance of the metal parts subsequent to reheating. Due to the elevated temperature in the furnace chamber, radiation is the dominant mode of heat transfer from the furnace to the bloom. In addition, there is convection heat transfer from the hot gases to the bloom. The heat transfer within the bloom is by conduction. In order to design a new furnace or to improve the performance of existing ones, the heat transfer analysis has to be done accurately. Given the complex geometry and large number of parameters encountered in the furnace, an analytical solution is difficult, and hence numerical modeling has to be resorted to. In the present work, a numerical technique for modelling the steady-state and transient heat transfer in a reheat furnace is developed. The work mainly involves the development of a radiation heat transfer analysis code for a reheat furnace, since a major part of the heat transfer in the furnace chamber is due to radiation from the roof and combustion gases. The code is modified from an existing finite volume method (FVM) based radiation heat transfer solver, The existing solver is a general purpose radiation heat transfer solver for enclosures and incorporates the following features: surface-to-surface radiation, gray absorbing-emitting medium in the enclosure, multiple reflections off the bounding walls, shadowing effects due to obstructions in the enclosure, diffuse reflection and enclosures with irregular geometry. As a part of the present work, it has now been extended to include the following features that characterise radiation heat transfer in the furnace chamber · Combination of specular and diffuse reflection as is the case with most real surfaces · Participating non-gray media, as the combustion gases in the furnace chamber exhibit highly spectral radiative characteristics Transient 2D conduction heat transfer within the metal part is then modelled using a FVM-based code. Radiation heat flux from the radiation model and convection heat flux calculated using existing correlations act as boundary conditions for the conduction model. A global iteration involving the radiation model and the conduction model is carried out for the overall solution. For the study, two types of reheat furnaces were chosen; the pusher-type furnace and the walking beam furnace. The difference in the heating process of the two furnaces implies that they have to be modelled differently. In the pusher-type furnace, the heating of the blooms is only from the hot roof and the gas. In the walking beam furnace, the heating is also from the hearth and the blooms adjacent to any given bloom. The model can predict the bloom residence time for any particular combination of furnace conditions and load dimensions. The effects of variations of emissivities of the load, thickness of the load and the residence time of billet in the furnaces were studied.
77

Modelagem e avaliação comparativa dos métodos Luus-Jaakola e R2W aplicados na estimativa de parâmetros cinéticos de adsorção / Modeling and comparative evaluation of Luus-Jaakola and R2W methods applied in estimating kinetic parameters of adsorption

Melicia Aline Cortat Ribeiro 18 June 2012 (has links)
Fundação Carlos Chagas Filho de Amparo a Pesquisa do Estado do Rio de Janeiro / As técnicas inversas têm sido usadas na determinação de parâmetros importantes envolvidos na concepção e desempenho de muitos processos industriais. A aplicação de métodos estocásticos tem aumentado nos últimos anos, demonstrando seu potencial no estudo e análise dos diferentes sistemas em aplicações de engenharia. As rotinas estocásticas são capazes de otimizar a solução em uma ampla gama de variáveis do domínio, sendo possível a determinação dos parâmetros de interesse simultaneamente. Neste trabalho foram adotados os métodos estocásticos Luus-Jaakola (LJ) e Random Restricted Window (R2W) na obtenção dos ótimos dos parâmetros cinéticos de adsorção no sistema de cromatografia em batelada, tendo por objetivo verificar qual método forneceria o melhor ajuste entre os resultados obtidos nas simulações computacionais e os dados experimentais. Este modelo foi resolvido empregando o método de Runge- Kutta de 4 ordem para a solução de equações diferenciais ordinárias. / The inverse techniques have been used in the determination of parameters involved in design and performance of many industrial processes. The application of stochastic methods has increased in recent years, demonstrating their potential in study and analysis of different systems in engineering applications. Stochastic routines are able to optimize the solution in a wide range of variables, it is possible to determine the parameters of interest simultaneously. In this work two adopted the stochastic methods, Luus-Jaakola (LJ) and Restricted Random Window (R2W), to obtain the optimum parameters for adsorption kinetics in batch chromatography system, aiming to determine which method would provide the best fit between the results obtained in computer simulations and experimental data. This model was solved using the Runge-Kutta 4th order for ordinary differential equations solution.
78

La musique arabe et les nouvelles technologies : caractérisation, esthétique et modélisation informatique / Arabic music and new technologies : characterization, aesthetics and computational modelling

Belhassen, Raed 11 December 2017 (has links)
Cette thèse porte sur une réflexion autour de la composition musicale arabe et les nouvelles technologies, avec comme élément central, les particularités idiosyncratiques inhérentes au langage musical.La rencontre avec les nouvelles technologies soulève plusieurs interrogations sous-jacentes. L’approche proposée consiste en la détermination d’un cadre formel avec un ensemble de structures idiomatiques selon un angle pluridisciplinaire musicologique et ethnomusicologique, mais aussi historico-acoustique et théorico-empirique à la fois.Le modèle musicologique dégagé permet de souligner : l’importance des cellules mélodiques primaires, une quantification acoustique des systèmes intervalliques, les aspects de monodie et d’hétérophonie selon le niveau individuel et collectif, une instabilité des degrés, et des aspects liés à l’interprétation musicale. Des approches originales sont envisagées concernant l’étude des mécanismes d’ornementations et trois catégories sont baptisées.La confrontation de ce modèle musicologique avec la rencontre des nouvelles technologies est établie et les conséquences de celle-ci sont soulignées. Les notions de simulation et d’émulation mettent en évidence la centralité du timbre et le recours à des échantillons audiovisuels permet d’identifier le domaine implicite de l’expérimentation. Une comparaison avec le modèle expérimental sur le plan de la musique électronique en général et l’informatique musicale en particulier est alors réalisée.Enfin, un essai de modélisation informatique dans l’environnement Csound est proposé et retrace les éléments idiosyncratiques identifiés. Des compositions électroacoustiques sont citées et une pièce est analysée. / This thesis focuses on Arabic musical composition and new technologies, with the idiosyncratic characteristics inherent in the musical language as central element.The encounter with new technologies raises several underlying questions.The approach suggested consists of a formal frame with a set of idiomatic structures according to a pluridisciplinary angle musicological and ethnomusicological, but also historical-acoustical and theorico-empirical at the same time.The musicological model revealed underlines: the importance of primary melodic cells, acoustic quantification of interval systems, monody and heterophony aspects according to individual and collective level, instability of degrees, and aspects related to musical interpretation.Original approaches are envisaged concerning the study of the mechanisms of ornamentations and three categories are baptized.The confrontation of this musicological model with new technologies’ encounter is established and the consequences of this one are underlined.Simulation and emulation concepts highlight the centrality of timbre and the use of audio-visual samples allows the identification of the implicit domain of experimentation.A comparison with the experimental model of electronic music in general and computer music in particular is therefore made.Finally, a computational modelling test in the Csound environment is proposed and traces the idiosyncratic elements identified. Electroacoustic compositions are cited and a musical work is analysed.
79

Combined experimental and computational investigation into inter-subject variability in cardiac electrophysiology

Britton, Oliver Jonathan January 2015 (has links)
The underlying causes of variability in the electrical activity of hearts from individuals of the same species are not well understood. Understanding this variability is important to enable prediction of the response of individual hearts to diseases and therapies. Current experimental and computational methods for investigating the behaviour of the heart do not incorporate biological variation between individuals. In experimental studies, experimental results are averaged together to control errors and determine the average behaviour of the studied organism. In computational studies, averaged experimental data is usually used to develop models, and these models therefore represent a 'typical' organism, with all information on variability within the species having been lost. In this thesis we develop a methodology for modelling variability between individuals of the same species in cardiac cellular electrophysiology, motivated by the inability of traditional computational modelling approaches to capture experimental variability. A first study is conducted using traditional modelling approaches to investigate potentially pro-arrhythmic abnormalities in rabbit Purkinje fibres. A comparison with experimental recordings highlights their wide variability and the inability of existing computer modelling approaches to capture it. This leads to the development of a novel methodology that integrates the variability observed in experimental data with computational modelling and simulation, by building experimentally-calibrated populations of computational models, that collectively span the variability seen in experimental data. We apply this methodology to construct a population of rabbit Purkinje cell models. We show that our population of models can quantitatively predict the range of responses, not just the average response, to application of the potassium channel blocking drug dofetilide. This demonstrates an important potential application of our methodology, for predicting pro-arrhythmic drug effects in safety pharmacology. We then analyse a data set of experimental recordings from human ventricular tissue preparations, and use this data to develop a population of human ventricular cell models. We apply this population to study how variability between individuals alters the susceptibility of cardiac cells to developing drug-induced repolarisation abnormalities. These abnormalities can increase the chance of fatal arrhythmias, but the mechanisms that determine individual susceptibility are not well-understood.
80

Modelagem e avaliação comparativa dos métodos Luus-Jaakola e R2W aplicados na estimativa de parâmetros cinéticos de adsorção / Modeling and comparative evaluation of Luus-Jaakola and R2W methods applied in estimating kinetic parameters of adsorption

Melicia Aline Cortat Ribeiro 18 June 2012 (has links)
Fundação Carlos Chagas Filho de Amparo a Pesquisa do Estado do Rio de Janeiro / As técnicas inversas têm sido usadas na determinação de parâmetros importantes envolvidos na concepção e desempenho de muitos processos industriais. A aplicação de métodos estocásticos tem aumentado nos últimos anos, demonstrando seu potencial no estudo e análise dos diferentes sistemas em aplicações de engenharia. As rotinas estocásticas são capazes de otimizar a solução em uma ampla gama de variáveis do domínio, sendo possível a determinação dos parâmetros de interesse simultaneamente. Neste trabalho foram adotados os métodos estocásticos Luus-Jaakola (LJ) e Random Restricted Window (R2W) na obtenção dos ótimos dos parâmetros cinéticos de adsorção no sistema de cromatografia em batelada, tendo por objetivo verificar qual método forneceria o melhor ajuste entre os resultados obtidos nas simulações computacionais e os dados experimentais. Este modelo foi resolvido empregando o método de Runge- Kutta de 4 ordem para a solução de equações diferenciais ordinárias. / The inverse techniques have been used in the determination of parameters involved in design and performance of many industrial processes. The application of stochastic methods has increased in recent years, demonstrating their potential in study and analysis of different systems in engineering applications. Stochastic routines are able to optimize the solution in a wide range of variables, it is possible to determine the parameters of interest simultaneously. In this work two adopted the stochastic methods, Luus-Jaakola (LJ) and Restricted Random Window (R2W), to obtain the optimum parameters for adsorption kinetics in batch chromatography system, aiming to determine which method would provide the best fit between the results obtained in computer simulations and experimental data. This model was solved using the Runge-Kutta 4th order for ordinary differential equations solution.

Page generated in 0.1606 seconds