• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 35
  • 9
  • 7
  • 6
  • 3
  • 1
  • Tagged with
  • 134
  • 134
  • 39
  • 25
  • 21
  • 21
  • 18
  • 15
  • 13
  • 12
  • 12
  • 12
  • 12
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Simultaneous characterization of objects temperature and radiative properties through multispectral infrared thermography / Caractérisation conjointe de la température et des propriétés radiatives des objets par thermographie infrarouge multispectrale

Toullier, Thibaud 06 November 2019 (has links)
L'utilisation de caméras infrarouges bas coûts pour la surveillance long-terme d'infrastructures est prometteuse grâce aux dernières avancées technologiques du domaine. Une mesure précise de la température des surfaces observées in-situ se heurte au manque de connaissance des propriétés radiatives de la scène. L'utilisation d'une instrumentation multi-capteurs permet d'affiner le modèle de mesure afin d'obtenir une estimation plus précise de la température. A contrario, il est montré qu'il est toujours possible d'exploiter des données climatiques en ligne pour pallier un manque de capteur. Des méthodes bayésiennes d'estimation conjointe d'émissivité et de température sont ensuite développées et comparées aux méthodes de la littérature. Un simulateur d'échanges radiatifs diffus de scènes 3D a été implémenté afin de tester ces différentes méthodes. Ce logiciel utilise l'accélération matérielle de la machine pour réduire les temps de calcul. Les résultats numériques obtenus mettent en perspective une utilisation avancée de la thermographie infrarouge multi-spectrale pour la surveillance de structures. Cette estimation conjointe permet alors d'obtenir un estimé de la température par thermographie infrarouge avec une incertitude connue. / The latest technological improvements in low-cost infrared cameras have brought new opportunities for long-term infrastructures monitoring. The accurate measurement of surfaces' temperatures is facing the lack of knowledge of radiatives properties of the scene. By using multi-sensors instrumentation, the measurement model can be refined to get a better estimate of the temperature. To overcome a lack of sensors instrumentation, it is shown that online and free available climatic data can be used. Then, Bayesian methods to estimate simultaneously the emissivity and temperature have been developed and compared to literature's methods. A radiative exchange simulator of 3D scenes have been developed to compare those different methods on numerical data. This software uses the hardware acceleration as well as a GPGPU approach to reduce the computation time. As a consequence, obtained numerical results emphasized an advanced use of multi-spectral infrared thermography for the monitoring of structures. This simultaneous estimation enables to have an estimate of the temperature by infrared thermography with a known uncertainty.
112

DEVELOPMENT OF DROPWISE ADDITIVE MANUFACTURING WITH NON-BROWNIAN SUSPENSIONS: APPLICATIONS OF COMPUTER VISION AND BAYESIAN MODELING TO PROCESS DESIGN, MONITORING AND CONTROL

Andrew J. Radcliffe (9080312) 24 July 2020 (has links)
<div>In the past two decades, the pharmaceutical industry has been engaged in modernization of its drug development and manufacturing strategies, spurred onward by changing market pressures, regulatory encouragement, and technological advancement. Concomitant with these changes has been a shift toward new modalities of manufacturing in support of patient-centric medicine and on-demand production. To achieve these objectives requires manufacturing platforms which are both flexible and scalable, hence the interest in development of small-scale, continuous processes for synthesis, purification and drug product production. Traditionally, the downstream steps begin with a crystalline drug powder – the effluent of the final purification steps – and convert this to tablets or capsules through a series of batch unit operations reliant on powder processing. As an alternative, additive manufacturing technologies provide the means to circumvent difficulties associated with dry powder rheology, while being inherently capable of flexible production.</div><div>Through the combination of physical knowledge, experimental work, and data-driven methods, a framework was developed for ink formulation and process operation in drop-on-demand manufacturing with non-Brownian suspensions. Motivated by the challenges at hand, application of novel computational image analysis techniques yielded insight into the effects of non-Brownian particles and fluid properties on rheology. Furthermore, the extraction of modal and statistical information provided insight into the stochastic events which appear to play a notable role in drop formation from such suspensions. These computer vision algorithms can readily be applied by other researchers interested in the physics of drop coalescence and breakup in order to further modeling efforts.</div><div>Returning to the realm of process development to deal with challenges of monitoring and quality control initiated by suspension-based manufacturing, these machine vision algorithms were combined with Bayesian modeling to enact a probabilistic control strategy at the level of each dosage unit by utilizing the real-time image data acquired by an online process image sensor. Drawing upon a large historical database which spanned a wide range of conditions, a hierarchical modeling approach was used to incorporate the various sources of uncertainty inherent to the manufacturing process and monitoring technology, therefore providing more reliable predictions for future data at in-sample and out-of-sample conditions.</div><div>This thesis thus contributes advances in three closely linked areas: additive manufacturing of solid oral drug products, computer vision methods for event recognition in drop formation, and Bayesian hierarchical modeling to predict the probability that each dosage unit produced is within specifications.</div><div><br></div>
113

Replication and Knowledge Production in Empirical Software Engineering Research

Krein, Jonathan L 01 December 2014 (has links) (PDF)
Although replication is considered an indispensable part of the scientific method in software engineering, few replication studies are published each year. The rate of replication, however, is not surprising given that replication theory in software engineering is immature. Not only are replication taxonomies varied and difficult to reconcile, but opinions on the role of replication contradict. In general, we have no clear sense of how to build knowledge via replication, particularly given the practical realities of our research field. Consequently, most replications in software engineering yield little useful information. In particular, the vast majority of external replications (i.e., replications performed by researchers unaffiliated with the original study) not only fail to reproduce the original results, but defy explanation. The net effect is that, as a research field, we consistently fail to produce usable (i.e., transferable) knowledge, and thus, our research results have little if any impact on industry. In this dissertation, we dissect the problem of replication into four primary concerns: 1) rate and explicitness of replication; 2) theoretical foundations of replication; 3) tractability of methods for context analysis; and 4) effectiveness of inter-study communication. We address each of the four concerns via a two-part research strategy involving both a theoretical and a practical component. The theoretical component consists of a grounded theory study in which we integrate and then apply external replication theory to problems of replication in empirical software engineering. The theoretical component makes three key contributions to the literature: first, it clarifies the role of replication with respect to the overall process of science; second, it presents a flexible framework for reconciling disparate replication terminology; and third, it informs a broad range of practical replication concerns. The practical component involves a series of replication studies, through which we explore a variety of replication concepts and empirical methods, ultimately culminating in the development of a tractable method for context analysis (TCA). TCA enables the quantitative evaluation of context variables in greater detail, with greater statistical power, and via considerably smaller datasets than previously possible. As we show (via a complex, real-world example), the method ultimately enables the empirically and statistically-grounded reconciliation and generalization of otherwise contradictory results across dissimilar replications—which problem has previously remained unsolved in software engineering.
114

Modelling and prediction of bacterial attachment to polymers

Epa, V.C., Hook, A.L., Chang, Chien-Yi, Yang, J., Langer, R., Anderson, D.G., Williams, P., Davies, M.C., Alexander, M.R., Winkler, D.A. 04 December 2013 (has links)
Yes / Infection by pathogenic bacteria on implanted and indwelling medical devices during surgery causes large morbidity and mortality worldwide. Attempts to ameliorate this important medical issue have included development of antimicrobial surfaces on materials, “no touch” surgical procedures, and development of materials with inherent low pathogen attachment. The search for new materials is increasingly being carried out by high throughput methods. Efficient methods for extracting knowledge from these large data sets are essential. Data from a large polymer microarray exposed to three clinical pathogens is used to derive robust and predictive machine-learning models of pathogen attachment. The models can predict pathogen attachment for the polymer library quantitatively. The models also successfully predict pathogen attachment for a second-generation library, and identify polymer surface chemistries that enhance or diminish pathogen attachment. / CSIRO Advanced Materials Transformational Capability Platform. Newton Turner Award for Exceptional Senior Scientists. Wellcome Trust. Grant Number: 085245. NIH. Grant Number: R01 DE016516
115

EFFICIENT CONFIDENCE SETS FOR DISEASE GENE LOCATIONS

Sinha, Ritwik 19 March 2007 (has links)
No description available.
116

Gaussian Process Methods for Estimating Radio Channel Characteristics

Ottosson, Anton, Karlstrand, Viktor January 2020 (has links)
Gaussian processes (GPs) as a Bayesian regression method have been around for some time. Since proven advant-ageous for sparse and noisy data, we explore the potential of Gaussian process regression (GPR) as a tool for estimating radiochannel characteristics. Specifically, we consider the estimation of a time-varying continuous transfer function from discrete samples. We introduce the basic theory of GPR, and employ both GPR and its deep-learning counterpart deep Gaussian process regression (DGPR)for estimation. We find that both perform well, even with few samples. Additionally, we relate the channel coherence bandwidth to a GPR hyperparameter called length-scale. The results show a tendency towards proportionality, suggesting that our approach offers an alternative way to approximate the coherence band-width. / Gaussiska processer (Gaussian processes, GPs) har länge använts för Bayesiansk regression. Då de visat sig fördelaktiga för gles och brusig data utforskar vi möjligheterna för GP-regression (Gaussian process regression, GPR) som ett verktyg för att estimera egenskaper hos radiokanaler.I synnerhet betraktas skattning av en tidsvarierande överföringsfunktion utifrån diskreta samplingar. Vi presenterar den grundläggande teorin kring GPR, och använder både GPR och dess djupinlärningsmotsvarighet DGPR (deep Gaussian process regression) för skattning. Båda ger goda resultat, även när samplingarna är få. Utöver detta så relaterar vi koherensbandbredden hos en radiokanal till en hyperparameter i GPR-modellen. Resultaten visar på en tendens till proportionalitet, vilket antyder att vår metod kan användas som ett alternativt sätt att approximera koherensbandbredden. / Kandidatexjobb i elektroteknik 2020, KTH, Stockholm
117

Mathematical modelling of metabolism and acidity in cancer

McGillen, Jessica Buono January 2014 (has links)
Human cancers exhibit the common phenotype of elevated glycolytic metabolism, which causes acidification of the tissue microenvironment and may facilitate tumour invasion. In this thesis, we use mathematical models to address a series of open problems underlying the glycolytic tumour phenotype and its attendant acidity. We first explore tissue-scale consequences of metabolically-derived acid. Incorporating more biological detail into a canonical model of acidity at the tumour-host interface, we extend the range of tumour behaviours captured by the modelling framework. We then carry out an asymptotic travelling wave analysis to express invasive tumour properties in terms of fundamental parameters, and find that interstitial gaps between an advancing tumour and retreating healthy tissue, characteristic of aggressive invasion and comprising a controversial feature of the original model, are less significant under our generalised formulation. Subsequently, we evaluate a potential role of lactate---historically assumed to be a passive byproduct of glycolytic metabolism---in a perfusion-dependent metabolic symbiosis that was recently proposed as a beneficial tumour behaviour. Upon developing a minimal model of dual glucose-lactate consumption in vivo and employing a multidimensional sensitivity analysis, we find that symbiosis may not be straightforwardly beneficial for our model tumour. Moreover, new in vitro experiments, carried out by an experimental collaborator, place U87 glioblastoma tumours in a weakly symbiotic parameter regime despite their clinical malignancy. These results suggest that intratumoural metabolic cooperation is unlikely to be an important role for lactate. Finally, we examine the complex pH regulation system that governs expulsion of metabolically derived acid loads across tumour cell membranes. This system differs from the healthy system by expression of only a few key proteins, yet its dynamics are non-intuitive in the crowded and poorly perfused in vivo environment. We systematically develop a model of tumour pH regulation, beginning with a single-cell scenario and progressing to a spheroid, within a Bayesian framework that incorporates information from in vitro data contributed by a second experimental collaborator. We predict that a net effect of pH regulation is a straightforward transmembrane pH gradient, but also that existing treatments are unable to disrupt the system strongly enough to cause tumour cell death. Taken together, our models help to elucidate previously unresolved features of glycolytic tumour metabolism, and illustrate the utility of a combined mathematical, statistical, and experimental approach for testing biological hypotheses. Opportunities for further investigation are discussed.
118

Méthodologie de l'utilisation des biomarqueurs quantitatifs longitudinaux pour l'aide à la décision en médecine : application aux PSA dans le cancer de la prostate / Methodology for the use of longitudinal quantitative biomarkers in medical decision making

Subtil, Fabien 04 June 2010 (has links)
Lorsqu'un biomarqueur est mesuré de façon répétée au cours du suivi de patients, il est d'abord nécessaire d'établir un critère, issu du profil d'évolution longitudinal du marqueur, afin de détecter la survenue d'un événement, ou d'en prédire la gravité. Nous avons développé une méthode de modélisation robuste de données longitudinales, afin de calculer les différents critères pour les patients, et d'en comparer les performances diagnostiques ou pronostiques. Dans un second temps, il faut déterminer un seuil de ce critère quantitatif au dessus ou en dessous duquel le test diagnostique est considéré comme positif. Une méthode Bayésienne d'estimation de ce seuil et de son intervalle de crédibilité a été développée. Ce travail a été appliqué au diagnostic de persistance locale de cellules cancéreuses après traitement par ultrasons d'un cancer de la prostate. Ce diagnostic est effectué à partir des mesures répétées d'antigène spécifique de la prostate (PSA), dont le nadir a été retenu, avec différents seuils, comme meilleur critère diagnostique. Ceci permet de n'effectuer des biopsies que lorsqu'il y a de fortes chances qu'elles soient positives. / For the early diagnosis or prognosis of an event in presence of repeated measurements of a biomarker over time, it is necessary to define a criterion, stemming from the longitudinal profiles of that marker. A method was developed for a robust modelling of marker measurements, to calculate the various criteria for the patients, and compare their diagnostic or prognostic accuracies. Using the continuous criterion as a diagnostic test requires the specification of a threshold. A Bayesian method was developed to estimate this threshold and its credible interval. This method was applied to the diagnosis of local prostate cancer persistence after an ultrasound treatment. The diagnosis relies on serial measurements of prostate specific antigen (PSA), whose nadir (along with several thresholds) was found to be the best diagnostic criterion. This allows to trigger biopsy only when this biopsy is likely to be positive.
119

Échantillonnage préférentiel adaptatif et méthodes bayésiennes approchées appliquées à la génétique des populations. / Adaptive multiple importance sampling and approximate bayesian computation with applications in population genetics.

Sedki, Mohammed Amechtoh 31 October 2012 (has links)
Dans cette thèse, on propose des techniques d'inférence bayésienne dans les modèles où la vraisemblance possède une composante latente. La vraisemblance d'un jeu de données observé est l'intégrale de la vraisemblance dite complète sur l'espace de la variable latente. On s'intéresse aux cas où l'espace de la variable latente est de très grande dimension et comportes des directions de différentes natures (discrètes et continues), ce qui rend cette intégrale incalculable. Le champs d'application privilégié de cette thèse est l'inférence dans les modèles de génétique des populations. Pour mener leurs études, les généticiens des populations se basent sur l'information génétique extraite des populations du présent et représente la variable observée. L'information incluant l'histoire spatiale et temporelle de l'espèce considérée est inaccessible en général et représente la composante latente. Notre première contribution dans cette thèse suppose que la vraisemblance peut être évaluée via une approximation numériquement coûteuse. Le schéma d'échantillonnage préférentiel adaptatif et multiple (AMIS pour Adaptive Multiple Importance Sampling) de Cornuet et al. [2012] nécessite peu d'appels au calcul de la vraisemblance et recycle ces évaluations. Cet algorithme approche la loi a posteriori par un système de particules pondérées. Cette technique est conçue pour pouvoir recycler les simulations obtenues par le processus itératif (la construction séquentielle d'une suite de lois d'importance). Dans les nombreux tests numériques effectués sur des modèles de génétique des populations, l'algorithme AMIS a montré des performances numériques très prometteuses en terme de stabilité. Ces propriétés numériques sont particulièrement adéquates pour notre contexte. Toutefois, la question de la convergence des estimateurs obtenus parcette technique reste largement ouverte. Dans cette thèse, nous montrons des résultats de convergence d'une version légèrement modifiée de cet algorithme. Sur des simulations, nous montrons que ses qualités numériques sont identiques à celles du schéma original. Dans la deuxième contribution de cette thèse, on renonce à l'approximation de la vraisemblance et onsupposera seulement que la simulation suivant le modèle (suivant la vraisemblance) est possible. Notre apport est un algorithme ABC séquentiel (Approximate Bayesian Computation). Sur les modèles de la génétique des populations, cette méthode peut se révéler lente lorsqu'on vise uneapproximation précise de la loi a posteriori. L'algorithme que nous proposons est une amélioration de l'algorithme ABC-SMC de DelMoral et al. [2012] que nous optimisons en nombre d'appels aux simulations suivant la vraisemblance, et que nous munissons d'un mécanisme de choix de niveauxd'acceptations auto-calibré. Nous implémentons notre algorithme pour inférer les paramètres d'un scénario évolutif réel et complexe de génétique des populations. Nous montrons que pour la même qualité d'approximation, notre algorithme nécessite deux fois moins de simulations par rapport à laméthode ABC avec acceptation couramment utilisée. / This thesis consists of two parts which can be read independently.The first part is about the Adaptive Multiple Importance Sampling (AMIS) algorithm presented in Cornuet et al.(2012) provides a significant improvement in stability and Effective Sample Size due to the introduction of the recycling procedure. These numerical properties are particularly adapted to the Bayesian paradigm in population genetics where the modelization involves a large number of parameters. However, the consistency of the AMIS estimator remains largely open. In this work, we provide a novel Adaptive Multiple Importance Sampling scheme corresponding to a slight modification of Cornuet et al. (2012) proposition that preserves the above-mentioned improvements. Finally, using limit theorems on triangular arrays of conditionally independant random variables, we give a consistensy result for the final particle system returned by our new scheme.The second part of this thesis lies in ABC paradigm. Approximate Bayesian Computation has been successfully used in population genetics models to bypass the calculation of the likelihood. These algorithms provide an accurate estimator by comparing the observed dataset to a sample of datasets simulated from the model. Although parallelization is easily achieved, computation times for assuring a suitable approximation quality of the posterior distribution are still long. To alleviate this issue, we propose a sequential algorithm adapted fromDel Moral et al. (2012) which runs twice as fast as traditional ABC algorithms. Itsparameters are calibrated to minimize the number of simulations from the model.
120

Inferência estatística em métodos de análise de ressonância magnética funcional / Statistical Inference in Methods of Analysis of Functional Magnetic Resonance

Cabella, Brenno Caetano Troca 11 April 2008 (has links)
No presente trabalho, conceitos de inferência estatística são utilizados para aplicação e comparação de diferentes métodos de análise de sinais de ressonância magnética funcional. A idéia central baseia-se na obtenção da distribuição de probabilidade da variável aleatória de interesse, para cada método estudado e sob diferentes valores da relação sinal-ruído (SNR). Este objetivo é atingido através de simulações numéricas da função resposta hemodinâmica (HRF) acrescida de ruído gaussiano. Tal procedimento nos permite avaliar a sensibilidade e a especificidade dos métodos empregados através da construção das curvas ROC (receiver operating characteristic) para diferentes valores de SNR. Sob específicas condições experimentais, aplicamos métodos clássicos de análise (teste t de Student e correlação), medidas de informação (distância de Kullback-Leibler e sua forma generalizada) e um método Bayesiano (método do pixel independente). Em especial, mostramos que a distância de Kullback-Leibler (D) (ou entropia relativa) e sua forma generalizada são medidas úteis para análise de sinais dentro do cenário de teoria da informação. Estas entropias são usadas como medidas da \"distância\"entre as funções de probabilidade p1 e p2 dos níveis do sinal relacionados a estímulo e repouso. Para prevenir a ocorrência de valores divergentes de D, introduzimos um pequeno parâmetro d nas definições de p1 e p2. Estendemos a análise, apresentando um estudo original da distância de Kullback-Leibler generalizada Dq (q é o parâmetro de Tsallis). Neste caso, a escolha apropriada do intervalo 0 < q < 1 permite assegurar que Dq seja finito. Obtemos as densidades de probabilidade f (D) e f (Dq) das médias amostrais das variáveis D e Dq , respectivamente, calculadas ao longo das N épocas de todo o experimento. Para pequenos valores de N (N < 30), mostramos que f (D) e f (Dq) são muito bem aproximadas por distribuições Gamma (qui^2 < 0,0009). Em seguida, estudamos o método (Bayesiano) do pixel independente, considerando a probabilidade a posteriori como variável aleatória e obtendo sua distribuição para várias SNR\'s e probabilidades a priori. Os resultados das simulações apontam para o fato de que a correlação e o método do pixel independente apresentam melhor desempenho do que os demais métodos empregados (para SNR > -20 dB). Contudo, deve-se ponderar que o teste t e os métodos entrópicos compartilham da vantagem de não se utilizarem de um modelo para HRF na análise de dados reais. Finalmente, para os diferentes métodos, obtemos os mapas funcionais correspondentes a séries de dados reais de um voluntário assintomático submetido a estímulo motor de evento relacionado, os quais demonstram ativação nas áreas cerebrais motoras primária e secundária. Enfatizamos que o procedimento adotado no presente estudo pode, em princípio, ser utilizado em outros métodos e sob diferentes condições experimentais. / In the present work, concepts of statistical inference are used for application and comparison of different methods of signal analysis in functional magnetic resonance imaging. The central idea is based on obtaining the probability distribution of the random variable of interest, for each method studied under different values of signal-to-noise ratio (SNR). This purpose is achieved by means of numerical simulations of the hemodynamic response function (HRF) with gaussian noise. This procedure allows us to assess the sensitivity and specificity of the methods employed by the construction of the ROC curves (receiver operating characteristic) for different values of SNR. Under specific experimental conditions, we apply classical methods of analysis (Student\'s t test and correlation), information measures (distance of Kullback-Leibler and its generalized form) and a Bayesian method (independent pixel method). In particular, we show that the distance of Kullback-Leibler D (or relative entropy) and its generalized form are useful measures for analysis of signals within the information theory scenario. These entropies are used as measures of the \"distance\"between the probability functions p1 and p2 of the signal levels related to stimulus and non-stimulus. In order to avoid undesirable divergences of D, we introduced a small parameter d in the definitions of p1 and p2. We extend such analysis, by presenting an original study of the generalized Kullback-Leibler distance Dq (q is Tsallis parameter). In this case, the appropriate choice of range 0 < q < 1 ensures that Dq is finite. We obtain the probability densities f (D) and f (Dq) of the sample averages of the variables D and Dq, respectively, calculated over the N epochs of the entire experiment. For small values of N (N < 30), we show that f (D) and f (Dq) are well approximated by Gamma distributions (qui^2 < 0.0009). Afterward, we studied the independent pixel bayesian method, considering the probability a posteriori as a random variable, and obtaining its distribution for various SNR\'s and probabilities a priori. The results of simulations point to the fact that the correlation and the independent pixel method have better performance than the other methods used (for SNR> -20 dB). However, one should consider that the Student\'s t test and the entropic methods share the advantage of not using a model for HRF in real data analysis. Finally, we obtain the maps corresponding to real data series from an asymptomatic volunteer submitted to an event-related motor stimulus, which shows brain activation in the primary and secondary motor brain areas. We emphasize that the procedure adopted in this study may, in principle, be used in other methods and under different experimental conditions.

Page generated in 0.0445 seconds