• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 114
  • 22
  • 17
  • 15
  • 7
  • 5
  • 5
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 232
  • 232
  • 90
  • 43
  • 43
  • 36
  • 30
  • 28
  • 27
  • 24
  • 23
  • 22
  • 21
  • 20
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Abdominal aortic aneurysm inception and evolution - A computational model

Grytsan, Andrii January 2016 (has links)
Abdominal aortic aneurysm (AAA) is characterized by a bulge in the abdominal aorta. AAA development is mostly asymptomatic, but such a bulge may suddenly rupture, which is associated with a high mortality rate. Unfortunately, there is no medication that can prevent AAA from expanding or rupturing. Therefore, patients with detected AAA are monitored until treatment indication, such as maximum AAA diameter of 55 mm or expansion rate of 1 cm/year. Models of AAA development may help to understand the disease progression and to inform decision-making on a patient-specific basis. AAA growth and remodeling (G&amp;R) models are rather complex, and before the challenge is undertaken, sound clinical validation is required. In Paper A, an existing thick-walled model of growth and remodeling of one layer of an AAA slice has been extended to a two-layered model, which better reflects the layered structure of the vessel wall. A parameter study was performed to investigate the influence of mechanical properties and G&amp;R parameters of such a model on the aneurysm growth. In Paper B, the model from Paper A was extended to an organ level model of AAA growth. Furthermore, the model was incorporated into a Fluid-Solid-Growth (FSG) framework. A patient-specific geometry of the abdominal aorta is used to illustrate the model capabilities. In Paper C, the evolution of the patient-specific biomechanical characteristics of the AAA was investigated. Four patients with five to eight Computed Tomography-Angiography (CT-A) scans at different time points were analyzed. Several non-trivial statistical correlations were found between the analyzed parameters. In Paper D, the effect of different growth kinematics on AAA growth was investigated. The transverse isotropic in-thickness growth was the most suitable AAA growth assumption, while fully isotropic growth and transverse isotropic in-plane growth produced unrealistic results. In addition, modeling of the tissue volume change improved the wall thickness prediction, but still overestimated thinning of the wall during aneurysm expansion. / Bukaortaaneurysm (AAA) kännetecknas av en utbuktning hos aortaväggen i buken. Tillväxt av en AAA är oftast asymtomatisk, men en sådan utbuktning kan plö̈tsligt brista, vilket har hög dödlighet. Tyvärr finns det inga mediciner som kan förhindra AAA från att expandera eller brista. Patienter med upptä̈ckt AAA hålls därför under uppsikt tills operationskrav är uppnådda, såsom maximal AAA-diameter på 55 mm eller expansionstakt på 1 cm/år. Modeller för AAA-tillväxt kan bidra till att öka förståelsen för sjukdomsförloppet och till att förbättra beslutsunderlaget på en patientspecifik basis. AAA modeller för tillväxt och strukturförändring (G&amp;R) är ganska komplicerade och innan man tar sig an denna utmaning krävs de god klinisk validering. I Artikel A har en befintlig tjockväggig modell för tillväxt av ett skikt av en AAA-skiva utö̈kats till en två-skiktsmodell. Denna modell återspeglar bättre den skiktade strukturen hos kärlväggen. Genom en parameterstudie undersö̈ktes påverkan av mekaniska egenskaper och G&amp;R-parametrar hos en sådan modell för AAA-tillväxt. I Artikel B utvidgades modellen från Artikel A till en organnivå-modell för AAA-tillväxt. Vidare inkorporerades modellen i ett “Fluid–Solid–Growth” (FSG) ramverk. En patientspecifik geometri hos bukaortan användes för att illustrera möjligheterna med modellen. I Artikel C undersöktes utvecklingen av patientspecifika biomekaniska egenskaper hos AAA. Fyra patienter som skannats fem till åtta gånger med “Computed Tomography-Angiography” (CT-A) vid olika tillfällen analyserades. Flera icke triviala statistiska samband konstaterades mellan de analyserade parametrarna. I Artikel D undersöktes effekten av olika tillväxt-kinematik för AAA tillväxt. En modell med transversellt-isotrop-i-tjockleken-tillväxt var den bäst lämpade för AAA tillväxt, medans antagandet om fullt-isotrop-tillväxt och transversellt-isotrop-i-planet-tillväxt producerade orimliga resultat. Dessutom gav modellering av vävnadsvolymsförändring ett förbättrat väggtjockleks resultat men en fortsatt överskattning av väggförtunningen under AAA-expansionen. / <p>QC 20161201</p>
182

Développement d’un modèle de classification probabiliste pour la cartographie du couvert nival dans les bassins versants d’Hydro-Québec à l’aide de données de micro-ondes passives

Teasdale, Mylène 09 1900 (has links)
Chaque jour, des décisions doivent être prises quant à la quantité d'hydroélectricité produite au Québec. Ces décisions reposent sur la prévision des apports en eau dans les bassins versants produite à l'aide de modèles hydrologiques. Ces modèles prennent en compte plusieurs facteurs, dont notamment la présence ou l'absence de neige au sol. Cette information est primordiale durant la fonte printanière pour anticiper les apports à venir, puisqu'entre 30 et 40% du volume de crue peut provenir de la fonte du couvert nival. Il est donc nécessaire pour les prévisionnistes de pouvoir suivre l'évolution du couvert de neige de façon quotidienne afin d'ajuster leurs prévisions selon le phénomène de fonte. Des méthodes pour cartographier la neige au sol sont actuellement utilisées à l'Institut de recherche d'Hydro-Québec (IREQ), mais elles présentent quelques lacunes. Ce mémoire a pour objectif d'utiliser des données de télédétection en micro-ondes passives (le gradient de températures de brillance en position verticale (GTV)) à l'aide d'une approche statistique afin de produire des cartes neige/non-neige et d'en quantifier l'incertitude de classification. Pour ce faire, le GTV a été utilisé afin de calculer une probabilité de neige quotidienne via les mélanges de lois normales selon la statistique bayésienne. Par la suite, ces probabilités ont été modélisées à l'aide de la régression linéaire sur les logits et des cartographies du couvert nival ont été produites. Les résultats des modèles ont été validés qualitativement et quantitativement, puis leur intégration à Hydro-Québec a été discutée. / Every day, decisions must be made about the amount of hydroelectricity produced in Quebec. These decisions are based on the prediction of water inflow in watersheds based on hydrological models. These models take into account several factors, including the presence or absence of snow. This information is critical during the spring melt to anticipate future flows, since between 30 and 40 % of the flood volume may come from the melting of the snow cover. It is therefore necessary for forecasters to be able to monitor on a daily basis the snow cover to adjust their expectations about the melting phenomenon. Some methods to map snow on the ground are currently used at the Institut de recherche d'Hydro-Québec (IREQ), but they have some shortcomings. This master thesis's main goal is to use remote sensing passive microwave data (the vertically polarized brightness temperature gradient ratio (GTV)) with a statistical approach to produce snow maps and to quantify the classification uncertainty. In order to do this, the GTV has been used to calculate a daily probability of snow via a Gaussian mixture model using Bayesian statistics. Subsequently, these probabilities were modeled using linear regression models on logits and snow cover maps were produced. The models results were validated qualitatively and quantitatively, and their integration at Hydro-Québec was discussed.
183

Table tennis event detection and classification

Oldham, Kevin M. January 2015 (has links)
It is well understood that multiple video cameras and computer vision (CV) technology can be used in sport for match officiating, statistics and player performance analysis. A review of the literature reveals a number of existing solutions, both commercial and theoretical, within this domain. However, these solutions are expensive and often complex in their installation. The hypothesis for this research states that by considering only changes in ball motion, automatic event classification is achievable with low-cost monocular video recording devices, without the need for 3-dimensional (3D) positional ball data and representation. The focus of this research is a rigorous empirical study of low cost single consumer-grade video camera solutions applied to table tennis, confirming that monocular CV based detected ball location data contains sufficient information to enable key match-play events to be recognised and measured. In total a library of 276 event-based video sequences, using a range of recording hardware, were produced for this research. The research has four key considerations: i) an investigation into an effective recording environment with minimum configuration and calibration, ii) the selection and optimisation of a CV algorithm to detect the ball from the resulting single source video data, iii) validation of the accuracy of the 2-dimensional (2D) CV data for motion change detection, and iv) the data requirements and processing techniques necessary to automatically detect changes in ball motion and match those to match-play events. Throughout the thesis, table tennis has been chosen as the example sport for observational and experimental analysis since it offers a number of specific CV challenges due to the relatively high ball speed (in excess of 100kph) and small ball size (40mm in diameter). Furthermore, the inherent rules of table tennis show potential for a monocular based event classification vision system. As the initial stage, a proposed optimum location and configuration of the single camera is defined. Next, the selection of a CV algorithm is critical in obtaining usable ball motion data. It is shown in this research that segmentation processes vary in their ball detection capabilities and location out-puts, which ultimately affects the ability of automated event detection and decision making solutions. Therefore, a comparison of CV algorithms is necessary to establish confidence in the accuracy of the derived location of the ball. As part of the research, a CV software environment has been developed to allow robust, repeatable and direct comparisons between different CV algorithms. An event based method of evaluating the success of a CV algorithm is proposed. Comparison of CV algorithms is made against the novel Efficacy Metric Set (EMS), producing a measurable Relative Efficacy Index (REI). Within the context of this low cost, single camera ball trajectory and event investigation, experimental results provided show that the Horn-Schunck Optical Flow algorithm, with a REI of 163.5 is the most successful method when compared to a discrete selection of CV detection and extraction techniques gathered from the literature review. Furthermore, evidence based data from the REI also suggests switching to the Canny edge detector (a REI of 186.4) for segmentation of the ball when in close proximity to the net. In addition to and in support of the data generated from the CV software environment, a novel method is presented for producing simultaneous data from 3D marker based recordings, reduced to 2D and compared directly to the CV output to establish comparative time-resolved data for the ball location. It is proposed here that a continuous scale factor, based on the known dimensions of the ball, is incorporated at every frame. Using this method, comparison results show a mean accuracy of 3.01mm when applied to a selection of nineteen video sequences and events. This tolerance is within 10% of the diameter of the ball and accountable by the limits of image resolution. Further experimental results demonstrate the ability to identify a number of match-play events from a monocular image sequence using a combination of the suggested optimum algorithm and ball motion analysis methods. The results show a promising application of 2D based CV processing to match-play event classification with an overall success rate of 95.9%. The majority of failures occur when the ball, during returns and services, is partially occluded by either the player or racket, due to the inherent problem of using a monocular recording device. Finally, the thesis proposes further research and extensions for developing and implementing monocular based CV processing of motion based event analysis and classification in a wider range of applications.
184

Modelos de mistura beta mistos sob abordagem bayesiana / Mixture of beta mixed models: a Bayesian approach

Zerbeto, Ana Paula 14 December 2018 (has links)
Os modelos de mistura são muito eficazes para analisar dados compostos por diferentes subpopulações com alocações desconhecidas ou que apresentam assimetria, multimodalidade ou curtose. Esta tese propõe relacionar a distribuição de probabilidade beta e a técnica de ajuste de modelos mistos à metodologia de modelos de mistura para que sejam adequados na análise de dados que assumem valores em um intervalo restrito conhecido e que também são caracterizados por possuírem uma estrutura de agrupamento ou hierárquica. Foram especificados os modelos de mistura beta mistos linear, com dispersão constante e variável, e não linear. Foi considerada uma abordagem bayesiana com uso de métodos de Monte Carlo via Cadeias de Markov (MCMC). Estudos de simulação foram delineados para avaliar os resultados inferenciais destes modelos em relação à acurácia da estimação pontual dos parâmetros, ao desempenho de critérios de informação na seleção do número de elementos da mistura e ao diagnóstico de identificabilidade obtido com o algoritmo data cloning. O desempenho dos modelos foi muito promissor, principalmente pela boa acurácia da estimação pontual dos parâmetros e por não haver evidências de falta de identificabilidade. Três bancos de dados reais das áreas de saúde, marketing e educação foram estudados por meio das técnicas propostas. Tanto nos estudos de simulação quanto na aplicação a dados reais se obtiveram resultados muito satisfatórios que evidenciam tanto a utilidade dos modelos desenvolvidos aos objetivos tratados quanto a potencialidade de aplicação. Ressaltando que a metodologia apresentada também pode ser aplicada e estendida a outros modelos de mistura. / Mixture models are very effective for analyzing data composed of different subpopulations with unknown allocations or with asymmetry, multimodality or kurtosis. This work proposes to link the beta probability distribution and the mixed models to the methodology of mixture models so that they are suitable to analyse data with values in a restricted and known interval and that also are characterized by having a grouping or hierarchical structure. There were specified the linear beta mixture models with random effects, with constant and varying dispersion, and also the nonlinear one with constant dispersion. It was considered a Bayesian approach using Markov Chain Monte Carlo (MCMC) methods. Simulation studies were designed to evaluate the inferential results of these models in relation to the accuracy of the parameter estimation, to the performance of information criteria in the selection of the number of elements of the mixture and to the diagnosis of identifiability obtained with the algorithm data cloning. The performance of the models was very promising, mainly due to the good accuracy of the point estimation of the parameters and because there was no evidence of lack of identifiability of the model. Three real databases of health, marketing and education were studied using the proposed techniques. In both the simulation studies and the application to real data had very satisfactory results that show both the usefulness of the models developed to the treated objectives and the potentiality of application. Note that the presented methodology can also be applied and extended to other mixing models.
185

Model-based clustering and model selection for binned data. / Classification automatique à base de modèle et choix de modèles pour les données discrétisées

Wu, Jingwen 28 January 2014 (has links)
Cette thèse étudie les approches de classification automatique basées sur les modèles de mélange gaussiens et les critères de choix de modèles pour la classification automatique de données discrétisées. Quatorze algorithmes binned-EM et quatorze algorithmes bin-EM-CEM sont développés pour quatorze modèles de mélange gaussiens parcimonieux. Ces nouveaux algorithmes combinent les avantages des données discrétisées en termes de réduction du temps d’exécution et les avantages des modèles de mélange gaussiens parcimonieux en termes de simplification de l'estimation des paramètres. Les complexités des algorithmes binned-EM et bin-EM-CEM sont calculées et comparées aux complexités des algorithmes EM et CEM respectivement. Afin de choisir le bon modèle qui s'adapte bien aux données et qui satisfait les exigences de précision en classification avec un temps de calcul raisonnable, les critères AIC, BIC, ICL, NEC et AWE sont étendus à la classification automatique de données discrétisées lorsque l'on utilise les algorithmes binned-EM et bin-EM-CEM proposés. Les avantages des différentes méthodes proposées sont illustrés par des études expérimentales. / This thesis studies the Gaussian mixture model-based clustering approaches and the criteria of model selection for binned data clustering. Fourteen binned-EM algorithms and fourteen bin-EM-CEM algorithms are developed for fourteen parsimonious Gaussian mixture models. These new algorithms combine the advantages in computation time reduction of binning data and the advantages in parameters estimation simplification of parsimonious Gaussian mixture models. The complexities of the binned-EM and the bin-EM-CEM algorithms are calculated and compared to the complexities of the EM and the CEM algorithms respectively. In order to select the right model which fits well the data and satisfies the clustering precision requirements with a reasonable computation time, AIC, BIC, ICL, NEC, and AWE criteria, are extended to binned data clustering when the proposed binned-EM and bin-EM-CEM algorithms are used. The advantages of the different proposed methods are illustrated through experimental studies.
186

Two-phase flows in gas-evolving electrochemical applications

Wetind, Ruben January 2001 (has links)
No description available.
187

Two-phase flows in gas-evolving electrochemical applications

Wetind, Ruben January 2001 (has links)
No description available.
188

Alternative regression models to beta distribution under bayesian approach / Modelos de regressão alternativos à distribuição beta sob abordagem bayesiana

Paz, Rosineide Fernando da 25 August 2017 (has links)
Submitted by Aelson Maciera (aelsoncm@terra.com.br) on 2017-09-27T18:09:58Z No. of bitstreams: 1 TeseRFP.pdf: 2142415 bytes, checksum: 8dcd8615da0b442e9f1b52f35364715b (MD5) / Approved for entry into archive by Ronildo Prado (producaointelectual.bco@ufscar.br) on 2017-10-10T18:16:14Z (GMT) No. of bitstreams: 1 TeseRFP.pdf: 2142415 bytes, checksum: 8dcd8615da0b442e9f1b52f35364715b (MD5) / Approved for entry into archive by Ronildo Prado (producaointelectual.bco@ufscar.br) on 2017-10-10T18:16:22Z (GMT) No. of bitstreams: 1 TeseRFP.pdf: 2142415 bytes, checksum: 8dcd8615da0b442e9f1b52f35364715b (MD5) / Made available in DSpace on 2017-10-10T18:23:04Z (GMT). No. of bitstreams: 1 TeseRFP.pdf: 2142415 bytes, checksum: 8dcd8615da0b442e9f1b52f35364715b (MD5) Previous issue date: 2017-08-25 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / The Beta distribution is a bounded domain distribution which has dominated the modeling the distribution of random variable that assume value between 0 and 1. Bounded domain distributions arising in various situations such as rates, proportions and index. Motivated by an analysis of electoral votes percentages (where a distribution with support on the positive real numbers was used, although a distribution with limited support could be more suitable) we focus on alternative distributions to Beta distribution with emphasis in regression models. In this work, initially we present the Simplex mixture model as a flexible model to modeling the distribution of bounded random variable then we extend the model to the context of regression models with the inclusion of covariates. The parameters estimation is discussed for both models considering Bayesian inference. We apply these models to simulated data sets in order to investigate the performance of the estimators. The results obtained were satisfactory for all the cases investigated. Finally, we introduce a parameterization of the L-Logistic distribution to be used in the context of regression models and we extend it to a mixture of mixed models. / A distribuição beta é uma distribuição com suporte limitado que tem dominado a modelagem de variáveis aleatórias que assumem valores entre 0 e 1. Distribuições com suporte limitado surgem em várias situações como em taxas, proporções e índices. Motivados por uma análise de porcentagens de votos eleitorais, em que foi assumida uma distribuição com suporte nos números reais positivos quando uma distribuição com suporte limitado seira mais apropriada, focamos em modelos alternativos a distribuição beta com enfase em modelos de regressão. Neste trabalho, apresentamos, inicialmente, um modelo de mistura de distribuições Simplex como um modelo flexível para modelar a distribuição de variáveis aleatórias que assumem valores em um intervalo limitado, em seguida estendemos o modelo para o contexto de modelos de regressão com a inclusão de covariáveis. A estimação dos parâmetros foi discutida para ambos os modelos, considerando o método bayesiano. Aplicamos os dois modelos a dados simulados para investigarmos a performance dos estimadores usados. Os resultados obtidos foram satisfatórios para todos os casos investigados. Finalmente, introduzimos a distribuição L-Logistica no contexto de modelos de regressão e posteriormente estendemos este modelo para o contexto de misturas de modelos de regressão mista.
189

Impact des multitrajets sur les performances des systèmes de navigation par satellite : contribution à l'amélioration de la précision de localisation par modélisation bayésienne / Multipath impact on the performances of satellite navigation systems : contribution to the enhancement of location accuracy towards bayesian modeling

Nahimana, Donnay Fleury 19 February 2009 (has links)
De nombreuses solutions sont développées pour diminuer l'influence des multitrajets sur la précision et la disponibilité des systèmes GNSS. L'intégration de capteurs supplémentaires dans le système de localisation est l'une des solutions permettant de compenser notamment l'absence de données satellitaires. Un tel système est certes d'une bonne précision mais sa complexité et son coût limitent un usage très répandu.Cette thèse propose une approche algorithmique destinée à améliorer la précision des systèmes GNSS en milieu urbain. L'étude se base sur l'utilisation des signaux GNSS uniquement et une connaissance de l'environnement proche du récepteur à partir d'un modèle 3D du lieu de navigation.La méthode présentée intervient à l'étape de filtrage du signal reçu par le récepteur GNSS. Elle exploite les techniques de filtrage statistique de type Monte Carlo Séquentiels appelées filtre particulaire. L'erreur de position en milieu urbain est liée à l'état de réception des signaux satellitaires (bloqué, direct ou réfléchi). C'est pourquoi une information sur l'environnement du récepteur doit être prise en compte. La thèse propose également un nouveau modèle d'erreurs de pseudodistance qui permet de considérer les conditions de réception du signal dans le calcul de la position.Dans un premier temps, l'état de réception de chaque satellite reçu est supposé connu dans le filtre particulaire. Une chaîne de Markov, valable pour une trajectoire connue du mobile, est préalablement définie pour déduire les états successifs de réception des satellites. Par la suite, on utilise une distribution de Dirichlet pour estimer les états de réception des satellites / Most of the GNSS-based transport applications are employed in dense urban areas. One of the reasons of bad position accuracy in urban area is the obstacle's presence (building and trees). Many solutions are developed to decrease the multipath impact on accuracy and availability of GNSS systems. Integration of supplementary sensors into the localisation system is one of the solutions used to supply a lack of GNSS data. Such systems offer good accuracy but increase complexity and cost, which becomes inappropriate to equip a large fleet of vehicles.This thesis proposes an algorithmic approach to enhance the position accuracy in urban environment. The study is based on GNSS signals only and knowledge of the close reception environment with a 3D model of the navigation area.The method impacts the signal filtering step of the process. The filtering process is based on Sequential Monte Carlo methods called particle filter. As the position error in urban area is related to the satellite reception state (blocked, direct or reflected), information of the receiver environment is taken into account. A pseudorange error model is also proposed to fit satellite reception conditions. In a first work, the reception state of each satellite is assumed to be known. A Markov chain is defined for a known trajectory of the vehicle and is used to determine the successive reception states of each signal. Then, the states are estimated using a Dirichlet distribution
190

Avaliação de operadora de plano de saúde : cooperativa médica de assistência à saúde

Lumertz, José Antônio January 2011 (has links)
A globalização, a maior ou menor estabilidade financeira e política e o atual nível de desenvolvimento econômico que o Brasil vem apresentando, condição que lhe concedeu a classificação de investment grade, obtido em 30 de abril de 2008, pela Agência Standard & Poor’s, conjugado com o nível de renda que a população vem atingindo, está transformando o país num mercado atrativo ao capital externo. A regulamentação do setor de saúde privada iniciada com a lei 9656/98 e com a criação da Agência Nacional de Saúde Suplementar – ANS, em 2000, fez com que este segmento mercadológico passasse a atrair a curiosidade e, agora, o direcionamento da atenção de mega investidores. Neste mercado há uma importante participação das cooperativas médicas e odontológicas - com significativa parte no share – aproximadamente 40%, no seu conjunto. A regulamentação vem causando certa concentração do setor pela aquisição de carteiras ou até mesmo de toda a empresa, sendo que estas negociações atingem a casa dos bilhões de reais. Essas situações encontram nas sociedades cooperativas médicas e odontológicas peculiaridades que divergem destas facilidades de negociação. / Globalization, financial and political stability and current degree of economical development that Brazil are living (because of these condition Standard & Poor’s gave to Brazil the classification of investment grade on April 30th, 2008) combined with Brazilians income degree are converting Brazil in an attractive market for outside capital. Private health sector regulamentation which started with the Brazilian law number 9656 from 1998, and the creation of Agência Nacional de Saúde Suplementar – ANS (in 2000) let this sector to attract curiosity and nowadays the attention of big stakeholders. Medical and odontological cooperatives have important participation in this market with signification part in share – like 40% on its set. The regulamentation bring on a concentration in the sector by portfolio purchase or until same all firm acquisition – that involves trillion of Brazilian reais. These situations find on medical and odontological cooperatives quirks that diverge from negotiations facilities.

Page generated in 0.0447 seconds