• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 3
  • 1
  • 1
  • Tagged with
  • 11
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Från serum till plasma : Jämförelse av paratyroideahormon samt kortisol på Cobas 8000®

Mellberg, Maline January 2022 (has links)
Paratyroideahormon (PTH) och kortisol är två analyter som i dagsläget analyseras i serum med elektrokemilumenicense på instrumentet Cobas 8000® på Klinisk Kemi på Länssjukhuset i Kalmar. I framtiden är det planerat att övergå till att analysera dem i plasma. Syftet med projektet var att jämföra värden som erhållits för serum samt plasma för de båda analyterna och undersöka korrelationen mellan dem. Samtidigt utvärderades metodernas repeterbarhet samt eventuellt inflytande av oförutsebara faktorer på mätresultaten med en totalimprecisionsstudie.  PTH är ett peptidhormon som reglerar kalciumhomeostasen i kroppen. Dysfunktion i produktionen av PTH kan leda till osteoporos, hjärtarytmier, muskelkramper samt mentala störningar. Kortisol är ett livsnödvändigt steroidhormon som påverkar ämnesomsättningen och säkerställande det centrala nervsystemets energibehov i situationer med hög stress. Båda analyserna utförs rutinmässigt samt akut och en övergång till plasma skulle framför allt vara tidsbesparande och generera kortare svarstider. Jämförelsen av serum och plasma gjordes med 15 prover från patienter där ett serumrör samt ett plasmarör hade tagits vid samma provtillfälle. Detta resulterade i en determinationskoefficient (r2) som var för PTH 0,9988 och för kortisol 0,9993.  Totalimprecisionen uppskattades genom att analysera två serumkontroller för PTH och kortisol med sex replikat under fem dagar. Variationskoefficienten (CV) för de två PTH-serumkontrollerna var 3,9% för IM2 respektive 3,1% för IM3. För kortisol-serumkontrollerna var CV 2,1% för IM1 respektive 1,3% för IM2. För metoderna eftersträvades ett CV under 5%. Slutsatsen var att korrelationen mellan serum och plasma var mycket god samt att metoderna hade en bra stabilitet och repeterbarhet. / Parathyroid hormone (PTH) and cortisol are hormones involved in regulation of important mechanisms in the human body. PTH regulates the homeostasis of calcium and dysfunction in PTH production and release could lead to osteoporosis and arrythmia. Cortisol is involved in the metabolism and assures energy to the central nervous system in times of physiological stress. The concentration of these analytes is clinically determined by electrochemical luminescence on the immunochemical assay platform Cobas 8000®. In the laboratory of clinical chemistry in Kalmar the assays are performed on samples obtained from serum but changing to plasma assays would be more efficient and less time consuming. The aim of this project was to compare measured concentrations for PTH and cortisol using samples obtained from serum and plasma, respectively and to determine the correlation between the obtained results. The analytical precision was also evaluated to appreciate the influence of unpredictable events on the results.  The comparison between the results obtained from serum and plasma was performed by analysing 15 patient samples. Each serum and plasma sample were taken at the same time and from the same patient. The coefficient of determination for PTH was calculated to 0,9988 and for cortisol 0,9993 in serum and plasma, respectively. For the evaluation of the analytical precision, six replicates of two different serum controls for each analyte was assayed during six replicates for five days. In total of 30 replicates for each control. The coefficient of variation (CV) for PTH controls was 3,9% for IM2 and 3,1% for IM3. CV for cortisol was 2,1% for IM1 and 1,3% for IM2. The conclusion from this project was that there was a strong correlation between serum and plasma and both assays had good precision and repeatability.
2

Laboratórios via sistema tradicional e espectroscopia de reflectância: avaliação da qualidade analítica dos atributos do solo / Laboratories in the traditional system and reflectance spectroscopy: evaluation of analytical quality of soil attributes

Bedin, Luis Gustavo 26 August 2016 (has links)
A análise de solo é considerada ferramenta essencial para fins de recomendação de calagem, adubação e manejo do solo. Entretanto, com a demanda crescente por alimentos e a necessidade do aumento sustentável da produtividade agrícola, é fundamental seguir progredindo em termos de qualidade, custos e o tempo demandado para a obtenção dos resultados destas análises. Neste sentido, as técnicas de sensoriamento remoto, incluindo as escalas laboratoriais, de campo, aéreas e orbitais, apresentam vantagens, principalmente no que se refere à avaliação de áreas de grande extensão. A qualidade das determinações laboratoriais é fundamental para as recomendações de manejo do solo, levando ao questionamento do grau de variabilidade analítica entre diferentes laboratórios e quantificações via espectroscopia de reflectância. Objetivou-se avaliar as incertezas relacionadas às determinações da análise de solo, e como isso pode afetar nos modelos de predição espectrais (350-2.500 nm). Com isso, espera-se entender as vantagens e limitações das metodologias, permitindo assim decisões mais adequadas para o manejo do solo. Amostras de solos sob cultivo extensivo de cana de açúcar foram coletadas de 29 municípios situados no estado de São Paulo. Para a coleta dos solos foram abertos 48 perfis com aproximadamente 1,5 m de profundidade, foi retirado de cada perfil aproximadamente 10 kg de terra, nas profundidades de 0-0,2 e 0,8-1,00 m, totalizando 96 amostras primárias. Para as determinações químicas foram analisados os seguintes atributos: potencial hidrogeniônico (pH), matéria orgânica (MO), fósforo resina (P), potássio trocável (K+), cálcio trocável (Ca2+), magnésio trocável (Mg2+), alumínio trocável (Al3+), acidez potencial (H + Al), soma de bases trocáveis (SB), capacidade de troca de cátions (CTC), saturação da CTC por bases (V%) e saturação por Al3+ (m%). No que se refere às determinações granulométricas, foram analisadas as frações areia, silte e argila. Para obtenção dos espectros de reflectância, foram utilizados quatro espectrorradiômetros (350-2.500 nm). As variações das recomendações de calagem de diferentes laboratórios também foram avaliadas. Laboratórios foram avaliados com base em índices de imprecisão e inexatidão. As determinações com maiores erros em ordem decrescente, considerando a média de todos os laboratórios, foram m%, Al3+, Mg2+ e P. Esses erros influenciaram significativamente nas calibrações dos modelos de predições via sensor. Além disso, foi observado que as incertezas analíticas muitas vezes podem influenciar na recomendação de calagem. Para esta recomendação, um dos laboratórios estudados apresentou resultados com erro maior a 1 t ha-1. Os modelos de predição calibrados com os dados do laboratório com menor quantidade de erros apresentaram valor de R2 maior que 0,7 e RPD maior que 1,8, para os atributos MO, Al, CTC, H+Al, areia, silte e argila. A metodologia empregada possibilitou a quantificação do nível de incertezas aceitáveis nas determinações laboratoriais e a avaliação de como os erros analíticos laboratoriais influenciaram nas predições dos sensores. A espectroscopia de reflectância mostra ser alternativa complementar eficiente aos métodos tradicionais de análises de solo. / Soil analysis is an essential tool for liming recomendation, fertilization and soil management. Considering the increasing demand for food and the need for a sustainable increase in agricultural productivity, it is essential to promote the quality of soil analysis, as well as reducing costs and time required to obtain such analysis. In this sense, remote sensing techniques, including laboratory, field, aerial and orbital levels, have advantages especially regarding the assessment of areas of large extension. The quality of laboratory measurements is critical for soil management recommendations, which makes important to question the degree of analytical variability between different laboratories and measurements via reflectance spectroscopy. This study aimed to evaluate the uncertainties related to traditional soil analysis, and how they can affect the spectral prediction models (350-2500 nm). It is expected to understand the advantages and limitations of both methodologies, allowing proper decision-making for soil management. Soil samples under extensive sugar cane cultivation were collected from 29 municipalities in the state of São Paulo. For soil sampling, 48 soil profiles were opened in a depth of approximately 1.5 m and 10 kg of soil was collected from the depths 0-0.2 and 0.8-1.0 m, resulting in 96 primary samples. For chemical analysis the following attributes were considered: potential of Hydrogen (pH), Organic Matter (OM), phosphorus (P), exchangeable potassium (K+), exchangeable calcium (Ca2+), exchangeable magnesium (Mg2+), exchangeable aluminum (Al3+), potential acidity (H + Al), total exchangeable bases (SB), Cation Exchange Capacity (CEC), CEC saturation by bases (V%) and saturation by Al3+ (m%). Regarding the particle size measurements, the fractions sand, silt and clay were analyzed. Four spectroradiometers (350-2500 nm) were used in order to obtain the reflectance spectra. The variations of liming recommendations from different laboratories were also evaluated. Laboratories were evaluated based on imprecision and inaccuracy rates. The soil attributes that presented highest errors in the traditional analysis, based on the average of all laboratories, were in descending order m%, Al3+, Mg2+ and P. These errors significantly influenced the calibrations of the prediction models through sensors. Furthermore, the analytical uncertainties can often influence liming recommendations. For this recommendation, one of the laboratories presented results with errors greater than 1 t ha-1. The prediction models calibrated with laboratory data with fewer errors presented R2 value greater than 0.7 and RPD greater than 1.8 for OM, Al3+, CEC, H + Al, sand, silt and clay. The methodology allowed the quantification of the level of acceptable uncertainty in the laboratory measurements and the evaluation of how the laboratory analytical errors influenced the predictions of the sensors. The reflectance spectroscopy is an efficient complementary alternative to traditional methods of soil analyses.
3

Laboratórios via sistema tradicional e espectroscopia de reflectância: avaliação da qualidade analítica dos atributos do solo / Laboratories in the traditional system and reflectance spectroscopy: evaluation of analytical quality of soil attributes

Luis Gustavo Bedin 26 August 2016 (has links)
A análise de solo é considerada ferramenta essencial para fins de recomendação de calagem, adubação e manejo do solo. Entretanto, com a demanda crescente por alimentos e a necessidade do aumento sustentável da produtividade agrícola, é fundamental seguir progredindo em termos de qualidade, custos e o tempo demandado para a obtenção dos resultados destas análises. Neste sentido, as técnicas de sensoriamento remoto, incluindo as escalas laboratoriais, de campo, aéreas e orbitais, apresentam vantagens, principalmente no que se refere à avaliação de áreas de grande extensão. A qualidade das determinações laboratoriais é fundamental para as recomendações de manejo do solo, levando ao questionamento do grau de variabilidade analítica entre diferentes laboratórios e quantificações via espectroscopia de reflectância. Objetivou-se avaliar as incertezas relacionadas às determinações da análise de solo, e como isso pode afetar nos modelos de predição espectrais (350-2.500 nm). Com isso, espera-se entender as vantagens e limitações das metodologias, permitindo assim decisões mais adequadas para o manejo do solo. Amostras de solos sob cultivo extensivo de cana de açúcar foram coletadas de 29 municípios situados no estado de São Paulo. Para a coleta dos solos foram abertos 48 perfis com aproximadamente 1,5 m de profundidade, foi retirado de cada perfil aproximadamente 10 kg de terra, nas profundidades de 0-0,2 e 0,8-1,00 m, totalizando 96 amostras primárias. Para as determinações químicas foram analisados os seguintes atributos: potencial hidrogeniônico (pH), matéria orgânica (MO), fósforo resina (P), potássio trocável (K+), cálcio trocável (Ca2+), magnésio trocável (Mg2+), alumínio trocável (Al3+), acidez potencial (H + Al), soma de bases trocáveis (SB), capacidade de troca de cátions (CTC), saturação da CTC por bases (V%) e saturação por Al3+ (m%). No que se refere às determinações granulométricas, foram analisadas as frações areia, silte e argila. Para obtenção dos espectros de reflectância, foram utilizados quatro espectrorradiômetros (350-2.500 nm). As variações das recomendações de calagem de diferentes laboratórios também foram avaliadas. Laboratórios foram avaliados com base em índices de imprecisão e inexatidão. As determinações com maiores erros em ordem decrescente, considerando a média de todos os laboratórios, foram m%, Al3+, Mg2+ e P. Esses erros influenciaram significativamente nas calibrações dos modelos de predições via sensor. Além disso, foi observado que as incertezas analíticas muitas vezes podem influenciar na recomendação de calagem. Para esta recomendação, um dos laboratórios estudados apresentou resultados com erro maior a 1 t ha-1. Os modelos de predição calibrados com os dados do laboratório com menor quantidade de erros apresentaram valor de R2 maior que 0,7 e RPD maior que 1,8, para os atributos MO, Al, CTC, H+Al, areia, silte e argila. A metodologia empregada possibilitou a quantificação do nível de incertezas aceitáveis nas determinações laboratoriais e a avaliação de como os erros analíticos laboratoriais influenciaram nas predições dos sensores. A espectroscopia de reflectância mostra ser alternativa complementar eficiente aos métodos tradicionais de análises de solo. / Soil analysis is an essential tool for liming recomendation, fertilization and soil management. Considering the increasing demand for food and the need for a sustainable increase in agricultural productivity, it is essential to promote the quality of soil analysis, as well as reducing costs and time required to obtain such analysis. In this sense, remote sensing techniques, including laboratory, field, aerial and orbital levels, have advantages especially regarding the assessment of areas of large extension. The quality of laboratory measurements is critical for soil management recommendations, which makes important to question the degree of analytical variability between different laboratories and measurements via reflectance spectroscopy. This study aimed to evaluate the uncertainties related to traditional soil analysis, and how they can affect the spectral prediction models (350-2500 nm). It is expected to understand the advantages and limitations of both methodologies, allowing proper decision-making for soil management. Soil samples under extensive sugar cane cultivation were collected from 29 municipalities in the state of São Paulo. For soil sampling, 48 soil profiles were opened in a depth of approximately 1.5 m and 10 kg of soil was collected from the depths 0-0.2 and 0.8-1.0 m, resulting in 96 primary samples. For chemical analysis the following attributes were considered: potential of Hydrogen (pH), Organic Matter (OM), phosphorus (P), exchangeable potassium (K+), exchangeable calcium (Ca2+), exchangeable magnesium (Mg2+), exchangeable aluminum (Al3+), potential acidity (H + Al), total exchangeable bases (SB), Cation Exchange Capacity (CEC), CEC saturation by bases (V%) and saturation by Al3+ (m%). Regarding the particle size measurements, the fractions sand, silt and clay were analyzed. Four spectroradiometers (350-2500 nm) were used in order to obtain the reflectance spectra. The variations of liming recommendations from different laboratories were also evaluated. Laboratories were evaluated based on imprecision and inaccuracy rates. The soil attributes that presented highest errors in the traditional analysis, based on the average of all laboratories, were in descending order m%, Al3+, Mg2+ and P. These errors significantly influenced the calibrations of the prediction models through sensors. Furthermore, the analytical uncertainties can often influence liming recommendations. For this recommendation, one of the laboratories presented results with errors greater than 1 t ha-1. The prediction models calibrated with laboratory data with fewer errors presented R2 value greater than 0.7 and RPD greater than 1.8 for OM, Al3+, CEC, H + Al, sand, silt and clay. The methodology allowed the quantification of the level of acceptable uncertainty in the laboratory measurements and the evaluation of how the laboratory analytical errors influenced the predictions of the sensors. The reflectance spectroscopy is an efficient complementary alternative to traditional methods of soil analyses.
4

Modèle d'estimation de l'imprécision des mesures géométriques de données géographiques / A model to estimate the imprecision of geometric measurements computed from geographic data.

Girres, Jean-François 04 December 2012 (has links)
De nombreuses applications SIG reposent sur des mesures de longueur ou de surface calculées à partir de la géométrie des objets d'une base de données géographiques (comme des calculs d'itinéraires routiers ou des cartes de densité de population par exemple). Cependant, aucune information relative à l'imprécision de ces mesures n'est aujourd'hui communiquée à l'utilisateur. En effet, la majorité des indicateurs de précision géométrique proposés porte sur les erreurs de positionnement des objets, mais pas sur les erreurs de mesure, pourtant très fréquentes. Dans ce contexte, ce travail de thèse cherche à mettre au point des méthodes d'estimation de l'imprécision des mesures géométriques de longueur et de surface, afin de renseigner un utilisateur dans une logique d'aide à la décision. Pour répondre à cet objectif, nous proposons un modèle permettant d'estimer les impacts de règles de représentation (projection cartographique, non-prise en compte du terrain, approximation polygonale des courbes) et de processus de production (erreur de pointé et généralisation cartographique) sur les mesures géométriques de longueur et de surface, en fonction des caractéristiques des données vectorielles évaluées et du terrain que ces données décrivent. Des méthodes d'acquisition des connaissances sur les données évaluées sont également proposées afin de faciliter le paramétrage du modèle par l'utilisateur. La combinaison des impacts pour produire une estimation globale de l'imprécision de mesure demeure un problème complexe et nous proposons des premières pistes de solutions pour encadrer au mieux cette erreur cumulée. Le modèle proposé est implémenté au sein du prototype EstIM (Estimation de l'Imprécision des Mesures) / Many GIS applications are based on length and area measurements computed from the geometry of the objects of a geographic database (such as route planning or maps of population density, for example). However, no information concerning the imprecision of these measurements is now communicated to the final user. Indeed, most of the indicators on geometric quality focuses on positioning errors, but not on measurement errors, which are very frequent. In this context, this thesis seeks to develop methods for estimating the imprecision of geometric measurements of length and area, in order to inform a user for decision support. To achieve this objective, we propose a model to estimate the impacts of representation rules (cartographic projection, terrain, polygonal approximation of curves) and production processes (digitizing error, cartographic generalisation) on geometric measurements of length and area, according to the characteristics and the spatial context of the evaluated objects. Methods for acquiring knowledge about the evaluated data are also proposed to facilitate the parameterization of the model by the user. The combination of impacts to produce a global estimation of the imprecision of measurement is a complex problem, and we propose approaches to approximate the cumulated error bounds. The proposed model is implemented in the EstIM prototype (Estimation of the Imprecision of Measurements)
5

Estimating measurement uncertainty in the medical laboratory

Placido, Rui January 2016 (has links)
Medical Laboratories Accreditation is covered by ISO 15189:2012 - Medical Laboratories — Requirements for Quality and Competence. In Portugal, accreditation processes are held under the auspices of the Portuguese Accreditation Institute (IPAC), which applies the Portuguese edition (NP EN ISO 15189:2014). Accordingly, Medical Laboratories accreditation processes now require the estimate of measurement uncertainty (MU) associated to the results. The Guide to the Expression of Uncertainty in Measurement (GUM) describes the calculation of MU, not contemplating the specific aspects of medical laboratory testing. Several models have been advocated, yet without a final consensus. Given the lack of studies on MU in Portugal, especially on its application in the medical laboratory, it is the objective of this thesis to reach to a model that fulfils the IPAC’s accreditation regulations, in regards to this specific requirement. The study was based on the implementation of two formulae (MU-A and MU-B), using the Quality Management System (QMS) data of an ISO 15189 Accredited Laboratory. Including the laboratory’s two Cobas® 6000–c501 (Roche®) analysers (C1 and C2) the work focused three analytes: creatinine, glucose and total cholesterol. The MU-B model formula, combining the standard uncertainties of the method’s imprecision, of the calibrator’s assigned value and from the pre-analytical variation, was considered the one best fitting to the laboratory's objectives and to the study's purposes, representing well the dispersion of values reasonably attributable to the measurand final result. Expanded Uncertainties were: Creatinine - C1 = 9,60%; C2 = 5,80%; Glucose - C1 = 8,32%; C2 = 8,34%; Cholesterol - C1 = 4,00%; C2 = 3,54 %. ...[cont.].
6

Analyse formelle de concepts et fusion d'informations : application à l'estimation et au contrôle d'incertitude des indicateurs agri-environnementaux / Formal concept analysis and information fusion : application on the uncertainty estimation of environmental indicator

Assaghir, Zainab 12 November 2010 (has links)
La fusion d'informations consiste à résumer plusieurs informations provenant des différentes sources en une information exploitable et utile pour l'utilisateur.Le problème de la fusion est délicat surtout quand les informations délivrées sont incohérentes et hétérogènes. Les résultats de la fusion ne sont pas souvent exploitable et utilisables pour prendre une décision, quand ils sont imprécis. C'est généralement due au fait que les informations sont incohérentes. Plusieurs méthodes de fusion sont proposées pour combiner les informations imparfaites et elles appliquent l'opérateur de fusion sur l'ensemble de toutes les sources et considèrent le résultat tel qu'il est. Dans ce travail, nous proposons une méthode de fusion fondée sur l'Analyse Formelle de Concepts, en particulier son extension pour les données numériques : les structures de patrons. Cette méthode permet d'associer chaque sous-ensemble de sources avec son résultat de fusion. Toutefois l'opérateur de fusion est choisi, alors un treillis de concept est construit. Ce treillis fournit une classification intéressante des sources et leurs résultats de fusion. De plus, le treillis garde l'origine de l'information. Quand le résultat global de la fusion est imprécis, la méthode permet à l'utilisateur d'identifier les sous-ensemble maximaux de sources qui supportent une bonne décision. La méthode fournit une vue structurée de la fusion globale appliquée à l'ensemble de toutes les sources et des résultats partiels de la fusion marqués d'un sous-ensemble de sources. Dans ce travail, nous avons considéré les informations numériques représentées dans le cadre de la théorie des possibilités et nous avons utilisé trois sortes d'opérateurs pour construire le treillis de concepts. Une application dans le monde agricole, où la question de l'expert est d'estimer des valeurs des caractéristiques de pesticide provenant de plusieurs sources, pour calculer des indices environnementaux est détaillée pour évaluer la méthode de fusion proposée / Merging pieces of information into an interpretable and useful format is a tricky task even when an information fusion method is chosen. Fusion results may not be in suitable form for being used in decision analysis. This is generally due to the fact that information sources are heterogeneous and provide inconsistent information, which may lead to imprecise results. Several fusion operators have been proposed for combining uncertain information and they apply the fusion operator on the set of all sources and provide the resulting information. In this work, we studied and proposed a method to combine information using Formal Concept Analysis in particular Pattern Structures. This method allows us to associate any subset of sources with its information fusion result. Then once a fusion operator is chosen, a concept lattice is built. The concept lattice gives an interesting classification of fusion results and it keeps a track of the information origin. When the fusion global result is too imprecise, the method enables the users to identify what maximal subset of sources would support a more precise and useful result. Instead of providing a unique fusion result, the method yields a structured view of partial results labeled by subsets of sources. In this thesis, we studied the numerical information represented in the framework of possibility theory and we used three fusion operators to built the concept lattice. We applied this method in the context of agronomy when experts have to estimate several characteristics values coming from several sources for computing an environmental risk
7

The Impact of Imprecision in HCV Viral Load Test Results on Clinicians’ Therapeutic Management Decisions and on the Economic Value of the Test

Madej, Roberta M. 01 January 2013 (has links)
Clinical laboratory test results are integral to patient management. Important aspects of laboratory tests’ contributions are the use of the test information and the role they have in facilitating efficient and effective use of healthcare resources. Methods of measuring those contributions were examined using quantitative HCV RNA test results (HCV VL) in therapeutic management decisions as a model. Test precision is important in those decisions; therefore, the clinical use was evaluated by studying the impact that knowledge of inherent assay imprecision had on clinicians’ decisions. A survey describing a simulated patient at a decision point for HCV triple-combination therapy management was sent to 1491 hepatology clinicians. Participants saw HCV RNA results at five different levels and were asked to choose to: continue therapy, discontinue therapy, or repeat the test. Test results were presented both with and without the 95% confidence intervals (CIs). Three of the VLs had CIs that overlapped the therapeutic decision level. Participants saw both sets of results in random order. Demographics and practice preferences were also surveyed. One-hundred-thirty-eight responses were received. Adherence to clinical guidelines was demonstrated in self-reported behaviors and in most decisions. However, participants chose to repeat the test up to 37% of the time. The impact of the knowledge of assay imprecision did not have a statistically significant effect on clinicians’ decisions. To determine economic value, an analytic decision-tree model was developed. Transition probabilities, costs, and Quality of Life values were derived from published literature. Survey respondents’ decisions were used as model inputs. Across all HCV VL levels, the calculated test value was approximately $2600, with up to $17,000 in treatment-related cost savings per patient at higher HCV VLs. The test value prevailed regardless of the presence or absence of CIs, and despite repeat testing. The calculated value in cost savings/patient was up to 100 times the investment for HCV VL testing. Laboratory tests are investments in efficient uses of healthcare resources. Proper interpretation and use of their information is integral to that value. This type of analysis can inform institutional decisions and higher level policy discussions.
8

Méthode non-additive intervalliste de super-résolution d'images, dans un contexte semi-aveugle / A non-additive interval-valued super-resolution image method, in a semi-blind context

Graba, Farès 17 April 2015 (has links)
La super-résolution est une technique de traitement d'images qui consiste en la reconstruction d'une image hautement résolue à partir d'une ou plusieurs images bassement résolues.Cette technique est apparue dans les années 1980 pour tenter d'augmenter artificiellement la résolution des images et donc de pallier, de façon algorithmique, les limites physiques des capteurs d'images.Comme beaucoup des techniques de reconstruction en traitement d'images, la super-résolution est connue pour être un problème mal posé dont la résolution numérique est mal conditionnée. Ce mauvais conditionnement rend la qualité des images hautement résolues reconstruites très sensible au choix du modèle d'acquisition des images, et particulièrement à la modélisation de la réponse impulsionnelle de l'imageur.Dans le panorama des méthodes de super-résolution que nous dressons, nous montrons qu'aucune des méthodes proposées par la littérature ne permet de modéliser proprement le fait que la réponse impulsionnelle d'un imageur est, au mieux, connue de façon imprécise. Au mieux l'écart existant entre modèle et réalité est modélisé par une variable aléatoire, alors que ce biais est systématique.Nous proposons de modéliser l'imprécision de la connaissance de la réponse impulsionnelle par un ensemble convexe de réponses impulsionnelles. L'utilisation d'un tel modèle remet en question les techniques de résolution. Nous proposons d'adapter une des techniques classiques les plus populaires, connue sous le nom de rétro-projection itérative, à cette représentation imprécise.L'image super-résolue reconstruite est de nature intervalliste, c'est à dire que la valeur associée à chaque pixel est un intervalle réel. Cette reconstruction s'avère robuste à la modélisation de la réponse impulsionnelle ainsi qu'à d'autres défauts. Il s'avère aussi que la largeur des intervalles obtenus permet de quantifier l'erreur de reconstruction. / Super-resolution is an image processing technique that involves reconstructing a high resolution image based on one or several low resolution images. This technique appeared in the 1980's in an attempt to artificially increase image resolution and therefore to overcome, algorithmically, the physical limits of an imager.Like many reconstruction problems in image processing, super-resolution is known as an ill-posed problem whose numerical resolution is ill-conditioned. This ill-conditioning makes high resolution image reconstruction qualityvery sensitive to the choice of image acquisition model, particularly to the model of the imager Point Spread Function (PSF).In the panorama of super-resolution methods that we draw, we show that none of the methods proposed in the relevant literature allows properly modeling the fact that the imager PSF is, at best, imprecisely known. At best the deviation between model and reality is considered as being a random variable, while it is not: the bias is systematic.We propose to model scant knowledge on the imager's PSF by a convex set of PSFs. The use of such a model challenges the classical inversion methods. We propose to adapt one of the most popular super-resolution methods, known under the name of "iterative back-projection", to this imprecise representation. The super-resolved image reconstructed by the proposed method is interval-valued, i.e. the value associated to each pixel is a real interval. This reconstruction turns out to be robust to the PSF model and to some other errors. It also turns out that the width of the obtained intervals quantifies the reconstruction error.
9

Imprecisão na estimação orçamentária dos municípios brasileiros / Imprecision in budgeting estimating in Brazilian municipalities

Azevedo, Ricardo Rocha de 15 January 2014 (has links)
A pesquisa analisou o grau de imprecisão orçamentária dos municípios brasileiros, e sugeriu fatores que estariam associados à imprecisão. A importância da análise da precisão do orçamento é reconhecida por organismos internacionais como o Banco Mundial e OCDE, que têm desenvolvido mecanismos de acompanhamento da qualidade do orçamento público. O orçamento público é o instrumento de estimação e alocação de recursos em ações que foram priorizadas pelos agentes da administração pública para concretizar sua plataforma de governo proposta na campanha. Assim, o orçamento sinaliza aos cidadãos as políticas públicas propostas na campanha, assim como as ações específicas que que serão futuramente executadas. Além disso, o orçamento fornece importantes informações sobre o nível de endividamento e a proporção de investimentos do município. A imprecisão na estimação de receitas e despesas no orçamento distorce a alocação planejada colocando em risco a execução do plano, e também reduz a capacidade do próprio governo em planejar as suas ações. A falta de incentivos para buscar a precisão, dada a baixa cobrança pelos órgãos de controle externo e pelos mecanismos de controle social, pode levar a erros e à baixa atenção ao processo orçamentário nos municípios. A literatura anterior têm concentrado esforços em estudar a transparência, a participação popular e técnicas de previsão das receitas, mas pouco tem tratado o processo de alocação de recursos. Os resultados da pesquisa mostram que (i) o controle legislativo tem alguma associação com a diminuição da imprecisão do orçamento em municípios nos quais o Prefeito não tem a maioria da Câmara; (ii) o controle externo não possui relação com a imprecisão. / The research examined the degree of budget inaccuracy of Brazilian municipalities, and suggested factors associated to vagueness. The importance of analyzing the budget accuracy is recognized by international bodies such as the World Bank and OECD, who have developed mechanisms to monitor the quality of the public budget. The public budget is the instrument of estimation and resource allocation in stocks that have been prioritized by the agents of public administration to implement their platform of government proposed in the campaign. Thus, the budget signals to citizens the public policies proposed in the campaign, as well as the specific actions that will be implemented in the future. In addition, the budget provides important information about the level of debt and the proportion of investments of the municipality. The imprecision in estimating revenues and expenses in the budget distorts the allocation planned endangering the implementation of the plan, and also reduces the government\'s ability to plan their own actions. The lack of incentives to seek accuracy, given the low charge by external control bodies and the mechanisms of social control, can lead to errors and low attention to the budgetary process in the municipalities. The previous literature has focused efforts on studying transparency, popular participation and revenue forecasting techniques, but little has handled the process of resource allocation. The survey results show that (i) the legislative control has some association with the decrease in the budget inaccuracy in municipalities where the mayor does not have the majority of the Board; (ii) external control has no relationship with the inaccuracy.
10

Development and Evaluation of Nonparametric Mixed Effects Models

Baverel, Paul January 2011 (has links)
A nonparametric population approach is now accessible to a more comprehensive network of modelers given its recent implementation into the popular NONMEM application, previously limited in scope by standard parametric approaches for the analysis of pharmacokinetic and pharmacodynamic data. The aim of this thesis was to assess the relative merits and downsides of nonparametric models in a nonlinear mixed effects framework in comparison with a set of parametric models developed in NONMEM based on real datasets and when applied to simple experimental settings, and to develop new diagnostic tools adapted to nonparametric models. Nonparametric models as implemented in NONMEM VI showed better overall simulation properties and predictive performance than standard parametric models, with significantly less bias and imprecision in outcomes of numerical predictive check (NPC) from 25 real data designs. This evaluation was carried on by a simulation study comparing the relative predictive performance of nonparametric and parametric models across three different validation procedures assessed by NPC. The usefulness of a nonparametric estimation step in diagnosing distributional assumption of parameters was then demonstrated through the development and the application of two bootstrapping techniques aiming to estimate imprecision of nonparametric parameter distributions. Finally, a novel covariate modeling approach intended for nonparametric models was developed with good statistical properties for identification of predictive covariates. In conclusion, by relaxing the classical normality assumption in the distribution of model parameters and given the set of diagnostic tools developed, the nonparametric approach in NONMEM constitutes an attractive alternative to the routinely used parametric approach and an improvement for efficient data analysis.

Page generated in 0.082 seconds