Spelling suggestions: "subject:"aprincipal components analysis"" "subject:"_principal components analysis""
111 |
Modeling the Point Spread Function Using Principal Component AnalysisRagozzine, Brett A. 29 December 2008 (has links)
No description available.
|
112 |
Multivariate Applications of Bayesian Model AveragingNoble, Robert Bruce 04 January 2001 (has links)
The standard methodology when building statistical models has been to use one of several algorithms to systematically search the model space for a good model. If the number of variables is small then all possible models or best subset procedures may be used, but for data sets with a large number of variables, a stepwise procedure is usually implemented. The stepwise procedure of model selection was designed for its computational efficiency and is not guaranteed to find the best model with respect to any optimality criteria. While the model selected may not be the best possible of those in the model space, commonly it is almost as good as the best model. Many times there will be several models that exist that may be competitors of the best model in terms of the selection criterion, but classical model building dictates that a single model be chosen to the exclusion of all others. An alternative to this is Bayesian model averaging (BMA), which uses the information from all models based on how well each is supported by the data.
Using BMA allows a variance component due to the uncertainty of the model selection process to be estimated. The variance of any statistic of interest is conditional on the model selected so if there is model uncertainty then variance estimates should reflect this. BMA methodology can also be used for variable assessment since the probability that a given variable is active is readily obtained from the individual model posterior probabilities.
The multivariate methods considered in this research are principal components analysis (PCA), canonical variate analysis (CVA), and canonical correlation analysis (CCA). Each method is viewed as a particular multivariate extension of univariate multiple regression. The marginal likelihood of a univariate multiple regression model has been approximated using the Bayes information criteria (BIC), hence the marginal likelihood for these multivariate extensions also makes use of this approximation.
One of the main criticisms of multivariate techniques in general is that they are difficult to interpret. To aid interpretation, BMA methodology is used to assess the contribution of each variable to the methods investigated. A second issue that is addressed is displaying of results of an analysis graphically. The goal here is to effectively convey the germane elements of an analysis when BMA is used in order to obtain a clearer picture of what conclusions should be drawn.
Finally, the model uncertainty variance component can be estimated using BMA. The variance due to model uncertainty is ignored when the standard model building tenets are used giving overly optimistic variance estimates. Even though the model attained via standard techniques may be adequate, in general, it would be difficult to argue that the chosen model is in fact the correct model. It seems more appropriate to incorporate the information from all plausible models that are well supported by the data to make decisions and to use variance estimates that account for the uncertainty in the model estimation as well as model selection. / Ph. D.
|
113 |
Human optokinetic nystagmus : a stochastic analysisWaddington, Jonathan January 2012 (has links)
Optokinetic nystagmus (OKN) is a fundamental gaze-stabilising response in which eye movements attempt to compensate for the retinal slip caused by self-motion. The OKN response consists of a slow following movement made in the direction of stimulus motion interrupted by fast eye movements that are primarily made in the opposite direction. The timing and amplitude of these slow phases and quick phases are notably variable, but this variability is poorly understood. In this study I performed principal component analysis on OKN parameters in order to investigate how the eigenvectors and eigenvalues of the underlying components contribute to the correlation between OKN parameters over time. I found three categories of principal components that could explain the variance within each cycle of OKN, and only parameters from within a single cycle contributed highly to any given component. Differences found in the correlation matrices of OKN parameters appear to reflect changes in the eigenvalues of components, while eigenvectors remain predominantly similar across participants, and trials. I have developed a linear and stochastic model of OKN based on these results and demonstrated that OKN can be described as a 1st order Markov process, with three sources of noise affecting SP velocity, QP triggering, and QP amplitude. I have used this model to make some important predictions about the optokinetic reflex: the transient response of SP velocity, the existence of signal dependent noise in the system, the target position of QPs, and the threshold at which QPs are generated. Finally, I investigate whether the significant variability within OKN may represent adaptive control of explicit and implicit parameters. iii
|
114 |
Developing Criteria for Extracting Principal Components and Assessing Multiple Significance Tests in Knowledge Discovery ApplicationsKeeling, Kellie Bliss 08 1900 (has links)
With advances in computer technology, organizations are able to store large amounts of data in data warehouses. There are two fundamental issues researchers must address: the dimensionality of data and the interpretation of multiple statistical tests. The first issue addressed by this research is the determination of the number of components to retain in principal components analysis. This research establishes regression, asymptotic theory, and neural network approaches for estimating mean and 95th percentile eigenvalues for implementing Horn's parallel analysis procedure for retaining components. Certain methods perform better for specific combinations of sample size and numbers of variables. The adjusted normal order statistic estimator (ANOSE), an asymptotic procedure, performs the best overall. Future research is warranted on combining methods to increase accuracy. The second issue involves interpreting multiple statistical tests. This study uses simulation to show that Parker and Rothenberg's technique using a density function with a mixture of betas to model p-values is viable for p-values from central and non-central t distributions. The simulation study shows that final estimates obtained in the proposed mixture approach reliably estimate the true proportion of the distributions associated with the null and nonnull hypotheses. Modeling the density of p-values allows for better control of the true experimentwise error rate and is used to provide insight into grouping hypothesis tests for clustering purposes. Future research will expand the simulation to include p-values generated from additional distributions. The techniques presented are applied to data from Lake Texoma where the size of the database and the number of hypotheses of interest call for nontraditional data mining techniques. The issue is to determine if information technology can be used to monitor the chlorophyll levels in the lake as chloride is removed upstream. A relationship established between chlorophyll and the energy reflectance, which can be measured by satellites, enables more comprehensive and frequent monitoring. The results have both economic and political ramifications.
|
115 |
Risk Measurement of Mortgage-Backed Security Portfolios via Principal Components and Regression AnalysesMotyka, Matt 29 April 2003 (has links)
Risk measurement of mortgage-backed security portfolios presents a very involved task for analysts and portfolio managers of such investments. A strong predictive econometric model that can account for the variability of these securities in the future would prove a very useful tool for anyone in this financial market sector due to the difficulty of evaluating the risk of mortgage cash flows and prepayment options at the same time. This project presents two linear regression methods that attempt to explain the risk within these portfolios. The first study involves a principal components analysis on absolute changes in market data to form new sets of uncorrelated variables based on the variability of original data. These principal components then serve as the predictor variables in a principal components regression, where the response variables are the day-to-day changes in the net asset values of three agency mortgage-backed security mutual funds. The independence of each principal component would allow an analyst to reduce the number of observable sets in capturing the risk of these portfolios of fixed income instruments. The second idea revolves around a simple ordinary least squares regression of the three mortgage funds on the sets of the changes in original daily, weekly and monthly variables. While the correlation among such predictor variables may be very high, the simplicity of utilizing observable market variables is a clear advantage. The goal of either method was to capture the largest amount of variance in the mortgage-backed portfolios through these econometric models. The main purpose was to reduce the residual variance to less than 10 percent, or to produce at least 90 percent explanatory power of the original fund variances. The remaining risk could then be attributed to the nonlinear dependence in the changes in these net asset values on the explanatory variables. The primary cause of this nonlinearity is due to the prepayment put option inherent in these securities.
|
116 |
Avaliação de laranjeiras doces quanto à qualidade de frutos, períodos de maturação e resistência a Guignardia citricarpa /Sousa, Patrícia Ferreira Cunha. January 2009 (has links)
Orientador: Antonio de Goes / Banca: Eduardo Sanches Stuchi / Banca: Kátia Cristina Kupper / Banca: Gener Tadeu Pereira / Banca: Marcel Bellato Spósito / Resumo: Apesar de sua importância comercial, o número de variedades de laranjas é muito restrito no Brasil. Os Bancos de Germoplasmas de citros possuem grande número de genótipos de laranjas doces para serem explorados e avaliados quanto aos aspectos botânicos, genéticos e agronômicos, visando elevar a variabilidade genética e as qualidades agronômicas das cultivares. Como parte desse trabalho, avaliou-se 58 genótipos de laranjeiras doces em relação aos caracteres físicos, visando mercado in natura por meio de 9 caracteres físicos (diâmetro, perímetro, altura e peso dos frutos, espessuras da casca, albedo e polpa e número de sementes) e 7 caracteres visando qualidade industrial (acidez total titulável, sólidos solúveis totais, "ratio", peso dos frutos, rendimento de suco, ácido ascórbico e índice tecnológico= kg sólidos solúveis/40,8kg). A análise multivariada indicou a existência de variabilidade entre os genótipos em relação aos caracteres físicos visando mercado in natura e qualidade industrial. Dois componentes principais, com autovalores > 1, representaram 66,03% da variância total para os caracteres físicos. As variáveis com maior poder discriminatório na primeira componente principal foram: diâmetro, perímetro, peso e altura dos frutos. Os escores desse componente foram designados MI-CP1 (mercado in natura), e os genótipos com os maiores valores foram os mais indicados para o mercado de fruta fresca. Na segunda componente principal, as variáveis mais discriminantes foram espessura do endocarpo e rendimento de suco, cujos escores foram nomeados (S-CP2), caracteres físicos esses ideais para a qualidade industrial. Nos escores dos dois componentes principais (MI-CP1 e S-CP2), o genótipo 22- 'Lanelate' foi destaque, seguido por 43-Telde, 39-Rotuna, 44-Torregrossa, 46-Tua Mamede e 17-Grada. Quanto às avaliações visando qualidade industrial... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Although its commercial importance, the number of you cultivate of oranges it is very restricted in Brazil. The Banks of Germoplasmas of citros possess innumerable accesses of oranges candies to be explored and evaluated how much to the botanical, genetic and agronomics aspects, aiming at to raise the genetic variability and the agronomics qualities cultivating of them. As part of that work, was sought to evaluate 58 genotypes of sweet orange trees in relation to the physical characters, seeking market in nature and industry quality, through 9 physical characters (diameter, perimeter, height and weight of the fruits, thickness of the peel, albedo and pulp and number of seeds) and 7 characters seeking industrial quality (acidity total titillate, total soluble solids, ratio ", weight of the fruits, juice revenue, ascorbic acid and technological index = kg solid solutes/40,8kg). The analysis multivariate indicated the variability existence among the genotypes in relation to the physical characters and industrial quality. Two main components, with autovalues> 1, they represented 66,03% of the total variance for the physical characters. The variables with larger power discriminate in the first main component were: diameter, perimeter, weight and height of the fruits; we named the scores of that component of MI-CP1 (market in nature), genotypes with the largest values were the most suitable to the market of fresh fruit; in the second main component the variables more discriminate were thickness of the endocarp and juice revenue, it was named (S-CP2), characters physical ideas for the industrial quality. In the scores of the two main components (MI-CP1 and S-CP2), the genotype 22-Lanelate was prominence, followed for 43-Telde, 39-Rotuna, 44- Torregrossa, 46-Tua Mamede and it 17-Grada. How much to the evaluations aiming at industrial quality (INDUST-CP1), had been distinguished: ...(Complete abstract click electronic access below) / Doutor
|
117 |
Differentiation between causes of optic disc swelling using retinal layer shape featuresMiller, John William 01 May 2018 (has links)
The optic disc is the region of the retina where the optic nerve exits the back of the eye. A number of conditions can cause the optic disc to swell. Papilledema, optic disc swelling caused by raised intracranial pressure (ICP), and nonarteritic anterior ischemic optic neuropathy (NAION), swelling caused by reduced blood flow to the back of the eye, are two such conditions. Rapid, accurate diagnosis of the cause of disc swelling is important, as with papilledema the underlying cause of raised ICP could potentially be life-threatening and may require immediate intervention.
The current clinical standard for diagnosing and assessing papilledema is a subjective measure based on qualitative inferences drawn from fundus images. Even with the expert training required to properly perform the assessment, measurements and results can vary significantly between clinicians. As such, the need for a rapid, accurate diagnostic tool for optic disc swelling is clear.
Shape analysis of the structures of the retina has emerged as a promising quantitative tool for distinguishing between causes of optic disc swelling. Optic disc swelling can cause the retinal surfaces to distort, taking on shapes that differ from their normal arrangement. Recent work has examined how changes in the shape of one of these surfaces, Bruch's membrane (BM), varies between different types of optic disc swelling, containing clinically-relevant information.
The inner limiting membrane (ILM), the most anterior retinal surface and furthest from BM, can take on shapes that are distinct from the more posterior layers when the optic disc becomes swollen. These unique shape characteristics have yet to be explored for their potential clinical utility. This thesis develops new shape models of the ILM.
The ultimate goal of this work is to develop noninvasive, automated diagnostic tools for clinical use. To that end, a necessary first step in establishing clinical relevance is demonstrating the utility of retinal shape information in a machine learning classifier. Retinal layer shape information and regional volume measurements acquired from spectral-domain optical coherence tomography scans from 78 patients (39 papilledema, 39 NAION) was used to train random forest classifiers to distinguish between cases of papilledema and NAION.
On average, the classifiers were able to correctly distinguish between papilledema and NAION 85.7±2.0% of the time, confirming the usefulness of retinal layer shapes for determining the cause of optic disc swelling. The results of this experiment are encouraging for future studies that will include more patients and attempt to differentiate between additional causes of optic disc edema.
|
118 |
The Religiosity of Vietnamese AmericansLe, Jennifer Linh 2011 May 1900 (has links)
Religion is a deeply important tradition in many people's lives, especially for those forced to leave abruptly their homes and loved ones and resettle in a foreign land. Religion not only provides spiritual guidance but also social networks, comfort, and moral standards, among many others things. I chose to study the beliefs and practices of Vietnamese American Buddhists and Catholics as well as the relationship between those two groups in the U.S. The Vietnamese present an interesting case because of their collective status as a well-publicized immigrant, formerly refugee, population that is now well-established in this country. With my research, I was able to test five hypotheses. I wanted to determine the degree of transnationality, tension between the religious groups, conversion, and ancestor worship. Secondarily, I assessed any differences regionally. In order to test my hypotheses, I conducted 60 quantitative surveys. I sampled from the Houston and Minneapolis-St. Paul Vietnamese communities.
Transnationality, or ties to the homeland, was more prevalent for Buddhists than Catholics as I had hypothesized. There was a minute degree of tension present, however, generally with older members of the first generation cohort. Traditional Vietnamese ancestor worship was not more prevalent with Buddhists than with Catholics. I was unable to sample enough religious converts in order to test my conversion hypothesis. In terms of differences across regions, all variables other than national identity as well as an indicator of transnationality were statistically insignificant. This data helps fill a nearly 30-year gap in the research in this area and focuses specifically on the Vietnamese population which many studies have been unable to do.
In addition to my quantitative study, I also conducted qualitative fieldwork at four primary research and three secondary research sites in the Minneapolis-St. Paul and Houston metropolitan areas. Twenty-five to thirty hours were spent at each primary location observing the members, volunteers, dress, interactions, normative and deviant behaviors during services, socialization, languages spoken, attentiveness, racial diversity, and additional activities provided by the religious organization to the membership. This fieldwork gave me a better understanding of this community in a religious context.
|
119 |
STATISTICS IN THE BILLERA-HOLMES-VOGTMANN TREESPACEWeyenberg, Grady S. 01 January 2015 (has links)
This dissertation is an effort to adapt two classical non-parametric statistical techniques, kernel density estimation (KDE) and principal components analysis (PCA), to the Billera-Holmes-Vogtmann (BHV) metric space for phylogenetic trees. This adaption gives a more general framework for developing and testing various hypotheses about apparent differences or similarities between sets of phylogenetic trees than currently exists.
For example, while the majority of gene histories found in a clade of organisms are expected to be generated by a common evolutionary process, numerous other coexisting processes (e.g. horizontal gene transfers, gene duplication and subsequent neofunctionalization) will cause some genes to exhibit a history quite distinct from the histories of the majority of genes. Such “outlying” gene trees are considered to be biologically interesting and identifying these genes has become an important problem in phylogenetics.
The R sofware package kdetrees, developed in Chapter 2, contains an implementation of the kernel density estimation method. The primary theoretical difficulty involved in this adaptation concerns the normalizion of the kernel functions in the BHV metric space. This problem is addressed in Chapter 3. In both chapters, the software package is applied to both simulated and empirical datasets to demonstrate the properties of the method.
A few first theoretical steps in adaption of principal components analysis to the BHV space are presented in Chapter 4. It becomes necessary to generalize the notion of a set of perpendicular vectors in Euclidean space to the BHV metric space, but there some ambiguity about how to best proceed. We show that convex hulls are one reasonable approach to the problem. The Nye-PCA- algorithm provides a method of projecting onto arbitrary convex hulls in BHV space, providing the core of a modified PCA-type method.
|
120 |
Μείωση θορύβου εικόνας απεικονιστικών τεχνικών πυρηνικής ιατρικής με ανάλυση κύριων συνιστωσών / Use of principal component analysis for noice reduction in scintigraphic imagesΣμπιλίρη, Βασιλική Γ. 16 December 2008 (has links)
The aim of this study is the development of a statistical denoising method, to reduce noise in scintigraphic images, preserving image quality characteristics such as contrast, and resolution. The method is based on principal component analysis (PCA) reduces the volume of image data, preserving a large amount of useful information, by considering that a small number of independent image components contain useful information (signal), whereas a large number of independent components contain statistical noise. Therefore, applying PCA and discarding the image components, which correspond to noise, noise reduction can be achieved.
PCA is a multivariate correlation analysis technique which explains algebraically a variance-covariance structure of observed data sets with a few linear combinations of original variables [28-30]. The motivation behind PCA is to find a direction, or a few directions, that explain as much of the variability as possible. This is achieved because each direction is associated with a linear sum of the variables, which are linear sums of the initial variables. Thus, the first principal component is the linear sum corresponding to the direction of greatest variability. The search for the second principal component is restricted to variables that are uncorrelated with the first principal component.
To assess the performance of the proposed denoising method was compared to four conventional noise reduction methods, employing quantitative image quality characteristics (noise and spatial resolution characteristics). Specifically, the linear filter (smooth 3x3 and smooth 5x5), and the non-linear filter (median 3x3 and median 5x5) were used. Additionally to demonstrate the applicability of the proposed method, it was applied to clinical planar scintigraphic images. / Ο όρος Πυρηνική Ιατρική περιγράφει τις διαγνωστικές και θεραπευτικές διαδικασίες, που απαιτούν την εισαγωγή ραδιοφαρμάκων στον οργανισμό. Οι απεικονιστικές τεχνικές της πυρηνικής ιατρικής αξιοποιούν το γεγονός ότι η ακτινοβολία των ραδιενεργών νουκλιδίων μπορεί να διαπεράσει τους ιστούς και να ανιχνευθεί εξωτερικά, καθιστώντας δυνατή τη μελέτη φυσιολογικών και βιοχημικών διαδικασιών εν εξελίξει σε ζωντανούς οργανισμούς.
Η απεικόνιση πυρηνικής ιατρικής χρησιμοποιείται ευρέως στην κλινική πράξη. Σε σύγκριση με άλλες απεικονιστικές τεχνικές έχει το πλεονέκτημα ότι μπορεί να δώσει ταυτόχρονα ανατομικές και λειτουργικές πληροφορίες. Το μειονέκτημα όμως των εικόνων πυρηνικής ιατρικής είναι ο πολύ χαμηλός λόγος σήματος-προς-θόρυβο (signal-to-noise ratio-SNR) σε σχέση με εικόνες άλλων απεικονιστικών τεχνικών.
Η εικόνα στην πυρηνική ιατρική αντιστοιχεί στην κατανομή ραδιενεργού υλικού μέσα στο σώμα του ασθενούς. Η τιμή κάθε pixel της εικόνας σχετίζεται με τον αριθμό των γ-φωτονίων που ανιχνεύονται σε μια περίοδο χρόνου. Οι τιμές αυτές ακολουθούν μια στατιστική κατανομή (κατανομή Poisson), λόγω της τυχαίας φύσης της διάσπασης του χορηγούμενου ραδιενεργού υλικού. Η διακύμανση μιας τυχαίας Poisson μεταβλητής ισούται με τη μέση τιμή της και συνεπώς για να μειωθεί η επίδραση του Poisson θορύβου, ο αριθμός των φωτονίων που ανιχνεύονται πρέπει να αυξηθεί. Αυτό μπορεί να επιτευχθεί με τρεις τρόπους. Πρώτον, με αύξηση του χρόνου καταγραφής, που συνεπάγεται όμως αυξημένο κίνδυνο μετακίνησης του ασθενή. Δεύτερον, με αύξηση της δόσης ραδιενεργού υλικού, που δίνεται στον ασθενή, κάτι που προφανώς είναι ανεπιθύμητο. Η τελευταία λύση είναι η χρήση γ-κάμερας με πολλαπλούς ανιχνευτές ή με πολύ ευαίσθητο ανιχνευτή, που συνεπάγεται αυξημένο κόστος και πολυπλοκότητα.
Για το λόγο αυτό, τεχνικές ψηφιακής επεξεργασίας για μείωση θορύβου εικόνας μπορούν να συνεισφέρουν σημαντικά στη βελτίωση της εικόνας στην πυρηνική ιατρική. Οι κλασικές τεχνικές μείωσης θορύβου κάνουν χρήση γραμμικών φίλτρων εξομάλυνσης (smoothing filters) για την αντικατάσταση της τιμής κάθε εικονοστοιχείου (pixel) με μια μέση τιμή ,η οποία προκύπτει από τη γειτονιά του. Τα φίλτρα αυτά όμως έχουν το μειονέκτημα ότι μειώνουν την αντίθεση και τη διακριτική
ικανότητα της εικόνας. Μη γραμμικά φίλτρα, όπως το median φίλτρο, διατηρούν σε πολλές περιπτώσεις την αντίθεση των δομών, αλλά επίσης υποβαθμίζουν την ποιότητα εικόνας. Ένας από τους λόγους, που οι συμβατικές τεχνικές δεν έχουν ικανοποιητικά αποτελέσματα είναι ότι δεν αντιμετωπίζουν το γεγονός ότι ο θόρυβος σε κάθε pixel εξαρτάται από την ένταση του σήματος (signal dependent noise). Για το λόγο αυτό έχουν προταθεί πρασαρμοζόμενα (adaptive) φίλτρα μείωσης θορύβου. Η κατηγορία των φίλτρων αυτών χρησιμοποιεί στατιστικά κριτήρια για την επιλογή των γειτονικών pixels, που χρησιμοποιούνται για τον υπολογισμό της τιμής του κεντρικού pixel.
Στη συγκεκριμένη διπλωματική εργασία υλοποιήθηκε μέθοδος μείωσης θορύβου, που βασίζεται στη Ανάλυση Κύριων Συνιστωσών (Principal Components Analysis, PCA), προσαρμοσμένη σε εικόνες πυρηνικής ιατρικής. Η μέθοδος αυτή στοχεύει στη μείωση του κβαντικού θορύβου Poisson κατανομής, που εμπεριέχεται σε εικόνες πυρηνικής ιατρικής. Η PCA είναι μια στατιστική τεχνική, που εξετάζει τις σχέσεις που διέπουν τις μεταβλητές ενός συνόλου δεδομένων και βρίσκει ένα υποσύνολο από τις πιο σημαντικές μεταβλητές. Οι νέες μεταβλητές περιγράφονται σαν γραμμικός συνδυασμός των αρχικών μεταβλητών και κατατάσσονται σε σειρά σημαντικότητας σε σχέση με τη διακύμανση των δεδομένων που η κάθε μια εκφράζει. Η πρώτη σημαντική συνιστώσα (principal component) είναι η μεταβλητή που εκφράζει το μέγιστο ποσό διακύμανσης. Η δεύτερη σημαντική συνιστώσα εκφράζει το επόμενο μεγαλύτερο ποσό διακύμανσης και είναι ανεξάρτητη από της πρώτης κ.ο.κ.. Ουσιαστικά, το σύνολο των αρχικών σχετιζόμενων μεταβλητών μετασχηματίζεται σε ένα σύνολο ασυσχέτιστων μεταβλητών, όπου οι λιγότερο σημαντικές μεταβλητές μπορούν να απομακρυνθούν χωρίς ουσιαστική απώλεια πληροφορίας. Η κύρια χρήση της PCA είναι να μειωθεί ο όγκος ενός συνόλου δεδομένων και να οδηγηθούμε σε μια βέλτιστη περιγραφή τους. Στην περίπτωση των εικόνων πυρηνικής ιατρικής μπορούμε να θεωρήσουμε ότι λόγω του στατιστικού χαρακτήρα του θορύβου η χρήσιμη πληροφορία περιέχεται σε μικρό αριθμό συνιστωσών, ενώ ο θόρυβος σε ένα μεγάλο αριθμό μη-σημαντικών συνιστωσών. Εφαρμόζοντας συνεπώς την PCA και αφαιρώντας τις συνιστώσες που αντιστοιχούν στον θόρυβο μπορούμε να επιτύχουμε σημαντική μείωση του.
Επίσης πραγματοποιήθηκε συγκριτική αξιολόγηση μεταξύ της προτεινόμενης μεθόδου και άλλων μεθόδων μείωσης θορύβου σε εικόνες πυρηνικής ιατρικής. Συγκεκριμένα, η μέθοδος που βασίζεται στη PCA συγκρίθηκε με το φίλτρο
εξομάλυνσης (smooth 3x3 και smooth 5x5) και το μη-γραμμικό φίλτρο (median 3x3 και median 5x5). Όλες οι μέθοδοι εφαρμόστηκαν σε πρότυπες εικόνες πυρηνικής ιατρικής, που αποκτήθηκαν με τη βοήθεια δυο ομοιωμάτων, ενός ομοιώματος με μικρές θερμές περιοχές (hot spots phantom) και ενός ομοιώματος μέτρησης διακριτικής ικανότητας (bar phantom) σε διαφορετικούς χρόνους. Στις επεξεργασμένες εικόνες μετρήθηκαν ο θόρυβος, η αντίθεση, ο λόγος αντίθεσης-προς-θόρυβο (Contrast-to-Noise-ratio, CNR) και το εύρος στο ήμισυ της μέγιστης τιμής (Full-Width-of-Half-Maximum, FWHM). Τα αποτελέσματα της σύγκρισης έδειξαν ότι η μέθοδος που βασίζεται στη PCA μειώνει σημαντικά το θόρυβο, ενώ ταυτόχρονα αυξάνει το λόγο αντίθεσης-προς-θόρυβο. Τέλος, πραγματοποιήθηκε πιλοτική μελέτη προτίμησης από δυο πυρηνικούς ιατρούς μεταξύ των μεθόδων μείωσης θορύβου σε δείγμα κλινικών εικόνων συγκεκριμένων εξετάσεων στατικών λήψεων (οστών, πνευμόνων, θυρεοειδούς, παραθυρεοειδούς και νεφρών). Η μελέτη αυτή έδειξε ότι η PCA μειώνει σημαντικά το θόρυβο, ενώ ταυτόχρονα βελτιώνει οπτικά τις ανατομικές δομές των εικόνων.
|
Page generated in 0.1313 seconds