421 |
Aspects on Imaging and Endovascular Treatment of Aortic Dissection and AneurysmEriksson, Mats-Ola January 2013 (has links)
Aortic aneurysm and dissections are potentially life threatening conditions. The advent of endovascular aortic repair (EVAR) and thoracic endovascular aortic repair (TEVAR) has reduced perioperative mortality and morbidity and are now established therapy methods for treatment of aortic disease. Adequate pre- and intraoperative imaging is important for optimal results in endovascular procedures. However, the standard use of CT and angiography may not always be sufficient to provide necessary information required for treatment, and complementary techniques are warranted in selected cases. TEVAR in acute complicated type B aortic dissections is proven effective in several reports, but long-term clinical outcome and aortic remodelling are still not fully evaluated. Intravascular phased array imaging (IPAI) was used in patients undergoing EVAR and TEVAR for aortic aneurysm and dissection. The combined information from IPAI and fluoroscopy allowed exact positioning of the stent graft. The colour Doppler function facilitated detection of blood-flow in relevant arteries during and after the procedures, and it also facilitated control of ceased flow in excluded false lumens or aneurysms. Clinical early and long-term results after TEVAR for acute complicated type B aortic dissection were investigated in all patients treated between 1999 and 2009 at UppsalaUniversityHospital. Results were favourable regarding survival and permanent neurological complications. Long-term follow-up of aortic morphological changes in the same patient group showed overall significant reduction of aortic and false lumen diameters, and an increase of true lumen diameter. Total thrombosis of the false lumen occured more often in patients with DeBakey IIIa aortic dissection, than in IIIb. In conclusion, IPAI may be a complementary tool to traditional imaging modalities in EVAR and TEVAR in selected cases. Long-term clinical outcome is excellent with favourable aortic remodeling after TEVAR in patients with acute complicated type B aortic dissection.
|
422 |
Multiple hypothesis testing and multiple outlier identification methodsYin, Yaling 13 April 2010
Traditional multiple hypothesis testing procedures, such as that of Benjamini and Hochberg, fix an error rate and determine the corresponding rejection region. In 2002 Storey proposed a fixed rejection region procedure and showed numerically that it can gain more power than the fixed error rate procedure of Benjamini and Hochberg while controlling the same false discovery rate (FDR). In this thesis it is proved that when the number of alternatives is small compared to the total number of hypotheses, Storeys method can be less powerful than that of Benjamini and Hochberg. Moreover, the two procedures are compared by setting them to produce the same FDR. The difference in power between Storeys procedure and that of Benjamini and Hochberg is near zero when the distance between the null and alternative distributions is large, but Benjamini and Hochbergs procedure becomes more powerful as the distance decreases. It is shown that modifying the Benjamini and Hochberg procedure to incorporate an estimate of the proportion of true null hypotheses as proposed by Black gives a procedure with superior power.<p>
Multiple hypothesis testing can also be applied to regression diagnostics. In this thesis, a Bayesian method is proposed to test multiple hypotheses, of which the i-th null and alternative hypotheses are that the i-th observation is not an outlier versus it is, for i=1,...,m. In the proposed Bayesian model, it is assumed that outliers have a mean shift, where the proportion of outliers and the mean shift respectively follow a Beta prior distribution and a normal prior distribution. It is proved in the thesis that for the proposed model, when there exists more than one outlier, the marginal distributions of the deletion residual of the i-th observation under both null and alternative hypotheses are doubly noncentral t distributions. The outlyingness of the i-th observation is measured by the marginal posterior probability that the i-th observation is an outlier given its deletion residual. An importance sampling method is proposed to calculate this probability. This method requires the computation of the density of the doubly noncentral F distribution and this is approximated using Patnaiks approximation. An algorithm is proposed in this thesis to examine the accuracy of Patnaiks approximation. The comparison of this algorithms output with Patnaiks approximation shows that the latter can save massive computation time without losing much accuracy.<p>
The proposed Bayesian multiple outlier identification procedure is applied to some simulated data sets. Various simulation and prior parameters are used to study the sensitivity of the posteriors to the priors. The area under the ROC curves (AUC) is calculated for each combination of parameters. A factorial design analysis on AUC is carried out by choosing various simulation and prior parameters as factors. The resulting AUC values are high for various selected parameters, indicating that the proposed method can identify the majority of outliers within tolerable errors. The results of the factorial design show that the priors do not have much effect on the marginal posterior probability as long as the sample size is not too small.<p>
In this thesis, the proposed Bayesian procedure is also applied to a real data set obtained by Kanduc et al. in 2008. The proteomes of thirty viruses examined by Kanduc et al. are found to share a high number of pentapeptide overlaps to the human proteome. In a linear regression analysis of the level of viral overlaps to the human proteome and the length of viral proteome, it is reported by Kanduc et al. that among the thirty viruses, human T-lymphotropic virus 1, Rubella virus, and hepatitis C virus, present relatively higher levels of overlaps with the human proteome than the predicted level of overlaps. The results obtained using the proposed procedure indicate that the four viruses with extremely large sizes (Human herpesvirus 4, Human herpesvirus 6, Variola virus, and Human herpesvirus 5) are more likely to be the outliers than the three reported viruses. The results with thefour extreme viruses deleted confirm the claim of Kanduc et al.
|
423 |
Multiple hypothesis testing and multiple outlier identification methodsYin, Yaling 13 April 2010 (has links)
Traditional multiple hypothesis testing procedures, such as that of Benjamini and Hochberg, fix an error rate and determine the corresponding rejection region. In 2002 Storey proposed a fixed rejection region procedure and showed numerically that it can gain more power than the fixed error rate procedure of Benjamini and Hochberg while controlling the same false discovery rate (FDR). In this thesis it is proved that when the number of alternatives is small compared to the total number of hypotheses, Storeys method can be less powerful than that of Benjamini and Hochberg. Moreover, the two procedures are compared by setting them to produce the same FDR. The difference in power between Storeys procedure and that of Benjamini and Hochberg is near zero when the distance between the null and alternative distributions is large, but Benjamini and Hochbergs procedure becomes more powerful as the distance decreases. It is shown that modifying the Benjamini and Hochberg procedure to incorporate an estimate of the proportion of true null hypotheses as proposed by Black gives a procedure with superior power.<p>
Multiple hypothesis testing can also be applied to regression diagnostics. In this thesis, a Bayesian method is proposed to test multiple hypotheses, of which the i-th null and alternative hypotheses are that the i-th observation is not an outlier versus it is, for i=1,...,m. In the proposed Bayesian model, it is assumed that outliers have a mean shift, where the proportion of outliers and the mean shift respectively follow a Beta prior distribution and a normal prior distribution. It is proved in the thesis that for the proposed model, when there exists more than one outlier, the marginal distributions of the deletion residual of the i-th observation under both null and alternative hypotheses are doubly noncentral t distributions. The outlyingness of the i-th observation is measured by the marginal posterior probability that the i-th observation is an outlier given its deletion residual. An importance sampling method is proposed to calculate this probability. This method requires the computation of the density of the doubly noncentral F distribution and this is approximated using Patnaiks approximation. An algorithm is proposed in this thesis to examine the accuracy of Patnaiks approximation. The comparison of this algorithms output with Patnaiks approximation shows that the latter can save massive computation time without losing much accuracy.<p>
The proposed Bayesian multiple outlier identification procedure is applied to some simulated data sets. Various simulation and prior parameters are used to study the sensitivity of the posteriors to the priors. The area under the ROC curves (AUC) is calculated for each combination of parameters. A factorial design analysis on AUC is carried out by choosing various simulation and prior parameters as factors. The resulting AUC values are high for various selected parameters, indicating that the proposed method can identify the majority of outliers within tolerable errors. The results of the factorial design show that the priors do not have much effect on the marginal posterior probability as long as the sample size is not too small.<p>
In this thesis, the proposed Bayesian procedure is also applied to a real data set obtained by Kanduc et al. in 2008. The proteomes of thirty viruses examined by Kanduc et al. are found to share a high number of pentapeptide overlaps to the human proteome. In a linear regression analysis of the level of viral overlaps to the human proteome and the length of viral proteome, it is reported by Kanduc et al. that among the thirty viruses, human T-lymphotropic virus 1, Rubella virus, and hepatitis C virus, present relatively higher levels of overlaps with the human proteome than the predicted level of overlaps. The results obtained using the proposed procedure indicate that the four viruses with extremely large sizes (Human herpesvirus 4, Human herpesvirus 6, Variola virus, and Human herpesvirus 5) are more likely to be the outliers than the three reported viruses. The results with thefour extreme viruses deleted confirm the claim of Kanduc et al.
|
424 |
Radar Target Detection In Non-gaussian ClutterDoyuran, Ulku 01 September 2007 (has links) (PDF)
In this study, novel methods for high-resolution radar target detection in non-Gaussian clutter environment are proposed. In solution of the problem, two approaches are used: Non-coherent detection that operates on the envelope-detected signal for thresholding and coherent detection that performs clutter suppression, Doppler processing and thresholding at the same time. The proposed non-coherent detectors, which are designed to operate in non-Gaussian and range-heterogeneous clutter, yield higher performance than the conventional methods that were designed either for Gaussian clutter or heterogeneous clutter. The proposed coherent detector exploits the information in all the range cells and pulses and performs the clutter reduction and thresholding simultaneously. The design is performed for uncorrelated, partially correlated and fully correlated clutter among range cells. The performance analysis indicates the superiority of the designed methods over the classical ones, in fully correlated and partially correlated situations. In addition, by design of detectors for multiple targets and making corrections to the conventional methods, the target-masking problem of the classical detectors is alleviated.
|
425 |
Derrida et Bergson : dialogue médiat sur la question de l'immédiatFradet, Pierre-Alexandre 08 1900 (has links)
Si le rapport entre Derrida et Bergson n’a pas fait l’objet de nombreuses études, les commentaires existants témoignent à peu près tous d’une vision commune : entre les deux philosophes, les divergences peuvent être atténuées, voire dissoutes, par la considération de convergences plus fondamentales. Les pages qui suivent seront l’occasion pour nous de faire contrepoids à cette vulgate interprétative. Sans nier l’existence de points de contact entre Derrida et Bergson, nous voudrions en effet montrer qu’un important désaccord subsiste entre eux au sujet de la possibilité de l’intuition. Alors que Derrida met en cause les doctrines intuitionnistes, Bergson érige l’intuition en méthode philosophique. Le présent mémoire prendra pour fil conducteur les motifs de cette discorde. Réduit à sa plus simple expression, l’objectif que nous y poursuivrons sera de montrer que les pensées bergsonienne et derridienne, lorsque mises en dialogue, révèlent un désaccord partiel qui permet de réfléchir de façon féconde sur la possibilité de l’intuition. Pour être plus exact, nous caresserons ici une triple ambition : i/ cerner étroitement l’objet du litige entre Derrida et Bergson, trop peu souligné par les commentateurs, et dont nous montrons qu’il s’articule à une entente partielle ; ii/ tirer au clair les diverses raisons qui amènent l’un à s’en prendre à l’intuition, l’autre à embrasser la méthode intuitive ; iii/ établir que certains arguments de Bergson, bien qu’ils connaissent un regain d’intérêt depuis quelques années, paraissent lacunaires lorsqu’on les confronte à différentes objections. / Although studies of the relation between Derrida and Bergson are few and far between, they nearly all share a common vision: that of attenuating – or even altogether eliminating – the divisions between the two philosophers’ thought, by considering their more fundamental convergences. The following pages will allow us to counterbalance this common interpretation. Without denying the points that Derrida and Bergson do have in common, we will show an important divergence in opinion between the two on the idea that intuition is possible and founded. While Derrida lays doubt on intuitionist doctrine, Bergson establishes intuition as a philosophical method. This thesis examines the motives behind this divergence. Put simply, a comparison of Derridian and Bergsonian thought reveals a partial disagreement that enables fruitful reflection about whether or not intuition is possible. More precisely, we pursue three objectives here: i/ to clearly identify the scope of the disagreement between Derrida and Bergson, often overlooked by previous commentaries, showing that it includes a partial agreement; ii/ to clarify the diverse reasons leading Derrida to deny the very existence of intuition while Bergson embraces intuition as a philosophical method; and iii/ to show that certain Bergsonian arguments, although enjoying a resurge in interest in recent years, appear unable to stand up to several different objections.
|
426 |
THE PSYCHOLOGICAL IMPACTS OF FALSE POSITIVE OVARIAN CANCER SCREENING: ASSESSMENT VIA MIXED AND TRAJECTORY MODELINGWiggins, Amanda T 01 January 2013 (has links)
Ovarian cancer (OC) is the fifth most common cancer among women and has the highest mortality of any cancer of the female reproductive system. The majority (61%) of OC cases are diagnosed at a distant stage. Because diagnoses occur most commonly at a late-stage and prognosis for advanced disease is poor, research focusing on the development of effective OC screening methods to facilitate early detection in high-risk, asymptomatic women is fundamental in reducing OC-specific mortality. Presently, there is no screening modality proven efficacious in reducing OC-mortality. However, transvaginal ultrasonography (TVS) has shown value in early detection of OC. TVS presents with the possibility of false positive results which occur when a women receives an abnormal TVS screening test result that is deemed benign following repeat testing (about 7% of the time). The purpose of this dissertation was to evaluate the impact of false positive TVS screening test results on a variety of psychological and behavioral outcomes using mixed and trajectory statistical modeling. The three specific aims of this dissertation were to 1) compare psychological and behavioral outcomes between women receiving normal and false positive results, 2) identify characteristics of women receiving false positive results associated with increased OC-specific distress and 3) characterize distress trajectories following receipt of false positive results.
Analyses included a subset of women participating in an experimental study conducted through the University of Kentucky Ovarian Cancer Screening Program. 750 women completed longitudinal assessments: 375 false positive and 375 normal results. Mixed and group-based trajectory modeling were used to evaluate the specific aims.
Results suggest women receiving false positive TVS result experience increased OC-specific distress compared to women receiving normal results. Among those receiving false positives, less education, no history of an abnormal screening test result, less optimism and more social constraint were associated with increased OC-specific distress. Family history was associated with increased distress among women with monitoring informational coping styles. Three distinct trajectories characterize the trajectory of distress over a four-month study period. Although decreasing over time, a notable proportion of women experience sustained high levels of OC-specific distress.
|
427 |
"I didn’t see an iPod, but you did – so I’ll say I did, too": exploring source memory and subjective experiences accompanying memory conformityAzad, Tanjeem 08 February 2010 (has links)
Memory conformity effects occur when witnesses report misleading suggestions they learned about from another witness. Using a new paradigm the present thesis investigated whether what subject-witnesses report about an event also implies what they personally remember or know about that event. Subjects were tested in pairs, with each member of a pair shown a different version of a video using the MORI technique. There were critical details (e.g., theft of an iPod) in each of the following conditions: visible to only one member of each subject spair, visible to both members of the pair, and not visible to either member of the pair. Pairs subsequently completed a questionnaire together to remember details from the video. Subjects then individually completed a similar questionnaire. A source monitoring and subjective experiences test revealed that co-witness discussion does not necessarily lead people to experience illusory recollections for details they did not witness themselves.
|
428 |
Effects Of Associative Processes On False Memory: Evidence From Converging Associates And Category Associates ProceduresMisirlisoy, Mine 01 July 2004 (has links) (PDF)
The present study investigated the differential effects of test-induced priming on false memories evoked by Converging Associates Procedure (DRM lists) and Category Associates Procedures (Category lists). The experimental settings involved the manipulation of test order of the critical items, in relation to the list items from their corresponding lists. The significance of the study comes from the fact that it directly compares the false memories elicited by Converging Associates Procedure and Category Associates Procedures within the same experimental settings. The results demonstrated that associative processes at test affected the proportion of false recollections elicited by DRM lists more than that elicited by Category lists. The results are discussed in relation to gist based theories of false memory and activation/monitoring account.
|
429 |
Combining over- and under-approximating program analyses for automatic software testingCsallner, Christoph 07 July 2008 (has links)
This dissertation attacks the well-known problem of path-imprecision in static program analysis. Our starting point is an existing static program analysis that over-approximates the execution paths of the analyzed program. We then make this over-approximating program analysis more precise for automatic testing in an object-oriented programming language. We achieve this by combining the over-approximating program analysis with usage-observing and under-approximating analyses. More specifically, we make the following contributions.
We present a technique to eliminate language-level unsound bug warnings produced by an execution-path-over-approximating analysis for object-oriented programs that is based on the weakest precondition calculus. Our technique post-processes the results of the over-approximating analysis by solving the produced constraint systems and generating and executing concrete test-cases that satisfy the given constraint systems. Only test-cases that confirm the results of the over-approximating static analysis are presented to the user. This technique has the important side-benefit of making the results of a weakest-precondition based static analysis easier to understand for human consumers. We show examples from our experiments that visually demonstrate the difference between hundreds of complicated constraints and a simple corresponding JUnit test-case.
Besides eliminating language-level unsound bug warnings, we present an additional technique that also addresses user-level unsound bug warnings. This technique pre-processes the testee with a dynamic analysis that takes advantage of actual user data. It annotates the testee with the knowledge obtained from this pre-processing step and thereby provides guidance for the over-approximating analysis.
We also present an improvement to dynamic invariant detection for object-oriented programming languages. Previous approaches do not take behavioral subtyping into account and therefore may produce inconsistent results, which can throw off automated analyses such as the ones we are performing for bug-finding.
Finally, we address the problem of unwanted dependencies between test-cases caused by global state. We present two techniques for efficiently re-initializing global state between test-case executions and discuss their trade-offs.
We have implemented the above techniques in the JCrasher, Check 'n' Crash, and DSD-Crasher tools and present initial experience in using them for automated bug finding in real-world Java programs.
|
430 |
Section 24 of the criminal code : navigating veracity and verisimilitude in verbatim theatreFaulkner, Natalie January 2007 (has links)
This research project comprises a stage play Section 24 of the Criminal Code, and accompanying exegesis, which focuses upon the experience of a woman accessing the Criminal Justice system after she is raped. The play is in the verbatim model and draws upon court transcript, which is deconstructed to reveal the workings of Defence counsel 'storylines' and meta-narratives of gender, sexual availability and power. The exegesis investigates attitudes toward rape and rape victims perpetuated by Australian popular culture, and the way that myths about false rape complaints and 'deserving victims' continue to influence the reporting and conviction rates for rape. The thesis argues that recent reforms have yet to make an impact on the conviction rate or experience of women accessing the Justice system, because of entrenched misogyny within the system itself. Several factors contribute to widespread ignorance of the reality of our own Criminal Justice system, and the thesis proposes that a work of verbatim theatre may redress the paucity of understanding that enables the dysfunction of the current system. The paper explores the different approaches taken by Verbatim theatre practitioners and the appropriateness of the Verbatim theatre model for communicating this particular (lived) experience. Questions of ownership over one's story, and representation in that story indicate the emancipatory potential of a work. Where practitioners do not have a personal connection to their subject matter or material and access material that is already in the public domain, they may feel a greater freedom to manipulate story and character for dramatic effect, or to suit an activist agenda for change. It is shown that a playwright with a personal connection to her material and subject must address issues of ownership, ethical representation, veracity and verisimilitude when creating a piece of verbatim theatre. Preferencing the truth of the Complainant Woman's experience over the orthodoxies of the well-made play may contribute to a negative response to the work from male audiences. However, the thesis concludes that the subject of rape and its prosecution invokes a gendered response in itself, and ultimately questions the desirability of presenting a play that delivers a palatable story rather than an unpleasant truth.
|
Page generated in 0.0506 seconds