• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1258
  • 958
  • 482
  • 266
  • 46
  • 35
  • 27
  • 22
  • 17
  • 10
  • 8
  • 6
  • 6
  • 4
  • 3
  • Tagged with
  • 3508
  • 745
  • 681
  • 666
  • 657
  • 647
  • 606
  • 460
  • 370
  • 323
  • 302
  • 295
  • 241
  • 222
  • 203
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

A Protein Sequence-Properties Evaluation Framework for Crystallization Screen Design

Dougall, David Stephen 04 January 2008 (has links)
The goal of the research was to develop a Protein-Specific Properties Evaluation (PSPE) framework that would aid in the statistical evaluation of variables for predicting ranges of and prior probability distributions for protein crystallization conditions. Development of such a framework is motivated by the rapid growth and evolution of the Protein Data Bank. Features of the framework that has been developed include (1) it is an instantiation of the scientific method for the framing and testing of hypotheses in an informatics setting, (2) the use of hidden variables, and (3) a negative result is still useful. The hidden variables examined in this study are related to the estimated net charge (Q) of the proteins under consideration. The Q is a function of the amino acid composition, the solution pH, and the assumed pKa values for the titratable amino acid residues. The proteins size clearly has a significant impact on the magnitude of the Q. Therefore, two additional variables were introduced to mitigate this effect, the specific charge (Qbar) and the average surface charge density (sigma). The principal observation is that proteins appear to crystallize at low values of Qbar and sigma. One problem with this observation is that low is a relative term and the frame of reference requires careful examination. The results are sufficiently weak that no prospective predictions appear possible although information of this type could be included with other weak predictors in a Bayesian predictor scheme. Additional work would be required to establish this; however that work is beyond the scope of the dissertation. Although many statistically significant correlations among Q-related quantities were noted, no evidence could be developed to suggest they were anything other than those expected from the additional information introduced with the hidden variables. Thus, the principal conclusions of this PSPE analysis are that (1) Qbar/sigma and other Q-related variables are of limited value as prospective predictors of ranges of values of crystallization conditions. Although this is a negative result, it is still useful in that it allows attention to be directed into more productive avenues.
112

PERCEPTIONS OF PERSONAL POWER AND THEIR RELATIONSHIP TO CLINICIANS RESISTANCE TO THE INTRODUCTION OF COMPUTERIZED PHYSICIAN ORDER ENTRY

Bartos, Christa Elizabeth 25 September 2008 (has links)
The implementation of computerized provider order entry (CPOE) across the health care system has been slow in realization. In addition to the inherent financial burden, a significant cause for this delay is the high number of system failures resulting from clinicians resistance. Changes in workflow and communication, time demands, system complexity, and changes to power structures have all been identified as consequences of CPOE systems that can cause resistance among clinicians. Of these, I believe that perceived changes in a persons power in the workplace can be more difficult to overcome than changes in the work routine. Perception of the power or control that clinicians have in the workplace and their attitudes toward CPOE are precursors to behavior, and if these perceptions and attitudes are negative, can result in resistive behavior. Based on psycho-social theories of power, resistance, and organizational information technology (IT) implementation in business, I applied these concepts to healthcare IT implementation. Qualitative studies have looked at power and resistance, but no previous study has measured the degree or direction of power change, or confirmed that a relationship exists between power perceptions and CPOE attitudes. One reason for this is that no instruments existed to obtain this data. I developed the Semantic Differential Power Perception (SDPP) survey as an electronic survey to measure power perception and CPOE attitudes, and established reliability and validity of the instrument in a measurement study. The SDPP was used to collect data from 276 healthcare workers in two different hospitals before and after implementation of CPOE. I identified a significant correlation between power perceptions and attitudes toward CPOE. Examining the direction of change by healthcare position, we found that the power perception values decreased for all positions and that attitudes toward CPOE varied based on use of the system. Understanding the relationship between power perceptions and CPOE attitudes is the first step in determining causative relationships. This understanding will enable system developers to modify implementation processes and training methods to enhance waning power and support positive power changes, therefore minimizing power related resistance.
113

A Bayesian Network Model for Spatio-Temporal Event Surveillance

Jiang, Xia 07 January 2009 (has links)
Event surveillance involves analyzing a region in order to detect patterns that are indicative of some event of interest. An example is the monitoring of information about emergency department visits to detect a disease outbreak. Spatial event surveillance involves analyzing spatial patterns of evidence that are indicative of the event of interest. A special case of spatial event surveillance is spatial cluster detection, which searches for subregions in which the count of an event of interest is higher than expected. Temporal event surveillance involves monitoring for emerging temporal patterns. Spatio-temporal event surveillance involves joint spatial and temporal monitoring. When the events observed are of direct interest, then analyzing counts of those events is generally the preferred approach. However, in event surveillance we often only observe events that are indirectly related to the events of interest. For example, during an influenza outbreak, we may only have information about the chief complaints of patients who visited emergency departments. In this situation, a better surveillance approach may be to model the relationships among the events of interest and those observed. I developed a high-level Bayesian network architecture that represents a class of spatial event surveillance models, which I call BayesNet-S. I also developed an architecture that represents a class of temporal event surveillance models called BayesNet-T. These Bayesian network architectures are combined into a single architecture that represents a class of spatio-temporal models called BayesNet-ST. Using these architectures, it is often possible to construct a temporal, spatial, or spatio-temporal model from an existing Bayesian network event-surveillance model that is non-spatial and non-temporal. My general hypothesis is that when an existing model is extended to incorporate space and time, event surveillance will be improved. PANDA-CDCA (PC) (Cooper et al., 2007) is a non-temporal, non-spatial disease outbreak detection system. I extended PC both spatially and temporally. My specific hypothesis is that each of the spatial and temporal extensions of PC will perform outbreak detection better than does PC, and that the combined use of the spatial and temporal extensions will perform better than either extension alone. The experimental results obtained in this research support this hypothesis.
114

A Bayesian Rule Generation Framework for 'Omic' Biomedical Data Analysis

Lustgarten, Jonathan Llyle 14 May 2009 (has links)
High-dimensional biomedical 'omic' datasets are accumulating rapidly from studies aimed at early detection and better management of human disease. These datasets pose tremendous challenges for analysis due to their large number of variables that represent measurements of biochemical molecules, such as proteins and mRNA, from bodily fluids or tissues extracted from a rather small cohort of samples. Machine learning methods have been applied to modeling these datasets including rule learning methods, which have been successful in generating models that are easily interpretable by the scientists. Rule learning methods have typically relied on a frequentist measure of certainty within IF-THEN (propositional) rules. In this dissertation, a Bayesian Rule Generation Framework (BRGF) is developed and tested that can produce rules with probabilities, thereby enabling a mathematically rigorous representation of uncertainty in rule models. The BRGF includes a novel Bayesian Discretization method combined with one or more search strategies for building constrained Bayesian Networks from data and converting them into probabilistic rules. Both global and local structures are built using different Bayesian Network generation algorithms and the rule models generated from the network are tested on public and private 'omic' datasets. We show that using a specific type of structure (Bayesian decision graphs) in tandem with a specific type of search method (parallel greedy) allows us to achieve statistically significant higher overall performance over current state of the art rule learning methods. Not only does using the BRGF boost performance on average on 'omic' biomedical data to a statistically significant point, but also provides the ability to incorporate prior information in a mathematically rigorous fashion for modeling purposes.
115

Engineering an EMR System in the Developing World Necessity is the Mother of Invention

Douglas, Gerald Paul 14 May 2009 (has links)
While Electronic Medical Record (EMR) systems continue to improve the efficacy of healthcare delivery in the West, they have yet to be widely deployed in the developing world, where more than 90% of the global disease burden exists. The benefits afforded by an EMR notwithstanding, there is some skepticism regarding the feasibility of operationalizing an EMR system in a low-resource setting. This dissertation challenges these preconceptions and advances the understanding of the problems faced when implementing EMR systems to support healthcare delivery in a developing-world setting. Our methodology relies primarily on eight years of in-field experimentation and study. To facilitate a better understanding of the needs and challenges, we created a pilot system in a large government central hospital in Malawi, Africa. Learning from the pilot we developed and operationalized a point-of-care EMR system for managing the care and treatment of patients receiving antiretroviral therapy, which we put forth as a demonstration of feasibility in a developing-world setting. The pilot identified many unique challenges of healthcare delivery in the developing world, and reinforced the need to engineer solutions from scratch rather than blindly transplant systems developed in and for the West. Three novel technologies were developed over the course of our study, the most significant of which is the touchscreen clinical workstation appliance. Each of the novel technologies and their contribution towards successful implementation are described in the context of both an engineering and a risk management framework. A small comparative study to address data quality concerns associated with a point-of-care approach concluded that there was no significant difference in the accuracy of data collected through the use of a prototype point-of-care system compared to that of data entered retrospectively from paper records. We conclude by noting that while feasibility has been demonstrated the greatest challenge to sustainability is the lack of financial resources to monitor and support EMR systems once in place.
116

SPEECH TO CHART: SPEECH RECOGNITION AND NATURAL LANGUAGE PROCESSING FOR DENTAL CHARTING

Irwin, Regina (Jeannie) Yuhaniak 11 January 2010 (has links)
Typically, when using practice management systems (PMS), dentists perform data entry by utilizing an assistant as a transcriptionist. This prevents dentists from interacting directly with the PMSs. Speech recognition interfaces can provide the solution to this problem. Existing speech interfaces of PMSs are cumbersome and poorly designed. In dentistry, there is a desire and need for a usable natural language interface for clinical data entry. Objectives. (1) evaluate the efficiency, effectiveness, and user satisfaction of the speech interfaces of four dental PMSs, (2) develop and evaluate a speech-to-chart prototype for charting naturally spoken dental exams. Methods. We evaluated the speech interfaces of four leading PMSs. We manually reviewed the capabilities of each system and then had 18 dental students chart 18 findings via speech in each of the systems. We measured time, errors, and user satisfaction. Next, we developed and evaluated a speech-to-chart prototype which contained the following components: speech recognizer; post-processor for error correction; NLP application (ONYX) and; graphical chart generator. We evaluated the accuracy of the speech recognizer and the post-processor. We then performed a summative evaluation on the entire system. Our prototype charted 12 hard tissue exams. We compared the charted exams to reference standard exams charted by two dentists. Results. Of the four systems, only two allowed both hard tissue and periodontal charting via speech. All interfaces required using specific commands directly comparable to using a mouse. The average time to chart the nine hard tissue findings was 2:48 and the nine periodontal findings was 2:06. There was an average of 7.5 errors per exam. We created a speech-to-chart prototype that supports natural dictation with no structured commands. On manually transcribed exams, the system performed with an average 80% accuracy. The average time to chart a single hard tissue finding with the prototype was 7.3 seconds. An improved discourse processor will greatly enhance the prototypes accuracy. Conclusions. The speech interfaces of existing PMSs are cumbersome, require using specific speech commands, and make several errors per exam. We successfully created a speech-to-chart prototype that charts hard tissue findings from naturally spoken dental exams.
117

Foundational studies for measuring the impact, prevalence, and patterns of publicly sharing biomedical research data

Piwowar, Heather Alyce 18 May 2010 (has links)
Many initiatives encourage research investigators to share their raw research datasets in hopes of increasing research efficiency and quality. Despite these investments of time and money, we do not have a firm grasp on the prevalence or patterns of data sharing and reuse. Previous survey methods for understanding data sharing patterns provide insight into investigator attitudes, but do not facilitate direct measurement of data sharing behaviour or its correlates. In this study, we evaluate and use bibliometric methods to understand the impact, prevalence, and patterns with which investigators publicly share their raw gene expression microarray datasets after study publication. To begin, we analyzed the citation history of 85 clinical trials published between 1999 and 2003. Almost half of the trials had shared their microarray data publicly on the internet. Publicly available data was significantly (p=0.006) associated with a 69% increase in citations, independently of journal impact factor, date of publication, and author country of origin. Digging deeper into data sharing patterns required methods for automatically identifying data creation and data sharing. We derived a full-text query to identify studies that generated gene expression microarray data. Issuing the query in PubMed Central, Highwire Press, and Google Scholar found 56% of the data-creation studies in our gold standard, with 90% precision. Next, we established that searching ArrayExpress and the Gene Expression Omnibus databases for PubMed article identifiers retrieved 77% of associated publicly-accessible datasets. We used these methods to identify 11603 publications that created gene expression microarray data. Authors of at least 25% of these publications deposited their data in the predominant public databases. We collected a wide set of variables about these studies and derived 15 factors that describe their authorship, funding, institution, publication, and domain environments. In second-order analysis, authors with a history of sharing and reusing shared gene expression microarray data were most likely to share their data, and those studying human subjects and cancer were least likely to share. We hope these methods and results will contribute to a deeper understanding of data sharing behavior and eventually more effective data sharing initiatives.
118

DETECTING ADVERSE DRUG REACTIONS IN THE NURSING HOME SETTING USING A CLINICAL EVENT MONITOR

Handler, Steven Mark 18 May 2010 (has links)
Adverse drug reactions (ADRs) are the most clinically significant and costly medication-related problems in nursing homes (NH), and are associated with an estimated 93,000 deaths a year and as much as $4 billion of excess healthcare expenditures. Current ADR detection and management strategies that rely on pharmacist retrospective chart reviews (i.e., usual care) are inadequate. Active medication monitoring systems, such as clinical event monitors, are recommended by many safety organizations as an alternative to detect and manage ADRs. These systems have been shown to be less expensive, faster, and identify ADRs not normally detected by clinicians in the hospital setting. The main research goal of this dissertation is to review the rationale for the development and subsequent evaluation of an active medication monitoring system to automate the detection of ADRs in the NH setting. This dissertation includes three parts and each part has its own emphasis and methodology centered on the main topic of better understanding of how to detect ADRs in the NH setting. The first paper describes a systematic review of pharmacy and laboratory signals used by clinical event monitors to detect ADRs in hospitalized adult patients. The second paper describes the development of a consensus list of agreed upon laboratory, pharmacy, and Minimum Data Set signals that can be used by a clinical event monitor to detect potential ADRs. The third paper describes the implementation and pharmacist evaluation of a clinical event monitor using the signals developed by consensus. The findings in the papers described will help us to better understand, design, and evaluate active medication monitoring systems to automate the detection of ADRs in the NH setting. Future research is needed to determine if NH patients managed by physicians who receive active medication monitoring alerts have more ADRs detected, have a faster ADR management response time, and result in more cost-savings from a societal perspective, compared to usual care.
119

Effect of a metacognitive intervention on cognitive heuristic use during diagnostic reasoning

Payne, Velma Lucille 16 May 2011 (has links)
Medical judgment and decision-making frequently occur under conditions of uncertainty. In order to reduce the complexity of diagnosis, physicians often rely on cognitive heuristics. Use of heuristics during clinical reasoning can be effective; however when used inappropriately the result can be flawed reasoning, medical errors and patient harm. Many researchers have attempted to debias individuals from inappropriate heuristic use by designing interventions based on normative theories of decision-making. There have been few attempts to debias individuals using interventions based on descriptive decision-making theories. Objectives: (1) Assess use of Anchoring and Adjustment and Confirmation Bias during diagnostic reasoning; (2) Investigate the impact of heuristic use on diagnostic accuracy; (3) Determine the impact of a metacognitive intervention based on the Mental Model Theory designed to reduce biased judgment by inducing physicians to 'think about how they think'; and (4) Test a novel technique using eye-tracking to determine heuristic use and diagnostic accuracy within mode of thinking as defined by the Dual Process Theory. Methods: Medical students and residents assessed clinical scenarios using a computer system, specified a diagnosis, and designated the data used to arrive at the diagnosis. During case analysis, subjects either verbalized their thoughts or wore eye-tracking equipment to capture eye movements and pupil size as they diagnosed cases. Diagnostic data specified by the subject was used to measure heuristic use and assess the impact of heuristic use on diagnostic accuracy. Eye-tracking data was used to determine the frequency of heuristic use (Confirmation Bias only) and mode of thinking. Statistic models were executed to determine the effect of the metacognitive intervention. Results: Use of cognitive heuristics during diagnostic reasoning was common for this subject population. Logistic regression showed case difficulty to be an important factor contributing to diagnostic error. The metacognitive intervention had no effect on heuristic use and diagnostic accuracy. Eye-tracking data reveal this subject population infrequently assess cases in the Intuitive mode of thinking; spend more time in the Analytical mode of thinking, and switches between the two modes frequently as they reason through a case to arrive at a diagnosis.
120

Active and self-learning in a biological screening task

Peter, Jan-Thorsten January 2009 (has links)
<p>The focus of this master thesis is to analyze the effectiveness of self-learning and active-learning on biological virtual screening. Virtual screening is used in the pharmaceutical industry to increase the effectiveness of biological screening. Self- learning is a technique where parts of the classified test data are reused for training the classifier again. Active learning gives the classifier the possibility to select certain parts of the test data to use them as additional training data. The experiments in the thesis show that both methods can be used to improve the precision of the virtual screening process, but active-learning is more effective due to the additional information that is provided.</p>

Page generated in 0.0723 seconds