• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1015
  • 224
  • 97
  • 96
  • 69
  • 31
  • 29
  • 19
  • 19
  • 14
  • 12
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 2076
  • 745
  • 706
  • 585
  • 437
  • 357
  • 330
  • 310
  • 227
  • 221
  • 193
  • 189
  • 174
  • 165
  • 160
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
541

Development of Graphical User Interfaces (GUI) software and database for radiation therapy applications

Adhikary, Kalyan. January 2005 (has links)
Thesis (M.S.)--Medical University of Ohio, 2005. / "In partial fulfillment of the requirements for the degree of Master of Science in Biomedical Sciences." Major advisor: E. Ishmael Parsai. Includes abstract. Document formatted into pages: 144 p. Title from title page of PDF document. Title at ETD Web site: Development of Graphical User Interfaces (GUI) software and databases for radiation therapy applications. Bibliography: page 65.
542

The temporal organisation of documents and versions : a user-centred investigation : a thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in the University of Canterbury /

JasonSmith, Michael. January 2006 (has links)
Thesis (Ph. D.)--University of Canterbury, 2006. / Typescript (photocopy). Includes bibliographical references (p. 166-184). Also available via the World Wide Web.
543

Asynchronous Backup and Initialization of a Database Server for Replicated Database Systems

Bhalla, Subhash, Madnick, Stuart E. 14 April 2003 (has links)
A possibility of a temporary disconnection of database service exists in many computing environments. It is a common need to permit a participating site to lag behind and re-initialize to full recovery. It is also necessary that active transactions view a globally consistent system state for ongoing operations. We present an algorithm for on-the-fly backup and site-initialization. The technique is non-blocking in the sense that failure and recovery procedures do not interfere with ordinary transactions. As a result the system can tolerate disconnection of services and reconnection of disconnected services, without incurring high overheads
544

Σχεδιασμός βάσης δεδομένων ιατροτεχνολογικών προϊόντων

Κατσαμπάνη, Ολυμπία 15 June 2010 (has links)
- / -
545

Ανάπτυξη εφαρμογής καταγραφής αποτελεσμάτων ιατρικών εξετάσεων συμβατής με το πρότυπο HL7

Μπενέτου, Ξανθή 26 August 2010 (has links)
- / -
546

User redefinition of search goals through interaction with an information retrieval system

Hider, Philip Martin January 2004 (has links)
Search goals of users of information retrieval systems have commonly been assumed to be static and well-formed. However, a significant amount of goal redefinition is detected in the studies described. A pilot study examined user behaviour at a library OPAC, showing that search results would quite frequently induce users to reconsider and revise their search goals, sometimes following up with a new search based on this revision (labeled "strong" goal redefinition). The main analysis employed transaction logs from the OCLC FirstSearch service, investigating what factors, if any, might affect the amount of goal redefinition that takes place during a search session. To this end, ten hypotheses were proposed and considered. Within each search session, logged queries were coded according to their conceptual differences or similarities, in order for indices of strong goal redefinition to be constructed: a chronological content analysis was thus performed on the transaction logs. The indices of redefinition for search sessions on different FirstSearch databases were compared. It was found that different databases induced goal redefinition to different extents. Further analysis showed that the metadata displayed by a database appeared to affect the amount of goal redefinition, and that the presence of abstracts in results was a positive factor, as was the presence of descriptors and identifiers, perhaps because of the former's hyperlinking nature on the FirstSearch interface. On the other hand, no evidence was found to indicate that abstract length has much of an effect on goal redefinition, nor hit rate or levels of precision and recall. Of the two indices of redefinition that were produced, the "refined" index showed signs of greater precision. Implications of the findings are discussed. It is suggested that goal redefinition should be considered a positive result of system feedback, and that systems should readily allow users to follow up on redefined goals. Abstracts and summaries of documents should be presented to the user as frequently as possible, and hyperlinks from key terms in the metadata should also be created to assist evolving searches. The importance of how system feedback is encountered by the user is emphasized in a new model of information retrieval, which embraces the nonconscious as part of the "cognitive viewpoint," allowing for nonconscious information wants to enter into a user's consciousness through cues encountered during the scanning of search results, triggering a redefinition of search goal. This thesis paves the way for a considerable amount of potentially important research, including: further testing and development of the index of goal redefinition; deeper transaction log analyses, perhaps using screen recorders, examining semantic content and contextualizing at the level of the query; and further identification and analysis of the factors affecting goal redefinition, across different types of information retrieval system.
547

An Investigation of the Cost and Accuracy Tradeoffs of Supplanting AFDs with Bayes Network in Query Processing in the Presence of Incompleteness in Autonomous Databases

January 2011 (has links)
abstract: As the information available to lay users through autonomous data sources continues to increase, mediators become important to ensure that the wealth of information available is tapped effectively. A key challenge that these information mediators need to handle is the varying levels of incompleteness in the underlying databases in terms of missing attribute values. Existing approaches such as Query Processing over Incomplete Autonomous Databases (QPIAD) aim to mine and use Approximate Functional Dependencies (AFDs) to predict and retrieve relevant incomplete tuples. These approaches make independence assumptions about missing values--which critically hobbles their performance when there are tuples containing missing values for multiple correlated attributes. In this thesis, I present a principled probabilis- tic alternative that views an incomplete tuple as defining a distribution over the complete tuples that it stands for. I learn this distribution in terms of Bayes networks. My approach involves min- ing/"learning" Bayes networks from a sample of the database, and using it do both imputation (predict a missing value) and query rewriting (retrieve relevant results with incompleteness on the query-constrained attributes, when the data sources are autonomous). I present empirical studies to demonstrate that (i) at higher levels of incompleteness, when multiple attribute values are missing, Bayes networks do provide a significantly higher classification accuracy and (ii) the relevant possible answers retrieved by the queries reformulated using Bayes networks provide higher precision and recall than AFDs while keeping query processing costs manageable. / Dissertation/Thesis / M.S. Computer Science 2011
548

DNSSEC -- authenticated denial of existence : understanding zone enumeration

Vasant, Sachin 22 January 2016 (has links)
Over the years DNS has proved to be an integral part of the internet infracstructure. For our purposes, DNS is simply a large scale distributed database that maps human-readable domain names to network recognizable IP addresses. Unfortunately, authenticity of responses was not integral to the initial DNS design. This lead to the possibility of a very practical forgery of responses as displayed by Kaminsky's cache poisoning attacks. DNSSEC is primarily designed as a security extension of DNS, that guarantees authenticity of DNS responses. To answer invalid queries in an authenticated manner, DNSSEC initially employed the NSEC records. To its credit, NSEC allowed nameservers to precompute signatures for such negative responses offline. As a result, NSEC is highly scalable while preserving the authenticity/correctness of responses. But, while doing so, NSEC leaks domains from nameserver's zone. This is called zone enumeration. To counter zone enumeration, NSEC3 was deployed. It is a hashed authenticated denial of existence of mechanism,i.e., it reveals the hashes of the zones in a domain. NSEC3 yet allows offline signatures, and is scalable like NSEC. Unfortunately, hashes are vulnerable to dictionary attacks a property exploited by conventional NSEC3 zone enumeration tool, e.g., nsec3walkertool. This leads us to investigate the possibility of constructing an authenticated denial of existence of mechanism which yet allows offline cryptography. To do so, we first define the security goals of a "secure" DNSSEC mechanism in terms of an Authenticated Database System (ADS) with additional goals of privacy, that we define. Any protocol that achieves these goals, maintains the integrity of DNSSEC responses and prevents zone enumeration. We then show that any protocol that achieves such security goals, can be used to construct weak signatures that prevent selective forgeries. This construction, though a strong indication, doesn't confirm the impossibility of generating proofs offline. To confirm that such proofs aren't possible offline, we show attacks of zone enumeration on two large classes of proofs. The provers/responders in this case either repeat proofs non-negligibly often or select proofs as subsets from a pre-computed set of proof elements. The attackers we present use a dictionary of all elements that are likely to occur in the database/zone. The attackers prune the said dictionary to obtain the set of all elements in the database (along with a few additional elements that are erroneously classified to be in the database). These attackers minimize the number of queries made to such responders and are loosely based on the paradigm of Probably Approximately Correct learning as introduced by Valiant.
549

Predictive Maintenance Framework for a Vehicular IoT Gateway Node Using Active Database Rules

Butylin, Sergei 13 December 2018 (has links)
This thesis describes a proposed design and implementation of a predictive maintenance engine developed to fulfill the requirements of the STO Company (Societe de transport de l'Outaouais) for maintaining vehicles in the fleet. Predictive maintenance is proven to be an effective approach and has become an industry standard in many fields. However, in the transportation industry, it is still in the stages of development due to the complexity of moving systems and the high level dimensions of involved parameters. Because it is almost impossible to cover all use cases of the vehicle operational process using one particular approach to predictive maintenance, in our work we take a systematic approach to designing a predictive maintenance system in several steps. Each step is implemented at the corresponding development stage based on the available data accumulated during system funсtioning cycle. % by dividing the entire system into modules and implementing different approaches. This thesis delves into the process of designing the general infrastructural model of the fleet management system (FMS), while focusing on the edge gateway module located on the vehicle and its function of detecting maintenance events based on current vehicle status. Several approaches may be used to detect maintenance events, such as a machine learning approach or an expert system-based approach. While the final version of fleet management system will use a hybrid approach, in this thesis paper we chose to focus on the second option based on expert knowledge, while machine learning has been left for future implementation since it requires extensive training data to be gathered prior to conducting experiments and actualizing operations. Inspired by the IDEA methodology which promotes mapping business rules as software classes and using the object-relational model for mapping objects to database entities, we take active database features as a base for developing a rule engine implementation. However, in contrast to the IDEA methodology which seeks to describe the specific system and its sub-modules, then build active rules based on the interaction between sub-systems, we are not aware of the functional structure of the vehicle due to its complexity. Instead, we develop a framework for creating specific active rules based on abstract classifications structured as ECA rules (event-condition-action), but with some expansions made due to the specifics of vehicle maintenance. The thesis describes an attempt to implement such a framework, and particularly the rule engine module, using active database features making it possible to encapsulate the active behaviour inside the database and decouple event detection from other functionalities. We provide the system with a set of example rules and then conduct a series of experiments analyzing the system for performance and correctness of events detection.
550

The Health System Costs of Potentially Inappropriate Prescribing in Ontario: A Population-based Study

Black, Cody 12 November 2018 (has links)
Potentially Inappropriate Prescribing (PIP) is common in Canada yet little is known about its health system costs or which PIPs have the greatest cost impact. This thesis examined the health system costs from hospitalizations, emergency department (ED) visits and medications from all PIP, and for distinct PIP. PIPs were identified in a cohort of older adults in Ontario using a subset of the STOPP/START criteria applicable to health administrative databases, and all analyses were conducted by comparing participants with and without PIP. In study one, the costs from hospitalization, ED visits and newly prescribed medications were identified using population attributable fractions. PIP was identified as responsible for a sizeable portion of all three cost categories, with hospitalization and ED visits costs most highly impacted. Study two compared the incremental costs due to PIP among four distinct PIP criteria selected based on differing frequency and crude costs to validate the use of such characteristics for priority-setting. The crude healthcare costs, as well as the cost of the drug causing the PIP and the frequency of the PIP were identified as likely key characteristics of high-impact PIP. Combined, these studies provide evidence on the overall burden of PIP, while also identifying likely characteristics of high-impact PIP. They suggest interventions at the health system level may be needed to address medication appropriateness and provide information which may be helpful to decision-makers when identifying which PIPs should be targeted for intervention, given no health system level interventions for PIP are currently in place.

Page generated in 0.0437 seconds