• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1606
  • 457
  • 422
  • 170
  • 114
  • 102
  • 61
  • 49
  • 40
  • 36
  • 29
  • 23
  • 21
  • 17
  • 16
  • Tagged with
  • 3645
  • 856
  • 804
  • 754
  • 608
  • 544
  • 420
  • 400
  • 392
  • 363
  • 310
  • 304
  • 296
  • 277
  • 264
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

A Publish-Subscribe System for Data Replication and Synchronization Among Integrated Person-Centric Information Systems

Qiu, Xiangbin 01 May 2010 (has links)
Synchronization of data across an integrated system of heterogeneous databases is a difficult but important task, especially in the context of integrating health care information throughout a region, state, or nation. This thesis describes the design and implementation of a data replication and synchronization tool, called the Sync Engine, which allows users to define custom data-sharing patterns and transformations for an integrated system of heterogeneous person-centric databases. This thesis also discusses the relationship between the Sync Engine's contributions and several relevant issues in the area of data integration and replication. The Sync Engine's design and implementation was validated by adapting it to CHARM, a real world integrated system currently in use at the Utah Department of Health.
322

The Design And Development Of A Comprehensive 49 Locus Y-str Database For Major U.S. Populations

Berdos, Paulina Niki 01 January 2004 (has links)
The establishment of a U.S. National Y-STR reference database from a variety of geographically and ethically diverse populations is essential to facilitate the generation of reliable estimates of Y-STR haplotype frequencies. Such multi-locus haplotype frequencies are required to provide a statistical estimate of the significance of a match. Y-STR loci, unlike traditional STR markers, are not independent of one another and are co-inherited as extended haplotypes of linked markers. The estimation of the frequency of occurrence of a particular haplotype therefore necessitates the use of a counting method, which means that the significance of many matches is dependent upon the size, in both the number of samples and the number of included loci, in the database. A U.S. Y-STR Haplotype Reference Database has been created by the International Forensic Y-User Group and is maintained by the Max Plank Institute for Evolutionary Anthropology, Leipzig, Germany. However, this database has been limited to a set of 9 core Y-STRs, limiting its operational usefulness, particularly in light of the development of novel Y-STR multiplexes consisting of additional loci. A key component of our developmental strategy is to allow for the continuous updating of haplotype data using the same samples. This ensures that as new markers are developed, the same samples would be re-typed, and a new extended haplotype developed, thus accommodating any laboratory needing haplotype data for any combination of Y-STR markers. The aid of geographically diverse crime laboratories was enlisted to obtain the necessary samples. In exchange for the samples, the crime laboratories benefit by obtaining a custom built no-cost local Y-STR database. Results on the development of a 49 locus Y-STR National Reference Database will be defined and information on the future establishment of web-based accessibility to the forensic community will also be provided.
323

Project Information System - A Database Approach

Cheung, Paul Nai-Kwong 09 1900 (has links)
<p> In many organizations, the working activities of individuals are accounted for on the basis of projects to which they are assigned. Usually all such information and the descriptions of the projects are recorded manually during the advancement of each project. This primitive procedure is not efficient to maintain information for enabling ongoing project control and analysis of the allocation of personnel resources. Hence there is a need for a computerized system. Such a system has been designed and partly implemented by employing the CDC DMS-170 Database Management System using COBOL as the host language.</p> / Thesis / Master of Science (MSc)
324

A relational algebra database system for a microcomputer /

Chiu, George Kwok-Wing. January 1982 (has links)
No description available.
325

An analysis of the Keetch-Byram Drought Index as a predictor of forest fire potential

Morris, John Andrew 11 August 2007 (has links)
Forestry management relies on combating the challenges posed by fire. Knowledge of fire risk helps in the planning for and reacting to forest fires. One method used is the mapping of the Keetch-Byram Drought Index (KBDI). Though it was not specifically designed to be a fire risk assessment tool it is used to highlight areas with a high potential of fire due to climactic factors. A statistical analysis is performed to determine the relevance of the use of this drought index to measure fire potential. Analysis is performed by testing the relationship of KBDI to the number of fires and area burned in the Mississippi Forestry Commission?s south east fire district. These analyses showed a weak relationship between total fires and KBDI. In many cases there was no relationship. This calls into question the use of KBDI to determine fire potential.
326

AN INVESTIGATION OF 3-D ANTHROPOMETRIC SHAPE DESCRIPTORS FOR DATABASE MINING

ROBINETTE, KATHLEEN M. January 2003 (has links)
No description available.
327

An analysis of the learning curve to achieve competency at colonoscopy using the JETS database

Ward, S.T., Mohammed, Mohammed A., Walt, R., Valori, R., Ismail, T., Dunckley, P. 27 January 2014 (has links)
No / Objective The number of colonoscopies required to reach competency is not well established. The primary aim of this study was to determine the number of colonoscopies trainees need to perform to attain competency, defined by a caecal intubation rate (CIR) ≥90%. As competency depends on completion, we also investigated trainee factors that were associated with colonoscopy completion. Design The Joint Advisory Group on GI Endoscopy in the UK has developed a trainee e-portfolio from which colonoscopy data were retrieved. Inclusion criteria were all trainees who had performed a total of ≥20 colonoscopies and had performed ≤50 colonoscopies prior to submission of data to the e-portfolio. The primary outcome measure was colonoscopy completion. The number of colonoscopies required to achieve CIR ≥90% was calculated by the moving average method and learning curve cumulative summation (LC-Cusum) analysis. To determine factors which determine colonoscopy completion, a mixed effect logistic regression model was developed which allowed for nesting of patients within trainees and nesting of patients within hospitals, with various patient, trainee and training factors entered as fixed effects. Results 297 trainees undertook 36 730 colonoscopies. By moving average analysis, the cohort of trainees reached a CIR of 90% at 233 procedures. By LC-Cusum analysis, 41% of trainees were competent after 200 procedures. Of the trainee factors, the number of colonoscopies, intensity of training and previous flexible sigmoidoscopy experience were significant factors associated with colonoscopy completion. Conclusions This is the largest study to date investigating the number of procedures required to achieve competency in colonoscopy. The current training certification benchmark in the UK of 200 procedures does not appear to be an inappropriate minimum requirement. The LC-Cusum chart provides real time feedback on individual learning curves for trainees. The association of training intensity and flexible sigmoidoscopy experience with colonoscopy completion could be exploited in training programmes.
328

Comparing database management systems with SQLAlchemy : A quantitative study on database management systems

Fredstam, Marcus, Johansson, Gabriel January 2019 (has links)
Knowing which database management system to use for a project is difficult to know in advance. Luckily, there are tools that can help the developer apply the same database design on multiple different database management systems without having to change the code. In this thesis, we investigate the strengths of SQLAlchemy, which is an SQL toolkit for Python. We compared SQLite, PostgreSQL and MySQL using SQLAlchemy as well as compared a pure MySQL implementation against the results from SQLAlchemy. We conclude that, for our database design, PostgreSQL was the best database management system and that for the average SQL-user, SQLAlchemy is an excellent substitution to writing regular SQL.
329

Medical Data Management on the cloud / Gestion de données médicales sur le cloud

Mohamad, Baraa 23 June 2015 (has links)
Résumé indisponible / Medical data management has become a real challenge due to the emergence of new imaging technologies providing high image resolutions.This thesis focuses in particular on the management of DICOM files. DICOM is one of the most important medical standards. DICOM files have special data format where one file may contain regular data, multimedia data and services. These files are extremely heterogeneous (the schema of a file cannot be predicted) and have large data sizes. The characteristics of DICOM files added to the requirements of medical data management in general – in term of availability and accessibility- have led us to construct our research question as follows:Is it possible to build a system that: (1) is highly available, (2) supports any medical images (different specialties, modalities and physicians’ practices), (3) enables to store extremely huge/ever increasing data, (4) provides expressive accesses and (5) is cost-effective .In order to answer this question we have built a hybrid (row-column) cloud-enabled storage system. The idea of this solution is to disperse DICOM attributes thoughtfully, depending on their characteristics, over both data layouts in a way that provides the best of row-oriented and column-oriented storage models in one system. All with exploiting the interesting features of the cloud that enables us to ensure the availability and portability of medical data. Storing data on such hybrid data layout opens the door for a second research question, how to process queries efficiently over this hybrid data storage with enabling new and more efficient query plansThe originality of our proposal comes from the fact that there is currently no system that stores data in such hybrid storage (i.e. an attribute is either on row-oriented database or on column-oriented one and a given query could interrogate both storage models at the same time) and studies query processing over it.The experimental prototypes implemented in this thesis show interesting results and opens the door for multiple optimizations and research questions.
330

Performance Analysis of Relational Databases, Object-Oriented Databases and ORM Frameworks / Prestandaanalys av Relationsdatabaser, Objektorienterade Databaser och ORM-Ramverk

Nagy, Victor January 2014 (has links)
In the planning stage of web and software development, it is important to select the right tool for the job. When selecting the database to use, relational databases like MySQL is a popular choice. However, relational databases suffer by object-relational impedance mismatch. In this work we will explore the response time of relational and object-oriented databases and the overhead of ORM frameworks. This will be done by creating a web application that receives data about flights and airports from a client, which measures the response time of the databases and the entire request. It was found that MySQL has the lowest response time, while the ORM framework Hibernate adds an overhead on some of the tests while performing similar to MySQL. Db4o had the highest response time in a majority of the tests. In future works, this study could be extended by other tests or by other type of databases.

Page generated in 0.0466 seconds