• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1608
  • 457
  • 422
  • 171
  • 114
  • 102
  • 61
  • 49
  • 40
  • 36
  • 29
  • 23
  • 21
  • 17
  • 16
  • Tagged with
  • 3650
  • 856
  • 805
  • 755
  • 608
  • 544
  • 420
  • 401
  • 392
  • 365
  • 310
  • 304
  • 296
  • 277
  • 264
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
781

Non-rigid image registration evaluation using common evaluation databases

Wei, Ying 01 December 2009 (has links)
Evaluating non-rigid image registration performance is a difficult problem since there is rarely a “gold standard” (i.e., ground truth) correspondence between two images. The Non-rigid Image Registration Evaluation Project (NIREP) was started to develop a standardized set of common databases, evaluation statistics and a software tool for performance evaluation of non-rigid image registration algorithms. The goal of the work in this thesis is to build up common image databases for rigorous testing of non-rigid image registration algorithms, and compare their performance by a diverse set of evaluation statistics on our multiple well documented image databases. The well documented databases as well as new evaluation statistics have been and will be released to public research community. The performance of five non-rigid registration algorithms (Affine, AIR, Demons, SLE and SICLE) was evaluated using 22 images from two NIREP evaluation databases. Six evaluation statistics (Relative Overlap, Intensity Variance, Normalized ROI overlap, alignment of calcarine sulci, Inverse Consistency Error and Transitivity Error) were used to evaluate and compare registration performance. This thesis provides a complete and accurate reporting of evaluation tests so that others are able to get access to these results and make a comparison of registration algorithms they concerned in their specific use. Moreover, this work followed the recommendations of the Standards for Reporting of Diagnostic Accuracy (STARD) initiative to disclose all relevant information for each non-rigid registration validation test.
782

Data extraction of digitized old newspaper content to streamline the search process for users with a genealogy perspective

Pettersson, Sandra January 2019 (has links)
This thesis presents the data extraction of digitized old newspaper content and the implementation of a search function to simplify for the user. This is developed as a master’s degree project at Linköping University. The application allows the user to search for interesting content in a database of articles and can be used by both genealogists, local historians and novices. The database is filled with data from OCR scanned newspapers and the user can either search the database by their own or with the help of their family tree. The family tree is implemented by reading the users GEDcom file and extracting useful information that is then used to get better search results. The result is returned to the user in the form of digital articles. The work concludes that the information from GEDcom files can be used to find new interesting facts and that the user should be allowed to affect how the data is reduced, in the form of article categorization and filtering.
783

Database Design and Optimization for Telemetric Aquatic Species-Tracking Systems

Regmi, Bijay 01 May 2018 (has links)
Tracking an individual species has always been a challenge for scientists, especially when one has to make sure to not change its natural movement pattern. When the number of individuals being tracked is increased and water is added to the equation, the task becomes next to impossible. But thanks to technologies and tracking methods like telemetry, the task of tracking any species without affecting the natural movement pattern has not only become a reality but easily accessible to scientists. Underwater acoustic telemetry has become a standard tool for fisheries biologist to study the movement pattern of the fish (Heupel). This project develops a minimalistic database designed to meet the needs of the telemetry systems. The database is optimized for storing a large number of datasets generated by the telemetry system and also for the most common queries run against the system.
784

Conceptual Modeling of Data with Provenance

Archer, David William 01 January 2011 (has links)
Traditional database systems manage data, but often do not address its provenance. In the past, users were often implicitly familiar with data they used, how it was created (and hence how it might be appropriately used), and from which sources it came. Today, users may be physically and organizationally remote from the data they use, so this information may not be easily accessible to them. In recent years, several models have been proposed for recording provenance of data. Our work is motivated by opportunities to make provenance easy to manage and query. For example, current approaches model provenance as expressions that may be easily stored alongside data, but are difficult to parse and reconstruct for querying, and are difficult to query with available languages. We contribute a conceptual model for data and provenance, and evaluate how well it addresses these opportunities. We compare the expressive power of our model's language to that of other models. We also define a benchmark suite with which to study performance of our model, and use this suite to study key model aspects implemented on existing software platforms. We discover some salient performance bottlenecks in these implementations, and suggest future work to explore improvements. Finally, we show that our implementations can comprise a logical model that faithfully supports our conceptual model.
785

Public Perception and Privacy Issues with DNA Regulations and Database in Alabama

Hall, Thea Denean 01 January 2016 (has links)
The Combined DNA Index System (CODIS) database is utilized in all 50 states for matching DNA evidence with criminal suspects. While each state administers CODIS, which ultimately feeds into a national database, little is understood about how citizens in states perceive the utility of such a database and how their perceptions and knowledge of DNA could impact state policy changes though voting. Research also suggests that the "CSI Effect" may impact how citizens perceive the role of a national DNA database. Grounded on Gerbner's cultivation theory, the purpose of this study was to determine if, in Alabama, there is a statistically significant relationship between the likelihood of providing DNA and the educational level and gender of study participants and perceptions concerning expanded support state participation in CODIS. Data were collected through an online survey administered to a random sample (n = 584) residents of Alabama that focused on examined the relationships between demographics variables of age, race or ethnicity, level of education and the CSI effect, and support of increased participation in in a standardized national DNA database. Findings indicate that there is not a statistically significant relationship between the CSI effect and support of participation in CODIS. However, data analysis revealed that level of education (p=.05) and gender (p=
786

Murder-suicide in the United States: 1999-2009

Kramer, Katherine Willah Otermat 01 December 2011 (has links)
This dissertation focused on examining murder-suicide in the United States through descriptive, time-series and spatiotemporal analyses using a self-created and herein verified national database that spanned the years 1999 through 2009. Chapter 2, "Establishment and validation of a national database for murder-suicide in the United States: 1999-2009," describes the methods and sources used in the creation of a national database of murder-suicide. The database was validated using less geographically and/or temporally expansive databases through the use of capture-recapture methods in two ways: the number of events identified in specified space and time was compared and cases were matched using the perpetrator's name. Victim and perpetrator characteristics were then described as compared to previous studies. Chapter 3, "A time-series analysis of murder, suicide and murder-suicide in the United States, 1999-2007" utilized time-series analysis techniques to investigate the impact of time varying covariates on murder, suicide and murder-suicide. Analyses were conducted in the United States at the national level from January 1999 to December 2007. Johansen's multivariable cointegration analysis showed that two-month time lagged murder was positively associated with murder, suicide and murder-suicide. Two-month time lagged suicide was negatively associated with murder, suicide and murder-suicide. Two-month time lagged murder-suicide was not related to any of the three events. Chapter 4, "Spatiotemporal relationships among murder, suicide and murder-suicide in the United States: 1999-2008" examined space, time, and spatiotemporal relationships among murder, suicide and murder-suicide using a spatiotemporal scan statistic from SaTScanTM. Thirty-five temporal and spatiotemporal clusters of murder, suicide, murder/murder-suicide, suicide/murder-suicide and murder/suicide/murder-suicide were identified. No purely spatial clusters, clusters of murder/suicide without murder-suicide, or purely murder-suicide were identified. The murder-suicide database, that will be made public in 2012, will be a novel source of information for investigators interested in studying murder-suicides with the inclusion of date, place, perpetrator and victim characteristics. Its validation along with the time-series and spatiotemporal analyses provides greater understanding of murder-suicide by itself and compared to murder and suicide.
787

Low level structures in the implementation of the relational algebra

Otoo, Ekow J. January 1983 (has links)
No description available.
788

On formal specification of authorization policies and their transformations : thesis

Bai, Yun, University of Western Sydney, Nepean, School of Computing and Information Technology January 2000 (has links)
Most of today's information systems are quite complex and often involve multi-user resource-sharing. In such a system, authorization policies are needed to ensure that the information flows in the desired way and to prevent illegal access to the system resource. Overall, authorization policies provide the ability to limit and control accesses to systems, applications and information. These policies need to be updated to capture the changing requirements of applications, systems and users. These updatings are implemented through the transformation of authorization policies. In this thesis, the author proposes a logic based formal approach to specifying authorization policies and to reason about the transformation and sequence of transformations of authorization policies and its application in object oriented databases. The author defines the structure of the policy transformation and employs model-based semantics to perform the transformation under the principle of minimum change. The language is modified to consider a sequence of authorization policy transformations. It handles more complex transformations and solves certain problems. The language is able to represent incomplete information, default authorizations and allows denials to be expressed explicitly. The proposed language is used to specify a variety of well known access control policies such as static separation of duty, dynamic separation of duty and Chinese wall security policy. The authorization formalization is also applied to object oriented databases. / Doctor of Philosophy (PhD)
789

Logic programming based formal representations for authorization and security protocols

Wang, Shujing, University of Western Sydney, College of Health and Science, School of Computing and Mathematics January 2008 (has links)
Logic programming with answer set semantics has been considered appealing rule-based formalism language and applied in information security areas. In this thesis, we investigate the problems of authorization in distributed environments and security protocol verification and update. Authorization decisions are required in large-scale distributed environments, such as electronic commerce, remote resource sharing, etc. We adopt the trust management approach, in which authorization is viewed as a ‘proof of compliance" problem. We develop an authorization language AL with non-monotonic feature as the policy and credential specification language, which can express delegation with depth control, complex subject structures, both positive and negative authorizations, and separation of duty concepts. The theoretical foundation for language AL is the answer set semantics of logic programming. We transform AL to logic programs and the authorization decisions are based on answer sets of the programs. We also explore the tractable subclasses of language AL. We implement a fine grained access control prototype system for XML resources, in which the language AL¤ simplified from AL is the policy and credential specification language. We define XPolicy, the XML format of AL¤, which is a DTD for the XML policy documents. The semantics of the policy is based on the semantics of language AL. The system is implemented using Java programming. We investigate the security protocol verification problem in provable security approach. Based on logic programming with answer set semantics, we develop a unified framework for security protocol verification and update, which integrates protocol specification, verification and update. The update model is defined using forgetting techniques in logic programming. Through a case study protocol, we demonstrate an application of our approach. / Doctor of Philosophy (PhD)
790

Implementation of a logic-based access control system with dynamic policy updates and temporal constraints

Crescini, Vino Fernando, University of Western Sydney, College of Health and Science, School of Computing and Mathematics January 2006 (has links)
As information systems evolve to cope with the ever increasing demand of today’s digital world, so does the need for more effective means of protecting information. In the early days of computing, information security started out as a branch of information technology. Over the years, several advances in information security have been made and, as a result, it is now considered a discipline in its own right. The most fundamental function of information security is to ensure that information flows to authorised entities, and at the same time, prevent unauthorised entities from accessing the protected information. In a typical information system, an access control system provides this function. Several advances in the field of information security have produced several access control models and implementations. However, as information technology evolves, the need for a better access control system increases. This dissertation proposes an effective, yet flexible access control system: the Policy Updater access control system. Policy Updater is a fully-implemented access control system that provides policy evaluations as well as dynamic policy updates. These functions are provided by the use of a logic-based language, L, to represent the underlying access control policies, constraints and policy update rules. The system performs authorisation query evaluations, as well as conditional and dynamic policy updates by translating language L policies to normal logic programs in a form suitable for evaluation using the well-known Stable Model semantics. In this thesis, we show the underlying mechanisms that make up the Policy Updater system, including the theoretical foundations of its formal language, the system structure, a full discussion of implementation issues and a performance analysis. Lastly, the thesis also proposes a non-trivial extension of the Policy Updater system that is capable of supporting temporal constraints. This is made possible by the integration of the well-established Temporal Interval Algebra into the extended authorisation language, language LT , which can also be translated into a normal logic program for evaluation. The formalisation of this extension, together with the full implementation details, are included in this dissertation. / Doctor of Philosophy (PhD)

Page generated in 0.0515 seconds