• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 11
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 90
  • 90
  • 29
  • 29
  • 26
  • 20
  • 15
  • 14
  • 13
  • 11
  • 10
  • 10
  • 9
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Abstractions and algorithms for active multidatabases /

Obermeyer, Lincoln Lance, January 1999 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 1999. / Vita. Includes bibliographical references (leaves 198-212). Available also in a digital version from Dissertation Abstracts.
22

Schema exportation and integration for achieving information sharing in a transnational setting

Patil, Manjiri Pandurang. January 2005 (has links)
Thesis (M.S.)--University of Florida, 2005. / Title from title page of source document. Document formatted into pages; contains 77 pages. Includes vita. Includes bibliographical references.
23

Designing and implementing a distributed database for a small multi-outlet business

Grech, Joseph. January 2009 (has links)
Thesis (M.S.S.I.S.)--Regis University, Denver, Colo., 2009. / Title from PDF title page (viewed on Jun. 26, 2010). Includes bibliographical references.
24

FUNCTION COMPUTING IN VERTICALLY PARTITIONED DISTRIBUTED DATABASES

SHINDE, KAUSTUBH ARUN January 2006 (has links)
No description available.
25

Query processing in distributed database systems /

Unnava, Vasundhara January 1992 (has links)
No description available.
26

Relational Computing Using HPC Resources: Services and Optimizations

Soundarapandian, Manikandan 15 September 2015 (has links)
Computational epidemiology involves processing, analysing and managing large volumes of data. Such massive datasets cannot be handled efficiently by using traditional standalone database management systems, owing to their limitation in the degree of computational efficiency and bandwidth to scale to large volumes of data. In this thesis, we address management and processing of large volumes of data for modeling, simulation and analysis in epidemiological studies. Traditionally, compute intensive tasks are processed using high performance computing resources and supercomputers whereas data intensive tasks are delegated to standalone databases and some custom programs. DiceX framework is a one-stop solution for distributed database management and processing and its main mission is to leverage and utilize supercomputing resources for data intensive computing, in particular relational data processing. While standalone databases are always on and a user can submit queries at any time for required results, supercomputing resources must be acquired and are available for a limited time period. These resources are relinquished either upon completion of execution or at the expiration of the allocated time period. This kind of reservation based usage style poses critical challenges, including building and launching a distributed data engine onto the supercomputer, saving the engine and resuming from the saved image, devising efficient optimization upgrades to the data engine and enabling other applications to seamlessly access the engine . These challenges and requirements cause us to align our approach more closely with cloud computing paradigms of Infrastructure as a Service(IaaS) and Platform as a Service(PaaS). In this thesis, we propose cloud computing like workflows, but using supercomputing resources to manage and process relational data intensive tasks. We propose and implement several services including database freeze and migrate and resume, ad-hoc resource addition and table redistribution. These services assist in carrying out the workflows defined. We also propose an optimization upgrade to the query planning module of postgres-XC, the core relational data processing engine of the DiceX framework. With a knowledge of domain semantics, we have devised a more robust data distribution strategy that would enable to push down most time consuming sql operations forcefully to the postgres-XC data nodes, bypassing its query planner's default shippability criteria without compromising correctness. Forcing query push down reduces the query processing time by a factor of almost 40%-60% for certain complex spatio-temporal queries on our epidemiology datasets. As part of this work, a generic broker service has also been implemented, which acts as an interface to the DiceX framework by exposing restful apis, which applications can make use of to query and retrieve results irrespective of the programming language or environment. / Master of Science
27

Selective data replication for distributed geographical data sets : a thesis submitted in partial fulfilment of the requirements for the degree of Master of Science, Department of Computer Science & Software Engineering, University of Canterbury, Christchurch, New Zealand /

Gu, Xuan. January 1900 (has links)
Thesis (M. Sc.)--University of Canterbury, 2008. / Typescript (photocopy). "October 2008." Includes bibliographical references (p. [81]-85). Also available via the World Wide Web.
28

Reconfigurable multiprocessor operating system kernel for high performance computing

Mukherjee, Bodhisattwa 12 1900 (has links)
No description available.
29

Extracción de conocimiento en grandes bases de datos utilizando estrategias adaptativas

Hasperué, Waldo January 2014 (has links)
El objetivo general de esta tesis es el desarrollo de una técnica adaptativa para la extracción de conocimiento en grandes bases de datos. En el análisis de enormes volúmenes de datos resulta de interés contar con técnicas que permitan, primero analizar la información y obtener conocimiento útil en forma de reglas de clasificación y luego adaptar el conocimiento adquirido ante los cambios que ocurran en los datos originales. El aporte de la tesis está centrado en la definición de una técnica adaptativa que permite extraer conocimiento de grandes bases de datos a partir de un modelo dinámico capaz de adaptarse a los cambios de la información, obteniendo así una técnica de minería de datos que sea capaz de generar conocimiento útil, produciendo resultados que sean de provecho al usuario final. Los resultados de esta investigación pueden aplicarse en áreas tales como análisis de suelos, análisis genético, biología, robótica, economía, medicina, detección de fallas en plantas y comunicación de sistemas móviles. <i>(del texto de la contratapa)</i> / Tesis doctoral de la Facultad de Informática (UNLP). Grado alcanzado: Doctor en Ciencias Informáticas. Directores de tesis: Armando De Giusti y Laura Lanzarini. La tesis, presentada en el año 2012, obtuvo el Premio "Dr. Raúl Gallard" en el 2013.
30

Towards a unified methodology for the design and development of distributed control system software

Lau, Y. K. H. January 1991 (has links)
A unified approach to the design and development of distributed control software is presented. This method is the result of a 'tight' integration between a formal method for concurrent systems (CSP) and a structured method for distributed control system (DARTS). The work presented in this thesis does not seek to extend the semantic model of CSP nor to design a specific control algorithm, rather, efforts are made to apply the existing specification and verification techniques to enhance the formality of the well established and case-proven structured counterparts that benefits are captured from both methods. As a methodology is the central aim, the suggested approach is a first step towards a complete unified software development environment, which engineers can follow from organising design ideas to system implementation with proven correctness. The thesis develops a set of parameterised CSP predicates for expressing concurrency and communication together with a corresponding set of generic processes to reflect these specified behaviours. These generic processes are formal building blocks for generating system implementations at different levels of abstraction. Utilisation of DARTS criteria and the parameterised CSP objects frame the refinement strategies. Also, mappings of generic processes to pictorial representations are suggested which enable easy assimilation of the evolving designs. Applicability of the approach is demonstrated through a high level software design of a highperformance robot control system where its suitability is shown via requirement specifications, properties verification and implementation of salient behaviours using generic building blocks. Although verification often means rigorous mathematical reasoning, the thesis presents a proof assistant the Causality Diagram Evaluation Tool to automate the manipulation of CSP processes according to the defined algebraic laws. It is shown to be of value in reasoning with designs and implementations of the robot system. It is found that the analysis facility and the graphical interpretation of communication provided by the tool allow effective analysis and manipulation of early designs. The results derived from specifying essential design details, from transforming highly abstracted implementation models, and from investigation of system behaviours through formal reasoning and simulation conclude that formal methods, in particular CSP, has a niche value in enhancing software reliability at the sub-system level as well as providing a better underpinning to the structured method DARTS. The end product is a method to generate a correct and unambiguous document of the system concerned that is amenable to a direct implementation.

Page generated in 0.0656 seconds