Spelling suggestions: "subject:"database"" "subject:"catabase""
1161 |
Object oriented databases : a natural part of object oriented software development?Carlsson, Anders January 2003 (has links)
The technology of object oriented databases was introduced to system developers in the late 1980?s. Despite that it is rarely used today. This thesis introduces the concept of object oriented databases as the purposed solution to the problems that exist with the use of relational databases. The thesis points to the advantages with storing the application objects in the database without disassembling them to fit a relational data model. Based on that advantages and the cost of introducing such a rarely used technology into a project, a guideline for when to use object oriented databases and when to use relational databases is given. / anders@actk.net
|
1162 |
En webbaserad komptensmatris / Webbased Competence MatrixLindahl, Mathias, Murad, Assan January 2017 (has links)
Idag är det vanligt på större IT-företag att konstellationen i de agila teamen ändras i och med projektens kompetensbehov. Förutom de egna anställda, hyrs ofta konsulter in med spetskompetens inom olika områden. För att effektivisera projektplanering, kompetensutveckling och för att hitta kompetenser i andra team på företaget, behövdes ett verktyg för att hantera kompetenserna på företaget. Denna rapport bygger på ett uppdrag förmedlat av Transportstyrelsen. Transportstyrelsens IT-enhet växer och ett stort antal konsulter är ständigt på plats. Uppdraget gick ut på att ta fram en prototyp till en webbaserad plattform för att hantera de olika teamens kompetenser. Projektet resulterade i en webbapplikation med ett användargränssnitt byggt i Angular med data lagrad i en code first-baserad SQL-server. Kommunikationen mellan Angular och SQL-servern görs över ett Web-Api. / Today, it is common for large IT companies that the constellation of the agile teams changes based on the projects skills requirements. In addition to their own employees, consultants are often hired with excellence in different areas. In order to streamline project planning, skills development and finding skills in other teams in the company, a tool was required to manage the skills in the company. This report is based on a mission conveyed by the Swedish transport Agency. The Transport Agency's IT unit is growing and a large number of consultants are constantly present. The mission was to produce a prototype of a web-based platform to handle teams various skills. The project resulted in a web application with a user interface built in Angular with data stored in a code first-based SQL server. Communication between Angular and the SQL Server is done over a Web Api.
|
1163 |
Main-memory database VS Traditional databaseRehn, Marcus, Sunesson, Emil January 2013 (has links)
There has been a surge of new databases in recent years. Applications today create a higher demand on database performance than ever before. Main-memory databases have come into the market quite recently and they are just now catching a lot of interest from many different directions. Main-memory databases are a type of database that stores all of its data in the primary memory. They provide a big increase in performance to a lot of different applications. This work evaluates the difference in performance between two chosen candidates. To represent main memory databases we chose VoltDB and to represent traditional databases we chose MySQL. We have performed several tests on those two databases. We point out differences in functionality, performance and design choices. We want to create a reference where anyone that considers changing from a traditional database to a main memory database, can find support for their decision. What are the advantages and what are the disadvantages of using a main-memory database, and when should we switch from our old database to a newer technology.
|
1164 |
[en] A STUDY FOR SHARING LEARNING OBJECTS IN MULTIMEDIA DATABASE / [pt] UM ESTUDO PARA O COMPARTILHAMENTO DE OBJETOS DE APRENDIZADO EM BANCO DE DADOS MULTIMÍDIAKONSTANTIN KURIZKY 30 June 2004 (has links)
[pt] Este trabalho apresenta uma proposta para utilizar a
tecnologia de banco de dados para o armazenamento e a
gerência de objetos de aprendizado em uma federação de
banco de dados (banco de dados distribuído). A crescente
evolução no uso de aprendizado eletrônico trouxe o foco
para a produtividade na elaboração e gerência do conteúdo
dos módulos educacionais. Este conteúdo compreende hoje de
vídeos, áudios e de outros dados relacionados, além de
textos. Este material é normalmente armazenado pelos
instrutores sem maiores preocupações quanto ao
compartilhamento. Como membro do projeto PGL (Partnership in
Global Learning) - uma organização virtual voltada para
pesquisa, desenvolvimento e disseminação do aprendizado
através de novas tecnologias - o laboratório de banco de
dados da PUC-Rio - TecBD, tem pesquisado a adoção do
enfoque de banco de dados para a gerência de objetos de
aprendizado (Learning Objects) armazenados em locais
interligados formando um ambiente de banco de dados
heterogêneos distribuído. Este trabalho visa: 1) utilizando
produtos de BD comercialmente disponíveis; 2) adotando os
atuais padrões existentes para definição de objetos de
aprendizado; 3) considerando objetos de aprendizado
armazenados em locais separados e autônomos; 4) desenvolver
uma aplicação (protótipo) com esses objetos de aprendizado.
O modelo de dados adotado estabelece uma estrutura de
objetos de aprendizado compostos, via relacionamentos com
elementos atômicos e também com elementos compostos.
Diferentes abordagens como, por exemplo, Web Services,
Java/Servlets e Web Application Servers, foram estudadas
para o problema da autonomia e distribuição geográfica. Um
protótipo foi construído utilizando o produto IBM DB2 com
seus recursos suplementares tais como extensores para dados
de áudio, vídeo, imagens, XML e suporte para gerenciamento
federado. A exploração dos dados armazenados, via navegador
(browser), foi realizada utilizando a camada IBM Net.Data
que embora não obrigatória, permitiu realizar a tarefa de
um modo simples e disponibilizou uma solução bem integrada
com o IBM DB2 e seus complementos. / [en] This work presents a proposal to utilize database
technology for storing and managing learning objects in a
database federation (distributed database). The evolution
of e-learning has brought the focus over the productivity
to make and to manage the content of learning modules,
which today comprises videos, audio, among other related
data, besides the text data. Instructors normally store this
material without worry about sharing. As a member of PGL -
Partnership in Global Learning - a virtual organization for
research, development and dissemination of learning through
new technologies - TecBD - the PUC-Rio`s Database
Laboratory is researching the use of database approach for
managing learning objects stored on interconnected sites
composing a heterogeneous distributed database environment.
This work intends: 1) using market ready DB products; 2)
adopting the actual standards for defining of learning
objects; 3) considering learning objects stored on
separated and autonomous sites; 4) to develop an
application (a study case) with these learning objects. The
learning object`s model establishes a structure for
composing learning objects by linking atomic elements and
also linking composed elements. Different approaches as
Web Services, Java/Servlets and Web Application Servers
were considered for the geographically distributed problem.
A study case was build using the product IBM DB2 with the
provided extenders for audio, video, image, XML data and the
Federated System Support. The web browser`s explore of the
stored data was build using the IBM Net.Data software.
Although not exclusive, it provided an easy way to perform
this task and also enabled an easy integration with IBM DB2
and its extenders.
|
1165 |
Natural Language Interfaces to DatabasesChandra, Yohan 12 1900 (has links)
Natural language interfaces to databases (NLIDB) are systems that aim to bridge the gap between the languages used by humans and computers, and automatically translate natural language sentences to database queries. This thesis proposes a novel approach to NLIDB, using graph-based models. The system starts by collecting as much information as possible from existing databases and sentences, and transforms this information into a knowledge base for the system. Given a new question, the system will use this knowledge to analyze and translate the sentence into its corresponding database query statement. The graph-based NLIDB system uses English as the natural language, a relational database model, and SQL as the formal query language. In experiments performed with natural language questions ran against a large database containing information about U.S. geography, the system showed good performance compared to the state-of-the-art in the field.
|
1166 |
A comparison of latency for MongoDB and PostgreSQL with a focus on analysis of source codeLindvall, Josefin, Sturesson, Adam January 2021 (has links)
The purpose of this paper is to clarify the differences in latency between PostgreSQL and MongoDB as a consequence of their differences in software architecture. This has been achieved through benchmarking of Insert, Read and Update operations with the tool “Yahoo! Cloud Serving Benchmark”, and through source code analysis of both database management systems (DBMSs). The overall structure of the architecture has been researched with Big O notation as a tool to examine the complexity of the source code. The result from the benchmarking show that the latency for Insert and Update operations were lower for MongoDB, while the latency for Read was lower for PostgreSQL. The results from the source code analysis show that both DBMSs have a complexity of O(n), but that there are multiple differences in their software architecture affecting latency. The most important difference was the length of the parsing process which was larger for PostgreSQL. The conclusion is that there are significant differences in latency and source code and that room exists for further research in the field. The biggest limitation of the experiment consist of factors such as background processes which affected latency and could not be eliminated, resulting in a low validity.
|
1167 |
Systém pro detekci rámce GPON / GPON Frame Detection SystemHolík, Martin January 2018 (has links)
This diploma thesis deals with GPON frame detection system. Partial problems of designing databases, optical networks and management of database system are described in theoretical parts. Practical parts this thesis are focused on design of system for detecting GPON frames and script for analysing of traffic.
|
1168 |
A Comparison of Bibliographic Instruction Methods on CD-ROM DatabasesDavis, Dorothy F. (Dorothy Frances) 05 1900 (has links)
The purpose of this study was to compare four different methods of bibliographic instruction in order to determine which method would have the most effect on student learning.
|
1169 |
Sinkhole Hazard Assessment in Minnesota Using a Decision Tree ModelGao, Yongli, Alexander, E. Calvin 01 May 2008 (has links)
An understanding of what influences sinkhole formation and the ability to accurately predict sinkhole hazards is critical to environmental management efforts in the karst lands of southeastern Minnesota. Based on the distribution of distances to the nearest sinkhole, sinkhole density, bedrock geology and depth to bedrock in southeastern Minnesota and northwestern Iowa, a decision tree model has been developed to construct maps of sinkhole probability in Minnesota. The decision tree model was converted as cartographic models and implemented in ArcGIS to create a preliminary sinkhole probability map in Goodhue, Wabasha, Olmsted, Fillmore, and Mower Counties. This model quantifies bedrock geology, depth to bedrock, sinkhole density, and neighborhood effects in southeastern Minnesota but excludes potential controlling factors such as structural control, topographic settings, human activities and land-use. The sinkhole probability map needs to be verified and updated as more sinkholes are mapped and more information about sinkhole formation is obtained.
|
1170 |
Database Tuning using Evolutionary and Search AlgorithmsRaneblad, Erica January 2023 (has links)
Achieving optimal performance of a database can be crucial for many businesses, and tuning its configuration parameters is a necessary step in this process. Many existing tuning methods involve complex machine learning algorithms and require large amounts of historical data from the system being tuned. However, training machine learning models can be problematic if a considerable amount of computational resources and data storage is required. This paper investigates the possibility of using less complex search algorithms or evolutionary algorithms to tune database configuration parameters, and presents a framework that employs Hill Climbing and Particle Swarm Optimization. The performance of the algorithms are tested on a PostgreSQL database using read-only workloads. Particle Swarm Optimization displayed the largest improvement in query response time, improving it by 26.09% compared to using the configuration parameters' default values. Given the improvement shown by Particle Swarm Optimization, evolutionary algorithms may be promising in the field of database tuning.
|
Page generated in 0.0605 seconds