• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 109
  • 13
  • 4
  • Tagged with
  • 126
  • 102
  • 27
  • 25
  • 20
  • 20
  • 19
  • 19
  • 18
  • 18
  • 18
  • 17
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Verifikace ukazatelových programů pomocí lesních automatů / Verification of Pointer Programs Based on Forest Automata

Hruška, Martin January 2015 (has links)
In this work, we focus on improving the forest automata based shape analysis implemented in the Forester tool. This approach represents shapes of the heap using forest automata. Forest automata are based on tree automata and Forester currently has only a simple implementation of tree automata. Our first contribution is replacing this implementation by the general purpose tree automata library VATA, which contains the highly optimized implementations of automata operations. The version of Forester using the VATA library participated in the competition SV-COMP 2015. We further extended the forest automata based verification method with two new techniques - a counterexample analysis and predicate abstraction. The first one allows us to determine whether a found error is a real or spurious one. The results of the counterexample analysis is also used for creating new predicates which are used for the refinement of predicate abstraction. We show that both of these techniques contribute to an improvement over the early approach.
52

Návrh řešení Business Intelligence / Proposal Solution of Business Intelligence

Drdla, Tomáš January 2016 (has links)
The aim of this thesis is to propose Business Intelligence solutions and consider its impact on decision-making, implementation costs, the overall contribution to the company and make a proposal that will help to change in the present conditiontion unsatisfactory situation of data management in enterprise JáNěkdo.CZ s.r.o.
53

Integrade Linked Data / Linked Data Integration

Michelfeit, Jan January 2013 (has links)
Linked Data have emerged as a successful publication format which could mean to structured data what Web meant to documents. The strength of Linked Data is in its fitness for integration of data from multiple sources. Linked Data integration opens door to new opportunities but also poses new challenges. New algorithms and tools need to be developed to cover all steps of data integration. This thesis examines the established data integration proceses and how they can be applied to Linked Data, with focus on data fusion and conflict resolution. Novel algorithms for Linked Data fusion are proposed and the task of supporting trust with provenance information and quality assessment of fused data is addressed. The proposed algorithms are implemented as part of a Linked Data integration framework ODCleanStore.
54

Zobrazování datové kostky / Data Cube Visualization

Dittrich, Petr January 2009 (has links)
The topic of the master's thesis is a concept and implementation of the prototype application TOPZ demostrating the data warehouse. All theoretical facts about the data storage space are discussed at first. It is also specified the areas of possible improvements of the data warehouse. The specification of requirements and concept of the demonstration application are described in the following part. Testing the performace of the data warehouse is discussed in last chapter.
55

Data integration in large enterprises / Datová integrace ve velkých podnicích

Nagyová, Barbora January 2015 (has links)
Data Integration is currently an important and complex topic for many companies, because having a good and working Data Integration solution can bring multiple advantages over competitors. Data Integration is usually being executed in a form of a project, which might easily turn into failure. In order to decrease risks and negative impact of a failed Data Integration project, there needs to be good project management, Data Integration knowledge and the right technology in place. This thesis provides a framework for setting up a good Data Integration solution. The framework is developed based on the current theory, currently available Data Integration tools and opinions provided by experts working in the field for a minimum of 7+ years and have proven their skills with a successful Data Integration project. This thesis does not guarantee the development of the right Data Integration solution, but it does provide guidance how to deal with a Data Integration project in a large enterprise. This thesis is structured into seven chapters. The first chapter brings an overview about this thesis such as scope, goals, assumptions and expected value. The second chapter describes Data Management and basic Data Integration theory in order to distinguish these two topics and to explain the relationship between them. The third chapter is focused purely on Data Integration theory which should be known by everyone who participates in a Data Integration project. The fourth chapter analyses features of the current Data Integration solutions available on the market and provides an overview of the most common and necessary functionalities. Chapter five focuses on the practical part of this thesis, where the Data Integration framework is designed based on findings from previous chapters and interviews with experts in this field. Chapter six then applies the framework to a real working (anonymized) Data Integration solution, highlights the gap between the framework and the solution and provides guidance how to deal with the gaps. Chapter seven provides a resume, personal opinion and outlook.
56

Systém pro podporu výuky dynamických datových struktur / System for Support of Dynamic Data Structures Learning

Trávníček, Jiří Unknown Date (has links)
The main objective of this work is to design and implement an application that can be used as an aid for the education of programming essentials. Particularly, the attention focuses on the domain of dynamic data structures. The target application will be implemented with the use of web technologies so that it can be run in an ordinary WWW browser. First of all, a brief introduction recapitulates the data structures to be covered. Then the work summarizes the usable technologies available within the web browsers with the focus on the particular technology (which is DHTML) that will become the target platform. The most significant part of this work then discusses the design of the final application. This rather theoretical part is then followed by the description of the practical implementation. A short user manual is also included.
57

Rozvoj elektronizace samosprávy mikroregionu Zábřežsko ve vztahu k e-governmentu / Development of electronisation of local administration of Zábřežsko micro region in relation with e-government

Filipová, Lucie January 2017 (has links)
Thesis "Development of an electronisation of local administration of Zábřežsko micro region in relation with eGovernment"examines the issues of the development of electronisation of autonomy in relation to legislation and eGovernment. The topic of the public administration has been increasingly discussed in recent years not only in the context of increasing demands for processing speed and information, but also to facilitate transparency and sub-processes in public administration. Theoretical part describes the role of eGovernment in the public administration, the process of electronisation of public administration and individual services of eGovernment (Czech POINT, data boxes, basic registers, communication infrastructure of public administration), which are being in place with municipalities. The practical part evaluates the situation of software and hardware equipment, legislatively bound on a contract software and service of eGovernment on a sample of villages Zábřežsko. On the basis of the results of the questionnaire researches are determinated recommendations for sample of 20 micro region Zábřežsko.
58

Škálovatelná optimalizace celých programů / Scalable link-time optimization

Láska, Ladislav January 2017 (has links)
Both major open-source compilers, GCC and LLVM, have a mature link-time optimization framework usable on most current programs. They are however not free from many performance issues, which prevent them to perform certain analyses and optimizations. We analyze bottlenecks and identify multiple places for improvement, focusing on improving interprocedural points-to analysis. For this purpose, we design a new data structure derived from Bloom filters and use it to significantly improve performance and memory consumption of link-time optimization. Powered by TCPDF (www.tcpdf.org)
59

Návrh a aplikace modelu pro testování implementace nové části DWH na platformě Teradata / Proposal and application of model for testing DWH implementation on Teradata platform

Lysáček, Jakub January 2017 (has links)
The thesis focuses on application of theory of software testing in data warehousing area. The main goal of the thesis is an introduction of selected theory of software testing and analogical use of the theory in data warehousing environment. Part of the main goal is an introduction of architectonic model of testing process and later on focusing on part of testing which is problematic in data warehousing area. Partial goal of the thesis is validation of problematic part of the model using practical scenario. This partial goal is divided into two parts. First part focuses on requirements gathering and categorizing their priority. Second part focuses on demonstrating how project requirements, available resources and Teradata specific environment manipulate with the testing process. Theoretical part summarizes selected theory of software testing which is later applied in the area of data warehouse testing. The chapter introduces phases of data warehouse testing and specific goals of each testing phase. Chapter also describes model FURPS which is used to classify software quality dimensions a selected methods of requirements priority classification. An architectural model of testing process and its entities are described in the second part of theory. The theory then focuses on problematic part of the model which is requirements gathering and classification of their priority and demonstrates the influence of time, quality requirements and available resources on the overall process of testing. Practical part introduces a real-life scenario which demonstrates an application of described theory, namely requirements gathering, classification of requirements priorities and assigning dimensions of quality. The next part demonstrates the influence of available resources and requirements on the scope of testing. The outcome of the practical part of the thesis is that requirements gathering and classification of their priorities allows us to classify and sort scope of testing into logical and clear units, assign roles and their responsibilities and flexibly react on changes of project requirements. The thesis also points out that there are risks associated with changes of scope and emphasizes on the need of their evaluation.
60

Audit ochrany a zabezpečení datové základny / Data Security Audit

Vepřek, Petr January 2006 (has links)
Cílem práce je zpracovat vybrané okruhy související s ochranou a zabezpečením datové základny podniku a navrhnout metodiku pro provádění auditu systému bezpečnosti datové základny. Práce ozřejmuje význam a přínosy aktivní ochrany datové základny podniku, charakterizuje proces provádění analýzy rizik, diskutuje problematiku zálohování, archivace a obnovy datové základny a specifikuje nástroje a systémy eliminace ztrát, poškození a zneužití informací. V poslední kapitole je navrhnuta metodika zpracování auditu ochrany a zabezpečení datové základny a stručně popsána její aplikace v praxi.

Page generated in 0.0298 seconds