• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • 18
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 64
  • 64
  • 18
  • 16
  • 15
  • 14
  • 14
  • 14
  • 14
  • 13
  • 12
  • 11
  • 9
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Aplikace pro zobrazení modelu bezdrátové sítě / Application for display wireless network model

Žoldoš, Petr January 2011 (has links)
The first step of Master's thesis was to gain knowledge about technologies Adobe Flex SDK and Google Maps API. Knowledge was used to develop an application, which let users create, generate and modify graphical wireless network model. Position and characteristics of each single unit are monitored either in a map interface or in building plans. Dates are gathered from forms filled by current user, from external file or periodically from connected database system. Theoretical part enlightens technologies that were used. It describes program development and solutions, which were made, along with examples of the source code. Included are printscreens of graphical user interface as well as description of how does it all work.
32

Heterogeneity-Aware Placement Strategies for Query Optimization

Karnagel, Tomas 23 May 2017 (has links)
Computing hardware is changing from systems with homogeneous CPUs to systems with heterogeneous computing units like GPUs, Many Integrated Cores, or FPGAs. This trend is caused by scaling problems of homogeneous systems, where heat dissipation and energy consumption is limiting further growths in compute-performance. Heterogeneous systems provide differently optimized computing hardware, which allows different operations to be computed on the most appropriate computing unit, resulting in faster execution and less energy consumption. For database systems, this is a new opportunity to accelerate query processing, allowing faster and more interactive querying of large amounts of data. However, the current hardware trend is also a challenge as most database systems do not support heterogeneous computing resources and it is not clear how to support these systems best. In the past, mainly single operators were ported to different computing units showing great results, while missing a system wide application. To efficiently support heterogeneous systems, a systems approach for query processing and query optimization is needed. In this thesis, we tackle the optimization challenge in detail. As a starting point, we evaluate three different approaches on isolated use-cases to assess their advantages and limitations. First, we evaluate a fork-join approach of intra-operator parallelism, where the same operator is executed on multiple computing units at the same time, each execution with different data partitions. Second, we evaluate using one computing unit statically to accelerate one operator, which provides high code-optimization potential, due to this static and pre-known usage of hardware and software. Third, we evaluate dynamically placing operators onto computing units, depending on the operator, the available computing hardware, and the given data sizes. We argue that the first and second approach suffer from multiple overheads or high implementation costs. The third approach, dynamic placement, shows good performance, while being highly extensible to different computing units and different operator implementations. To automate this dynamic approach, we first propose general placement optimization for query processing. This general approach includes runtime estimation of operators on different computing units as well as two approaches for defining the actual operator placement according to the estimated runtimes. The two placement approaches are local optimization, which decides the placement locally at run-time, and global optimization, where the placement is decided at compile-time, while allowing a global view for enhanced data sharing. The main limitation of the latter is the high dependency on cardinality estimation of intermediate results, as estimation errors for the cardinalities propagate to the operator runtime estimation and placement optimization. Therefore, we propose adaptive placement optimization, allowing the placement optimization to become fully independent of cardinalities estimation, effectively eliminating the main source of inaccuracy for runtime estimation and placement optimization. Finally, we define an adaptive placement sequence, incorporating all our proposed techniques of placement optimization. We implement this sequence as a virtualization layer between the database system and the heterogeneous hardware. Our implementation approach bases on preexisting interfaces to the database system and the hardware, allowing non-intrusive integration into existing database systems. We evaluate our techniques using two different database systems and two different OLAP benchmarks, accelerating the query processing through heterogeneous execution.
33

Scene Understanding For Real Time Processing Of Queries Over Big Data Streaming Video

Aved, Alexander 01 January 2013 (has links)
With heightened security concerns across the globe and the increasing need to monitor, preserve and protect infrastructure and public spaces to ensure proper operation, quality assurance and safety, numerous video cameras have been deployed. Accordingly, they also need to be monitored effectively and efficiently. However, relying on human operators to constantly monitor all the video streams is not scalable or cost effective. Humans can become subjective, fatigued, even exhibit bias and it is difficult to maintain high levels of vigilance when capturing, searching and recognizing events that occur infrequently or in isolation. These limitations are addressed in the Live Video Database Management System (LVDBMS), a framework for managing and processing live motion imagery data. It enables rapid development of video surveillance software much like traditional database applications are developed today. Such developed video stream processing applications and ad hoc queries are able to "reuse" advanced image processing techniques that have been developed. This results in lower software development and maintenance costs. Furthermore, the LVDBMS can be intensively tested to ensure consistent quality across all associated video database applications. Its intrinsic privacy framework facilitates a formalized approach to the specification and enforcement of verifiable privacy policies. This is an important step towards enabling a general privacy certification for video surveillance systems by leveraging a standardized privacy specification language. With the potential to impact many important fields ranging from security and assembly line monitoring to wildlife studies and the environment, the broader impact of this work is clear. The privacy framework protects the general public from abusive use of surveillance technology; iii success in addressing the "trust" issue will enable many new surveillance-related applications. Although this research focuses on video surveillance, the proposed framework has the potential to support many video-based analytical applications.
34

Challenges in implementation of new CAD software in the manufacturing industry: a case study : Ongoing Implementation of CADMATIC Electrical for Valmet Tissue Mills / Utmaningar vid implementering av ny CAD-programvara inom tillverkningsindustrin: en fallstudie : Pågående Implementation av CADMATIC Electrical för Valmet Tissue Mills

de Haan, Jonathan January 2023 (has links)
The rapid advancement of computer software in the digital revolution has created a diverse range of powerful tools for various industries. Even though there is such a broad selection, there is no software that fits every individual or organization because of the different criterias. This is no exception for designing tools. To stay competitive and foster further growth, Valmet Tissue Mills recognizes the need to embrace the latest technology. They are in their quest for an updated design software that incorporates a database, they have chosen CADMATIC Electrical, a Finnish CAD tool specifically designed for electrical and automation engineering. However, through complications with implementing the tool under the COVID-19 lockdown, it has resulted in lasting challenges. The management has experienced resistance to change, because of lack of knowledge and a complicated system. In order to decrease the uncertainty, this study has researched the opportunities and challenges with CADMATIC Electrical. The availability of functions and how they work in the software.  The thesis found that CADMATIC Electrical can increase collaboration, decrease human error, surmountable revision control, etc. However, it is important to consider the complexity and security of using a database CAD tool. CADMATIC Electrical was also found to have the majority of the functions that the engineers desired. / Den snabba utvecklingen av datorprogramvara i den digitala revolutionen har skapat en mängd kraftfulla verktyg för olika branscher. Trots det breda utbudet finns det ingen programvara som passar varje individ eller organisation på grund av olika kriterier. Detta gäller även för designverktyg. För att vara konkurrenskraftig och främja ytterligare tillväxt, erkänner Valmet Tissue Mills behovet av att omfamna den senaste tekniken. I sin strävan efter en uppdaterad design programvara som inkluderar en databas har de valt CADMATIC Electrical, ett finskt CAD-verktyg speciellt utformat för el- och automationsingenjörer. Dock har komplikationer med att implementera verktyget under COVID-19-lockdownen resulterat i långvariga utmaningar. Ledningen har stött på motstånd mot förändring på grund av brist på kunskap och en komplicerad systemmiljö. För att minska osäkerheten har denna studie undersökt möjligheterna och utmaningarna med CADMATIC Electrical, samt tillgängligheten av funktioner och hur de fungerar i programvaran. Avhandlingen visade att CADMATIC Electrical kan öka samarbete, minska mänskliga fel, hantera revision med mera. Det är emellertid viktigt att beakta komplexiteten och säkerheten vid användning av ett CAD-verktyg med databas. Det konstaterades också att CADMATIC Electrical hade majoriteten av de funktioner som ingenjörerna önskade.
35

Description Logics with Aggregates and Concrete Domains

Baader, Franz, Sattler, Ulrike 18 May 2022 (has links)
We show that extending description logics by simple aggregation functions as available in database systems may lead to undecidability of inference problems such as satisfiability and subsumption.
36

Maintenance of cube automatic summary tables

Lehner, Wolfgang, Sidle, Richard, Hamid, Wolfgang Cochrane, Roberta 10 January 2023 (has links)
Materialized views (or Automatic Summary Tables—ASTs) are commonly used to improve the performance of aggregation queries by orders of magnitude. In contrast to regular tables, ASTs are synchronized by the database system. In this paper, we present techniques for maintaining cube ASTs. Our implementation is based on IBM DB2 UDB.
37

Employee Management System

Kanchev, Kancho January 2007 (has links)
<p>This report includes a development presentation of an information system for managing the staff data within a small company or organization. The system as such as it has been developed is called Employee Management System. It consists of functionally related GUI (application program) and database.</p><p>The choice of the programming tools is individual and particular.</p>
38

Evaluation of relational algebra queries on probabilistic databases : tractability and approximation

Fink, Robert D. January 2014 (has links)
Query processing is a core task in probabilistic databases: Given a query and a database that encodes uncertainty in data by means of probability distributions, the problem is to compute possible query answers together with their respective probabilities of being correct. This thesis advances the state of the art in two aspects of query processing in probabilistic databases: complexity analysis and query evaluation techniques. A dichotomy is established for non-repeating, con- junctive relational algebra queries with negation that separates #P-hard queries from those with PTIME data complexity. A framework for computing proba- bilities of relational algebra queries is presented; the probability computation algorithm is based on decomposition methods and provides exact answers in the case of exhaustive decompositions, or anytime approximate answers with absolute or relative error guarantees in the case of partial decompositions. The framework is extended to queries with aggregation operators. An experimental evaluation of the proposed algorithms’ implementations within the SPROUT query engine complements the theoretical results. The SPROUT<sup>2</sup> system uses this query engine to compute answers to queries on uncertain, tabular Web data.
39

QPPT: Query Processing on Prefix Trees

Kissinger, Thomas, Schlegel, Benjamin, Habich, Dirk, Lehner, Wolfgang 28 May 2013 (has links) (PDF)
Modern database systems have to process huge amounts of data and should provide results with low latency at the same time. To achieve this, data is nowadays typically hold completely in main memory, to benefit of its high bandwidth and low access latency that could never be reached with disks. Current in-memory databases are usually columnstores that exchange columns or vectors between operators and suffer from a high tuple reconstruction overhead. In this paper, we present the indexed table-at-a-time processing model that makes indexes the first-class citizen of the database system. The processing model comprises the concepts of intermediate indexed tables and cooperative operators, which make indexes the common data exchange format between plan operators. To keep the intermediate index materialization costs low, we employ optimized prefix trees that offer a balanced read/write performance. The indexed tableat-a-time processing model allows the efficient construction of composed operators like the multi-way-select-join-group. Such operators speed up the processing of complex OLAP queries so that our approach outperforms state-of-the-art in-memory databases.
40

Model řízení projektově orientovaných organizací sociální oblasti / Management model of project-oriented organizations of social sphere

PROKEŠOVÁ, Radka January 2012 (has links)
The thesis "Management model of project-oriented organizations of social sphere" deals with the proposal of a management model for project-oriented organizations of social sphere using monitoring and evaluation processes. The proposed management model was based on qualitative research conducted in the selected institution, which was the Institute of Social Work of the Faculty of Health and Social Studies. (Institute of Social Work ? ISW). The reason for selecting this institution was the variety, scope of activities undertaken and development phase of the institution. Currently, as an integral part of institutions focusing on practical social work with the client is considered not only strategic management but also project management which favorably influences the efficiency of services as well as provides the possibility of obtaining resources for their financing. The degree of success of these projects is positively influenced by the processes of monitoring and evaluation, both in the planning phase of the project and during its implementation. On the basis of the qualitative research was proposed a management model for the selected Institute of Social Work. The management model is based on a set of monitoring and evaluation indicators that were, using synergy effects, incorporated in the proposal of database systems of ISW. Based on the values of proposed indicators obtained from the database system we can not only monitor but also assess the performance of the Institute of Social Work. Some of the indicators can be used to design development strategies of the ISW. The applicability of the established management model was assessed by the experts from different fields of management, but similarly focused institutions, in the form of semi-structured interview. Based on the expert opinion, this management model is applicable to other similarly oriented organizations. The model, according to the experts, would bring improvements in management of project-oriented organizations of social sphere with similar focus as the ISW.

Page generated in 0.0695 seconds