• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 424
  • 73
  • 18
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 14
  • 7
  • 5
  • 5
  • 3
  • 3
  • Tagged with
  • 674
  • 674
  • 274
  • 219
  • 195
  • 153
  • 128
  • 123
  • 97
  • 83
  • 80
  • 67
  • 56
  • 54
  • 53
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
661

Energy-Efficient In-Memory Database Computing

Lehner, Wolfgang January 2013 (has links)
The efficient and flexible management of large datasets is one of the core requirements of modern business applications. Having access to consistent and up-to-date information is the foundation for operational, tactical, and strategic decision making. Within the last few years, the database community sparked a large number of extremely innovative research projects to push the envelope in the context of modern database system architectures. In this paper, we outline requirements and influencing factors to identify some of the hot research topics in database management systems. We argue that—even after 30 years of active database research—the time is right to rethink some of the core architectural principles and come up with novel approaches to meet the requirements of the next decades in data management. The sheer number of diverse and novel (e.g., scientific) application areas, the existence of modern hardware capabilities, and the need of large data centers to become more energy-efficient will be the drivers for database research in the years to come.
662

DJ: Bridging Java and Deductive Databases

Hall, Andrew Brian 07 July 2008 (has links)
Modern society is intrinsically dependent on the ability to manage data effectively. While relational databases have been the industry standard for the past quarter century, recent growth in data volumes and complexity requires novel data management solutions. These trends revitalized the interest in deductive databases and highlighted the need for column-oriented data storage. However, programming technologies for enterprise computing were designed for the relational data management model (i.e., row-oriented data storage). Therefore, developers cannot easily incorporate emerging data management solutions into enterprise systems. To address the problem above, this thesis presents Deductive Java (DJ), a system that enables enterprise programmers to use a column oriented deductive database in their Java applications. DJ does so without requiring that the programmer become proficient in deductive databases and their non-standardized, vendor-specific APIs. The design of DJ incorporates three novel features: (1) tailoring orthogonal persistence technology to the needs of a deductive database with column-oriented storage; (2) using Java interfaces as a primary mapping construct, thereby simplifying method call interception; (3) providing facilities to deploy light-weight business rules. DJ was developed in partnership with LogicBlox Inc., an Atlanta based technology startup. / Master of Science
663

Assessing Query Execution Time and Implementational Complexity in Different Databases for Time Series Data / Utvärdering av frågeexekveringstid och implementeringskomplexitet i olika databaser för tidsseriedata

Jama Mohamud, Nuh, Söderström Broström, Mikael January 2024 (has links)
Traditional database management systems are designed for general purpose data handling, and fail to work efficiently with time-series data due to characteristics like high volume, rapid ingestion rates, and a focus on temporal relationships. However, what is a best solution is not a trivial question to answer. Hence, this thesis aims to analyze four different Database Management Systems (DBMS) to determine their suitability for managing time series data, with a specific focus on Internet of Things (IoT) applications. The DBMSs examined include PostgreSQL, TimescaleDB, ClickHouse, and InfluxDB. This thesis evaluates query performance across varying dataset sizes and time ranges, as well as the implementational complexity of each DBMS. The benchmarking results indicate that InfluxDB consistently delivers the best performance, though it involves higher implementational complexity and time consumption. ClickHouse emerges as a strong alternative with the second-best performance and the simplest implementation. The thesis also identifies potential biases in benchmarking tools and suggests that TimescaleDB's performance may have been affected by configuration errors. The findings provide significant insights into the performance metrics and implementation challenges of the selected DBMSs. Despite limitations in fully addressing the research questions, this thesis offers a valuable overview of the examined DBMSs in terms of performance and implementational complexity. These results should be considered alongside additional research when selecting a DBMS for time series data. / Traditionella databashanteringssystem är utformade för allmän datahantering och fungerar inte effektivt med tidsseriedata på grund av egenskaper som hög volym, snabba insättningshastigheter och fokus på tidsrelationer. Dock är frågan om vad som är den bästa lösningen inte trivial. Därför syftar denna avhandling till att analysera fyra olika databashanteringssystem (DBMS) för att fastställa deras lämplighet för att hantera tidsseriedata, med ett särskilt fokus på Internet of Things (IoT)-applikationer. De DBMS som undersöks inkluderar PostgreSQL, TimescaleDB, ClickHouse och InfluxDB. Denna avhandling utvärderar sökprestanda över varierande datamängder och tidsintervall, samt implementeringskomplexiteten för varje DBMS. Prestandaresultaten visar att InfluxDB konsekvent levererar den bästa prestandan, men med högre implementeringskomplexitet och tidsåtgång. ClickHouse framstår som ett starkt alternativ med näst bäst prestanda och är enklast att implementera. Studien identifierar också potentiella partiskhet i prestandaverktygen och antyder att TimescaleDB:s prestandaresultat kan ha påverkats av konfigurationsfel. Resultaten ger betydande insikter i prestandamått och implementeringsutmaningar för de utvalda DBMS. Trots begränsningarna i att fullt ut besvara forskningsfrågorna erbjuder studien en värdefull översikt. Dessa resultat bör beaktas tillsammans med ytterligare forskning vid val av ett DBMS för tidsseriedata.
664

Coping with evolution in information systems: a database perspective

Lawrence, Gregory 25 August 2009 (has links)
Business organisations today are faced with the complex problem of dealing with evolution in their software information systems. This effectively concerns the accommodation and facilitation of change, in terms of both changing user requirements and changing technological requirements. An approach that uses the software development life-cycle as a vehicle to study the problem of evolution is adopted. This involves the stages of requirements analysis, system specification, design, implementation, and finally operation and maintenance. The problem of evolution is one requiring proactive as well as reactive solutions for any given application domain. Measuring evolvability in conceptual models and the specification of changing requirements are considered. However, even "best designs" are limited in dealing with unanticipated evolution, and require implementation phase paradigms that can facilitate an evolution correctly (semantic integrity), efficiently (minimal disruption of services) and consistently (all affected parts are consistent following the change). These are also discussed / Computing / M. Sc. (Information Systems)
665

L’irrigation dans le bassin du Rhône : gestion de l’information géographique sur les ressources en eau et leurs usages / Irrigation in the Rhône basin : geographic information system about freshwater resources and water uses

Richard-Schott, Florence 06 December 2010 (has links)
L’irrigation a connu de grands changements dans le bassin du Rhône français durant les trente dernières années du vingtième siècle. La mise en œuvre d’un Système d’Information sur le bassin du Rhône (SIR) montre l’existence de quatre grands systèmes d’irrigation qui s’individualisent au sein de plusieurs « régions d’irrigation ». Ces dernières révèlent des dynamiques contrastées, mettant à mal l’idée que l’irrigation aurait connu une expansion continue et homogène, même si les superficies irriguées augmentent globalement. Ces dynamiques spatiales s’expliquent par les profondes transformations d’une pratique modernisée, utilisant des techniques toujours plus économes en eau. C’est d’ailleurs le deuxième enseignement de la recherche : l’accroissement général des superficies irriguées n’a pas entraîné une augmentation des demandes en eau. Celles-ci ont plutôt tendance à diminuer, de l’ordre de 30 % en trente ans. Sous l’impulsion des gestionnaires, les irrigants font un usage de plus en plus raisonné des ressources en eau et, à terme, il ne faut certainement pas considérer l’irrigation comme une menace généralisée pour les équilibres environnementaux... Le mémoire de thèse s’accompagne d’un système de gestion de l’information géographique et d’un atlas en version électronique. / Over the last thirty years of the twentieth century, irrigation in the French basin of the Rhône river has undergone substantial change. The implementation of a Geographic Information System on the Rhône basin (SIR) demonstrates the existence of four main irrigation systems individualized within several “irrigation regions.” These reveal in turn a series of contrasted dynamics, putting into question the idea that irrigation expansion had been both continuous and homogeneous, even though the total surface area irrigated actually increased. These spatial dynamics can be accounted for by the deep transformations due to a modernised practice that relies on techniques ever more sparing with water. This is in fact the second lesson one can draw from this study : the general increase in irrigated surface areas did not lead to an increase in water demand. On the contrary, water demand has tended to diminish, in the order of 30% over thirty years. Driven by management, the cultivators’ use of water resources is more and more reasoned, so that in the long run irrigation is surely no global threat to environmental balance. The thesis includes a system for managing geographic information as well as an electronic atlas.
666

資料庫行銷之研究-以金融業為例

周紋祺, Chou, Wen-Chi Unknown Date (has links)
隨著行銷方式由大眾行銷演進至區隔行銷,現今已進入1-1行銷的時代。由於媒體廣告的過多,其實際的廣告效果已漸受質疑。行銷的手法不再是拍一支好的廣告片就可以擄獲所有消費者的購買,加上社會結構、型態的改變,科技的突飛猛進,消費者可選擇的產品種類愈來愈多,因此行銷競爭壓力更甚以往,而如何「有效」吸引消費者的購買便成為今日企業的重要課題。「資料庫行銷」的導入,可以透過有系統的分析,更精確地鎖定目標客群,進而從節省行銷成本與提升銷售業績雙方面,替公司帶來良好的利潤貢獻,也解決了企業銷售上的困境。 本研究屬於探索性研究,全文共分為六個章節。從資料庫行銷的相關議題方面探究,經過文獻探討、融入實務經驗、產生研究架構,並做為個案研究的訪談依據,以了解現行金融業在資料蒐集、資料庫軟硬體上的選用考量與管理、資料分析方法的運用程度、行銷計劃的執行及售後顧客管理上的態度與手法。整體架構在透過四家銀行:陽信商業銀行、中國信託商銀、玉山銀行及匯豐銀行的實證後,發現架構的適用性極高,充分兼顧了學理與實用性。另一面,環顧過去許多文獻中有關資料庫行銷的整體規劃模式,在構面上均多少有所不足。因此本研究乃融合文獻中之許多說法及實務運作所發展出,以補坊間書籍之缺。而在行銷作法上乃將傳統行銷學理的精神轉化為實際可執行的行動,分別從「潛在顧客開發」與「現有顧客滲透」兩方面依序開列步驟,更利從業人員落實資料庫行銷的執行。 經由第四章的研究分析可發現,目前國內在資料庫行銷的導入上尚未臻成熟,以金融業為例,中國信託商銀挾龐大信用卡顧客資料,加上公司本身的積極投入與耕耘,目前在這方面的發展腳步居領先優勢。因此資料庫行銷帶來的績效使金融同業莫不領會到導入此作法的必要性。以目前行銷手法觀之,外國銀行原先的資料庫行銷優勢將漸被本國銀行趕上,而本國銀行在售後顧客管理上又不如外國銀行佳。 理論架構的發展雖極具實用性,惟發展之過程仍不免有抽樣及深度等客觀因素上的限制。在後續研究上,建議未來的研究學人可自架構中個別議題上深入研究、或分析不同銀行對相同金融商品採資料庫行銷後的成果差異、或選個別單一公司做導入前後的研究,真正驗證資料庫行銷模式引進後的實際差異。
667

Evaluation of Queries on Linked Distributed XML Data / Auswertung von Anfragen an verteilte, verlinkte XML Daten

Behrends, Erik 18 December 2006 (has links)
No description available.
668

Coping with evolution in information systems: a database perspective

Lawrence, Gregory 25 August 2009 (has links)
Business organisations today are faced with the complex problem of dealing with evolution in their software information systems. This effectively concerns the accommodation and facilitation of change, in terms of both changing user requirements and changing technological requirements. An approach that uses the software development life-cycle as a vehicle to study the problem of evolution is adopted. This involves the stages of requirements analysis, system specification, design, implementation, and finally operation and maintenance. The problem of evolution is one requiring proactive as well as reactive solutions for any given application domain. Measuring evolvability in conceptual models and the specification of changing requirements are considered. However, even "best designs" are limited in dealing with unanticipated evolution, and require implementation phase paradigms that can facilitate an evolution correctly (semantic integrity), efficiently (minimal disruption of services) and consistently (all affected parts are consistent following the change). These are also discussed / Computing / M. Sc. (Information Systems)
669

Desenvolvimento de jogo didático para tornar prático o uso das atividades que contribuem para a melhoria de processo : elevação da alvenaria estrutural

Mesquita, Victor Felix de 06 June 2014 (has links)
Construction companies seek improvements for their management systems. Therefore, they need to apply concepts, methods and lean techniques for the identification and subsequent elimination or reduction of activities that do not add value to the final product. In this context, there are activities that contribute to the improvement of construction processes. They are best practices that seek to eliminate factors that create interruptions in processes that occur at construction sites. The objective of this research was to develop a didactic game that helps managers to make practical the use of activities that contribute to the improvement of structural masonry construction process, to stimulate flow production continuity in construction sites and the elimination of wastes. The methodology was divided into two phases. The first was the literature review, and the second was divided into five stages related to field research, that involve the procedures required for the development and validation of the tool proposed in this research. For this, the following tasks were carried out: exploratory study, list of best practices, development, validation and application of a checklist; development of didactic game and its validation by the application in a group dynamics, primarily with researchers and later with the managers of three of the six construction sites researched. As a result of the preparation of the best practices list, it was found that the neglect of most of these activities could result in the occurrence of making-do and consequently in rework. The application of checklist showed that the managers use best practices in their daily. The application of the game confirmed the information observed in the field and provided an overview of the process by managers. From that emerged the main contribution of this study, which was the conclusion that the game is a teaching tool that can be used to train professionals in the identification of strengths and weaknesses related to management activities in construction process of structural masonry elevation. Thus, it is expected that managers look at the tool as a practical instrument to identify best practices in the construction process and that in the future the tool can be applied in other processes. / As empresas construtoras buscam melhorias de seus sistemas gerenciais. Para tanto, necessitam aplicar conceitos, métodos e técnicas enxutas, para a identificação e posterior eliminação ou redução de atividades que não agregam valor ao produto final. Nesse contexto, inserem-se as atividades que contribuem para a melhoria dos processos construtivos. São as boas práticas que buscam eliminar fatores que gerem interrupções nos processos que ocorrem nos canteiros de obras. Nesse sentido, o objetivo desta pesquisa foi desenvolver um jogo didático que auxilie os gerentes de obras no uso das atividades que contribuam para a melhoria do processo construtivo de elevação da alvenaria estrutural, de forma a estimular a continuidade dos fluxos de produção dentro dos canteiros de obras e a eliminação das perdas. A metodologia foi dividida em duas fases. A primeira tratou-se da revisão de literatura e a segunda, subdividida em cinco etapas relacionadas à pesquisa de campo, envolve os procedimentos necessários para o desenvolvimento e a validação de jogo didático proposto nesta pesquisa. Para isto, procedeu-se: realização de estudo exploratório; elaboração de lista de boas práticas; elaboração, validação e aplicação de checklist; desenvolvimento de jogo didático e sua validação através da aplicação em forma de dinâmica de grupo, primeiramente com pesquisadores e, posteriormente, com os gerentes de três dos seis canteiros de obras em que se desenvolveram os estudos. Como resultado da elaboração da lista de boas práticas, constatou-se que a negligência da maioria dessas atividades poderia resultar na ocorrência de perdas por making-do e, consequentemente, em retrabalhos. A aplicação do checklist nos canteiros mostrou que, mesmo sem perceber, os gerentes utilizam as boas práticas no seu dia a dia. O processo de aplicação do jogo simulativo confirmou as informações observadas em campo e proporcionou uma visão geral do processo pelos gerentes. A partir daí surgiu a principal contribuição deste estudo, o jogo didático proposto é uma ferramenta que pode ser utilizada para treinar profissionais na identificação de pontos fortes e frágeis relacionados com as atividades gerenciais no processo construtivo de elevação da alvenaria estrutural. Assim, espera-se que os gerentes vejam na ferramenta um instrumento prático para identificar boas práticas no processo construtivo e que no futuro a ferramenta seja aplicada em outros processos.
670

Systém řízení báze dat v operační paměti / In-Memory Database Management System

Pehal, Petr January 2013 (has links)
The focus of this thesis is a proprietary database interface for management tables in memory. At the beginning, there is given a short introduction to the databases. Then the concept of in-memory database systems is presented. Also the main advantages and disadvantages of this solution are discussed. The theoretical introduction is ended by brief overview of existing systems. After that the basic information about energetic management system RIS are presented together with system's in-memory database interface. Further the work aims at the specification and design of required modifications and extensions of the interface. Then the implementation details and tests results are presented. In conclusion the results are summarized and future development is discussed.

Page generated in 0.0916 seconds