Spelling suggestions: "subject:"warehouse"" "subject:"arehouse""
251 |
Understanding cryptic schemata in large extract-transform-load systemsAlbrecht, Alexander, Naumann, Felix January 2012 (has links)
Extract-Transform-Load (ETL) tools are used for the creation, maintenance, and evolution of data warehouses, data marts, and operational data stores. ETL workflows populate those systems with data from various data sources by specifying and executing a DAG of transformations. Over time, hundreds of individual workflows evolve as new sources and new requirements are integrated into the system. The maintenance and evolution of large-scale ETL systems requires much time and manual effort. A key problem is to understand the meaning of unfamiliar attribute labels in source and target databases and ETL transformations. Hard-to-understand attribute labels lead to frustration and time spent to develop and understand ETL workflows.
We present a schema decryption technique to support ETL developers in understanding cryptic schemata of sources, targets, and ETL transformations. For a given ETL system, our recommender-like approach leverages the large number of mapped attribute labels in existing ETL workflows to produce good and meaningful decryptions. In this way we are able to decrypt attribute labels consisting of a number of unfamiliar few-letter abbreviations, such as UNP_PEN_INT, which we can decrypt to UNPAID_PENALTY_INTEREST. We evaluate our schema decryption approach on three real-world repositories of ETL workflows and show that our approach is able to suggest high-quality decryptions for cryptic attribute labels in a given schema. / Extract-Transform-Load (ETL) Tools werden häufig beim Erstellen, der Wartung und der Weiterentwicklung von Data Warehouses, Data Marts und operationalen Datenbanken verwendet. ETL Workflows befüllen diese Systeme mit Daten aus vielen unterschiedlichen Quellsystemen. Ein ETL Workflow besteht aus mehreren Transformationsschritten, die einen DAG-strukturierter Graphen bilden. Mit der Zeit entstehen hunderte individueller ETL Workflows, da neue Datenquellen integriert oder neue Anforderungen umgesetzt werden müssen. Die Wartung und Weiterentwicklung von großen ETL Systemen benötigt viel Zeit und manuelle Arbeit. Ein zentrales Problem ist dabei das Verständnis unbekannter Attributnamen in Quell- und Zieldatenbanken und ETL Transformationen. Schwer verständliche Attributnamen führen zu Frustration und hohen Zeitaufwänden bei der Entwicklung und dem Verständnis von ETL Workflows.
Wir präsentieren eine Schema Decryption Technik, die ETL Entwicklern das Verständnis kryptischer Schemata in Quell- und Zieldatenbanken und ETL Transformationen erleichtert. Unser Ansatz berücksichtigt für ein gegebenes ETL System die Vielzahl verknüpfter Attributnamen in den existierenden ETL Workflows. So werden gute und aussagekräftige "Decryptions" gefunden und wir sind in der Lage Attributnamen, die aus unbekannten Abkürzungen bestehen, zu "decrypten". So wird z.B. für den Attributenamen UNP_PEN_INT als Decryption UNPAIN_PENALTY_INTEREST vorgeschlagen.
Unser Schema Decryption Ansatz wurde für drei ETL-Repositories evaluiert und es zeigte sich, dass unser Ansatz qualitativ hochwertige Decryptions für kryptische Attributnamen vorschlägt.
|
252 |
Improving Business Processes using Process-oriented Data WarehouseShahzad, Muhammad Khurram January 2012 (has links)
The Business Process Management (BPM) lifecycle consists of four phases: design and analysis, configuration, enactment, and evaluation - also known as performance analysis and improvement. Performance analysis and improvement of business processes, one of the core phases in the BPM life cycle, is becoming on top of the agenda for many enterprises. An emerging approach to that is to use the business intelligence approaches that attempt to facilitate the analytical capabilities of business process management systems by implementing process-oriented data warehouse and mining techniques. However, little work has been has done on developing core methods and tools for performance analysis and improvement of business processes. In particular, adequate methods, clearly defined steps or instructions that can guide process managers for analyzing and improving processes using process warehouse (PW) are not available. In the absence of such methods, guidelines or clearly defined steps, important steps may be ignored and credible improvements steps cannot be taken. This research addresses the described limitations by developing a method for performance analysis and improvement of business processes. The key feature of the developed method is, it employs business-orientation in the design and utilization of a PW. The method is composed of three steps, building goal-structure, integrating goal-structure with PW, and analyzing and improving business processes. During the first step, a set of top-level performance goals are identified for the process of interest. Subsequently, the identified goals are decomposed to generate a goal-structure that is aligned with the functional decomposition of the process of interest. The second step describes a technique for integrating the generated goal-structure with PW. The third step describes, a performance estimation model, a decision model and a step by step approach that focuses on utilizing PW for analysis and improvement of business processes. In order to facilitate the use of the proposed method a prototype is developed. The prototype offers a graphical user interface for defining goal structure, integrating goals with PW, and goal-based navigation of PW. In order to evaluate the proposed method, we first develop an evaluation framework and subsequently use it for the evaluation of the proposed method. The framework consists of three components, each representing a type of evaluation. The components are, methodological-structure evaluation, performance-based evaluation and perception-based evaluation. The results of the evaluation show partial support for the methodological structure. However, the results of performance and perception evaluation show promising results of the proposed method. / <p>QC 20121217</p>
|
253 |
Qualitätsgetriebene Datenproduktionssteuerung in Echtzeit-Data-Warehouse-SystemenThiele, Maik 10 August 2010 (has links) (PDF)
Wurden früher Data-Warehouse-Systeme meist nur zur Datenanalyse für die Entscheidungsunterstützung des Managements eingesetzt, haben sie sich nunmehr zur zentralen Plattform für die integrierte Informationsversorgung eines Unternehmens entwickelt. Dies schließt vor allem auch die Einbindung des Data-Warehouses in operative Prozesse mit ein, für die zum einen sehr aktuelle Daten benötigt werden und zum anderen eine schnelle Anfrageverarbeitung gefordert wird. Daneben existieren jedoch weiterhin klassische Data-Warehouse-Anwendungen, welche hochqualitative und verfeinerte Daten benötigen. Die Anwender eines Data-Warehouse-Systems haben somit verschiedene und zum Teil konfligierende Anforderungen bezüglich der Datenaktualität, der Anfragelatenz und der Datenstabilität. In der vorliegenden Dissertation wurden Methoden und Techniken entwickelt, die diesen Konflikt adressieren und lösen. Die umfassende Zielstellung bestand darin, eine Echtzeit-Data-Warehouse-Architektur zu entwickeln, welche die Informationsversorgung in seiner ganzen Breite -- von historischen bis hin zu aktuellen Daten -- abdecken kann.
Zunächst wurde ein Verfahren zur Ablaufplanung kontinuierlicher Aktualisierungsströme erarbeitet. Dieses berücksichtigt die widerstreitenden Anforderungen der Nutzer des Data-Warehouse-Systems und erzeugt bewiesenermaßen optimale Ablaufpläne. Im nächsten Schritt wurde die Ablaufplanung im Kontext mehrstufiger Datenproduktionsprozesse untersucht. Gegenstand der Analyse war insbesondere, unter welchen Bedingungen eine Ablaufplanung in Datenproduktionsprozessen gewinnbringend anwendbar ist.
Zur Unterstützung der Analyse komplexer Data-Warehouse-Prozesse wurde eine Visualisierung der Entwicklung der Datenzustände, über die Produktionsprozesse hinweg, vorgeschlagen. Mit dieser steht ein Werkzeug zur Verfügung, mit dem explorativ Datenproduktionsprozesse auf ihr Optimierungspotenzial hin untersucht werden können.
Das den operativen Datenänderungen unterworfene Echtzeit-Data-Warehouse-System führt in der Berichtsproduktion zu Inkonsistenzen. Daher wurde eine entkoppelte und für die Anwendung der Berichtsproduktion optimierte Datenschicht erarbeitet. Es wurde weiterhin ein Aggregationskonzept zur Beschleunigung der Anfrageverarbeitung entwickelt. Die Vollständigkeit der Berichtsanfragen wird durch spezielle Anfragetechniken garantiert.
Es wurden zwei Data-Warehouse-Fallstudien großer Unternehmen vorgestellt sowie deren spezifische Herausforderungen analysiert. Die in dieser Dissertation entwickelten Konzepte wurden auf ihren Nutzen und ihre Anwendbarkeit in den Praxisszenarien hin überprüft.
|
254 |
Data warehouse schema design /Lechtenbörger, Jens. January 2001 (has links)
Univ., Diss.--Münster (Westfalen), 2001.
|
255 |
E-handelns påverkan på materialhanteringsprocessen : En fallstudie på Södra Wood Interiör i KallingeVikenadler, Jacob, Mogensen, Emma, Bosson, Linnéa January 2018 (has links)
Kurs: Ämnesfördjupande arbete i logistik, 2FE25E, VT 2018 Uppsatsens titel: E-handelns påverkan på materialhanteringsprocessen - En fallstudie på Södra Wood Interiör i Kallinge Författare: Linnéa Bosson, Emma Mogensen och Jacob Vikenadler Handledare: Petra Andersson Examinator: Hana Hulthén Bakgrund och problemdiskussion: E-handeln har vuxit mycket under de senaste åren. E-handeln inom byggvarubranschen är något som inte slagit igenom än men är idag påväg uppåt. Därför bör företag som verkar inom byggbranschen förbereda sig på denna förändring. Södra Wood Interiör i Kallinge står just nu inför denna förberedelse, där ett flöde av både slutkund- och bulkorders ska kombineras på ett lönsamt och effektivt sätt. Syfte: Syftet är att identifiera förbättringar hos SWI:s centrallager, när det kommer till företagets nuvarande materialhanteringsprocess, inför en fortsatt ökning av e-handel inom byggvarubranschen. Metod: Uppsatsen är av kvalitativ art och har utförts genom en fallstudie. Uppsatsen behandlas utifrån ett hermeneutiskt synsätt och har ett deduktivt angreppssätt. Slutsats: Ett ökat flöde av e-handel resulterar i att antalet och storleken på orders förändras. Om SWI väljer att haka på trenden med e-handel och möjliggöra sådana lösningar för sina kunder kommer de få ett ökat flöde av så kallade slutkundorders. Detta leder i sin tur till att SWI måste kombinera ett stort flöde av slutkund- och bulkorders i samma lager. I dagsläget är det troligt att en sådan kombination skulle innebära att materialhanteringsprocessen blir mindre effektiv. I analysen presenteras sedan olika förbättringsförslag som tagits fram för att SWI ska kunna möta den ineffiktivitet som följer av att slutkundorderflödet ökar. Förbättringsförslagen är en egen lagerdel som behandlar slutkundorders i SWI:s befintliga lager och ytterligare ett centrallager på annan plats. / Background: E-commerce has grown during the last years. E-commerce in the construction industry on the other hand has not grown as much, but it is heading upwards. Therefore companies working in the construction industry should prepare for this change. Södra Wood Interiör in Kallinge is currently planning and preparing for this change, where a flow of both retail orders and bulk orders should be combined in a profitable and efficient manner. Purpose: The purpose of this study is to identify improvements at SWI:s main warehouse, when it comes to the companies material handling process, facing a continued increase in e-commerce in the construction industry. Method: The essay is of qualitative nature and has been performed by a case study. The essay has been processed in a hermeneutical approach and performed with a deductive approach. Conclusion: An increased flow of e-commerce results in a change in quantity and size of orders. If SWI choose to follow the trend and enable such solutions for their customers SWI will get an increased flow of so called retail orders. This in turn leads to that SWI have to combine a large flow of retail orders and bulk orders in the same warehouse. In the current situation it is likely that such a combination implies that the material handling process becomes less efficient. The analysis presents various improvements proposed to enable SWI to meet the inefficiency that comes from the increase in retail orders. The improvements suggest a separate warehouse that deals with the retail orders in SWI:s existing warehouse and additional warehouse located elsewhere.
|
256 |
SB-Index : um índice espacial baseado em bitmap para data warehouse geográficoSiqueira, Thiago Luís Lopes 26 August 2009 (has links)
Made available in DSpace on 2016-06-02T19:05:38Z (GMT). No. of bitstreams: 1
2652.pdf: 3404746 bytes, checksum: b3a10a77ac70bae2b29efed871dc75e4 (MD5)
Previous issue date: 2009-08-26 / Universidade Federal de Minas Gerais / Geographic Data Warehouses (GDW) became one of the main technologies used in decision-making processes and spatial analysis since they provide the integration of Data Warehouses, On-Line Analytical Processing and Geographic Information Systems. As a result, a GDW enables spatial analyses together with agile and flexible multidimensional analytical queries over huge volumes of data. On the other hand, there is a challenge in a GDW concerning the query performance, which consists of retrieving data related to ad-hoc spatial query windows and avoiding the high cost of star-joins. Clearly, mechanisms to provide efficient query processing, as index structures, are essential. In this master s thesis, a novel index for GDW is introduced, namely the SB-index, which is based on the Bitmap Join Index and the Minimum Bounding Rectangle. The SB-index inherits the Bitmap Index legacy techniques and introduces them in GDW, as well as it enables support for predefined spatial attribute hierarchies. The SB-index validation was performed through experimental performance tests. Comparisons among the SB-index approach, the star-join aided by R-tree and the star-join aided by GiST indicated that the SB-index significantly improves the elapsed time in query processing from 76% up to 96% with regard to queries defined over the spatial predicates of intersection, enclosure and containment and applied to roll-up and drill-down operations. In addition, the impact of the increase in data volume on the performance was analyzed. The increase did not impair the performance of the SB-index, which highly improved the elapsed time in query processing. Moreover, in this master s thesis there is an experimental investigation on how does the spatial data redundancy affect query response time and storage requirements in a GDW? . Redundant and non-redundant GDW schemas were compared, concluding that redundancy is related to high performance losses. Then, aiming at improving query performance, the SB-index performance was evaluated on the redundant GDW schema. The results pointed out that SB-index significantly improves the elapsed time in query processing from 25% up to 99%. Finally, a specific enhancement of the SB-index was developed in order to deal with spatial data redundancy. With this enhancement, the minimum performance gain observed became 80%. / O Data Warehouse Geográfico (DWG) tornou-se uma das principais tecnologias de suporte à decisão, pois promove a integração de data warehouses, On-Line Analytical Processing e Sistemas de Informações Geográficas. Por isso, um DWG viabiliza a análise espacial aliada à execução de consultas analíticas multidimensionais envolvendo enormes volumes de dados. Por outro lado, existe um desafio relativo ao desempenho no processamento de consultas, que envolvem janelas de consulta espaciais ad-hoc e várias junções entre tabelas. Claramente, mecanismos para aumentar o desempenho do processamento de consultas, como as estruturas de indexação, são essenciais. Nesta dissertação, propõe-se um novo índice para DWG chamado SB-index, baseado no Índice Bitmap de Junção e no Retângulo Envolvente Mínimo. O SB-index herda todo o legado de técnicas do Índice Bitmap e o introduz no DWG. Além disso, ele provê suporte a hierarquias de atributos espaciais predefinidas. Este índice foi validado por meio de testes experimentais de desempenho. Comparações entre o SB-index, a junção estrela auxiliada pela R-tree e a junção-estrela auxiliada por GiST indicaram que o SB-index diminui significativamente o tempo de resposta do processamento de consultas roll-up e drill-down relacionadas aos predicados espaciais intersecta , está contido e contém , promovendo ganhos de 76% a 96%. Mostrou-se ainda que a variação do volume de dados não prejudica o desempenho do SB-index. Esta dissertação também investiga a seguinte questão: como a redundância de dados espaciais afeta um DWG? . Foram comparados os esquemas de DWG redundante e não-redundante. Observou-se que a redundância de dados espaciais acarreta prejuízos ao tempo de resposta das consultas e aos requisitos de armazenamento do DWG. Então, visando melhorar o desempenho do processamento de consultas, introduziu-se o SB-index no esquema de DWG redundante. Os ganhos de desempenho obtidos a partir desta ação variaram de 25% a 99%. Por fim, foi proposta uma melhoria sobre o SB-index a fim de lidar especificamente com a questão da redundância de dados espaciais. A partir desta melhoria, o ganho mínimo de desempenho tornou-se 80%.
|
257 |
Proposta de um sistema de apoio à decisão para controle e gerenciamento agrícola em usinas de açúcar e álcool / Proposal for systems of support to the decision for the control and management agricultural in sugar and ethanol plantsRenato Tavares 04 July 2008 (has links)
Aliada à crescente evolução da computação, dois fatores também começaram a receber maior atenção: o conhecimento e a informação. Esta evolução faz com que a informação possa estar disponibilizada a todos, contribuindo de certa forma, para o auxílio na aquisição de conhecimento e visando posterior tomada de decisão, observa-se também a grande importância dos bancos de dados. Este trabalho apresenta a proposta de um Sistema de Apoio à Decisão (SAD), utilizando-se de avançadas metodologias de armazenamento de dados em poderosos bancos de dados (Data Warehouse), para o controle e gerenciamento agrícola em usinas de açúcar e álcool. Foi desenvolvido um ambiente estruturado, extensível, projetado para a análise de dados não voláteis, logicamente e fisicamente transformados, provenientes de diversas aplicações, alinhados com a estrutura da empresa, atualizados e mantidos por um longo período de tempo, referidos em termos utilizados no negócio e sumarizados para análise rápida. Com a implantação destas novas tecnologias, a empresa estará apta a obter informações do nível gerencial e estratégico para ajudar nos seus processos de tomada de decisão, o que antes não era possível com os atuais sistemas de informação existentes na empresa. / Information and knowledge are receiving more attention with the growing of computing evolution. This evolution provides available information to everybody contributing for the acquisition of knowledge, transforming the database in an important key of this evolution process. This work presents a research about the use of technologies of systems of support to the decision by using methodologies of data storage in powerful data bases (Data Warehouse) for the control and management agricultural in sugar and ethanol plants. An environment was developed for the analysis of non-volatile data, extensible, projected and physically transformed, proceeding from diverse applications, aligned with the structure of the company, updated and kept for a long period of time, related in business\' terms and grouped for fast analysis. With the implantation of these new technologies, the company will be able to obtain information of the managerial and strategic level to help on its decision making processes, what was not possible before with the current systems of information existent at the company.
|
258 |
Informationsförvaltning inom en stor organisation : En fallstudie på Trafikverket / : Information management within large organizationsGlad, Therese January 2016 (has links)
Denna studie syftar till att undersöka hur en stor organisation arbetar med förvaltning av information genom att undersöka dess nuvarande informationsförvaltning, samt undersöka eventuella förslag till framtida informationsförvaltning. Vidare syftar studien också till att undersöka hur en stor organisation kan etablera en tydlig styrning, samverkan, hantering och ansvars- och rollfördelning kring informationsförvaltning. Denna studie är kvalitativ, där datainsamlingen sker genom dokumentstudier och intervjuer. Studien bedrivs med abduktion och är en normativ fallstudie då studiens mål är att ge vägledning och föreslå åtgärder till det fall som uppdragsgivaren har bett mig att studera. Fallet i denna studie är ett typiskt fall, då studiens resultat kan vara i intresse för fler än studiens uppdragsgivare, exempelvis organisationer med liknande informationsmiljö. För att samla teori till studien så har jag genomfört litteraturstudier om ämnen som är relevanta för studiens syfte: Informationsförvaltning, Business Intelligence, Data Warehouse och dess arkitektur, samt Business Intelligence Competency Center. Denna studie bidrar med praktiskt kunskapsbidrag, då studien ger svar på praktiska problem. Uppdragsgivaren har haft praktiska problem i och med en icke fungerade informationsförvaltning, och denna studie har bidragit med förslag på framtida informationsförvaltning. Förslaget på framtida informationsförvaltning involverar ett centraliserat Data Warehouse, samt utvecklingen utav en verksamhet som hanterar informationsförvaltning och styrningen kring informationsförvaltningen inom hela organisationen. / This study aims to investigate how large organizations can work with information management by examining an organization's existing information management, and investigate possible future proposals to information management. Furthermore, the study aims to investigate how an organization can establish a clear direction, collaboration, management and responsibilities and roles regarding information management. This study is qualitative, where data collection occurs through document studies and interviews. The study is conducted by an abductive research approach and the study is a normative case study as the study's goal is to provide guidance and propose measures to the case that the collaboration partner has asked me to study. The case in this study is a typical instance, because the result will be representative of more than the study's collaboration partner, such as other large organizations with similar cases concerning information management. To collect theory to the study I conducted literature reviews on topics that are relevant to the purpose of the study: Information management, Business Intelligence, Data Warehouse and its architecture, as well as Business Intelligence Competency Center. This study contributes with practical knowledge, because the study provides answers to practical problems the collaboration partner has expressed within the non-operated information management, and this study contributes with suggestions for future information management. The suggestions involve a centralized Data Warehouse and the development of a function that handles information management, and disseminate the governance of information management throughout the organization.
|
259 |
Vývoj a implementace BI nadstavby nad systémem MS Dynamics NAV / Development and Implementation of BI sollution on the MS Dynamics NAV systemVotruba, Tomáš January 2013 (has links)
The dissertation describes the process of creating Business Intelligence solution over the ERP system MS Dynamics NAV. The successful development and deployment of prototype version is also the main goal of this work. The work is divided into two parts in accordance with sub-objectives: theoretical and practical. The theoretical part starts witch research of works with similar theme and then continues with description of the current state of Czech market with BI solutions on the MS Dynamics NAV, basic principles of Business Intelligence development and then the description of the method of setting benchmarks using the Balanced Scorecard. The practical part describes the development of Business Intelligence solution starting with identifying potential customer as a wholesale distributor company and setting the requirements for BI. The practical part continues through the design of a data warehouse in MS SQL Server, creation of data pump connected to the MS Dynamics NAV database, filling the data warehouse, OLAP cubes creating and ends with an example of output reports form working solution using MS Excel 2013.
|
260 |
Restrukturalizace skladu ve velkoobchodním podniku / Warehouse reengineering in specific wholesale companyKučera, Ondřej January 2013 (has links)
This Thesis aims to conduct analysis and reengineering of work methodology and warehouse layout in specific wholesale company. Thesis is divided into phases. In first phase the suitability of existing warehouse is reviewed. There will be evaluated hypothesis that existing warehouse is dimensionally insufficient, with incorrect layout and insufficient intake and expedition area. In second stage, there are created possible variants of reengineering and those are regarding to the conditions of uncertainty analyzed with Monte Carlo method.
|
Page generated in 0.0473 seconds