Spelling suggestions: "subject:"olla""
231 |
Bordures : de la sélection de vues dans un cube de données au calcul parallèle de fréquents maximauxTofan, Radu-Ionel 28 September 2010 (has links)
La matérialisation de vues est une technique efficace d'optimisation de requêtes. Dans cette thèse, nous proposons une nouvelle vision "orientée utilisateur" de solutions pour le problème de sélection de vues à matérialiser dans les entrepôt de données : l'utilisateur fixe le temps de réponse maximal. Dans cette vision nous proposons des algorithmes qui s'avèrent compétitifs avec les algorithmes de type "orienté système", dans lesquels les ressources, comme la mémoire, sont considérées comme la contrainte forte. L'approche "orientée utilisateur" est étudiée avec un contexte dynamique de système d'optimisation de requêtes. Nous analysons la stabilité de ce système par rapport à la dynamique de la charge de requêtes et des données qui sont insérées ou supprimées. Le concept clé de nos algorithmes de sélection de vues à matérialiser est la bordure. Ce concept a été très étudié en fouille de données dans le cadre du calcul des fréquents maximaux. Plusieurs algorithmes séquentiels ont été proposés pour résoudre ce problème. Nous proposons un nouvel algorithme séquentiel MineWithRounds, facilement parallélisable, qui se distingue des autres propositions par une garantie théorique d'accélération dans le cas de machines à plusieurs unités de calcul et à mémoire partagée. / The materialization of views is an effective technique for optimizing queries. In this thesis, we propose a new vision, we qualify it as "user oriented", of the solutions to the problem of selecting views to materialize in data warehouses : the user fixes the maximum response time. In this vision, we propose algorithms that are competitive with the algorithms "oriented system" type, where resources such as memory, are considered as the major constraint. The "user oriented" approach is studied under a dynamic context. We analyze the stability of this system with respect to the dynamic query workload dynamic as well as data dynamic (insertions and deletions). The key concept of our algorithms for selecting views to materialize is the border. This concept has been widely studied in the data mining community under the maximal frequent itemset extration setting. Many sequential algorithms have been proposed. We propose a new sequential algorithm MineWithRounds, easily parallelizable, which differs from the others in that it guarantees a theoretical speed up in the case of multiprocessors shared memory case.
|
232 |
Pilotní implementace Business Intelligence v obchodní firmě / Pilot implementation of Business Intelligence in a retail companySavka, Ján January 2017 (has links)
The thesis focuses on implementation of Business Intelligence in a small retail company. The aim of the thesis is to design a pilot Business Intelligence solution to support management activities in Lintea, s. r. o. The company is part of Slovak lingerie and underwear retail market since 2004. Business performance measurement is not effectively supported in the company. Rise in the quality level of management activities is the main benefit of the proposed implementation which is achieved by delivering previously inaccessible or laboriously accessible information in an appropriate form. Proposed solution substantially simplifies, accelerates and improves management and decision making activities in the company. Thesis has two parts: theoretical and practical. Theoretical part has two chapters devoted to performance management and BI. Chapter about performance management presents wider aspects of using BI in companies and introduces the Balanced Scorecard method. Chapter about Business Intelligence defines BI, explains its principles, components and basic analytic method of BI - the dimensional modelling which is then applied in the practical part. The theoretical part ends with the description of current situation in the BI market. The practical part starts with the introduction of the company Lintea, s. r. o. followed by the description of its current state and proposal of the business performance measures based on the Balanced Scorecard method. Second chapter of the practical part contains the proposed Business Intelligence solution itself. Individual steps of the Business Intelligence design process are: analysis of prerequisites and requirements, analysis of data sources, dimensional modelling, ETL design, multidimensional data structures design and finally presentation layer design.
|
233 |
Implementace BI ve stavebnictví / Implementation of Business Intelligence in building industryMelichar, Jan January 2008 (has links)
Diploma thesis is focused on the strategic performance management and Business Intelligence domain. Main objectives of the thesis are to define strategic goals of building enterprise by help of the Balanced Scorecard (BSC) concept and to assign specific metrics to these strategic goals. Another objective of this work is to design Business Intelligence (BI) implementation which means building a data warehouse upon company data, multidimensional cubes and user-defined reports. Initial theoretical principles are described in the first part of this work in which main issues of strategic performance management, BSC concept and BI domain are specified. In the practical part the strategic goals and specific metrics of building enterprise are defined. The output of this chapter is an overall strategic map containing strategic goals with assigned metrics and also comments describing mutual relationships of these goals. Next chapter deals with building a data warehouse upon company data, multidimensional cubes and user-defined reports with measured values interpretation. Contribution of the thesis consists in the enterprise management model upon BSC concept which helps specify strategic goals and also the design of BI implementation which should simplify monitoring of these strategic goals by the help of the metrics specified. Another contribution for building industry enterprise management can be an overview of main BI technologies and the ways and means of its practical application.
|
234 |
Interconnection between BPM and BI products / Propojení produktů BPM a BIZikmund, Martin January 2008 (has links)
Interconnection between various types of IT systems used in an enterprise is crucial in these days. Most of companies are using many different kinds of applications in their daily running of the enterprise which leads to necessity of sharing data across those applications to enable all employees to make a right decision upon correct information. In my diploma thesis I deal with interconnection of two systems -- Business Process Management (BPM) and Business Intelligence (BI). Both systems belong to group of top IT systems with big influence on ongoing business and right decision making on all levels from operational to strategic. My paper contains theoretical as well as practical part of the solution for interconnection of BI and BPM systems. First part is about presenting and describing basic concepts and technologies which are used in process of integration of BI and BPM. At the beginning there is a short introduction to BPM, BI and SOA. Following part is including analysis of three major ways of interconnection between BI and BPM systems. Last part of the first theoretical section presents two products. IBM FileNet P8 representative of BPM system and IBM Cognos 8 BI as a representative of BI system. Second part deals with the practical example of real integration between BI and BPM systems. In first part of this section is simple description of the scenario -- business case. After that there is a detail depiction of two different kinds of integration of BI and BPM. Analysis of benefits, advantages and further possibilities are at the end of work.
|
235 |
Implementace SugarCRM a BI řešení s použitím opensource nástrojů / Implementing SugarCRM and BI solution using opensource toolsUllrich, Jan January 2009 (has links)
Diploma thesis is focused on the implementation of customer relationship management and Business Intelligence solutions in the small business company using open source technologies. Main objective of the thesis is implementation of CRM and Business Intelligence and evaluation of usability of these solutions. Thesis describes basic elements of these solutions. For the setting up metrics is using Balanced Scorecard. In the next part of this thesis is designed the whole Business Intelligence including the creating of data model and OLAP cubes using open source technologies. Theoretical background is described in the first part of thesis. There are defined basic terms from scope of strategic management using Balanced Scorecard, CRM and Business Intelligence. Thesis demonstrates using and creation of the whole Business Intelligence solution. There are also evaluated difficulties in implementation of these products. In thesis is designed data warehouse and the whole solution including reports.
|
236 |
Implementace Business Intelligence ve firmě Haguess, a. s. / Implementace BI ve firmě Haguess a.s.Bendák, Martin January 2010 (has links)
The subject of this thesis is a pilot project of the implementation of the BI (Business Intelligence) solution in the company Haguess, a. s. The project is concerned with the analyses of data stored in a database -- DataBase Management System (DBMS) -- which is used as a data source by the web application Customer Support Center (CSC). Haguess primarily uses CSC as a helpdesk for its clients and partners, but also uses it for internal purposes. The main use of the CSC application is to support information systems delivered by Haguess. There were two motives for the choice of this subject: BI software tools had not previously been used by Haguess, and the company management was keen to get data analyses from the CSC application. That's why I decided to do these analyses. My goal was to create a practically useful solution which would be an incentive for Haguess to use BI software tools for other purposes. This thesis has two main goals. The first one is the realisation of the pilot BI solution, which would outline the possibilities for analysis of data from the CSC application using BI software tools. This involved the following activities: multidimensional analysis and BI solution design -- i.e. design of data pipelines, OLAP (On-Line Analytical Processing) cube/cubes, and user tools in MS Excel calculator. The second goal was to select the proper software tools for a future complex realisation and running of the BI solution. The first objective was achieved by doing the following: proper analysis of the CSC application data model, definition of user requests for output analysis and its comparison with the data model analysis. Based on this comparison, the basic subjects of output analyses were determined. These basic subjects were the starting point for the implementation of the BI solution. The second objective was achieved as follows: on the basis of the implementation results, basic demands (criteria) for BI software tools features were determined, bearing in mind the possible future complex realisation and running of a BI solution. Research determined which BI software tools were available on the market. The most suitable BI software tool was selected following a comparison of available options, and the criteria mentioned above. The primary outcome of this thesis is the creation of a practically usable BI solution.
|
237 |
Implementace finanční a majetkové analýzy municipalit pomocí open-source BI nástrojů / Implementation of the Financial and Property Analysis of Municipalities Using Open Source BI ToolsČerný, Ondřej January 2011 (has links)
The objective of this thesis is a complete implementation of the Financial and Property Analysis of Municipalities (FAMA) methodology using open-source Business Intelligence (BI) tools. The FAMA methodology was developed at the Institute of Public Administration and Regional Development at University of Economics in Prague and monitors wide range of aspects of management of municipalities. The main objective of this work is to create an application that would allow users to clearly analyze the management of municipalities with indicators of this methodology, using open source BI tools. The first, theoretical part, consists of two parts. The first half of the theoretical part of this thesis is devoted to the analysis of municipal finances and describes the FAMA methodology itself. Furthermore, methods used by the Ministry of Finance to evaluate municipalities are described. The second half of the theoretical part introduces readers to principles and components used in BI, some of which are used in the actual implementation. The second, practical part, initially deals with selection of suitable open-source BI tools and these tools are subsequently used to create an application that is used to analyze the management of municipalities. The implementation itself is divided into several parts. Firstly an initial study is performed, which is based on analysis of source data and user requirements. Based on the initial analysis, data warehouse is designed. Subsequently, an ETL project is created to process financial reports of municipalities and store them in the data warehouse. After filling the data warehouse, several OLAP cubes are created, which are used for multidimensional data analysis and finally, the presentation layer of the application is introduced and suitable graphical outputs for data presentation are designed. The main contribution of this thesis is the actual implementation of the FAMA methodology using the selected tools. The solution includes all indicators of the methodology and covers financial data of all municipalities of the Czech Republic in the years 2001 to 2012. Due to the scope of this work, a complete solution was made, that is ready for use.
|
238 |
Vytvoření BI procesu dávky pro sdružení SOLUS v retailové bance / Creation of BI batch process for SOLUS in the retail bankFara, Miroslav January 2012 (has links)
The thesis focuses on the batch processing of bank (retail bank) client data for SOLUS. The main goal of the thesis is to create a batch process that will generate files (monthly report) about clients of one of the bank companies operating in the Czech market (retail banks), according to requirements pre-defined by SOLUS. This will be achieved by analyzing the requirements, designing the pilot solution, and by creating and implementing the batch process for SOLUS. The main contribution of this thesis is a working BI process which will store the data in the required format to the data warehouse. By implementing this process the retail bank can use the data from the register that have been saved to this place by other member organizations which participate in creating SOLUS. The data analysis from the registry can help the retail banks to get detailed information about clients applying for products of the respective bank and will facilitate the decision making during the approval process. The solution of batch report generation for SOLUS has been created by using the tools MS SQL Server 2008 R2 Management Studio, MS SQL Server 2008 R2 Business Intelligence Development Studio, and MS Excel 2010. In this project the data from internal databases of retail bank are analyzed. In order to comply with the rules for information security and data sensitivity, it is not possible to use a large amount of information from internal databases for the purposes of this thesis, but it is possible to clearly demonstrate the functioning of the whole process.
|
239 |
Association rule mining as a support for OLAP / Dolování asociačních pravidel jako podpora pro OLAPChudán, David January 2010 (has links)
The aim of this work is to identify the possibilities of the complementary usage of two analytical methods of data analysis, OLAP analysis and data mining represented by GUHA association rule mining. The usage of these two methods in the context of proposed scenarios on one dataset presumes a synergistic effect, surpassing the knowledge acquired by these two methods independently. This is the main contribution of the work. Another contribution is the original use of GUHA association rules where the mining is performed on aggregated data. In their abilities, GUHA association rules outperform classic association rules referred to the literature. The experiments on real data demonstrate the finding of unusual trends in data that would be very difficult to acquire using standard methods of OLAP analysis, the time consuming manual browsing of an OLAP cube. On the other hand, the actual use of association rules loses a general overview of data. It is possible to declare that these two methods complement each other very well. The part of the solution is also usage of LMCL scripting language that automates selected parts of the data mining process. The proposed recommender system would shield the user from association rules, thereby enabling common analysts ignorant of the association rules to use their possibilities. The thesis combines quantitative and qualitative research. Quantitative research is represented by experiments on a real dataset, proposal of a recommender system and implementation of the selected parts of the association rules mining process by LISp-Miner Control Language. Qualitative research is represented by structured interviews with selected experts from the fields of data mining and business intelligence who confirm the meaningfulness of the proposed methods.
|
240 |
Využití systému SAS při tvorbě datových skladů a optimalizaci ETL procesů / Using the SAS System for Building Data Warehouses and Optimalization of ETL ProcessesPešička, Michal January 2008 (has links)
This diploma thesis deals with usability of the SAS system and its components for building and running data warehouse and complete solution of Business Intelligence. In the beginning it introduces the meaning and benefits of putting on Business Intelligence and its spot in an organization. It especially focuses on the running BI project in Kooperativa, a.s., insurance company. The main goal of this thesis is to aim on ETL processes of data warehouse, their specificity, characteristics and regular tasks solved across data layers, measuring their performance and feasibility of ETL optimalization. This optimalization can be considered from two different points of view – the first is a creation and maintenance of the ETL source code, the second is tuning for faster data processing. Log files, which are are the main source for performance monitoring, are processed by macroprogram specially tailored to this particular reason. Gained results are analyzed and on that basis I outline spots that need attention. The last part offers comparison of some alternatives to data transformation process typically solved by ETL tasks. Acquired results could be taken as hints used in designing and tweaking other akin ETL processes.
|
Page generated in 0.0437 seconds