• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 86
  • 58
  • 35
  • 26
  • 23
  • 11
  • 5
  • 5
  • 5
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 274
  • 164
  • 100
  • 81
  • 74
  • 42
  • 38
  • 36
  • 36
  • 33
  • 33
  • 32
  • 32
  • 32
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Implementace Business Intelligence v malé firmě a praktické ověření v personální agentuře / Implementation Business Intelligence in a small-sized company and practical check on personal placement agency

Dörfl, Jakub January 2006 (has links)
Cílem diplomové práce je návrh a implementace Business Intelligence v malé firmě, jako je personální agentura zabývající se poskytováním ?TEMPORARY HELP?. Požadavky na aplikaci BI jsou koncipovány na základě metodiky Balanced Scorecard. Teoretická část: popis základních principů a nástrojů konceptů BSC a BI včetně definování jejich vzájemného vztahu a specifických nároků daných užitím v malé firmě. Praktická část: ověření získaných poznatků z teoretické části práce v personální agentuře. V rámci BSC byly definovány konkrétní strategické cíle s měřítky a stanoveny strategické akce. Vybrané strategické cíle jsou sledovány vytvořenou BI aplikací v prostředí MS SQL 2000 a MS OFFICE XP.
72

Analisando os dados do programa de melhoramento genético da raça nelore com data warehousing e data mining. / Analyzing the program of genetic improvement of nelore breed data with data warehousing and data mining.

Valmir Ferreira Marques 28 October 2002 (has links)
A base de dados do Programa de Melhoramento Genético da Raça Nelore está crescendo consideravelmente, com isso, a criação de um ambiente que dê apoio à análise dos dados do Programa é de fundamental importância. As tecnologias que são utilizadas para a criação de um ambiente analítico são os processos de Data Warehousing e de Data Mining. Neste trabalho, foram construídos um Data Warehouse e consultas OLAP para fornecer visões multidimensionais dos dados. Além das análises realizadas com as consultas, também foi utilizada uma ferramenta de Data Mining Visual. O ambiente analítico desenvolvido proporciona aos pesquisadores e criadores do Programa um maior poder de análise de seus dados. Todo o processo de desenvolvimento desse ambiente é aqui apresentado. / The Program of Genetic Improvement of Nelore Breed database have been growing considerably. Therefore, the creation of an environment to support the data analysis of Program is very important. The technologies that are used for the creation of an analytical environment are the Data Warehousing and the Data Mining processes. In this work, a Data Warehouse and OLAP consultations had been constructed to supply multidimensional views of the data. Beyond the analyses carried through with the consultations, a tool of Visual Data Mining also was used. The developed analytical environment provides to the researchers and cattlemen of the Program a greater power of data analysis. The whole process of development of this environment is presented here.
73

Analisando os dados do programa de melhoramento genético da raça nelore com data warehousing e data mining. / Analyzing the program of genetic improvement of nelore breed data with data warehousing and data mining.

Marques, Valmir Ferreira 28 October 2002 (has links)
A base de dados do Programa de Melhoramento Genético da Raça Nelore está crescendo consideravelmente, com isso, a criação de um ambiente que dê apoio à análise dos dados do Programa é de fundamental importância. As tecnologias que são utilizadas para a criação de um ambiente analítico são os processos de Data Warehousing e de Data Mining. Neste trabalho, foram construídos um Data Warehouse e consultas OLAP para fornecer visões multidimensionais dos dados. Além das análises realizadas com as consultas, também foi utilizada uma ferramenta de Data Mining Visual. O ambiente analítico desenvolvido proporciona aos pesquisadores e criadores do Programa um maior poder de análise de seus dados. Todo o processo de desenvolvimento desse ambiente é aqui apresentado. / The Program of Genetic Improvement of Nelore Breed database have been growing considerably. Therefore, the creation of an environment to support the data analysis of Program is very important. The technologies that are used for the creation of an analytical environment are the Data Warehousing and the Data Mining processes. In this work, a Data Warehouse and OLAP consultations had been constructed to supply multidimensional views of the data. Beyond the analyses carried through with the consultations, a tool of Visual Data Mining also was used. The developed analytical environment provides to the researchers and cattlemen of the Program a greater power of data analysis. The whole process of development of this environment is presented here.
74

A Spatio-Temporal Model for the Evaluation of Education Quality in Peru

Alperin, Juan Pablo 28 January 2008 (has links)
The role of information and communication technologies in the development of modern societies has continuously increased over the past several decades. In particular, recent unprecedented growth in use of the Internet in many developing countries has been accompanied by greater information access and use. Along with this increased use, there have been significant advances in the development of technologies that can support the management and decision-making functions of decentralized government. However, the amount of data available to administrators and planners is increasing at a faster rate than their ability to use these resources effectively. A key issue in this context is the storage and retrieval of spatial and temporal data. With static data, a planner or analyst is limited to studying cross-sectional snapshots and has little capability to understand trends or assess the impacts of policies. Education, which is a vital part of the human experience and one of the most important aspects of development, is a spatio-temporal process that demands the capacities to store and analyze spatial distributions and temporal sequences simultaneously. Local planners must not only be able to identify problem areas, but also know if a problem is recent or on-going. They must also be able to identify factors which are causing problems for remediation and, most importantly, to assess the impact of remedial interventions. Internet-based tools that allow for fast and easy on-line exploration of spatio-temporal data will better equip planners for doing all of the above. This thesis presents a spatio-temporal on-line data model using the concept or paradigm of space-time. The thesis demonstrates how such a model can be of use in the development of customized software that addresses the evaluation of early childhood education quality in Peru.
75

A Spatio-Temporal Model for the Evaluation of Education Quality in Peru

Alperin, Juan Pablo 28 January 2008 (has links)
The role of information and communication technologies in the development of modern societies has continuously increased over the past several decades. In particular, recent unprecedented growth in use of the Internet in many developing countries has been accompanied by greater information access and use. Along with this increased use, there have been significant advances in the development of technologies that can support the management and decision-making functions of decentralized government. However, the amount of data available to administrators and planners is increasing at a faster rate than their ability to use these resources effectively. A key issue in this context is the storage and retrieval of spatial and temporal data. With static data, a planner or analyst is limited to studying cross-sectional snapshots and has little capability to understand trends or assess the impacts of policies. Education, which is a vital part of the human experience and one of the most important aspects of development, is a spatio-temporal process that demands the capacities to store and analyze spatial distributions and temporal sequences simultaneously. Local planners must not only be able to identify problem areas, but also know if a problem is recent or on-going. They must also be able to identify factors which are causing problems for remediation and, most importantly, to assess the impact of remedial interventions. Internet-based tools that allow for fast and easy on-line exploration of spatio-temporal data will better equip planners for doing all of the above. This thesis presents a spatio-temporal on-line data model using the concept or paradigm of space-time. The thesis demonstrates how such a model can be of use in the development of customized software that addresses the evaluation of early childhood education quality in Peru.
76

A Count-Based Partition Approach to the Design of the Range-Based Bitmap Indexes for Data Warehouses

Lin, Chien-Hsiu 29 July 2004 (has links)
Data warehouses contain data consolidated from several operational databases and provide the historical, and summarized data which is more appropriate for analysis than detail, individual records. On-Line Analytical Processing (OLAP) provides advanced analysis tools to extract information from data stored in a data warehouse. Fast response time is essential for on-line decision support. A bitmap index could reach this goal in read-mostly environments. When data has high cardinality, we prefer to use the Range-Based Index (RBI), which divides the attributes values into several partitions and a bitmap vector is used to represent a range. With RBI, however, the number of records assigned to different ranges can be highly unbalanced, resulting in different search times of disk accesses for different queries. Wu et al proposed an algorithm for RBI, DBEC, which takes the data distribution into consideration. But the DBEC strategy could not guarantee to get the partition result with the given number of bitmap vectors, PN. Moreover, for different data records with the same value, they may be partitioned into different bitmap vectors which takes long disk I/O time. Therefore, we propose the IPDF, CP, CP* strategies for constructing the dynamic range-based indexes concerning with the case that data has high cardinality and is not uniformly distributed. The IPDF strategy decides each partition according to the Probability Density Function (p.d.f.). The CP strategy sorts the data and partitions them into PN groups for every w continuous records. The CP* strategy is an improved version of the CP strategy by adjusting the cutting points such that data records with the same value will be assigned into the same partition. On the other hand, we could take the history of users' queries into consideration. Based on the greedy approach, we propose the GreedyExt and GreedyRange strategies. The GreedyExt strategy is used for answering exact queries and the GreedyRange strategy is used for answering range queries. The two strategies decide the set of queries to construct the bitmap vectors such that the average response time of answering queries could be reduced. Moreover, a bitmap index consists of a set of bitmap vectors and the size of the bitmap index could be much larger than the capacity of the disk. We propose the FZ strategy to compress each bitmap vector to reduce the size of the storage space and provide efficient bitwise operations without decompressing these bitmap vectors. Finally, from our performance analysis, the performance of the CP* strategy could be better than the CP strategy in terms of the number of disk accesses. From our simulation, we show that the ranges divided by the IPDF and CP* strategies are more uniform than those divided by the DBEC strategy. The GreedyExt and GreedyRange strategies could provide fast response time in most of situations. Moreover, the FZ strategy could reduce the storage space more than the WAH strategy.
77

Designing Portfolio Analysis System for E-Learning

Tsai, Min-Fang 06 August 2001 (has links)
In recent years, due to flourishing growth of internet, every protocol is approaching standardization and scientific applications are becoming full-grown everyday. Moreover, with the aid of internet educational resource and multimedia instruments, web-based instruction system has been enabled to build an internet learning environment which is different from traditional one. Teaching on the internet can not only overcome the limit of time and space, but also let students become more autonomous in their time and location of learning. Furthermore, all learning activities can be recorded in Web Logs automatically under the circumstances of not interfering with students when browsing learning material, interactive discussion, or team work learning. Because the Web Logs may also contain huge amount of non-educational meaning informations, teachers cannot directly use the Web Logs to observe student learning behaviors and to check learning situations. Many analytical instruments of Web Logs cannot either enable teachers to see educational meaning from Web Logs or carry out learning portfolio analysis. This research is to provide analysis of student Web Logs easily. Therefore, OLAP, and graphic data analytical techniques are used to design a visual learning portfolio analysis system. This system provides teachers to analyze the procedures of learning activities from different perspectives and graphic methods in order to understand the participatory extent of students toward learning activities. Besides, the evidence of students participate in learning activities will be surfaced, which can help teachers to evaluate the learning results effectively.
78

Veiklos analizės IS smulkiojo verslo įmonėms / Business intelligence system for SB market

Dapkevičius, Vytenis 18 January 2006 (has links)
Business intelligence system for SB market.
79

OLAP sistemų analizė / The analysis of OLAP systems

Rukavičius, Valdas 25 May 2005 (has links)
The amount of information exchange is constantly growing. All this gathered data is a perfect material for analysis, although it is only partially used. Most common problem is the method how to organize this data to the form necessary for business analysis. Usually these functions are provided by decision support systems. These systems use functions and tool that are not flexible and cannot give the answers in real time. OLAP systems provide the ability to analyse data in all aspects online. OLAP technology includes tools and measures to transform and store information, create and execute queries as well as graphic user interface. As OLAP systems get more affordable to companies new issues are discussed. Main problems are how to choose the best OLAP products available in the marker and how to design and implement OLAP system according to business requirements. The aims of this work are to identify most important criterias and to compare different OLAP systems in product and data structure level and to design and implement OLAP system to Telecommunication Company according to the analysis results. The analysis of OLAP systems was divided into the analysis of OLAP products available in the market and the analysis of individual OLAP system. The analysis showed that the choice of OLAP products depends on software equipment and analysis capacities of the company, and separate OLAP system must be designed according to specific analysis needs. According to the analysis results... [to full text]
80

An evaluative case report of the group decision manager : a look at the communication and coordination issues facing online group facilitation /

Bell, Daniel M., January 1998 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 1998. / Typescript. Vita. Includes bibliographical references (leaves 110-112). Also available on the Internet.

Page generated in 0.0432 seconds