• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 63
  • 29
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 149
  • 149
  • 35
  • 27
  • 25
  • 24
  • 23
  • 21
  • 20
  • 20
  • 16
  • 16
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Design of a performance evaluation tool for multimedia databases with special reference to Oracle

Stakemire, Tonia January 2004 (has links)
Increased production and use of multimedia data has led to the development of a more advanced Database Management System (DBMS), like an Object Relational Database Management System (ORDBMS). These advanced databases are necessitated by the complexity in structure and the functionality required by multimedia data. Unfortunately, no suitable benchmarks exist with which to test the performance of databases when handling multimedia data. This thesis describes the design of a benchmark to measure the performance of basic functionality found in multimedia databases. The benchmark, called MORD (Multimedia Object Relational Databases), targets Oracle, a well known commercial Object Relational Database Management System (ORDBMS) that can handle multimedia data. Although MORD targets Oracle, it can easily be applied to other Multimedia Database Management System (MMDBMS) as a result of a design that stressed its portability, and simplicity. MORD consists of a database schema, test data, and code to simulate representative queries on multimedia databases. A number of experiments are described that validate MORD and ensure its correct design and that its objectives are met. A by-product of these experiments is an initial understanding of the performance of multimedia databases. The experiments show that with multimedia data the buffer cache should be at least large enough to hold the largest dataset, a bigger block size improves the performance, and turning off logging and caching for bulk loading improves the performance. MORD can be used to compare different ORDBMS or to assist in the configuration of a specific database.
52

Word processing: What features need to be learned first to be productive fast?

Chapman, Deena Jacques 01 January 1991 (has links)
No description available.
53

Query processing optimization for distributed relational database systems: an implementation of a heuristic based algorithm

Stoler, Moshe January 1987 (has links)
The first step of the program is to input the statistical information concerning the relations of th· database. This information is stored in the log file and the file matrix data structures. Next, the query itself is read and stored in an array called the query matrix. The program examines the various fields of this matrix and decides which relations in the database are necessary to answer the query. For these relations it determines those attributes which should be eliminated and those which should be preserved for further processing. The key attributes are identified and are projected along with the other attributes. After the initial projection is completed the sizes of the new temporary relations are evaluated and stored in the appropriate fields of the file matrix structure. The program then examines that part of the query which contains the various restrictions on the attributes. The values of the attributes are sorted and those values which do not match the restrictions are eliminated from the log file. Again, the sizes of the new relations are estimated according to the method described by Egyhazy et al. [6]. A second projection is performed to eliminate attributes which were required by the selection phase but are not part of the final answer to the query. The remaining relations are those relations which need to be joined to form a relation with the required information. In order to decide upon which relations to join, a special table, the join matrix, is created. This table contains pairs of relations which have common attributes and common values and therefore are joinable. The LP algorithm is used to determine the least expensive join out of all the possible joins. This process is repeated until all of the relations are joined to form a single relation which answers the query. As in the case of projection and selection the size of the temporary relations after each join is estimated. As a last step, we remove the key attributes which helped in joining the files but are not part of the answer to the query. / Master of Engineering
54

Effect of certain parameters on response time of an Oracle database

Aihe, David Osemeahon 01 April 2001 (has links)
No description available.
55

A hypertext graph theory reference system

Islam, Mustafa R. January 1993 (has links)
G-Net system is being developed by the members of the G-Net research group under the supervision of Dr. K. Jay Bagga. The principle objective of the G-Net system is to provide an integrated tool for dealing with various aspects of graph theory. G-Net system is divided into two parts. GETS (Graph theory Experiments Tool Set) will provide a set of tools to experiment with graph theory, and HYGRES (HYpertext Graph theory Reference Service), the second subcomponent of the G-Net system to aid graph theory study and research. In this research a hypertext application is built to present the graph theory concepts, graph models and the algorithms. In other words, HYGRES (Guide Version) provides the hypertext facilities for organizing a graph theory database in a very natural and interactive way. An hypertext application development tool, called Guide, is used to implement this version of HYGRES. This project integrates the existing version of GETS so that it can also provide important services to HYGRES. The motivation behind this project is to study the initial criterion for developing a hypertext system, which can be used for future development of a stand alone version of the G-Net system. / Department of Computer Science
56

Analysis of multiple software releases of AFATDS using design metrics

Bhargava, Manjari January 1991 (has links)
The development of high quality software the first time, greatly depends upon the ability to judge the potential quality of the software early in the life cycle. The Software Engineering Research Center design metrics research team at Ball State University has developed a metrics approach for analyzing software designs. Given a design, these metrics highlight stress points and determine overall design quality.The purpose of this study is to analyze multiple software releases of the Advanced Field Artillery Tactical Data System (AFATDS) using design metrics. The focus is on examining the transformations of design metrics at each of three releases of AFATDS to determine the relationship of design metrics to the complexity and quality of a maturing system. The software selected as a test case for this research is the Human Interface code from Concept Evaluation Phase releases 2, 3, and 4 of AFATDS. To automate the metric collection process, a metric tool called the Design Metric Analyzer was developed.Further analysis of design metrics data indicated that the standard deviation and mean for the metric was higher for release 2, relatively lower for release 3, and again higher for release 4. Interpreting this means that there was a decrease in complexity and an improvement in the quality of the software from release 2 to release 3 and an increase in complexity in release 4. Dialog with project personnel regarding design metrics confirmed most of these observations. / Department of Computer Science
57

A hypertext application and system for G-net and the complementary relationship between graph theory and hypertext

Sawant, Vivek Manohar January 1993 (has links)
Many areas of computer science use graph theory and thus benefit from research in graph theory. Some of the important activities involved in graph theory work are the study of concepts, algorithm development, and theorem proving. These can be facilitated by providing computerized tools for graph drawing, algorithm animation and accessing graph theory information bases. Project G-Net is aimed at developing a set of such tools.Project G-Net has chosen to provide the tools in hypertext form based on the analysis of users' requirements. The project is presently developing a hypertext application and a hypertext system for providing the above set of tools. In the process of this development various issues pertaining to hypertext authoring, hypertext usability and application of graph theory to hypertext are being explored.The focus of this thesis is in proving that hypertext approach is most appropriate for realizing the goals of the G-Net project. The author was involved in the research that went into analysis of requirements, design of hypertext application and system, and the investigation of the complementary relationship between graph theory and hypertext. / Department of Computer Science
58

PDF shopping system with the lightweight currency protocol

Wang, Yingzhuo 01 January 2005 (has links)
This project is a web application for two types of bookstores an E-Bookstore and a PDF-Bookstore. Both are document sellers, however, The E-Bookstore is not a currency user. The PDF-Bookstore sells PDF documents and issues a lightweight currency called Scart. Customers can sell their PDF documents to earn Scart currency and buy PDF documents by paying with Scart.
59

Sensitivity analysis on a simulated helpdesk system with respect to input distributions with special reference to the circumference method

Roux, Johanna Wileria 01 January 2002 (has links)
Simulation analysis makes use of statistical distributions to specify the parameters of input data. It is well known that fitting a distribution to empirical data is more of an art than a science (Banks J., 1998, p. 74) because of the difficulty of constructing a 'good' histogram. The most difficult step is choosing an appropriate interval width. Too small a width will produce a ragged histogram, whereas too large a width will produce one that is overaggregated and block-like. De Beer and Swanepoel (1999) have developed 'Simple and effective number-of-bins circumference selectors' for creating histograms for the purpose of fitting distributions. When using simulation software such as Arena, one can generally fit distributions to input data using a built-in function in the software. If input distributions could be compared regarding their effect on the outcomes of a simulation model, one could assess whether input distributions generated by Arena could be accepted unconditionally or whether one should pay special attention to the input distributions used in the simulation model. In this study a simulation model of a computer helpdesk system is constructed to test the effect of input distributions. Distributions fitted with the 'circumference technique' are compared with those from the simulation package, Arena, and those calculated by the statistical package 'Statistica', and then compared with empirical distributions. In the helpdesk system, calls from employees experiencing problems with any computer hardware or software are logged, redirected when necessary, 'attended to, resolved and then closed. Queue statistics of the simulation model using input distributions suggested by Arena as opposed to input distributions deduced from the other methods are compared and a conclusion is reached as to how important or unimportant it is for this specific model to select appropriate input distributions. / Business Management / M. Com. (Quantitative Managemment)
60

The effect of enterprise resource planning systems on the financial statement audit of a higher education institution

14 July 2015 (has links)
M.Com. (Computer Auditing) / This study investigates the effects of the implementation and upgrade of financial Enterprise Resource Planning (hereafter ERP) systems, particularly the Oracle system, on financial reporting and audit. It also determines whether the independent external auditors play a vital role in the process of implementing internal controls in the implementation and upgrade of the Oracle system at a higher education institution (hereafter HEI). With the ever-evolving information technology, it is of utmost importance that the necessary controls be implemented. A sample of 18 Oracle system users from the HEI finance expenditure department and HEI independent external auditors is surveyed and the results of the survey are used to provide advice to organisational management on measures that should be implemented to ensure smooth systems implementation and post-implementation results. The empirical study indicates that the HEI had adequate measures and controls in place to ensure that the ERP implementation runs smoothly and threats are avoided, resulting in a successful implementation for competitive advantage in HEI.

Page generated in 0.1058 seconds