• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19765
  • 3375
  • 2417
  • 2018
  • 1551
  • 1432
  • 884
  • 407
  • 390
  • 359
  • 297
  • 238
  • 208
  • 208
  • 208
  • Tagged with
  • 38358
  • 12482
  • 9269
  • 7139
  • 6701
  • 5896
  • 5350
  • 5221
  • 4759
  • 3478
  • 3309
  • 2879
  • 2734
  • 2555
  • 2119
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
921

An appraisal of secure, wireless grid-enabled data warehousing

Seelo, Gaolathe January 2007 (has links)
In most research, appropriate collections of data play a significant role in aiding decision-making processes. This is more critical if the data is being accessed across organisational barriers. Further, for the data to be mined and analysed efficiently, to aid decision-making processes, it must be harnessed in a suitably-structured fashion. There is, for example, a need to perform diverse data analyses and interpretation of structured (non-personal) HIV/AIDS patient-data from various quarters in South Africa. Although this data does exist, to some extent, it is autonomously owned and stored in disparate data storages, and not readily available to all interested parties. In order to put this data to meaningful use, it is imperative to integrate and store this data in a manner in which it can be better utilized by all those involved in the ontological field. This implies integration of (and hence, interoperability), and appropriate accessibility to, the information systems of the autonomous organizations providing data and data-processing. This is a typical problem-scenario for a Virtual Inter-Organisational Information System (VIOIS), proposed in this study. The VIOIS envisaged is a hypothetical, secure, Wireless Grid-enabled Data Warehouse (WGDW) that enables IOIS interaction, such as the storage and processing of HIV/AIDS patient-data to be utilized for HIV/AIDS-specific research. The proposed WDGW offers a methodical approach for arriving at such a collaborative (HIV/AIDS research) integrated system. The proposed WDGW is virtual community that consists mainly of data-providers, service-providers and information-consumers. The WGDW-basis resulted from systematic literaturesurvey that covered a variety of technologies and standards that support datastorage, data-management, computation and connectivity between virtual community members in Grid computing contexts. A Grid computing paradigm is proposed for data-storage, data management and computation in the WGDW. Informational or analytical processing will be enabled through data warehousing while connectivity will be attained wirelessly (for addressing the paucity of connectivity infrastructure in rural parts of developing countries, like South Africa).
922

Continuous auditing technologies and models

Blundell, Adrian Wesley January 2007 (has links)
Continuous auditing is not a totally new concept, but it has not been widely implemented, and has existed mostly as a point of debate amongst the auditing fraternity. This may soon change, as continuous auditing has become a topic of great interest, especially in the last decade. This may be due to a combination of reasons. In the last decade, much of the confidence in auditors’ reports was lost due to corporate governance scandals. This also brought about a greater desire for faster, more reliable reporting on which to base decisions. This desire has been transposed into regulations such as the Sarbanes-Oxley act in the United States, which encourages real-time auditing activities, which would benefit from continuous auditing. A second, possible contributing factor to the heightened interest in continuous auditing is that much of the requisite technology has matured to a point where it can be successfully used to implement continuous auditing. It is the technologies which form the focus of this research. It is therefore, the primary objective of this research to investigate and identify the essential technologies, and identify and define their roles within a continuous auditing solution. To explore this area, three models of continuous auditing are compared according to the roles of the technologies within them. The roots of some auditing technologies which can be adapted to the paradigm of continuous auditing are explored, as well as new technologies, such as XML-based reporting languages. In order to fully explore these technologies, the concepts of data integrity and data quality are first defined and discussed, and some security measures which contribute to integrity are identified. An obstacle to implementing a continuous model is that even with the newly available technologies, the multitudes of systems which are used in organisations, produce data in a plethora of data formats. In performing an audit the continuous auditing system needs to first gather this data and then needs to be able to compare “apples with apples”. Therefore, the technologies which can be used to acquire and standardise the data are identified.
923

The adaptive adjustment of digital data receivers using pre-detection filter

Ameen, Siddeeq Yousif January 1990 (has links)
This thesis is concerned with the adaptive adjustment of digital data receivers employed m synchronous serial data transmission systems that use quadrature amplitude modulation The receiver employs a pre-detection filter that forms the first part of a decision feedback equalizer or else is used ahead of a near maximum-likelihood detector. The filter is ideally adjusted such that the sampled impulse response of the channel and filter is minimum phase.
924

Towards a Privacy Preserving Framework for Publishing Longitudinal Data

Sehatkar, Morvarid January 2014 (has links)
Recent advances in information technology have enabled public organizations and corporations to collect and store huge amounts of individuals' data in data repositories. Such data are powerful sources of information about an individual's life such as interests, activities, and finances. Corporations can employ data mining and knowledge discovery techniques to extract useful knowledge and interesting patterns from large repositories of individuals' data. The extracted knowledge can be exploited to improve strategic decision making, enhance business performance, and improve services. However, person-specific data often contain sensitive information about individuals and publishing such data poses potential privacy risks. To deal with these privacy issues, data must be anonymized so that no sensitive information about individuals can be disclosed from published data while distortion is minimized to ensure usefulness of data in practice. In this thesis, we address privacy concerns in publishing longitudinal data. A data set is longitudinal if it contains information of the same observation or event about individuals collected at several points in time. For instance, the data set of multiple visits of patients of a hospital over a period of time is longitudinal. Due to temporal correlations among the events of each record, potential background knowledge of adversaries about an individual in the context of longitudinal data has specific characteristics. None of the previous anonymization techniques can effectively protect longitudinal data against an adversary with such knowledge. In this thesis we identify the potential privacy threats on longitudinal data and propose a novel framework of anonymization algorithms in a way that protects individuals' privacy against both identity disclosure and attribute disclosure, and preserves data utility. Particularly, we propose two privacy models: (K,C)^P -privacy and (K,C)-privacy, and for each of these models we propose efficient algorithms for anonymizing longitudinal data. An extensive experimental study demonstrates that our proposed framework can effectively and efficiently anonymize longitudinal data.
925

The Multimodal Interaction through the Design of Data Glove

Han, Bote January 2015 (has links)
In this thesis, we propose and present a multimodal interaction system that can provide a natural way for human-computer interaction. The core idea of this system is to help users to interact with the machine naturally by recognizing various gestures from the user from a wearable device. To achieve this goal, we have implemented a system including both hardware solution and gesture recognizing approaches. For the hardware solution, we designed and implemented a data glove based interaction device with multiple kinds of sensors to detect finger formations, touch commands and hand postures. We also modified and implemented two gesture recognizing approach based on support vector machine (SVM) as well as the lookup table. The detailed design and information is presented in this thesis. In the end, the system achieves supporting over 30 kinds of touch commands, 18 kinds of finger formation, and 10 kinds of hand postures as well as the combination of finger formation and hand posture with the recognition rate of 86.67% as well as the accurate touch command detection. We also evaluated the system from the subjective user experience.
926

Camac-based nuclear data acquisition system

Le Patourel, David Andrew January 1972 (has links)
This paper describes a data acquisition system developed for an intermediate energy nuclear scattering experiment. Equipment standards of CAMAC and NIM were used together with a 12K minicomputer and an industry compatible magnetic tape driver. The higher level languages of BASIC and FORTRAN were equipped with subroutines that allow input/output communication with the CAMAC data acquisition system. The BASIC-CAMAC system proved to be most useful to the experimenter only taking second place to the FORTRAN-CAMAC system when data handling speed was of paramount importance. / Science, Faculty of / Physics and Astronomy, Department of / Graduate
927

Simulation and preliminary study of low resolution PCM and DPCM picture transmission systems.

Allan, Donald Peter January 1971 (has links)
A monochromatic image can be represented by a real function f(x,y,t) of two spatial variables and time. Picture transmission research is directed at finding methods of transmitting a facsimile of f(x,y,t) at minimum cost which still satisfies some quality criteria. The ensemble of source, functions, and the measures of cost and quality are defined by the application. For many applications, the subjective quality is an important measure of performance. Generally, it is difficult if not impossible to find a tractable analytic relationship between the source and system parameters, and the subjective quality. Thus,for picture processing research, equipment must be available for simulating proposed systems so that pictures can be produced and estimates made of the subjective quality. This thesis will describe a system which was devised for studying low resolution picture transmission systems. The system employs an image dissector to input picture data to a PDP-9 computer. The data can be stored on magnetic tape, processed and then displayed on a picture output device which employs a Tektronix 561 Oscilloscope. A set of general purpose programs was developed for manipulating picture data using this hardware configuration. Three test pictures were read in and particular programs were written to calculate some picture statistics. The statistics include signal histograms, auto-covariance functions and, some run length statistics for the two level material. These statistics were then used as design information for some simulation programs. A program was written to simulate Pulse Code Modulation (PCM) and Differential Pulse Code Modulation (DPCM) systems in order to evaluate the subjective effect of channel errors on pictures transmitted by PCM and DPCM. An attempt was also made to develop a simple two-dimensional filter program to reduce the effect of channel noise. Since the channel noise in the PCM and DPCM systems produced two-dimensional noises which were very different in appearance, two filtering routines were necessary. Subjective rating tests were conducted to evaluate the filtered and unfiltered PCM and DPCM pictures. For the unfiltered systems, these tests indicate that for good channels, the DPCM system is better than the PCM system while for very bad channels, the pictures produced by the PCM system are subjectively better than those of a DPCM system using the same number of bits. It was found also that the PCM filter proposed, improved the noisy pictures while the proposed DPCM filter produced no improvement and even lowers the quality of some of the pictures. / Applied Science, Faculty of / Electrical and Computer Engineering, Department of / Graduate
928

A television camera to computer interface

Yusuf, Tundey January 1970 (has links)
This is an instrumentation thesis. The interface system discussed is a link between an ordinary TV camera and a computer for storage of visual data. The same system can also be used as a link between the computer and a display monitor. Because of its wide bandwidth, a video signal cannot be sampled at the Nyquist rate and presented to a computer. Previous interface systems overcame the problem by scanning slowly on an element-by-element basis using a special scanner and then presenting the samples to the computer. After processing, the data would be read out at the same slow rate and displayed on a special display monitor. The interface described in this thesis will accept material obtained from an ordinary TV camera scanning at standard rate. By using a "stroboscope" sampling technique the samples are presented to the computer slowly enough for it to process. After processing, the data is displayed in a similar manner on a normally scanned monitor for evaluation. Basically the interface operates as follows: A TV camera video signal is sampled at a rate slow enough for computer acceptance. The camera scans the same picture several hundred times until all the points representing the picture have been sampled and stored, the sampling is controlled such that all the points are each sampled only once. Because of the sampling method consecutive samples in the computer do not correspond to adjacent points on the picture being stored. It may therefore be necessary to programme the computer to arrange the samples such that adjacent data in the computer represent consecutive picture points before processing. After processing, the samples may be rearranged and read out for display in the same order they were stored. The horizontal resolution of the picture being stored can be varied quite easily in steps. For example, a system designed to have a maximum of 480 points/line will also have the ability to provide such lower resolutions as 60, 120 and 240 points/lines. This variation is made possible by the design of the hardware. By software the vertical resolution can be varied between an upper limit of 525 lines per picture and such near submultiples of this as 263 and 131 lines/picture. The thesis is discussed in relation to the PDP-9 computer on which most of the work described was done. However, the system interfaces readily with other computers. / Applied Science, Faculty of / Electrical and Computer Engineering, Department of / Graduate
929

Dátová architektúra a dátové sklady v bankovníctve / Data architecture and data marts in banks

Juhás, Michal January 2008 (has links)
In this diploma thesis I analyze data architecture and data marts in a bank area. Firstly I analyze different theoretical approaches to build the data warehouse. Consequently I specialize in operational and analytical data marts -- I analyze their business value, differences and location at the warehouse. In the third chapter I analyze reasons that have led to implement Asset Liability Management data mart. In the following chapter I analyze a life-cycle of this data mart not only from the technical point of view, but also from the project management perspective. The main benefit from this thesis is in the analysis of these aspects of ALM life cycle.
930

Koncept zavedení Data Governance / Data governance implementation concept

Ullrichová, Jana January 2016 (has links)
This master´s thesis discusses concept of implementation for data governance. The theoretical part of this thesis is about data governance. It explains why data are important for company, describes definitoons of data governance, its history, its components, its principles and processes and fitting in company. Theoretical part is amended with examples of data governance failures and banking specifics. The main goal of this thesis is to create a concept for implementing data governance and its implementation in real company. That is what practical part consists of.

Page generated in 0.135 seconds