• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 678
  • 144
  • 138
  • 131
  • 26
  • 22
  • 13
  • 13
  • 13
  • 13
  • 13
  • 12
  • 8
  • 7
  • 2
  • Tagged with
  • 1289
  • 1289
  • 1289
  • 468
  • 326
  • 265
  • 265
  • 263
  • 261
  • 261
  • 261
  • 261
  • 261
  • 261
  • 261
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Low-power sampled-data dual-slope ADC /

Malachira, Bopanna Kariappa, January 2007 (has links)
Thesis (M.S.)--University of Texas at Dallas, 2007. / Includes vita. Includes bibliographical references (leaves 52-53)
242

Optimistic protocols for fault-tolerance in distributed systems /

Damani, Om Prakash, January 1999 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 1999. / Vita. Includes bibliographical references (leaves 117-124). Available also in a digital version from Dissertation Abstracts.
243

The impact on information systems controls within an organisation when making use of an EDI VAN

Rorbye, Trevor Wayne 08 May 2014 (has links)
M.Com. (Computer Auditing) / The implementation of EDI into South Africa business has only started in the recent past. The main reason for this is the fact that the huge benefits in terms of faster processing of business transactions, reduced costs of processing and the formation of strategic business alliances with key business partners, is only now being accepted by management. The other reason is due to the fact that large, commercially operated Value Added Networks (VANs) have only been in existence in this country during the last two years. The primary objective of this short dissertation can be summarised as follows: a) To provide a brief overview of the developments which are currently taking place in South Africa in the Electronic Data Interchange (EDI) environment and the Value Added Network (VAN) environment; b) To highlight how EDI is currently being implemented in South Africa; c) To develop a simplistic framework of key information systems controls which an auditor should consider when evaluating the information systems at a client; and d) To apply this controls framework to the EDI and VAN environments in order to derive lists of the information systems controls which should be reviewed by the auditor when their client makes use of an EDI VAN.
244

Computer-controlled microscope for the automatic classification of white blood cells.

Gabert, Howard Frederick January 1973 (has links)
A microscope was interfaced to a PDP-9 computer in order to develop techniques suitable for on-line classification of white blood cells. The computer visual-input system is composed of an image dissector optically coupled to a microscope used in a transmitted light mode. The position of the slide under the microscope and the fine focus control are controllable from the computer. A technique of auto-focusing was developed to efficiently focus the microscope under computer control. This algorithm is described in detail, followed by a discussion of the physical factors that affect the performance of this technique. A method of locating or isolating the leukocytes (white blood cells) is described next. A constant, referred to as the "contrast ratio' is used to extract the threshold for the nuclei of the leukocytes based on the average background intensity. Finally, contour tracing and curvature function extraction are used as a means of testing the system. A specific test is conducted to obtain a comparison of the system's efficiency as compared to that of "manual" techniques used by a technician. The system described here is not only suitable for automatic leukocyte classification, but could also be used for many other routine tests requiring the examination of microscopic cells. / Applied Science, Faculty of / Electrical and Computer Engineering, Department of / Graduate
245

On automating the accessioning process in a clinical pathology laboratory

Bridges, Herbert James January 1968 (has links)
In the Clinical Pathology Laboratory, one of the most-exacting and time consuming operations is that of accessioning. This term encompasses those operations concerned with the separation of test specimens into subsamples for multiple testing and the preparation of records to retain the identity of the sub-samples as they progress through the laboratory. This thesis contains a description of an automated accessioning unit that, when interfaced to a DEC PDP-9 general-purpose digital computer is capable of performing the accessioning function. The computer maintains the necessary records while the mechanical unit performs the sample splitting operation. Initial sample identification is performed by the card stub reader in a modified IBM 1084 sampler-reader. Samples are separated by a fixed splitter, in which each output is valve controlled. The subsamples are pumped to multiple tray loading stations. The trays are identified via a photo-transistor read head that senses a code built into each tray. The computer also controls the tray rotation and the filler arm independently at each station. All mechanical operations attempted by the computer are verified by appropriated sensors. A thirty-two channel scanning digital multiplexer was designed to monitor these feedback signals. Software concepts were developed to operate the device within the Keyboard Monitor Software environment of the PDP-9. The software was designed to permit shared execution of the accessioning unit controlling program and any other program system desired, with minimum restrictions on that system. / Applied Science, Faculty of / Electrical and Computer Engineering, Department of / Graduate
246

Economics of integrating computers and communications systems in Canada

Touchie, Rodger Don January 1969 (has links)
The success of the computer utility is extremely dependent upon the efficient interaction of electronic data processing equipment with high speed communication circuits. It is the purpose of this thesis to identify and examine some of the problem areas of a new and rapidly expanding industry. Emphasis is placed on the Canadian environment. However, the obvious influences of the related activities in the United States, are not to be neglected thus, the study includes numerous references to the American situation. This is not a report on computer capabilities, nor a technical analysis of the computer-communications interface. Rather, this study deals with the overlapping concerns of the computer and communications industries, and the implications of these mutual interests. Major issues are the economic considerations, government involvement and the examination of social effects. The paper consists of four main sections. The first of these is a general introduction, including a brief summary of data communication terminology. The second section concentrates upon the present Canadian situation and serves to describe the current state of teleprocessing in this country. Also, it involves a survey of the present services provided by the common carriers and the available interface equipment which developers of a computer utility might employ. One of the traits of common carrier services is regulation and this implies government involvement. Pertinent jurisdictions of both federal and provincial government bodies is described. It is felt, that within this realm, extensive reference to the actions of the Federal Communications Commission in the United States is warranted. It's pertinence to the Canadian scene will be brought directly into Section Two, and this is supplemented by Appendix B which describes some of the relevant occurrences in the United States as stimulated by the FCC. One prime undertaking serves to tie these economic, political and social factors together. As a result the thesis includes a review of progress towards a Canadian telecommunications satellite and what it might mean to cross country computer interactions in the future. Section Three looks at some of the management problems which are a part of data communication systems development. The generalities which are characteristic of any development recipe are applied to a specific case study. A study of this type leads to opinions and conclusions being drawn by the author and these are outlined in Section Four of the thesis. / Business, Sauder School of / Graduate
247

A digital data acquisition system for astronomical spectra

Isherwood, Barclay Clifford January 1971 (has links)
Certain restrictions associated with conventional photographic techniques have led to the use of a low light level television tube (isocon tube) as an alternative detector for astronomical spectra. An on-line digital data acquisition system has been developed to process and analyze data from the television camera. The digital system consists of a high speed analog-to-digital converter, a digital computer with magnetic tape transport and oscilloscope display, and some digital control logic. Spectra displayed on the photocathode of the isocon tube are digitized by the A/D converter and stored on magnetic tape via the computer processor. Data acquisition time for one frame of data (1360 samples) is less than 1.5 seconds. Results obtained with the system indicate that it performs as initially intended and is able to resolve spectral features of the order of 0.1 mm. / Science, Faculty of / Earth, Ocean and Atmospheric Sciences, Department of / Graduate
248

Proposed study of the reaction, 7Li(3He,da)4He, with a time-of-flight scattering chamber.

Mint, Edward Theodore January 1970 (has links)
The reaction, ⁷Li(³He,dα)⁴He, is proposed to search for an asymmetry about the direction of motion of ⁶Li in the breakup of the 4.57 MeV excited state of this nucleus as an intermediate state. This follows after the discovery in 1967 by Reimann et. al. of such an asymmetry about the direction of motion of ⁵Li in the breakup of the ground state of this nucleus, as an intermediate state in the reaction, ⁶Li(³He,pα)⁴He. The purpose of these experiments is to attempt some understanding of the three-body reaction mechanisms involved, and the manner in which the various particles are correlated in the intermediate state. The three-body kinematics of the reaction, ⁷Li(³He,dα)⁴He were thoroughly investigated, and because of particle identification problems, a charged particle time-of-flight technique was proposed to distinguish the emitted deuterons from alpha particles. A 23 inch scattering chamber was designed and constructed for this and other work, and subsequently tested using the reaction ⁷Li(p,α)⁴He. / Science, Faculty of / Physics and Astronomy, Department of / Graduate
249

Reliable group communication in distributed systems

Navaratnam, Srivallipuranandan January 1987 (has links)
This work describes the design and implementation details of a reliable group communication mechanism. The mechanism guarantees that messages will be received by all the operational members of the group or by none of them (atomicity). In addition, the sequence of messages will be the same at each of the recipients (order). The message ordering property can be used to simplify distributed database systems and distributed processing algorithms. The proposed mechanism continues to operate despite process, host and communication link failures (survivability). Survivability is essential in fault-tolerant applications. / Science, Faculty of / Computer Science, Department of / Graduate
250

A methodology for database management of time-variant encodings and/or missing information

Threlfall, William John January 1988 (has links)
The problem presented is how to handle encoded data for which the encodings or decodings change with respect to time, and which contains codes indicating that certain data is unknown, invalid, or not applicable with respect to certain entities during certain time periods. It is desirable to build a database management system that is capable of knowing about and being able to handle the changes in encodings and the missing information codes by embedding such knowledge in the data definition structure, in order to remove the necessity of having applications programmers and users constantly worrying about how the data is encoded. The experimental database management language DEFINE is utilized to achieve the desired result, and a database structure is created for a real-life example of data which contains many examples of time-variant encodings and missing information. / Science, Faculty of / Computer Science, Department of / Graduate

Page generated in 0.1461 seconds