331 |
The analysis and recommended design of a high-resolution digital data acquisition system for the in situ measurement of various physical and chemical parameters of sea waterAlbin, Arthur Grayson 14 March 1968 (has links)
The lack of availability of a single, complete paper on the
analysis of a digital data acquisition system for in situ oceanographic
measurements brought about the present work. In order to compile
an accurate technical paper on such a complex system, it was necessary
to design, construct, and calibrate a system for the in situ
measurement of various physical and chemical properties of sea
water.
A unique digital encoding circuit was developed which utilizes
the basic principles of null balance, resistance ratio bridge and
potentiometer measurements. The circuit is a uniquely loaded binary
ladder resistance network. Variation of the loading allows resistance
ratio and potentiometric measurements to be made over different
ranges with a constant high degree of resolution.
The developed system has the capability of measuring any
parameter which can be converted into a varying voltage, current,
or resistance. The range over which the measurement can be made
is adjustable; and the resolution equals ± 0.006% of the full range
value. The resolution is constant over the full range, as a result of
the analog-to- digital converter linearity characteristics. / Graduation date: 1968
|
332 |
Test case prioritizationMalishevsky, Alexey Grigorievich 19 June 2003 (has links)
Regression testing is an expensive software engineering activity intended to provide
confidence that modifications to a software system have not introduced faults.
Test case prioritization techniques help to reduce regression testing cost by ordering
test cases in a way that better achieves testing objectives. In this thesis, we are interested
in prioritizing to maximize a test suite's rate of fault detection, measured by a
metric, APED, trying to detect regression faults as early as possible during testing.
In previous work, several prioritization techniques using low-level code coverage
information had been developed. These techniques try to maximize APED over
a sequence of software releases, not targeting a particular release. These techniques'
effectiveness was empirically evaluated.
We present a larger set of prioritization techniques that use information at arbitrary
granularity levels and incorporate modification information, targeting prioritization
at a particular software release. Our empirical studies show significant
improvements in the rate of fault detection over randomly ordered test suites.
Previous work on prioritization assumed uniform test costs and fault seventies,
which might not be realistic in many practical cases. We present a new cost-cognizant
metric, APFD[subscript c], and prioritization techniques, together with approaches
for measuring and estimating these costs. Our empirical studies evaluate prioritization
in a cost-cognizant environment.
Prioritization techniques have been developed independently with little consideration
of their similarities. We present a general prioritization framework that allows
us to express existing prioritization techniques by a framework algorithm using
parameters and specific functions.
Previous research assumed that prioritization was always beneficial if it improves
the APFD metric. We introduce a prioritization cost-benefit model that more
accurately captures relevant cost and benefit factors, and allows practitioners to assess
whether it is economical to employ prioritization.
Prioritization effectiveness varies across programs, versions, and test suites. We
empirically investigate several of these factors on substantial software systems and
present a classification-tree-based predictor that can help select the most appropriate
prioritization technique in advance.
Together, these results improve our understanding of test case prioritization and
of the processes by which it is performed. / Graduation date: 2004
|
333 |
Reasoning about many-to-many requirement relationships in spreadsheet gridsBeckwith, Laura A. 18 November 2002 (has links)
Traditionally, research into end-user programming has focused on how to
make programming more accessible to end users. However, few researchers
have considered providing end users with devices to help improve the
reliability of the programs they create. To help improve the reliability of
spreadsheets created by end users, we are working to allow users to
communicate the purpose and other underlying information about their
spreadsheets using a form of requirement specifications we call "guards."
Guards were initially designed for individual cells but, for large spreadsheets,
with replicated/shared formulas across groups of rows or columns, guards can
only be practical if users can enter them across these groups of rows or
columns. The problem is, this introduces many-to-many relationships, at the
intersection of rows and columns with guards. It is not clear how the system
should reason and communicate about many-to-many relationships in a way
that will make sense to end users. In this thesis, we present the human-centric
design rationale for our approach to how the system should reason about such
many-to-many relationships. The design decisions are presented with their
reasons gleaned from two design-time models--Cognitive Dimensions and
Attention Investment--and from the users themselves in a small think-aloud
study. / Graduation date: 2003
|
334 |
A performance study of multithreadingKwak, Hantak 07 December 1998 (has links)
As the performance gap between processor and memory grows, memory latency
will be a major bottleneck in achieving high processor utilization. Multithreading has
emerged as one of the most promising and exciting techniques used to tolerate memory
latency by exploiting thread-level parallelism. The question however remains as to how
effective multithreading is on tolerating memory latency. Due to the current availability
of powerful microprocessors, high-speed networks and software infrastructure systems,
a cost-effective parallel machine is often realized using a network of workstations.
Therefore, we examine the possibility and the effectiveness of using multithreading in a
networked computing environment. Also, we propose the Multithreaded Virtual Processor
model as a means of integrating multithreaded programming paradigm and modern
superscalar processor with support for fast context switching and thread scheduling. In
order to validate our idea, a simulator was developed using a POSIX compliant Pthreads
package and a generic superscalar simulator called Simple Scalar glued together with
support for multithreading. The simulator is a powerful workbench that enables us to
study how future superscalar design and thread management should be modified to better
support multithreading. Our studies with MVP show that, in general, the performance
improvement comes not only from tolerating memory latency, but also due to the
data sharing among threads. / Graduation date: 1999
|
335 |
Guidelines for implementing real-time process control using the PCStanford, Clayton R. 06 December 1996 (has links)
The application of the personal computer in the area of real-time process control
is investigated. Background information is provided regarding factory automation and
process control. The current use of the PC in the factory for data acquisition is
presented along with an explanation of the advantages and disadvantages associated
with extending the use of the PC to real-time process control. The use of interrupt-driven
and polled I/O to obtain real-time response is investigated and contrasted with
the use of a real-time operating system. A unique compilation of information provides
guidelines for selecting an implementation method for real-time control. Experimental
work is performed to evaluate the access time and latency periods for the hard drive,
video monitor, and I/O devices operating in a DOS environment. The execution speeds
of C and assembly language programs are investigated. A method to estimate the
performance of a real-time control system using polled or interrupt-driven I/O is
developed. / Graduation date: 1997
|
336 |
Base inclinations in natural and synthetic DNAsChou, Ping-Jung 03 November 1993 (has links)
A sophisticated computer program is developed to analyze flow linear
dichroism data on nucleic acids for individual base inclinations. Measured
absorption and linear dichroism data for synthetic AT and GC polymers and
natural DNAs are analyzed. The reliability of the program is tested on data for
the synthetic polymers, and the results are similar to earlier, more
straightforward analyses. For the first time, specific base inclinations are
derived for all bases individually from the linear dichroism data for natural
deoxyribonucleic acids. For B-form DNA in aqueous solution at moderate salt
concentrations, the inclinations from perpendicular are as follows: d(A)=16.1 ��
0.5; d(T)=25.0 �� 0.9; d(G)=18.0 �� 0.6; d(C)=25.1 �� 0.8 deg. Our results
indicate that the bases in synthetic and natural DNAs are not perpendicular to
the helix axis, even in the B form.
The mathematical bases and numerical analyses are presented in detail
since both are the keys for successful spectral decompositions in this study,
and could be applied to nonlinear optimization problems encountered in other
types of biochemistry and biophysics measurements. The interplay between
computer programming and scientific measurements can not be
overemphasized for modern research. / Graduation date: 1994
|
337 |
A measure of the investment climate in South AfricaTongai Foto January 2009 (has links)
<p>Investor confidence is a concept many investors are constantly trying to gauge. In practice however these concepts are usually not easy to measure. This study attempts to capture the total sum of investor perception in South Africa by examining market behaviour. Data from the JSE/FTSE (1995-2009) will be used to determine an Equity Risk Premium. Bond Yield Spreads will also be calculated from data provided by I-NET BRIDGE. An amalgamation of these components will produce the proposed Investment Confidence Index. Similar indices currently on the South African Market are based on subjective surveys and might therefore be biased. The proposed index which is a first in SA will prove invaluable to practitioners in the financial sector.</p>
|
338 |
A microcomputer software package for simulation of non-ideal aqueous electrolyte systems at equilibriumSinquefield, Scott A. 22 May 1991 (has links)
The non-ideal aqueous electrolyte simulator (NAELS) is composed of
three major parts: a Newton-Raphson non-linear optimization program written
by Weare, et al (1987); an activity coefficient subroutine for non-ideal
electrolyte systems based on Pitzer's model; and an extensive, user expandable
database. It is robust, stable, and requires neither thermodynamic data nor
initial guesses as input. NAELS provides very good estimates of equilibrium
speciation and solubility in concentrated electrolyte systems. NAELS was
assembled as a technical utility package for use on IBM-compatable
microcomputers. / Graduation date: 1992
|
339 |
Incorporating domain-specific information into the compilation processGuyer, Samuel Zev. January 2003 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2003. / Vita. Includes bibliographical references. Available also from UMI Company.
|
340 |
Purchase selection of a major computer system in Hong Kong : considerations and marketing implications /Nip, Lai-ming. January 1988 (has links)
Thesis (M.B.A.)--University of Hong Kong, 1988.
|
Page generated in 0.0657 seconds