291 |
GDI: (Goal Directed Interface): An intelligent, iconic, object-oriented interface for office systems.Griggs, Kenneth Andrew. January 1989 (has links)
This dissertation presents the GDI (Goal Directed Interface) approach to the user interface for office systems. The primary objectives of the approach are to create an interface that (1) requires little user training and (2) tries to perform higher level task activities (ex. 'schedule a meeting') that have been excluded from computerization in the past. The GDI technique (1) postulates a simple model of the office environment consisting of persons, things, and processes, and a decomposable goal set, (2) represents knowledge in the office environment through rules, frames, and scripts, and object-oriented programming techniques, (3) creates an iconic visual representation consisting of persons, things, and processes that closely mimics the user's 'mental model' of the office world, (4) requires that the user's own 'person icon' be present for all interactions so that actions appear to take place in a user controllable context (the user's icon is, literally, in the interface), (5) provides a 'selection window' through which the user communicates his/her goal by grouping relevant icons, (6) uses a rule-based expert system to examine an icon configuration and, through its expertise, derives a user goal (despite ambiguous or faulty icon placements), (7) attempts to complete the user goal through the use of scripts and multiple expert systems.
|
292 |
A flexible, policy-aware middleware systemWalker, Scott Mervyn January 2006 (has links)
Middleware augments operating systems and network infrastructure to assist in the creation of distributed applications in a heterogeneous environment. Current middleware systems exhibit some or all of the following five main problems: 1. Decisions must be made early in the design process. 2. Applications are inflexible to dynamic changes in their distribution. 3. Application development is complex and error-prone. 4. Existing systems force an unnatural encoding of application-level semantics. 5. Approaches to the specification of distribution policy are limited. This thesis defines a taxonomy of existing middleware systems and describes their limitations. The requirements that must be met by a third generation middleware system are defined and implemented by a system called the RAFDA Run-Time (RRT). The RRT allows control over the extent to which inter-address-space communication is exposed to programmers, aiding the creation, maintenance and evolution of distributed applications. The RRT permits the introduction of distribution into applications quickly and with minimal programmer effort, allowing for quick application prototyping. Programmers can conceal or expose the distributed nature of applications as required. The RRT allows instances of arbitrary application classes to be exposed to remote access as Web Services, provides control over the parameter-passing semantics applied to remote method calls and permits the creation of flexible distribution policies. The design of the RRT is described and evaluated qualitatively in the context of a case study based around the implementation of a peer-to-peer overlay network. A prototype implementation of the RRT is examined and evaluated quantitatively. Programmers determine the trade off between flexibility and simplicity offered by the RRT on a per-application basis, by concealing or exposing inter-address-space communication. The RRT is a middleware system that adapts to the needs of applications, rather than forcing distributed applications to adapt to the needs of the middleware system.
|
293 |
The doubly-linked list protocol family for distributed shared memory multiprocessor systems劉宗國, Lau, Chung-kwok, Albert. January 1996 (has links)
published_or_final_version / Electrical and Electronic Engineering / Master / Master of Philosophy
|
294 |
Document distribution algorithms for distributed web servers伍頌斌, Ng, Chung-pun. January 2002 (has links)
published_or_final_version / Computer Science and Information Systems / Master / Master of Philosophy
|
295 |
A LOCAL NETWORK FOR LABORATORY AUTOMATION AND DATA COLLECTION.PRECKSHOT, GEORGE GARRELL. January 1982 (has links)
This dissertation describes LABNET, a loosely-coupled network of small computers for laboratory automation and data collection. The network comprises two parts: RAPNET, the local-network operating-system-like software, and Real-time MICRODARE, an interactive language for programming automation and data-collection tasks. RAPNET provides the framework upon which application-level programs like MICRODARE execute. In addition to the usual file services and other miscellaneous system services normally supplied by a single-CPU operating system, RAPNET provides link-level message facilities, program control, and a virtual channel system. There is a means for coordinated application-level program intercommunication, the pseudo-link; pseudo-links are the means by which programs running in different CPUs or in the same CPU may be connected. To the application-level program, a pseudo-link looks just like a file or device. Real-time MICRODARE supplies an interactive programming capability which uses the facilities of RAPNET to enable a programmer to do distributed program systems for automation, simulation, and data collection. MICRODARE consists of an interactive BASIC-like job-control language, and a compiled fast-task language. The job-control language permits time and event dependent scheduling of automation and data-collection program segments. The fast-task language does simulation, signal-processing, data-collection, and control tasks at close-to-assembly-language speeds.
|
296 |
A CASE STUDY OF FLEXIBLE DISTRIBUTED PROCESSING SYSTEM IN COPIER DEVELOPMENT (PROPOTYPE, DRIVER, PROTOCOL).Nguyen, Thuyen Dinh, 1959- January 1986 (has links)
No description available.
|
297 |
Job Embeddedness as a Predictor of Voluntary Turnover: Validation of a New InstrumentBesich, John S. 12 1900 (has links)
Voluntary turnover has become a problem for many organizations in today's society. The cost of this turnover reaches beyond organizational impact, but also affects the employees themselves. For this reason, there has been a plethora of research conducted by both academicians and practitioners on the causes and consequences of voluntary turnover. The purpose of this study is to test the validity and generalizability of the job embeddedness model of voluntary turnover to the information technology (IT) industry. The IT field has been plagued with high turnover rates in recent years. In this study, the job embeddedness model (Mitchell et al., 2001) is applied to a population sample consisting of health care information technology employees.
|
298 |
A method for finding common attributes in hetrogenous DoD databasesZobair, Hamza A. 06 1900 (has links)
Approved for public release; distribution is unlimited. / Traditional database development has been done for a specific, self-contained purpose with no plan to share or merge the data with other databases in the future. As these systems have matured, users have realized a requirement exists to share their data. Finding common attributes among databases is a time consuming task. However, it is one that is necessary as more and more corporations and agencies consolidate operations. In terms of DoD, the requirement to consolidate systems has come about, as the various data systems used by DoD agencies and our allies need to communicate with each other for a well-coordinated operation. One alternative for achieving the desired interconnectivity is to specify the requirement for interoperability in new systems. A more practical, less costly process is to merge existing systems and consolidate the common components. This paper proposes a process for consolidating portions of data dictionaries of two existing databases. The proposed method uses commercial-off-the-shelf software in finding common attributes between multiple databases and represents an improvement in accuracy and time over previous methods.
|
299 |
Effective use of Java Data objects in developing database applications. Advantages and disadvantagesZilidis, Paschalis. 06 1900 (has links)
Approved for public release; distribution is unlimited / Currently, the most common approach in developing database applications is to use an object-oriented language for the frontend module and a relational database for the backend datastore. The major disadvantage of this approach is the well-known "impedance mismatch" in which some form of mapping is required to connect the objects in the frontend and the relational tuples in the backend. Java Data Objects (JDO) technology is recently proposed Java API that eliminates the impedance mismatch. By using JDO API, the programmers deal strictly with objects. JDO hides the details of the backend datastore by providing the object-oriented view of the datastore. JDO automatically handles the mapping between the objects and the underlying data in the relational database, which is hidden from the programmer. This thesis investigates the effectiveness of JDO. Part of the analysis will develop a database application using JDO. Although JDO provides the benefits of object-orientation in design and implementation of the databases, it is not immune from problems and limitations. The thesis will also analyze the advantages and disadvantages of using JDO and discuss the areas requiring improvements in future releases. / Major, Hellenic Air Force
|
300 |
An Inquiry into the Inevitability of Prediction Error in Investment Portfolio ModelsValentine, Jerome Lynn 12 1900 (has links)
Many mathematical programming models of the selection of investment portfolios assume that the best portfolio at any given level of risk is the portfolio having the highest level of return. The expected level of return is defined as a linear combination of the expected returns of the individual investments contained within the portfolio,and risk is defined in terms of variance of return. This study uses Monte Carlo simulation to establish that if the estimates of the future returns on potential investments are unbiased, the steady-state return on the portfolio is overestimated by the procedure used in the standard models. Under reasonable assumptions concerning the parameters of the estimates of the various returns, this bias is quite sizeable, with the steady-state predicted return often overestimating the steady-state actual return by more than ten percentage points. In addition, it is shown that when the variances of the alternative potential investments are not all equal,a limitation on the variance of the portfolio will reduce the magnitude of the bias. In many reasonable cases, constraining the portfolio variance reduces the bias by a magnitude greater than the amount by which it reduces the predicted portfolio return, causing the steady-state actual return to rise. This implies that return cannot automatically be assumed to be a monotonic function of risk.
|
Page generated in 0.0959 seconds