• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 3
  • 1
  • Tagged with
  • 17
  • 17
  • 11
  • 7
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Conversion of a graphics package to sequential PASCAL

Snyder, Daniel Thomas January 2010 (has links)
Typescript, etc. / Digitized by Kansas Correctional Industries
2

A test methodology for reliability assessment of collaborative tools /

Powers, Brenda Joy. January 2004 (has links) (PDF)
Thesis (M.S. in Software Engineering)--Naval Postgraduate School, Sept. 2004. / Thesis Advisor(s): Mantak Shing, Neil Rowe. Includes bibliographical references (p. 51-52). Also available online.
3

Developing the portability indices and the portable code generator

Mundy, Gregory E. January 1900 (has links)
Thesis (M.S.)--West Virginia University, 2004. / Title from document title page. Document formatted into pages; contains ix, 117 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 114-117).
4

A test methodology for reliability assessment of collaborative tools

Powers, Brenda Joy 09 1900 (has links)
Approved for public release; distribution is unlimited / In the past ten years, military operations, as now evident in Iraq, involve both joint- allied and coalition forces. The evolving joint- and coalition-warfare environment presents coordination challenges. Collaborative tools can ease the difficulties in meeting these challenges by enabling highly interactive work to be performed by individuals not necessarily geographically co-located. Collaborative tools will revolutionize the manner in which distributed warfighters interact and inform each other of the missionplanning progress and situation assessment. These systems allow warfighters to integrate tactical information with key combat-support logistics data in both joint- and coalition-warfare environments. Countless collaboration tools and knowledge management systems exist today. Unfortunately, industry has developed these tools and systems for use primarily in exclusive communities of interests, services or agencies. The end result is a proliferation of tools that have not been designed to operate under all network conditions. Since network conditions are not standardized in the joint- and coalition-warfare environment, it is necessary to determine if a collaborative tool can perform under limited-bandwidth and latency conditions. Currently, there are neither evaluation criteria nor methodologies for evaluating collaborative tools with respect to performance reliability. This thesis proposes a test methodology for evaluation of performance reliability of collaborative tools, and demonstrates the effectiveness of the methodology with a case study of the performance evaluation of the InfoWorkSpace collaborative tool.
5

Mix-and-match compatibility and asymmetric costs

Monroe, Hunter K. January 1993 (has links)
This thesis analyzes how the ability of consumers to buy components of a system from different firms affects prices, profit margins, RandD effort, and welfare. It also examines firms' incentives to make their products compatible, that is, to allow consumers to mix-and-match different brands of components into systems. Chapter 1 reviews the economic literature on product compatibility with motivating material drawn from the personal computer industry. Three strands of the literature study compatibility using definitions based on the ability of consumers to mix- and-match components, to capture externalities arising from networks, and to switch brands costlessly. The mix-and-match literature has found that compatibility raises prices compared with those under incompatibility in a variety of settings. In practice, however, compatible computers appear to be less expensive than incompatible computers, and computer buyers have promoted standardization. Chapter 2 develops models of mix-and-match compatibility which make predictions that are the opposite of the literature's. If many Bertrand competitors draw their component costs, qualities, or characteristics from independent random distributions, then expected prices and profit margins are lower under compatibility than under incompatibility, while expected consumer surplus is higher. In addition, the chapter examines the incentives of firms to form coalitions around competing standards. It is found that a subset of firms may become compatible with each other to attract customers away from other firms, creating excess incentives for firms to become compatible from the perspective of industry profits. However, compatibility raises welfare if it is costless and components are homogeneous, because incompatibility is a restriction on the technology for combining components into systems. Chapter 3 shows that shifting from incompatibility to compatibility has an ambiguous impact on RandD effort to reduce costs. In an industry with sufficiently many firms that faces elastic demand, compatibility lowers prices and raises output, and therefore leads to greater RandD incentives. If effort lowers costs without changing the shape of the cost distribution function, compatibility induces firms to choose RandD effort levels that are closer together than under incompatibility. Chapter 4 relaxes the assumption that consumers combine components in fixed proportions. With variable coefficients, compatibility does not necessarily raise the profits of duopolists. For instance, compatibility prevents a dominant firm from setting the price of either component above its competitor's cost. On the other hand, when two "mirror-image" firms each have the lowest cost in one component and demand is symmetric across components, the firms prefer compatibility, as they did in the fixed coefficients case. When sufficiently many firms draw their costs from discrete random distributions, this ambiguity disappears, and expected profits are higher under incom- patibility. Variable coefficients also allow analysis of quantity competition, by eliminating the problem of unmatched components when there are asymmetric quantity choices. In this case, firms with mirror-image costs prefer compatibility to incompatibility because they can specialize in their low-cost component. However, when each firm has the same cost across components, firms are indifferent between the two regimes.
6

KANDIDATS : the porting of an image processing system

Lallement, Linda J January 2010 (has links)
Typescript, etc. / Digitized by Kansas Correctional Industries
7

The implementation of concurrent Pascal on the NCR8200

Mounday, Donald January 2010 (has links)
Typescript, etc. / Digitized by Kansas Correctional Industries
8

Multi-process structuring of X.25 software

Deering, Stephen Edward January 1982 (has links)
Modern communication protocols present the software designer with problems of asynchrony, real-time response, high throughput, robust exception handling, and multi-level interfacing. An operating system which provides lightweight processes and inexpensive inter-process communication offers solutions to all of these problems. This thesis examines the use of the multi-process structuring facilities of one such operating system, Verex, to implement the protocols defined by CCITT Recommendation X.25. The success of the multi-process design is confirmed by a working implementation that has linked a Verex system to the Datapac public network for over a year. The processes which make up the Verex X.25 software are organized into layers according to the layered definition of X.25. Within the layers, some processes take the form of finite-state machines which execute the state transitions specified in the protocol definition. Matching the structure of the software to the structure of the specification results in software which is easy to program, easy to understand, and likely to be correct. Multi-process structuring can be applied with similar benefits to protocols other than X.25 and systems other than Verex. / Science, Faculty of / Computer Science, Department of / Graduate
9

Japanese and Chinese management information systems and the question of transferability

Fukuda, K. John January 1982 (has links)
published_or_final_version / Management Studies / Doctoral / Doctor of Philosophy
10

A METHODOLOGY FOR GLOBAL SCHEMA DESIGN.

MANNINO, MICHAEL VICTOR. January 1983 (has links)
A global schema is an integrated view of heterogeneous databases used to support data sharing among independent, existing databases. Global schema design complexities arise from the volume of details, design choices, potential conflicts, and interdependencies among design choices. The methodology described provides a framework for efficient management of these critical dimensions in generating and evaluating alternative designs. The methodology contains three major steps. First, differences due to the varying local data models are resolved by converting each local schema to an equivalent schema in a unifying data model. Second, the entity types of the local schemas in the unifying model are grouped into clusters called common areas. All the entity types in a common area can possibly be merged via generalization. For each common area, semantic information is defined that drives the merging process. Third, each common area is integrated into the global schema by applying a set of generalization operators. Mapping rules are then defined to resolve differences in the representations of equivalent attributes. Th integration of the local schemas is based on equivalence assertions. Four types of attribute equivalences are defined: two attributes may be locally or globally equivalent, and they can be key or non-key. Strategies for handling each of these cases are proposed and evaluated. The global schema design methodology includes several algorithms which may assist a designer. One algorithm analyzes a set of equivalence assertions for consistency and completeness including resolution of transitively implied assertions. A second algorithm performs an interactive merge of a common area by presenting the possible generalization actions to the designer. It supports the theme that many generalization structures can be possible, and the appropriate structure often depends on designer preferences and application requirements. The methodology is evaluated for several cases involving real databases. The cases demonstrate the utility of the methodology in managing the details, considering many alternatives, and resolving conflicts. In addition, these cases demonstrate the need for a set of computer-aided tools; for even a relatively small case, the number of details and design choices can overwhelm a designer.

Page generated in 0.0987 seconds