• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 751
  • 350
  • 73
  • 73
  • 73
  • 73
  • 73
  • 72
  • 48
  • 31
  • 9
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 1701
  • 1701
  • 271
  • 254
  • 242
  • 208
  • 186
  • 185
  • 174
  • 167
  • 145
  • 138
  • 137
  • 128
  • 125
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Emulation framework for testing higher level control methodology /

Ennulat, Harold W. January 1992 (has links)
Thesis (M.S.)--Virginia Polytechnic Institute and State University, 1992. / Vita. Abstract. Includes bibliographical references (leaves 99-104). Also available via the Internet.
32

Verification of program properties: from testing to semi-proving

Zhou, Zhiquan., 周智泉. January 2003 (has links)
published_or_final_version / Computer Science and Information Systems / Doctoral / Doctor of Philosophy
33

Room allocation analysis

Hassett, Thomas Cyril January 1978 (has links)
published_or_final_version / Industrial Engineering / Master / Master of Science in Engineering
34

COMPUTER AIDED PROCESS ORGANIZATION IN SOFTWARE DESIGN.

KARIMI, JAHANGIR. January 1983 (has links)
Recent studies point to major problems in today's software systems. Problems in cost, reliability, maintainability, and poor responsiveness to user requirements have their origin in the early phases of the system development effort. Although increasing awareness of poor design practice has stimulated several research efforts toward making the "detailed design" process more "systematic," there is a significant need for a computer-aided methodology to help designers cope with the complex design process. A framework is established for organizing activities in support of one important aspect of "detailed design," the organization of processes into appropriate process groups and program modules. A computer-aided methodology is presented for analysis of variety of inter-process relationships in the determination of effective modularizations. The proposed methodology extends current software engineering practice through partial automation of an important software engineering problem, the effective structuring of processes according to multiple design criteria. Multiple design criteria are used to determine inter-process relationships. The system accomodates a number of design criteria including volume of data transport, distribution of data references, information and control distribution. The methodology begins with the assignment of a graph structure to subsystem components and their interdependencies. The resulting graph is partitioned in determination of subgraphs (modules) with strong intra-dependencies and weak inter-dependencies. The set of subgraphs define modules which satisfy principles of high module strength and low module coupling. The decomposition method used also produces a hierarchical structure of modules with little resource sharing. The resulting design limits "reference distribution" and "information distribution" between modules, which results in reduction of complexity of the total structure. Analytical tools in support of these activities are presented to illustrate support of the methodology by a pilot study.
35

Exact and approximation algorithms for DNA sequence reconstruction.

Kececioglu, John Dimitri. January 1991 (has links)
The DNA sequence in every human being is a text of three billion characters from a four letter alphabet; determining this sequence is a major project in molecular biology. The fundamental task biologists face is to reconstruct a long sequence given short fragments from unknown locations. These fragments contain errors, and may represent the sequence on one strand of the double-helix, or the reverse complement sequence on the other strand. The Sequence Reconstruction Problem is, given a collection F of fragment sequences and an error rate 0 ≤ ε < 1, find a shortest sequence S such that every fragment F ∈ F, or its reverse complement, matches a substring of S with at most ε|F| errors. Sequence Reconstruction is NP-complete. We decompose the problem into (1) constructing a graph of approximate overlaps between pairs of fragments, (2) selecting a set of overlaps of maximum total weight that induce a consistent layout of the fragments, (3) merging the overlaps into a multiple sequence alignment and voting on a consensus. A solution to (1) through (3) yields a reconstructed sequence feasible at error rate 2ε/(1-ε) and at most a factor 1/1-ε longer than the shortest reconstruction, given some assumptions on fragment error. We define a measure of the overlap in a reconstruction, show that maximizing the overlap minimizes the length, and that approximating (2) within a factor of α approximates Sequence Reconstruction within a factor of (1- ε)α under the overlap measure. We construct the overlap graph for (1) in O(εN²) time given fragments of total length N at error rate ε. We develop two exact and two approximation algorithms for (2). Our best exact algorithm computes an optimal layout for a graph of E overlaps and V fragments in O(K(E + V log V)) time, where K ≤ 2ᴱ is the size of the branch-and-bound search tree. Our best approximation algorithm computes a layout with overlap at least 1/2 the maximum in O(V(E + V log V)log V) time. This is the first treatment of Sequence Reconstruction with inexact data and unknown complementarity.
36

Evaluation of a neonatal hyperalimentation microcomputer program

Angelier, Daniel Michael, 1950- January 1988 (has links)
A neonatal hyperalimentation microcomputer program was designed to generate labels, and calculate mixing instructions. Artificial intelligence techniques including, interviewing experts and an inference algorithm, were employed to provide decision support in identifying clinically inappropriate orders. Development cost was $10,000. The program was alpha phase tested comparing pharmacists and technicians performance. Task time was high and prone to human mathematical error for pharmacists and technicians using an electric typewriter and calculator. All subjects performed poorly with low confidence in manually determining inappropriate orders, although pharmacists scored slightly higher. Task time was decreased 17 minutes with no errors using the program. Appropriateness, confidence and composite performance were vastly improved with decision support. Pharmacists composite performance was slightly higher.
37

The Forward Observer Personal Computer Simulator (FOPCSim) 2

Strom, Mark W. 09 1900 (has links)
Due to declining budgets and decreases in ammunition allowances, the opportunity to conduct live fire artillery training has been greatly reduced. The available simulation trainers are either outdated, require specialize contractor support, or are not deployable. FOPCSim was developed at no cost, is freely available, takes advantages of modern 3D graphics, eliminates costly contractor support, and will run on laptops in support of deploying units. The simulator provides users with real-time performance feedback based on the Marine Corps Training and Readiness standards and was designed according to a cognitive task analysis of the call for fire procedures. To evaluate how well FOPCSim trains the call for fire procedures, an experiment was conducted at The Basic School in Quantico, Virginia. FOPCSim was used in place of the current simulation: Training Set, Fire Observation (TSFO) to evaluate its training effectiveness. By eliminating the overhead associated with most simulators, FOPCSim allows users to perform the call for fire procedures with a high degree of repetitiveness which is needed to train this type of task.
38

Design and operational aspects of autonomous Unmanned Combat Aerial Vehicles

Baggesen, Arne 09 1900 (has links)
A family of advanced weapon systems that deserves special attention comprises aerial autonomous weapons called Unmanned Combat Aerial Vehicles (UCAVs), which are characterized by the ability to loiter in the target area, sense the targets, acquire the targets, and then engage them. Modeling this combination of capabilities in a specific operational setting is necessary for addressing design and operational issues of this weapon. This work focuses on the development of an analytic probability model that captures key aspects of the autonomous weapon systems' engagement process. Special attention is given to simultaneous attack occurrences, imperfect battle damage assessment, and attack coordination properties. The model is a continuous-time Markov Chain and for its implementation a state generator and an algorithm that computes the transition and limiting probabilities has been developed and programmed in Java based software. The Markovmodel derives values for several measures of effectiveness (MOEs), and the average engagement time. Different operational scenarios and design configurations are examined in a sample analysis to demonstrate the model's capabilities. Tradeoffs among sensing, data processing capabilities, vulnerability and lethality of UCAVs are explicitly represented with respect to selected MOEs.
39

Authentication scenario for CyberCIEGE

Mueller, David S. 09 1900 (has links)
Frequent media reports of the loss or compromise of data stored on computer systems indicate that attempts to educate users on proper computer security policies and procedures seem to be ineffective. In an effort to provide a means of education that will more fully engage users, the CyberCIEGE game was created. It is hoped that by playing CyberCIEGE users will absorb computer security concepts better than they have through more traditional forms of instruction, because many find games to be a compelling experience. Many users do not understand why good passwords and password management are important for information systems. This effort developed a scenario for CyberCIEGE to teach players about issues involved when developing a password policy for a computer system. Limited testing showed the scenario accomplishes this. CyberCIEGE uses a Scenario Definition Language to provide developers and educators the ability to create scenarios that focus on particular concepts. To streamline scenario development, a Scenario Definition Tool has been created. As a part of scenario development, this work also involved beta testing of the Scenario Definition Tool, a program that aids scenario developers in the creation of scenarios for the game. This testing resulted in several improvements to the tool.
40

Traffic profiling of wireless sensor networks

Kirykos, Georgios 12 1900 (has links)
Network security is vital in wireless networks that are widely used today. We desire wireless networks that maintain a high degree of confidentiality, integrity, and availability. Wireless sensor networks pose unique challenges and limitations to the traditional schemes, which are used in the other wireless networks for security protection, and are due mainly to the increased vulnerability of physical attacks, energy and communication limitations. This thesis introduces the foundations of a network and anomaly-based Intrusion Detection System (IDS) tool, including both hardware and software components, that can be used for traffic profiling and monitoring of a wireless sensor network. The work demostrates how the IDS should capture and store traffic and use this information to create traffic profiles and baselines for normal traffic behavior. Then it describes how these baselines can be used to generate alerts based on traffic variations that imply possible attacks. Profiles on typical implementations of wireless sensor networks were observed and analyzed. Finally, initial indications from basic analysis of wireless sensor network traffic demonstrated a high degree of self-similarity.

Page generated in 0.0532 seconds