• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Rapid Mission Assurance Assessment via Sociotechnical Modeling and Simulation

Lanham, Michael Jay 01 May 2015 (has links)
How do organizations rapidly assess command-level effects of cyber attacks? Leaders need a way of assuring themselves that their organization, people, and information technology can continue their missions in a contested cyber environment. To do this, leaders should: 1) require assessments be more than analogical, anecdotal or simplistic snapshots in time; 2) demand the ability to rapidly model their organizations; 3) identify their organization’s structural vulnerabilities; and 4) have the ability to forecast mission assurance scenarios. Using text mining to build agent based dynamic network models of information processing organizations, I examine impacts of contested cyber environments on three common focus areas of information assurance—confidentiality, integrity, and availability. I find that assessing impacts of cyber attacks is a nuanced affair dependent on the nature of the attack, the nature of the organization and its missions, and the nature of the measurements. For well-manned information processing organizations, many attacks are in the nuisance range and that only multipronged or severe attacks cause meaningful failure. I also find that such organizations can design for resiliency and provide guidelines in how to do so.
2

A methodology for creating expert-based quantitative models for early phase design

Engler, William O., III 08 April 2013 (has links)
Early systems engineering and requirements definition requires quantitative information about potential solutions prior to having sufficient information or time to develop detailed models. This research develops and demonstrates a transparent and repeatable process for rapidly creating quantitative models that leverage existing expert knowledge. This process is built upon established modeling frameworks and current literature for low fidelity modeling and hierarchical expert-based methods. The process includes system definition using interactive morphological analysis and gathering information from subject-matter experts with computer-based interfaces in order to create a series of linear performance models. Available volunteers provided data for a relevant aerospace design to test the process as a whole and several hypotheses about specific methodological decisions made during the development. The collected data was analyzed for similarity among participants and for similarity to model parameters of an existing trusted truth model. The results of the analysis demonstrated the ability for expert-based models to accurately match the behavior of the truth models and of historical data.

Page generated in 0.0748 seconds