• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 2
  • 2
  • 1
  • Tagged with
  • 28
  • 28
  • 22
  • 17
  • 7
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Information Assurance (IA) Considerations for a Telemetry Network System (TmNS)

Hodack, David 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / The integrated Network Enhanced Telemetry (iNET) project was launched by the Central Test and Evaluation Investment Program (CTEIP) to foster network enhanced instrumentation and telemetry. The iNET program is preparing for the TmNS system demonstration. The goal of the demonstration is to prove that the proposed TmNS will meet the Test Capability Requirements Document (TCRD) and validate the iNET standards. One aspect of the preparation is looking at the IA issues and making decisions to ensure that the system will be certified and accredited, meet user needs, and be secure. This paper will explore a few of these considerations.
2

Obtaining an ATO for an iNET Operational Demonstration

Hodack, David 10 1900 (has links)
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The integrated Network Enhanced Telemetry (iNET) project was launched to foster network enhanced instrumentation and telemetry. The program is currently implementing an operational demonstration. That will involve installing and using a network enhanced instrumentation system on a helicopter. This demonstration will be used as a learning exercise for the implementation of network technologies. This paper will give a brief description of the operational demonstration. Then it will explore the need for an Authority to Operate (ATO) and describe how one was obtained.
3

Kärnverksamhetsdriven Information Assurance : Att skapa tilltro till beslutskritisk information

Haglund, Jonas January 2007 (has links)
<p>Examensarbetet är utfört på uppdrag av Generic Systems i syfte att bl.a. utreda begreppet Information Assurance (IA) utifrån given definition (IA is the process of establishing trust for the information we use in our business). Fokus kom att hamna på taktiska militära ledningssystem inklusive ingående individer.</p><p>Det finns ett stort behov av en sådan här studie av flera anledningar bl.a. eftersom det just nu pågår ett paradigmskifte; från förvaltar- till förbrukarattityd och från <i>need to know</i> till <i>need to share</i>. Det innebär alltså att ett helt nytt synsätt på information håller på att växa fram</p><p>Första slutsatsen som drog var att IA är ett verkligt multidisciplinärt ämne som kräver djupa kunskaper inom många och vitt skilda områden t.ex. systemvetenskap, datasäkerhet och organisationsteori. En annan slutsats jsom drogs var att för att en användare ska känna tilltro till information måste tre grundkrav vara uppfyllda: användaren måste ha förtroende för systemet som förmedlar informationen, informationen i sig måste uppvisa vissa kvaliteter och slutligen påverkar egenskaper hos användaren den upplevda tilltron.</p><p>Med utgångspunkt i Försvarsmaktens behov så visade det sig att de centrala förändringarna som krävs för att uppnå effekten IA bland annat är ökat fokus på användbarhet i systemutvecklingen och flexiblare individanpassad utbildning. Utbildningen måste tillse såväl tillräcklig teknisk kompetens som säkerhetsmedvetande hos personalen. Andra delar som bör förändras är informationsarkitekturen; ökad separation mellan datamodell och informationsmodell är önskvärd.</p>
4

Confidentiality Protection of User Data and Adaptive Resource Allocation for Managing Multiple Workflow Performance in Service-based Systems

January 2012 (has links)
abstract: In this dissertation, two interrelated problems of service-based systems (SBS) are addressed: protecting users' data confidentiality from service providers, and managing performance of multiple workflows in SBS. Current SBSs pose serious limitations to protecting users' data confidentiality. Since users' sensitive data is sent in unencrypted forms to remote machines owned and operated by third-party service providers, there are risks of unauthorized use of the users' sensitive data by service providers. Although there are many techniques for protecting users' data from outside attackers, currently there is no effective way to protect users' sensitive data from service providers. In this dissertation, an approach is presented to protecting the confidentiality of users' data from service providers, and ensuring that service providers cannot collect users' confidential data while the data is processed or stored in cloud computing systems. The approach has four major features: (1) separation of software service providers and infrastructure service providers, (2) hiding the information of the owners of data, (3) data obfuscation, and (4) software module decomposition and distributed execution. Since the approach to protecting users' data confidentiality includes software module decomposition and distributed execution, it is very important to effectively allocate the resource of servers in SBS to each of the software module to manage the overall performance of workflows in SBS. An approach is presented to resource allocation for SBS to adaptively allocating the system resources of servers to their software modules in runtime in order to satisfy the performance requirements of multiple workflows in SBS. Experimental results show that the dynamic resource allocation approach can substantially increase the throughput of a SBS and the optimal resource allocation can be found in polynomial time / Dissertation/Thesis / Ph.D. Computer Science 2012
5

Kärnverksamhetsdriven Information Assurance : Att skapa tilltro till beslutskritisk information

Haglund, Jonas January 2007 (has links)
Examensarbetet är utfört på uppdrag av Generic Systems i syfte att bl.a. utreda begreppet Information Assurance (IA) utifrån given definition (IA is the process of establishing trust for the information we use in our business). Fokus kom att hamna på taktiska militära ledningssystem inklusive ingående individer. Det finns ett stort behov av en sådan här studie av flera anledningar bl.a. eftersom det just nu pågår ett paradigmskifte; från förvaltar- till förbrukarattityd och från need to know till need to share. Det innebär alltså att ett helt nytt synsätt på information håller på att växa fram Första slutsatsen som drog var att IA är ett verkligt multidisciplinärt ämne som kräver djupa kunskaper inom många och vitt skilda områden t.ex. systemvetenskap, datasäkerhet och organisationsteori. En annan slutsats jsom drogs var att för att en användare ska känna tilltro till information måste tre grundkrav vara uppfyllda: användaren måste ha förtroende för systemet som förmedlar informationen, informationen i sig måste uppvisa vissa kvaliteter och slutligen påverkar egenskaper hos användaren den upplevda tilltron. Med utgångspunkt i Försvarsmaktens behov så visade det sig att de centrala förändringarna som krävs för att uppnå effekten IA bland annat är ökat fokus på användbarhet i systemutvecklingen och flexiblare individanpassad utbildning. Utbildningen måste tillse såväl tillräcklig teknisk kompetens som säkerhetsmedvetande hos personalen. Andra delar som bör förändras är informationsarkitekturen; ökad separation mellan datamodell och informationsmodell är önskvärd.
6

Physical Security Assessment of a Regional University Computer Network

Timbs, Nathan H 01 December 2013 (has links) (PDF)
Assessing a network's physical security is an essential step in securing its data. This document describes the design, implementation, and validation of PSATool, a prototype application for assessing the physical security of a network's intermediate distribution frames, or IDFs (a.k.a. "wiring closets"). PSATool was created to address a lack of tools for IDF assessment. It implements a checklist-based protocol for assessing compliance with 52 security requirements compiled from federal and international standards. This checklist can be extended according to organizational needs. PSATool was validated by using it to assess physical security at 135 IDFs at East Tennessee State University. PSATool exposed 95 threats, hazards, and vulnerabilities in 82 IDFs. A control was recommended for each threat, hazard, and vulnerability discovered. The administrators of ETSU's network concluded that PSATool's results agreed with their informal sense of these IDFs' physical security, while providing documented support for improvements to IDF security.
7

Combining Static Analysis and Dynamic Learning to Build Context Sensitive Models of Program Behavior

Liu, Zhen 10 December 2005 (has links)
This dissertation describes a family of models of program behavior, the Hybrid Push Down Automata (HPDA) that can be acquired using a combination of static analysis and dynamic learning in order to take advantage of the strengths of both. Static analysis is used to acquire a base model of all behavior defined in the binary source code. Dynamic learning from audit data is used to supplement the base model to provide a model that exactly follows the definition in the executable but that includes legal behavior determined at runtime. Our model is similar to the VPStatic model proposed by Feng, Giffin, et al., but with different assumptions and organization. Return address information extracted from the program call stack and system call information are used to build the model. Dynamic learning alone or a combination of static analysis and dynamic learning can be used to acquire the model. We have shown that a new dynamic learning algorithm based on the assumption of a single entry point and exit point for each function can yield models of increased generality and can help reduce the false positive rate. Previous approaches based on static analysis typically work only with statically linked programs. We have developed a new component-based model and learning algorithm that builds separate models for dynamic libraries used in a program allowing the models to be shared by different program models. Sharing of models reduces memory usage when several programs are monitored, promotes reuse of library models, and simplifies model maintenance when the system updates dynamic libraries. Experiments demonstrate that the prototype detection system built with the HPDA approach has a performance overhead of less than 6% and can be used with complex real-world applications. When compared to other detection systems based on analysis of operating system calls, the HPDA approach is shown to converge faster during learning, to detect attacks that escape other detection systems, and to have a lower false positive rate.
8

An Empirical Investigation Of The Influence Of Fear Appeals On Attitudes And Behavioral Intentions Associated With Recommended Individual Computer Security Actions

Johnston, Allen C 13 May 2006 (has links)
Through persuasive communication, IT executives strive to align the actions of end users with the desired security posture of management and of the firm. In many cases, the element of fear is incorporated within these communications. However, within the context of computer security and information assurance, it is not yet clear how these fear-inducing arguments, known as fear appeals, will ultimately impact the actions of end users. The purpose of this study is to examine the influence of fear appeals on the compliance of end users with recommendations to enact specific individual computer security actions toward the amelioration of threats. A two-phase examination was adopted that involved two distinct data collection and analysis procedures, and culminated in the development and testing of a conceptual model representing an infusion of theories based on prior research in Social Psychology and Information Systems (IS), namely the Extended Parallel Process Model (EPPM) and the Unified Theory of Acceptance and Use of Technology (UTAUT). Results of the study suggest that fear appeals do impact end users attitudes and behavioral intentions to comply with recommended individual acts of security, and that the impact is not uniform across all end users, but is determined in part by perceptions of self-efficacy, response efficacy, threat severity, threat susceptibility, and social influence. The findings suggest that self-efficacy and, to a lesser extent, response efficacy predict attitudes and behavioral intentions to engage individual computer security actions, and that these relationships are governed by perceptions of threat severity and threat susceptibility. The findings of this research will contribute to IS expectancy research, human-computer interaction, and organizational communication by revealing a new paradigm in which IT users form perceptions of the technology, not on the basis of performance gains, but on the basis of utility for threat amelioration.
9

Tackling the barriers to achieving Information Assurance

Simmons, Andrea C. January 2017 (has links)
This original, reflective practitioner study researched whether professionalising IA could be successfully achieved, in line with the UK Cyber Security Strategy expectations. The context was an observed changing dominant narrative from IA to cybersecurity. The research provides a dialectical relationship with the past to improve IA understanding. The Academic contribution: Using archival and survey data, the research traced the origins of the term IA and its practitioner usage, in the context of the increasing use of the neologism of cybersecurity, contributing to knowledge through historical research. Discourse analysis of predominantly UK government reports, policy direction, legislative and regulatory changes, reviewing texts to explore the functions served by specific constructions, mainly Information Security (Infosec) vs IA. The Researcher studied how accounts were linguistically constructed in terms of the descriptive, referential and rhetorical language used, and the function that serves. The results were captured in a chronological review of IA ontology. The Practitioner contribution: Through an initial Participatory Action Research (PAR) public sector case study, the researcher sought to make sense of how the IA profession operates and how it was maturing. Data collection from self-professed IA practitioners provided empirical evidence. The researcher undertook evolutionary work analysing survey responses and developed theories from the analysis to answer the research questions. The researcher observed a need to implement a unified approach to Information Governance (IG) on a large organisation-wide scale. Using a constructivist grounded theory the researcher developed a new theoretical framework - i3GRC™ (Integrated and Informed Information Governance, Risk, and Compliance) - based on what people actually say and do within the IA profession. i3GRC™ supports the required Information Protection (IP) through maturation from IA to holistic IG. Again, using PAR, the theoretical framework was tested through a private sector case study, the resultant experience strengthening the bridge between academia and practitioners.
10

The Impact of New Information Technology on Bureaucratic Organizational Culture

Givens, Mark Allen 01 January 2011 (has links)
Virtual work environments (VWEs) have been used in the private sector for more than a decade, but the United States Marine Corps (USMC), as a whole, has not yet taken advantage of associated benefits. The USMC construct parallels the bureaucratic organizational culture and uses an antiquated information technology (IT) infrastructure. During an effort to upgrade the Marine Corps Combat Development Command's infrastructure to a VWE, the change-agent noticed an immediate resistance towards the VWE and new work methodology. The problem identified for investigation was to discover why a bureaucratic organizational culture, matured through IT savvy and cognitively adept personnel, resists the VWE and new work methodology introduced by the evolution of IT. The explanatory, single case study documented the resistance towards the VWE and new work methodology and recommended a solution to the problem. Due to a noticeable resistance geared towards the adoption of the VWE, the case study and pre-trial preparation began in Fall, 2009 and the data-collection period occurred in the Spring of 2010. The preparation phase entailed developing extensive instruments that burrowed into the participants' technical expertise and willingness to accept change on an individual level. The instruments were validated by an expert panel from the following disciplines: knowledge management, information technology, and psychology, as part of the groundwork. Analysis of the data showed resistance towards the VWE, both collectively as a group and on the individual level. The final report articulates that the data proves the study successfully accomplished the goal. Resistance was found at the user level in the work environment. It stemmed from several key areas: lack of user input, lack of training, user ignorance, and absence of a vision statement. Leadership should review the implementation methodology and decide upon a new course of action. Training in the use of the VWE at entry and advanced levels should be offered to newcomers and be available on a continuous basis. Future change agents must examine the outcomes of similar, previous work in order to gain a better understanding of what makes an initiative fail or succeed.

Page generated in 0.0872 seconds