• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 117
  • 1
  • 1
  • 1
  • Tagged with
  • 125
  • 125
  • 96
  • 57
  • 22
  • 15
  • 14
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

The association and probability of a predictive relationship between cyber security incidents and type of internet connectivity| A quantitative study

Lagrule, Carlos Manuel 07 May 2015 (has links)
<p> Research has shown that the cost of information security (IS) breaches to organizations has been estimated to be in the billions of dollars. Extant research has linked human error to about 65% of data breaches, which involve economic loss of more than $20 billion to US companies. Researchers concur and further add that end users' behaviors contribute to internal security breaches in organizations, and that these behaviors include employee negligence and non-compliance with policies. Research has also shown that individuals' self-efficacy to strengthen information security efforts starts at home; this behavior at home creates the foundation for Internet users or individuals to continue applying security behaviors at work. This study investigated the association and the probability of a predictive relationship between the independent variable (IV), <i>type of Internet connectivity</i> and the dependent variable (DV), <i>cyber security incidents,</i> among adult users of the Internet in the U.S.A. Findings from a Chi-square test indicated that no statistically significant association and no probability of a statistically significant predictive relationship existed between the IV and the DV. These Chi-square test's results supported the results of the binomial logistic regression.</p>
72

Attributes effecting software testing estimation; is organizational trust an issue?

Hammoud, Wissam 05 September 2014 (has links)
<p> This quantitative correlational research explored the potential association between the levels of organizational trust and the software testing estimation. This was conducted by exploring the relationships between organizational trust, tester&rsquo;s expertise, organizational technology used, and the number of hours, number of testers, and time-coding estimated by the software testers. The research conducted on a software testing department of a health insurance organization, employed the use of the Organizational Trust Inventory- Short Form (OTI-SF) developed by Philip Bromiley and Larry Cummings and revealed a strong relationship between organizational trust and software testing estimation. The research reviews historical theories of organizational trust and include a deep discussion about software testing practices and software testing estimation. By examining the significant impact of organizational trust on project estimating and time-coding in this research, software testing leaders can benefit from this research to improve project planning and managing process by improving the levels of trust within their organizations.</p>
73

Evaluation of security methods for the prevention of malware on mobile devices

Giacalone, Anthony S. 08 August 2014 (has links)
<p> Since the introduction of the iPhone in 2008, mobile devices have become ubiquitous in our society and have spawned a new area for attackers to steal private information and data. Malware has begun to appear on these devices despite the claims of Google and Apple that their devices are secure. To combat this growing problem, companies have started producing applications which claim to have the ability to scan for malware and protect devices from these threats. Current measures to prevent loss of data from malware and illicit use of mobile devices are first be discussed. This thesis then explores and attempts to analyze the three most popular security application offerings on Android OS and determine if these security suites have any benefits to the user above and beyond the standard malware scans that are performed by Google's servers by conducting four separate benchmark tests on the software. Potential problems with these security programs, which include increased system load and loss of battery life, will be included in the discussion along with the results of the tests. Finally, this thesis will explore and discuss the lack of heuristic scanning in these security applications and the potential threat that boot sector viruses might pose to mobile devices in the future. </p>
74

An Automated System for Rapid and Secure Device Sanitization

LaBarge, Ralph S. 24 June 2014 (has links)
<p> Public and private organizations face the challenges of protecting their networks from cyber-attacks, while reducing the amount of time and money spent on Information Technology. Organizations can reduce their expenditures by reusing server, switch and router hardware, but they must use reliable and efficient methods of sanitizing these devices before they can be redeployed. The sanitization process removes proprietary, sensitive or classified data, as well as persistent malware from a device prior to reuse. The Johns Hopkins University Applied Physics Laboratory has developed an automated, rapid, and secure method for sanitizing servers, switches and routers. This sanitization method was implemented and tested on several different types of network devices during the Cyber Measurement &amp; Analysis Center project, which was funded under Phases I and II of the DARPA National Cyber Range program. The performance of the automated sanitization system was excellent with an order of magnitude reduction in the time required to sanitize servers, routers and switches, and a significant improvement in the effectiveness of the sanitization process through the addition of persistent malware removal.</p>
75

"Measuring Operational Effectiveness of Information Technology Infrastructure Library (ITIL) and the Impact of Critical Facilities Inclusion in the Process."

Woodell, Eric A. 31 May 2014 (has links)
<p> Information Technology (IT) professionals use the Information Technology Infrastructure Library (ITIL) process to better manage their business operations, measure performance, improve reliability and lower costs. This study examined the operational results of those data centers using ITIL against those that do not, and whether the results change when traditional facilities engineers are included in the process. Overall, those IT departments using ITIL processes had no statistically significant improvements when compared to those who do not. Inclusion of Critical Facilities (CF) personnel in the framework offered a statistically significant improvement in their overall reliability of their data centers. Those IT departments who do not include CF personnel in the ITIL framework have a slightly lower level of reliability than those who do not use the ITIL processes at all.</p>
76

An investigation of data privacy and utility using machine learning as a gauge

Mivule, Kato 18 June 2014 (has links)
<p> The purpose of this investigation is to study and pursue a user-defined approach in preserving data privacy while maintaining an acceptable level of data utility using machine learning classification techniques as a gauge in the generation of synthetic data sets. This dissertation will deal with data privacy, data utility, machine learning classification, and the generation of synthetic data sets. Hence, data privacy and utility preservation using machine learning classification as a gauge is the central focus of this study. Many organizations that transact in large amounts of data have to comply with state, federal, and international laws to guarantee that the privacy of individuals and other sensitive data is not compromised. Yet at some point during the data privacy process, data loses its utility - a measure of how useful a privatized dataset is to the user of that dataset. Data privacy researchers have documented that attaining an optimal balance between data privacy and utility is an NP-hard challenge, thus an intractable problem. Therefore we propose the classification error gauge (x-CEG) approach, a data utility quantification concept that employs machine learning classification techniques to gauge data utility based on the classification error. In the initial phase of this proposed approach, a data privacy algorithm such as differential privacy, Gaussian noise addition, generalization, and or k-anonymity is applied on a dataset for confidentiality, generating a privatized synthetic data set. The privatized synthetic data set is then passed through a machine learning classifier, after which the classification error is measured. If the classification error is lower or equal to a set threshold, then better utility might be achieved, otherwise, adjustment to the data privacy parameters is made and then the refined synthetic data set is sent to the machine learning classifier; the process repeats until the error threshold is reached. Additionally, this study presents the Comparative x-CEG concept, in which a privatized synthetic data set is passed through a series of classifiers, each of which returns a classification error, and the classifier with the lowest classification error is chosen after parameter adjustments, an indication of better data utility. Preliminary results from this investigation show that fine-tuning parameters in data privacy procedures, for example in the case of differential privacy, and increasing weak learners in the ensemble classifier for instance, might lead to lower classification error, thus better utility. Furthermore, this study explores the application of this approach by employing signal processing techniques in the generation of privatized synthetic data sets and improving data utility. This dissertation presents theoretical and empirical work examining various data privacy and utility methodologies using machine learning classification as a gauge. Similarly this study presents a resourceful approach in the generation of privatized synthetic data sets, and an innovative conceptual framework for the data privacy engineering process.</p>
77

Awareness of malicious social engineering among facebook users

Slonka, Kevin J. 18 June 2014 (has links)
<p> With the rapid growth of Facebook, the social networking website is becoming a lucrative target for malicious activity. Users of Facebook therefore should be aware of various malicious attacks and know how to identify them. This research analyzed Facebook users' level of understanding in the domain of malicious social engineering on Facebook. The research examined differences in awareness among multiple generational groups; secondary research questions focused on how factors such as age, gender, education, Internet usage, and trust affected users' awareness of malicious activity. Results suggest that the Baby Boomer generation is the least aware of malicious social engineering tactics on Facebook, specifically in regard to the Donation scam category. In addition, education level and educational background are significantly associated with awareness. These findings indicate a need for future work to gain a deeper understanding of Facebook users' awareness of malicious social engineering and generate targeted training in order to increase said awareness. </p>
78

IslaNet| An isolated and secure approach to implementing computer networking laboratories

Cruz-Zendejas, Rogelio 23 April 2014 (has links)
<p> The SIGITE Computing Curricula suggests that a hands-on laboratory component is essential in teaching networking courses. However there are some drawbacks and limitations including high costs of implementation and maintenance, security risks to the campus network, and a limited number of practical guides that feature both design and implementation. Furthermore, with the advancement of other approaches such as virtualization and simulation it has become increasingly difficult to justify funding a hands-on laboratory.</p><p> IslaNet is an isolated and secure approach to implementing computer networking laboratories which produce a low-cost model, focused on hands-on implementation. IslaNet uses various components from other approaches to mitigate, or in some cases, completely eliminate the risks and deficiencies that traditional hands-on laboratories introduce. The laboratory objectives are derived from the SIGITE Computing Curriculum to provide a solid, well developed foundation. IslaNet offers concept, design, and implementation using a unique multi-layer approach. </p>
79

Voice activated personal assistant| Privacy concerns in the public space

Easwara Moorthy, Aarthi 23 April 2014 (has links)
No description available.
80

Validating the OCTAVE Allegro Information Systems Risk Assessment Methodology| A Case Study

Keating, Corland G. 22 March 2014 (has links)
<p> An information system (IS) risk assessment is an important part of any successful security management strategy. Risk assessments help organizations to identify mission-critical IS assets and prioritize risk mitigation efforts. Many risk assessment methodologies, however, are complex and can only be completed successfully by highly qualified and experienced security experts. Small-sized organizations, including small-sized colleges and universities, due to their financial constraints and lack of IS security expertise, are challenged to conduct a risk assessment. Therefore, most small-sized colleges and universities do not perform IS risk assessments, which leaves the institution's data vulnerable to security incursions. The negative consequences of a security breach at these institutions can include a decline in the institution's reputation, loss of financial revenue, and exposure to lawsuits. </p><p> The goal of this research is to address the challenge of conducting IS risk assessments in small-sized colleges and universities by validating the use of the Operationally Critical Threat, Asset, and Vulnerability Evaluation (OCTAVE) Allegro risk assessment methodology at a small-sized university. OCTAVE Allegro is a streamlined risk assessment method created by Carnegie Mellon University's Software Engineering Institute. OCTAVE Allegro has the ability to provide robust risk assessment results, with a relatively small investment in time and resources, even for those organizations that do not have extensive risk management expertise. </p><p> The successful use of OCTAVE Allegro was validated using a case study that documented the process and outcome of conducting a risk assessment at George Fox University (GFU), a small-sized, private university located in Newberg, Oregon. GFU has the typical constraints of other small-sized universities; it has a relatively small information technology staff with limited expertise in conducting IS risk assessments and lacks a dedicated IS risk manager. Nevertheless, OCTAVE Allegro was relatively easy for GFU staff to understand, provided GFU with the ability to document the security requirements of their IS assets, helped to identify and evaluate IS security concerns, and provided an objective way to prioritize IS security projects. Thus, this research validates that OCTAVE Allegro is an appropriate and effective IS risk assessment method for small-sized colleges and universities.</p>

Page generated in 0.0629 seconds