911 |
Rethinking emancipation in critical security studiesNunes, João January 2010 (has links)
Since the end of the Cold War, there has been a comprehensive challenge to dominant conceptions in Security Studies. Security has been approached as a political phenomenon, resulting from political assumptions and having political effects. The politicization of security has been pursued by a number of so-called ‘critical approaches,’ including ‘security as emancipation.’ The latter argues that security consists in removing or alleviating constraints upon the lives of individuals and groups – such as poverty, ill health, or lack of education. This thesis asks two questions: firstly, can the ‘security as emancipation’ approach, in its current formulation, deliver on its claims and promises, in the context of the effort of politicization in Security Studies? And secondly, if it is shown that there are weaknesses, in what ways can the analytical and normative outlook of security as emancipation be strengthened through an engagement with other resources in the literature? Chapters 1 and 2 establish the context in which the merits of security as emancipation must be judged. They conclude that an engagement with this approach must focus on the way it conceives the multiple connections between security and politics. Chapters 3, 4 and 5 pursue this insight, by focusing on the notions of reality, threat and power respectively. In each of these themes, the argument identifies gaps in security as emancipation and suggests theoretical reconsiderations based on an engagement with approaches and ideas – in the critical security literature and in social and political theory – that so far have been neglected or not examined sufficiently by this approach. This thesis aims to re-establish security as emancipation as a valid interlocutor within critical debates about security. It also aims to show that the dialogue between critical approaches is, not only possible, but beneficial to understanding the politicization of security.
|
912 |
IS security leveraging the concept of knowledge managementNeville, Karen M. January 2010 (has links)
IS Security (ISS) has become a key element of business risk management and can itself create competitive advantage. Thus, organisations seek practical approaches to protect the operation of the business. Protecting the functionality of an organisation is a difficult task but it is the responsibility of both senior management and ISS functions to do so. An analysis of the ISS literature reveals a paucity of research of ISS management, and a need for research to develop a holistic model for managing ISS knowledge to overcome the ever-increasing number of negative security incidents. The ISS research community is restrained by small-scale technical questions as the social aspects of ISS are ignored resulting in fragmented research across the IS field. While several possible methods are scattered throughout the literature – they focus on the development of information systems. ISS professionals require a range of skills encompassing business knowledge, legal awareness, and organisational processes as well as technical security knowledge. Research to date has failed to provide an integrated approach to managing ISS knowledge. This study investigates how ISS could leverage the concept of knowledge management. It proposes a theoretical model derived from the ISS and KM literatures. Thus to address this gap in research, this study adopts an exploratory interpretive holistic case study approach using interviews and document analysis as data gathering methods. The study will focus on the relationship between ISS and KM and the proposed benefits that an ISS KM initiative would produce. An analysis of the approaches used by these specialised structures in managing knowledge within and across the two case studies facilitated the development of an integrated model. The interplay between the functions provided rich description of the approaches used to manage knowledge. This research builds on previous studies documented in the ISS literature, by providing a much needed model against which practitioners may diagnose problems, plan action and implement solutions. ISS models and standards today do not exhibit much flexibility, therefore managers make ISS decisions in a vacuum. ISS problems can be managed or reduced when the ISS functions and management are aware of the full range of controls available and implement the most effective. Unfortunately, they often lack this knowledge and their subsequent actions to cope with threats are less effective. The focus of ISS research to date has been technical and grounded in positivism and few, if any, studies utilise a qualitative approach, therefore eliminating holistic, in-depth rich descriptions of core issues within the field. Comparatively little work has taken a managerial point of view, covering broad organisational and social issues. This study acknowledges these issues and provides a solid conceptual foundation for future studies on ISS by answering calls for a theoretical model to guide research in the area. The study also identifies the positive and negative impacts of compliance and describes how organisations can apply the model to overcome these negative effects.
|
913 |
General Deterrence Theory: Assessing Information Systems Security Effectiveness in Large versus Small BusinessesSchuessler, Joseph H. 05 1900 (has links)
This research sought to shed light on information systems security (ISS) by conceptualizing an organization's use of countermeasures using general deterrence theory, positing a non-recursive relationship between threats and countermeasures, and by extending the ISS construct developed in prior research. Industry affiliation and organizational size are considered in terms of differences in threats that firms face, the different countermeasures in use by various firms, and ultimately, how a firm's ISS effectiveness is affected. Six information systems professionals were interviewed in order to develop the appropriate instruments necessary to assess the research model put forth; the final instrument was further refined by pilot testing with the intent of further clarifying the wording and layout of the instrument. Finally, the Association of Information Technology Professionals was surveyed using an online survey. The model was assessed using SmartPLS and a two-stage least squares analysis. Results indicate that a non-recursive relationship does indeed exist between threats and countermeasures and that countermeasures can be used to effectively frame an organization's use of countermeasures. Implications for practitioners include the ability to target the use of certain countermeasures to have desired effects on both ISS effectiveness and future threats. Additionally, the model put forth in this research can be used by practitioners to both assess their current ISS effectiveness as well as to prescriptively target desired levels of ISS effectiveness.
|
914 |
The expected financing problems of the old age, survivors, and disability insurance, and possible solutionsDettinger, Juergen Karl January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
|
915 |
Security Benchmarking of Transactional SystemsAraujo Neto, Afonso Comba de January 2012 (has links)
A maioria das organizações depende atualmente de algum tipo de infraestrutura computacional para suportar as atividades críticas para o negócio. Esta dependência cresce com o aumento da capacidade dos sistemas informáticos e da confiança que se pode depositar nesses sistemas, ao mesmo tempo que aumenta também o seu tamanho e complexidade. Os sistemas transacionais, tipicamente centrados em bases de dados utilizadas para armazenar e gerir a informação de suporte às tarefas diárias, sofrem naturalmente deste mesmo problema. Assim, uma solução frequentemente utilizada para amenizar a dificuldade em lidar com a complexidade dos sistemas passa por delegar sob outras organizações o trabalho de desenvolvimento, ou mesmo por utilizar soluções já disponíveis no mercado (sejam elas proprietárias ou abertas). A diversidade de software e componentes alternativos disponíveis atualmente torna necessária a existência de testes padronizados que ajudem na seleção da opção mais adequada entre as alternativas existentes, considerando uma conjunto de diferentes características. No entanto, o sucesso da investigação em testes padronizados de desempenho e confiabilidade contrasta radicalmente com os avanços em testes padronizados de segurança, os quais têm sido pouco investigados, apesar da sua extrema relevância. Esta tese discute o problema da definição de testes padronizados de segurança, comparando-o com outras iniciativas de sucesso, como a definição de testes padronizados de desempenho e de confiabilidade. Com base nesta análise é proposta um modelo de base para a definição de testes padronizados de segurança. Este modelo, aplicável de forma genérica a diversos tipos de sistemas e domínios, define duas etapas principais: qualificação de segurança e teste padronizado de confiança. A qualificação de segurança é um processo que permite avaliar um sistema tendo em conta os aspectos e requisitos de segurança mais evidentes num determinado domínio de aplicação, dividindo os sistemas avaliados entre aceitáveis e não aceitáveis. O teste padronizado de confiança, por outro lado, consiste em avaliar os sistemas considerados aceitáveis de modo a estimar a probabilidade de existirem problemas de segurança ocultados ou difíceis de detectar (o objetivo do processo é lidar com as incertezas inerentes aos aspectos de segurança). O modelo proposto é demonstrado e avaliado no contexto de sistemas transacionais, os quais podem ser divididos em duas partes: a infraestrutura e as aplicações de negócio. Uma vez que cada uma destas partes possui objetivos de segurança distintos, o modelo é utilizado no desenvolvimento de metodologias adequadas para cada uma delas. Primeiro, a tese apresenta um teste padronizado de segurança para infraestruturas de sistemas transacionais, descrevendo e justificando todos os passos e decisões tomadas ao longo do seu desenvolvimento. Este teste foi aplicado a quatro infraestruturas reais, sendo os resultados obtidos cuidadosamente apresentados e analisados. Ainda no contexto das infraestruturas de sistemas transacionais, a tese discute o problema da seleção de componentes de software. Este é um problema complexo uma vez que a avaliação de segurança destas infraestruturas não é exequível antes da sua entrada em funcionamento. A ferramenta proposta, que tem por objetivo ajudar na seleção do software básico para suportar este tipo de infraestrutura, é aplicada na avaliação e análise de sete pacotes de software distintos, todos alternativas tipicamente utilizadas em infraestruturas reais. Finalmente, a tese aborda o problema do desenvolvimento de testes padronizados de confiança para aplicações de negócio, focando especificamente em aplicações Web. Primeiro, é proposta uma abordagem baseada no uso de ferramentas de análise de código, sendo apresentadas as diversas experiências realizadas para avaliar a validade da proposta, incluindo um cenário representativo de situações reais, em que o objetivo passa por selecionar o mais seguro de entre sete alternativas de software para suportar fóruns Web. Com base nas análises realizadas e nas limitações desta proposta, é de seguida definida uma abordagem genérica para a definição de testes padronizados de confiança para aplicações Web. / Most organizations nowadays depend on some kind of computer infrastructure to manage business critical activities. This dependence grows as computer systems become more reliable and useful, but so does the complexity and size of systems. Transactional systems, which are database-centered applications used by most organizations to support daily tasks, are no exception. A typical solution to cope with systems complexity is to delegate the software development task, and to use existing solutions independently developed and maintained (either proprietary or open source). The multiplicity of software and component alternatives available has boosted the interest in suitable benchmarks, able to assist in the selection of the best candidate solutions, concerning several attributes. However, the huge success of performance and dependability benchmarking markedly contrasts with the small advances on security benchmarking, which has only sparsely been studied in the past. his thesis discusses the security benchmarking problem and main characteristics, particularly comparing these with other successful benchmarking initiatives, like performance and dependability benchmarking. Based on this analysis, a general framework for security benchmarking is proposed. This framework, suitable for most types of software systems and application domains, includes two main phases: security qualification and trustworthiness benchmarking. Security qualification is a process designed to evaluate the most obvious and identifiable security aspects of the system, dividing the evaluated targets in acceptable or unacceptable, given the specific security requirements of the application domain. Trustworthiness benchmarking, on the other hand, consists of an evaluation process that is applied over the qualified targets to estimate the probability of the existence of hidden or hard to detect security issues in a system (the main goal is to cope with the uncertainties related to security aspects). The framework is thoroughly demonstrated and evaluated in the context of transactional systems, which can be divided in two parts: the infrastructure and the business applications. As these parts have significantly different security goals, the framework is used to develop methodologies and approaches that fit their specific characteristics. First, the thesis proposes a security benchmark for transactional systems infrastructures and describes, discusses and justifies all the steps of the process. The benchmark is applied to four distinct real infrastructures, and the results of the assessment are thoroughly analyzed. Still in the context of transactional systems infrastructures, the thesis also addresses the problem of the selecting software components. This is complex as evaluating the security of an infrastructure cannot be done before deployment. The proposed tool, aimed at helping in the selection of basic software packages to support the infrastructure, is used to evaluate seven different software packages, representative alternatives for the deployment of real infrastructures. Finally, the thesis discusses the problem of designing trustworthiness benchmarks for business applications, focusing specifically on the case of web applications. First, a benchmarking approach based on static code analysis tools is proposed. Several experiments are presented to evaluate the effectiveness of the proposed metrics, including a representative experiment where the challenge was the selection of the most secure application among a set of seven web forums. Based on the analysis of the limitations of such approach, a generic approach for the definition of trustworthiness benchmarks for web applications is defined.
|
916 |
Analysis of security issues in cloud based e-learningKumar, Gunasekar, Chelikani, Anirudh January 2011 (has links)
Cloud based E-Learning is one of the booming technologies in IT field which brings powerful e-learning products with the help of cloud power. Cloud technology has numerous advantages over the existing traditional E-Learning systems but at the same time, security is a major concern in cloud based e-learning. So security measures are unavoidable to prevent the loss of users’ valuable data from the security vulnerabilities. Cloud based e-learning products also need to satisfy the security needs of customers and overcome various security threats which attack valuable data stored in cloud servers.So the study investigates various security issues involved in cloud based e-learning technology with an aim to suggest solutions in the form of security measures and security management standards. These will help to overcome the security threats in cloud based e-learning technology. To achieve our thesis aim, we used theoretical and empirical studies. Empirical study is made through the information gathered through various cloud based e-learning solution vendors websites. And the theoretical study is made through the text analysis on various research articles related to our subject areas. And finally the constant comparative method is used to compare the empirical findings with the facts discovered from our theoretical findings. These analysis and research studies are leads to find various security issues in cloud based e-learning technology. / Program: Magisterutbildning i informatik
|
917 |
The formation of the National Security State : the State and the opposition in military BrazilAlves, Maria Helena Moreira, 1944- January 1982 (has links)
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Political Science, 1982. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND DEWEY. / Includes bibliographical references. / by Maria Helena Moreira Alves. / Ph.D.
|
918 |
Trust- and clustering-based authentication service in MANET.January 2004 (has links)
Ngai Cheuk Han. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2004. / Includes bibliographical references (leaves 110-117). / Abstracts in English and Chinese. / Abstract --- p.i / Acknowledgement --- p.iv / Chapter 1 --- Introduction --- p.1 / Chapter 2 --- Background Study --- p.5 / Chapter 2.1 --- Mobile Ad Hoc Networks --- p.5 / Chapter 2.1.1 --- Definition --- p.5 / Chapter 2.1.2 --- Characteristics --- p.5 / Chapter 2.1.3 --- Applications --- p.6 / Chapter 2.1.4 --- Standards --- p.7 / Chapter 2.1.5 --- Routing Protocols --- p.8 / Chapter 2.2 --- Security in Mobile Ad Hoc Networks --- p.11 / Chapter 2.2.1 --- Vulnerabilities --- p.11 / Chapter 2.2.2 --- Motivation for the Attacks --- p.12 / Chapter 2.2.3 --- Types of Attacks --- p.13 / Chapter 2.3 --- Cryptography --- p.13 / Chapter 2.3.1 --- Cryptographic goals --- p.13 / Chapter 2.3.2 --- Symmetric-key encryption --- p.14 / Chapter 2.3.3 --- Asymmetric-key encryption --- p.14 / Chapter 2.3.4 --- Digital Signatures --- p.15 / Chapter 2.3.5 --- Digital Certificates --- p.15 / Chapter 2.3.6 --- Certificate Authority --- p.16 / Chapter 2.4 --- Literature Review --- p.17 / Chapter 3 --- Related Work --- p.23 / Chapter 4 --- Architecture and Models --- p.26 / Chapter 4.1 --- Architecture of the Authentication Service --- p.26 / Chapter 4.2 --- The Network Model --- p.28 / Chapter 4.2.1 --- Clustering-Based Structure --- p.31 / Chapter 4.2.2 --- Clusterhead Selection Criteria and Role --- p.33 / Chapter 4.3 --- The Trust Model --- p.37 / Chapter 4.3.1 --- Direct TVust --- p.40 / Chapter 4.3.2 --- Recommendation Trust --- p.41 / Chapter 4.3.3 --- Deriving Direct Trust --- p.41 / Chapter 5 --- Trust- and Clustering-Based Authentication Service --- p.43 / Chapter 5.1 --- Clustering Structure Formation and Maintenance --- p.43 / Chapter 5.1.1 --- Clustering Structure Formation --- p.43 / Chapter 5.1.2 --- Network Maintenance --- p.45 / Chapter 5.2 --- Security Operations --- p.50 / Chapter 5.2.1 --- Public Key Certification --- p.51 / Chapter 5.2.2 --- Identification of Malicious Nodes --- p.55 / Chapter 5.2.3 --- Trust Value Update --- p.58 / Chapter 5.3 --- Special Scenarios --- p.60 / Chapter 5.3.1 --- Join the network --- p.60 / Chapter 5.3.2 --- Move to another cluster --- p.61 / Chapter 5.3.3 --- Not Enough Introducer --- p.62 / Chapter 6 --- Simulations and Results --- p.65 / Chapter 6.1 --- Authentication Service Based on Trust and Network Mod- els --- p.65 / Chapter 6.1.1 --- Experiments Set-Up --- p.65 / Chapter 6.1.2 --- Simulation Results --- p.67 / Chapter 6.2 --- Clusters Formation and Maintenance --- p.85 / Chapter 6.2.1 --- Experiments Set-Up --- p.85 / Chapter 6.2.2 --- Simulation Results --- p.86 / Chapter 6.3 --- Authentication Service Based on Trust and Network Mod- els with Clusters Formation and Maintenance --- p.91 / Chapter 6.3.1 --- Experiments Set-Up --- p.91 / Chapter 6.3.2 --- Simulation Results --- p.94 / Chapter 7 --- Conclusion --- p.108 / Bibliography --- p.117
|
919 |
H.264 encoded digital video protection using temporal redundancy LSB steganographyMitchell, Scott January 2018 (has links)
A steganographic method was developed based on the temporal redundancies present in digital video streams, these redundancies are utilised by the H.264 encoding standard to reduce the bandwidth requirements of a digital video stream while maintaining content quality. The temporal redundancies are used to steganographically embed unique binary data within the digital video stream, this results in a unique embedding strategy within each video stream while also utilising areas that reduce the potential data loss experienced during the H.264 encoding process. The effectivness of the developed steganographic method is measured using the common steganographic metrics of Payload Capacity, Embedded Data Robustness and Media Impact. The results illustrate that the compensation of mutative factors in the embedding process using temporal redundancies result in a more robust method of data embedding within digital video encoded using H.264.
|
920 |
Resilient payment systemsBaqer, Khaled January 2018 (has links)
There have been decades of attempts to evolve or revolutionise the traditional financial system, but not all such efforts have been transformative or even successful. From Chaum's proposals in the 1980s for private payment systems to micropayments, previous attempts failed to take off for a variety of reasons, including non-existing markets, or issues pertaining to usability, scalability and performance, resilience against failure, and complexity of protocols. Towards creating more resilient payment systems, we investigated issues related to security engineering in general, and payment systems in particular. We identified that network coverage, central points of failure, and attacks may cripple system performance. The premise of our research is that offline capabilities are required to produce resilience in critical systems. We focus on issues related to network problems and attacks, system resilience, and scalability by introducing the ability to process payments offline without relying on the availability of network coverage; a lack of network coverage renders some payment services unusable for their customers. Decentralising payment verification, and outsourcing some operations to users, alleviates the burden of contacting centralised systems to process every transaction. Our secondary goal is to minimise the cost of providing payment systems, so providers can cut transaction fees. Moreover, by decentralising payment verification that can be performed offline, we increase system resilience, and seamlessly maintain offline operations until a system is back online. We also use tamper-resistant hardware to tackle usability issues, by minimising cognitive overhead and helping users to correctly handle critical data, minimising the risks of data theft and tampering. We apply our research towards extending financial inclusion efforts, since the issues discussed above must be solved to extend mobile payments to the poorest demographics. More research is needed to integrate online payments, offline payments, and delay-tolerant networking. This research extends and enhances not only payment systems, but other electronically-enabled services from pay-as-you-go solar panels to agricultural subsidies and payments from aid donors. We hope that this thesis is helpful for researchers, protocol designers, and policy makers interested in creating resilient payment systems by assisting them in financial inclusion efforts.
|
Page generated in 0.2922 seconds