71 |
Telemetry Post-Processing in the Clouds: A Data Security ChallengeKalibjian, J. R. 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / As organizations move toward cloud [1] computing environments, data security challenges will begin to take precedence over network security issues. This will potentially impact telemetry post processing in a myriad of ways. After reviewing how data security tools like Enterprise Rights Management (ERM), Enterprise Key Management (EKM), Data Loss Prevention (DLP), Database Activity Monitoring (DAM), and tokenization are impacting cloud security, their effect on telemetry post-processing will also be examined. An architecture will be described detailing how these data security tools can be utilized to make telemetry post-processing environments in the cloud more robust.
|
72 |
Nová média shromažďující informace o svém publiku a vztah uživatelů k bezpečnosti dat: kvalitativní studie / New media gathering users data and the attitude of users towards internet security: qualitative studyLaube, David January 2015 (has links)
The theoretical part of the thesis analyzes the topic of new media and how it works with the privacy of its users. On the examples of applications such as Facebook, or Google services, I refer to the intensive and extensive kind of private information, that are stored on the provider's servers. All these data are not just gathered, but also analyzed and evaluated. Private companies use data of its users in such extension like never before. New media and their activities raises new questions about possible misuse of such data. In this thesis I mention a few examples that are somehow related to the topic of privacy and personal data protection. In the practical part I use the tools of qualitative research to explore how the issue of online privacy and data security is perceived by different user groups and how they explain their behavior. I examined whether the privacy issue is an important one and if their online activity in this context is somehow particularly regulated or restricted. For research I chose two groups of respondents - younger users up to 37 years of age and older aged 55 +. I get information from the respondents in the form of semi-structured interview. These were analyzed and I created new conclusions from it.
|
73 |
Access control and inference problem in data integration systems / Problème d'inférence et contrôle d'accès dans les systèmes d'intégration de donnéesHaddad, Mehdi 01 December 2014 (has links)
Dans cette thèse nous nous intéressons au contrôle d’accès dans un système issu d’une intégration de données. Dans un système d’intégration de données un médiateur est défini. Ce médiateur a pour objectif d’offrir un point d’entrée unique à un ensemble de sources hétérogènes. Dans ce type d’architecture, l’aspect sécurité, et en particulier le contrôle d’accès, pose un défi majeur. En effet, chaque source, ayant été construite indépendamment, définit sa propre politique de contrôle d’accès. Le problème qui émerge de ce contexte est alors le suivant : "Comment définir une politique représentative au niveau du médiateur et qui permet de préserver les politiques des sources de données impliquées dans la construction du médiateur?" Préserver les politiques des sources de données signifie qu’un accès interdit au niveau d’une source doit également l’être au niveau du médiateur. Aussi, la politique du médiateur doit préserver les données des accès indirects. Un accès indirect consiste à synthétiser une information sensible en combinant des informations non sensibles et les liens sémantiques entre ces informations. Détecter tous les accès indirects dans un système est appelé problème d’inférence. Dans ce manuscrit, nous proposons une méthodologie incrémentale qui permet d’aborder le problème d’inférence dans un contexte d’intégration de données. Cette méthodologie est composée de trois phases. La première, phase de propagation, permet de combiner les politiques sources et ainsi générer une politique préliminaire au niveau médiateur. La deuxième phase, phase de détection, caractérise le rôle que peuvent jouer les relations sémantiques entre données afin d’inférer une information confidentielle. Par la suite, nous introduisant, au sein de cette phase, une approche basée sur les graphes afin d’énumérer tous les accès indirects qui peuvent induire l’accès à une information sensible. Afin de remédier aux accès indirects détectés nous introduisons la phase de reconfiguration qui propose deux solutions. La première solution est mise en œuvre au niveau conceptuel. La seconde solution est mise en œuvre lors de l’exécution. / In this thesis we are interested in controlling the access to a data integration system. In a data integration system, a mediator is defined. This mediator aims at providing a unique entry point to several heterogeneous sources. In this kind of architecture security aspects and access control in particular represent a major challenge. Indeed, every source, designed independently of the others, defines its own access control policy. The problem is then: "How to define a representative policy at the mediator level that preserves sources’ policies?" Preserving the sources’ policies means that a prohibited access at the source level should also be prohibited at the mediator level. Also, the policy of the mediator needs to protect data against indirect accesses. An indirect access occurs when one could synthesize sensitive information from the combination of non sensitive information and semantic constraints. Detecting all indirect accesses in a given system is referred to as the inference problem. In this manuscript, we propose an incremental methodology able to tackle the inference problem in a data integration context. This methodology has three phases. The first phase, the propagation phase, allows combining source policies and therefore generating a preliminary policy at the mediator level. The second phase, the detection phase, characterizes the role of semantic constraints in inducing inference about sensitive information. We also introduce in this phase a graph-based approach able to enumerate all indirect access that could induce accessing sensitive information. In order to deal with previously detected indirect access, we introduce the reconfiguration phase which provides two solutions. The first solution could be implemented at design time. The second solution could be implemented at runtime.
|
74 |
Análise de segurança em criptografia e esteganografia em sequências de imagens / Analysis of the cryptography security and steganography in images sequencesOliveira, Fábio Borges de 14 February 2007 (has links)
Made available in DSpace on 2015-03-04T18:50:49Z (GMT). No. of bitstreams: 1
Dissertacao.pdf: 3034546 bytes, checksum: 5e2004dbb50f098736d630710e806e70 (MD5)
Previous issue date: 2007-02-14 / Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior / Information security is being considered of great importance to the private and governamental institutions. For this reason, we opted to conduct a study of security in this dissertation. We started with an introduction to the information theory, and then we proposed a new kind of Perfect Secrecy cryptographic and finally made a study of steganography in an image sequence, in which we suggest a more aggressive steganography in coefficients of the discrete cosine transform. / A segurança da informação vem sendo considerada de grande importância para as instituições privadas e governamentais. Por este motivo, optamos em realizar um estudo sobre segurança nesta dissertação. Iniciamos com uma introdução à teoria da informação, partimos para métodos de criptografia onde propomos um novo tipo de Segredo Perfeito e finalmente fazemos um estudo de esteganografia em uma sequência de imagens, onde propomos uma esteganografia mais agressiva nos coeficientes da transformada discreta de cosseno.
|
75 |
Ontological lockdown assessment : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Information Technology at Massey University, Palmerston North, New ZealandSteele, Aaron January 2008 (has links)
In order to keep shared access computers secure and stable system administrators resort to locking down the computing environment in order to prevent intentional and unintentional damage by users. Skilled attackers are often able to break out of locked down computing environments and intentionally misuse shared access computers. This misuse has resulted in cases of mass identity theft and fraud, some of which have had an estimated cost ranging in millions. In order to determine if it is possible to break out of locked down computing environments an assessment method is required. Although a number of vulnerability assessment techniques exist, none of the existing techniques are sufficient for assessing locked down shared access computers. This is due to the existing techniques focusing on traditional, application specific, software vulnerabilities. Break out path vulnerabilities (which are exploited by attackers in order to break out of locked down environments) differ substantially from traditional vulnerabilities, and as a consequence are not easily discovered using existing techniques. Ontologies can be thought of as a modelling technique that can be used to capture expert knowledge about a domain of interest. The method for discovering break out paths in locked down computers can be considered expert knowledge in the domain of shared access computer security. This research proposes an ontology based assessment process for discovering break out path vulnerabilities in locked down shared access computers. The proposed approach is called the ontological lockdown assessment process. The ontological lockdown assessment process is implemented against a real world system and successfully identifies numerous break out path vulnerabilities.
|
76 |
Access Control Administration with Adjustable DecentralizationChinaei, Amir Hossein 22 August 2007 (has links)
Access control is a key function of enterprises that preserve and propagate massive data. Access control enforcement and administration are two major components of the system. On one hand, enterprises are responsible for data security; thus, consistent and reliable access control enforcement is necessary although the data may be distributed. On the other hand, data often belongs to several organizational units with various access control policies and many users; therefore, decentralized administration is needed to accommodate diverse access control needs and to avoid the central bottleneck. Yet, the required degree of decentralization varies within different organizations: some organizations may require a powerful administrator in the system; whereas, some others may prefer a self-governing setting in which no central administrator exists, but users fully manage their own data. Hence, a single system with adjustable decentralization will be useful for supporting various (de)centralized models within the spectrum of access control administration.
Giving individual users the ability to delegate or grant privileges is a means of decentralizing access control administration. Revocation of arbitrary privileges is a means of retaining control over data. To provide flexible administration, the ability to delegate a specific privilege and the ability to revoke it should be held independently of each other and independently of the privilege itself. Moreover, supporting arbitrary user and data hierarchies, fine-grained access control, and protection of both data (end objects) and metadata (access control data) with a single uniform model will provide the most widely deployable access control system.
Conflict resolution is a major aspect of access control administration in systems. Resolving access conflicts when deriving effective privileges from explicit ones is a challenging problem in the presence of both positive and negative privileges, sophisticated data hierarchies, and diversity of conflict resolution strategies.
This thesis presents a uniform access control administration model with adjustable decentralization, to protect both data and metadata. There are several contributions in this work. First, we present a novel mechanism to constrain access control administration for each object type at object creation time, as a means of adjusting the degree of decentralization for the object when the system is configured. Second, by controlling the access control metadata with the same mechanism that controls the users’ data, privileges can be granted and revoked to the extent that these actions conform to the corporation’s access control policy. Thus, this model supports a whole spectrum of access control administration, in which each model is characterized as a network of access control states, similar to a finite state automaton. The model depends on a hierarchy of access banks of authorizations which is supported by a formal semantics. Within this framework, we also introduce the self-governance property in the context of access control, and show how the model facilitates it. In particular, using this model, we introduce a conflict-free and decentralized access control administration model in which all users are able to retain complete control over their own data while they are also able to delegate any subset of their privileges to other users or user groups. We also introduce two measures to compare any two access control models in terms of the degrees of decentralization and interpretation. Finally, as the conflict resolution component of access control models, we incorporate a unified algorithm to resolve access conflicts by simultaneously supporting several combined strategies.
|
77 |
Access Control Administration with Adjustable DecentralizationChinaei, Amir Hossein 22 August 2007 (has links)
Access control is a key function of enterprises that preserve and propagate massive data. Access control enforcement and administration are two major components of the system. On one hand, enterprises are responsible for data security; thus, consistent and reliable access control enforcement is necessary although the data may be distributed. On the other hand, data often belongs to several organizational units with various access control policies and many users; therefore, decentralized administration is needed to accommodate diverse access control needs and to avoid the central bottleneck. Yet, the required degree of decentralization varies within different organizations: some organizations may require a powerful administrator in the system; whereas, some others may prefer a self-governing setting in which no central administrator exists, but users fully manage their own data. Hence, a single system with adjustable decentralization will be useful for supporting various (de)centralized models within the spectrum of access control administration.
Giving individual users the ability to delegate or grant privileges is a means of decentralizing access control administration. Revocation of arbitrary privileges is a means of retaining control over data. To provide flexible administration, the ability to delegate a specific privilege and the ability to revoke it should be held independently of each other and independently of the privilege itself. Moreover, supporting arbitrary user and data hierarchies, fine-grained access control, and protection of both data (end objects) and metadata (access control data) with a single uniform model will provide the most widely deployable access control system.
Conflict resolution is a major aspect of access control administration in systems. Resolving access conflicts when deriving effective privileges from explicit ones is a challenging problem in the presence of both positive and negative privileges, sophisticated data hierarchies, and diversity of conflict resolution strategies.
This thesis presents a uniform access control administration model with adjustable decentralization, to protect both data and metadata. There are several contributions in this work. First, we present a novel mechanism to constrain access control administration for each object type at object creation time, as a means of adjusting the degree of decentralization for the object when the system is configured. Second, by controlling the access control metadata with the same mechanism that controls the users’ data, privileges can be granted and revoked to the extent that these actions conform to the corporation’s access control policy. Thus, this model supports a whole spectrum of access control administration, in which each model is characterized as a network of access control states, similar to a finite state automaton. The model depends on a hierarchy of access banks of authorizations which is supported by a formal semantics. Within this framework, we also introduce the self-governance property in the context of access control, and show how the model facilitates it. In particular, using this model, we introduce a conflict-free and decentralized access control administration model in which all users are able to retain complete control over their own data while they are also able to delegate any subset of their privileges to other users or user groups. We also introduce two measures to compare any two access control models in terms of the degrees of decentralization and interpretation. Finally, as the conflict resolution component of access control models, we incorporate a unified algorithm to resolve access conflicts by simultaneously supporting several combined strategies.
|
78 |
Lietuvos valstybės institucijų privatumo politika internete / The Policy of Privacy of Lithuanian State Institutions in the InternetGedgaudas, Andrius 19 December 2006 (has links)
In work are discussed the state‘s institutions privacy policy in internet. Also in the work is analyzed court practice in cases related with the violations of personnel data security in the state‘s institutions; disputed new project of Personnel data law and given suggestions how to solve problems related with personnel data security in the state‘s institutions.
The regulation of personal data is one most important nowadays social phenomenon. Majority of state’s institutions personnel uses, plans to create and administer computerized personal data accumulation, transmission and etc. systems. Hospitals and other health supervision institutions accumulate and regulate personnel data of patients. Tax administration institutions accumulates information about inhabitant income and govern a huge personnel data bases, which systematizes information not only about resident income but also about their work place, family status and etc. Last year practice display that personnel data are related with personal life of a resident becomes a service, which helps commercial structures to increase profit. Understanding the value of such personal information majority of commercial subjects is inclined to neglect demands to honor the individual right to privacy. That is why state’s institutions, regulating personnel databases must ensure the security of it.
The work consists of preface, three chapters, which are divided into smaller sections, and conclusions, the list of literature and... [to full text]
|
79 |
Forensic computing : a deterministic model for validation and verification through an ontological examination of forensic functions and processesBeckett, Jason January 2010 (has links)
This dissertation contextualises the forensic computing domain in terms of validation of tools and processes. It explores the current state of forensic computing comparing it to the traditional forensic sciences. The research then develops a classification system for the disciplines functions to establish the extensible base for which a validation system is developed. / Thesis (PhD)--University of South Australia, 2010
|
80 |
Forensic computing : a deterministic model for validation and verification through an ontological examination of forensic functions and processesBeckett, Jason January 2010 (has links)
This dissertation contextualises the forensic computing domain in terms of validation of tools and processes. It explores the current state of forensic computing comparing it to the traditional forensic sciences. The research then develops a classification system for the disciplines functions to establish the extensible base for which a validation system is developed. / Thesis (PhD)--University of South Australia, 2010
|
Page generated in 0.0308 seconds