Spelling suggestions: "subject:"privacy"" "subject:"eprivacy""
371 |
Privacy-aware Use of Accountability EvidenceReuben, Jenni January 2017 (has links)
This thesis deals with the evidence that enable accountability, the privacy risks involved in using them and a privacy-aware solution to the problem of unauthorized evidence disclosure. Legal means to protect privacy of an individual is anchored on the data protection perspective i.e., on the responsible collection and use of personal data. Accountability plays a crucial role in such legal privacy frameworks for assuring an individual’s privacy. In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. System traces that record the system activities are the essential inputs to those automated audits. Nevertheless, the traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about processing of the personal data. Therefore, ensuring the privacy of the accountability traces is equally important as ensuring the privacy of the personal data. However, by and large, research involving accountability traces is concerned with storage, interoperability and analytics challenges rather than on the privacy implications involved in processing them. This dissertation focuses on both the application of accountability evidence such as in the automated privacy audits and the privacy aware use of them. The overall aim of the thesis is to provide a conceptual understanding of the privacy compliance research domain and to contribute to the solutions that promote privacy-aware use of the traces that enable accountability. To address the first part of the objective, a systematic study of existing body of knowledge on automated privacy compliance is conducted. As a result, the state-of-the-art is conceptualized as taxonomies. The second part of the objective is accomplished through two results; first, a systematic understanding of the privacy challenges involved in processing of the system traces is obtained, second, a model for privacy aware access restrictions are proposed and formalized in order to prevent illegitimate access to the system traces. Access to accountability traces such as provenance are required for automatic fulfillment of accountability obligations, but they themselves contain personally identifiable information, hence in this thesis we provide a solution to prevent unauthorized access to the provenance traces. / This thesis deals with the evidence that enables accountability, the privacy risks involved in using it and proposes a privacy-aware solution for preventing unauthorized evidence disclosure. Accountability plays a crucial role in the legal privacy frameworks for assuring individuals’ privacy. In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. Traces that record the system activities are the essential inputs to those audits. Nevertheless, such traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about the processing of the personal data. Therefore, ensuring the privacy of the traces is equally important as ensuring the privacy of the personal data. The aim of the thesis is to provide a conceptual understanding of the automated privacy compliance research and to contribute to the solutions that promote privacy-aware use of the accountability traces. This is achieved in this dissertation through a systematic study of the existing body of knowledge in automated privacy compliance, a systematic analysis of the privacy challenges involved in processing the traces and a proposal of a privacy-aware access control model for preventing illegitimate access to the traces.
|
372 |
Policy Merger System for P3P in a Cloud Aggregation PlatformOlurin, Olumuyiwa 09 January 2013 (has links)
The need for aggregating privacy policies is present in a variety of application areas today. In traditional client/server models, websites host services along with their policies in different private domains. However, in a cloud-computing platform where aggregators can merge multiple services, users often face complex decisions in terms of choosing the right services from service providers. In this computing paradigm, the ability to aggregate policies as well as services will be useful and more effective for users that are privacy conscious regarding their sensitive or personal information.
This thesis studies the problems associated with the Platform for Privacy Preference (P3P) language, and the present issues with communicating and understanding the P3P language. Furthermore, it discusses some efficient strategies and algorithms for the matching and the merging processes, and then elaborates on some privacy policy conflicts that may occur after merging policies. Lastly, the thesis presents a tool for matching and merging P3P policies. If successful, the merge produces an aggregate policy that is consistent with the policies of all participating service providers.
|
373 |
An Empirical Investigation of Internet Privacy: Customer Behaviour, Companies’ Privacy Policy Disclosures, and a GapNo, Won Gyun 09 1900 (has links)
Privacy emerges as a critical issue in an e-commerce environment because of a fundamental tension among corporate, consumer, and government interests. By reviewing prior Internet-privacy research in the fields of information systems, business, and marketing published between 1995 and 2006, we consider the following research questions: 1) how an individual’s privacy behaviour is affected by privacy policy disclosures and by the level of the individual’s involvement regarding the sensitivity of personal information; 2) how companies’ privacy policies vary with respect to regulatory approaches and cultural values; and 3) whether there is a gap between the privacy practices valued by individuals and those emphasized by companies. A three-stage study is conducted to answer these questions.
The first two stages, consisting of a Web-based survey and an online ordering experiment with 210 participants, found that individuals are more likely to read the privacy policy statements posted on Web sites and less likely to provide personal information, when they are under a high privacy involved situation as compared to being in a low privacy involved situation. However, the existence of a privacy seal did not affect individuals’ behaviour, regardless of involvement conditions. This study also found a gap between self-reported privacy behaviour and actual privacy behaviour. When individuals were requested to provide personal information, their privacy policy statement reading behaviour was close to their self-report behaviour. However, their personal information providing behaviour was different from their self-reported behaviour.
The third stage, which entailed the study of 420 privacy policies spanning six countries and two industries, showed that privacy policies vary across countries, as well as with varying governmental involvement and cultural values in those countries. Finally, the analysis of all the three stages revealed a gap between individuals’ importance ratings of companies’ privacy practices and policies that companies emphasize in their privacy disclosures.
|
374 |
An Empirical Investigation of Internet Privacy: Customer Behaviour, Companies’ Privacy Policy Disclosures, and a GapNo, Won Gyun 09 1900 (has links)
Privacy emerges as a critical issue in an e-commerce environment because of a fundamental tension among corporate, consumer, and government interests. By reviewing prior Internet-privacy research in the fields of information systems, business, and marketing published between 1995 and 2006, we consider the following research questions: 1) how an individual’s privacy behaviour is affected by privacy policy disclosures and by the level of the individual’s involvement regarding the sensitivity of personal information; 2) how companies’ privacy policies vary with respect to regulatory approaches and cultural values; and 3) whether there is a gap between the privacy practices valued by individuals and those emphasized by companies. A three-stage study is conducted to answer these questions.
The first two stages, consisting of a Web-based survey and an online ordering experiment with 210 participants, found that individuals are more likely to read the privacy policy statements posted on Web sites and less likely to provide personal information, when they are under a high privacy involved situation as compared to being in a low privacy involved situation. However, the existence of a privacy seal did not affect individuals’ behaviour, regardless of involvement conditions. This study also found a gap between self-reported privacy behaviour and actual privacy behaviour. When individuals were requested to provide personal information, their privacy policy statement reading behaviour was close to their self-report behaviour. However, their personal information providing behaviour was different from their self-reported behaviour.
The third stage, which entailed the study of 420 privacy policies spanning six countries and two industries, showed that privacy policies vary across countries, as well as with varying governmental involvement and cultural values in those countries. Finally, the analysis of all the three stages revealed a gap between individuals’ importance ratings of companies’ privacy practices and policies that companies emphasize in their privacy disclosures.
|
375 |
Oblivious Handshakes and Sharing of Secrets of Privacy-Preserving Matching and Authentication ProtocolsDuan, Pu 2011 May 1900 (has links)
The objective of this research is focused on two of the most important privacy-preserving techniques: privacy-preserving element matching protocols and privacy-preserving credential authentication protocols, where an element represents the information generated by users themselves and a credential represents a group membership assigned from an independent central authority (CA). The former is also known as private set intersection (PSI) protocol and the latter is also known as secret handshake (SH) protocol. In this dissertation, I present a general framework for design of efficient and secure PSI and SH protocols based on similar message exchange and computing procedures to confirm “commonality” of their exchanged information, while protecting the information from each other when the commonalty test fails. I propose to use the homomorphic randomization function (HRF) to meet the privacy-preserving requirements, i.e., common element/credential can be computed efficiently based on homomorphism of the function and uncommon element/credential are difficult to derive because of the randomization of the same function.
Based on the general framework two new PSI protocols with linear computing and communication cost are proposed. The first protocol uses full homomorphic randomization function as the cryptographic basis and the second one uses partial homomorphic randomization function. Both of them achieve element confidentiality and private set intersection. A new SH protocol is also designed based on the framework, which achieves unlinkability with a reusable pair of credential and pseudonym and least number of bilinear mapping operations. I also propose to interlock the proposed PSI protocols and SH protocol to design new protocols with new security properties. When a PSI protocol is executed first and the matched elements are associated with the credentials in a following SH protocol, authenticity is guaranteed on matched elements. When a SH protocol is executed first and the verified credentials is used in a following PSI protocol, detection resistance and impersonation attack resistance are guaranteed on matching elements.
The proposed PSI and SH protocols are implemented to provide privacy-preserving inquiry matching service (PPIM) for social networking applications and privacy-preserving correlation service (PAC) of network security alerts. PPIM allows online social consumers to find partners with matched inquiries and verified group memberships without exposing any information to unmatched parties. PAC allows independent network alert sources to find the common alerts without unveiling their local network information to each other.
|
376 |
The Study of Customer Personal Data ProtectionHuang, Li-Ying 30 August 2005 (has links)
The Study of Customer Personal Data Protection
In this customer-driven era, corporations and government agencies face the challenges from customers. If government and corporations can utilize the power of computers to manage the huge amount of personal data they have collected by storing and editing, data mining and customer relationship management can be put to use on services, customer cares, and marketing. This will increase the efficiency of government agencies and stimulate the development of economy. The government, corporations and the people all will be benefited from this move. However, while the organizations make large investments in the security of their computer systems to avoid the invasion of virus and hackers, the abuse and breach caused by the employees, contractors, and other legal users can compromise all the preventive measures.
This study investigates the performance of customer personal data privacy protection. While discussing the regulations such as computer processing of personal data acts and Telecommunications Acts, the theory on which this study is based is Self-Regulation Mechanism. The Self-Regulation Mechanism can be applied to the self-monitoring, self-esteem, information ethics, and self-efficacy of the users who have access to the customer personal data. It can also be applied to the management of the customer personal data privacy at the organization level. This study gathered 432 valid surveys from the customer personal data users who are the customer service staffs in the telecommunication industry. With path analysis methodology, this study explores the interactions among the management of organization, personal privacy protection self-efficacy, and information ethics. With information ethics and self-efficacy as the intervening variable between the management of organization and protection performance, this study is set to clarify the level of impacts that these three items have over the performance of customer information privacy protection.
Through the model validation, the customer personal data protection self-regulation mechanism proposed in this study demonstrates suitability and the management of organization also shows positive, direct and noticeable impacts. However, the effects of information ethics on privacy protection self-efficacy and those of self-efficacy on the performance of privacy protection are not obvious. Therefore, the organization should strengthen the information ethics of its employees and improve self-efficacy. Also, they should bring up feasible and solid suggestions, hoping to improve the customer personal privacy protection performance of the organization and its members. By doing the customers will have confidence in the organization. Winning the trust and satisfaction from the customers will promote the organization image and even bring in more business opportunities, a good thing for running a long term business.
|
377 |
An Xacml Based Framework For Structured Patient Privacy Policy (s3p)Mizani, Mehrdad Alizadeh 01 September 2006 (has links) (PDF)
The emergence of electronic healthcare have caused numerous changes in both substantive and procedural aspects of healthcare processes. Such changes have introduced new risks to patient privacy and information confidentiality. Traditional privacy policies fall too short to respond to privacy needs of patients in electronic healthcare. Structured and enforceable policies are needed in order to protect patient privacy in modern healthcare with its cross organizational information sharing and decision making. Structured Patient Privacy Policy (S3P) is a framework for a formalized and enforceable privacy policy in healthcare. S3P contains a prototype implementation of a structured and enforceable privacy policy based on eXtensible Access Control Markup Language (XACML). By simulating healthcare scenarios, S3P provides a means for experts from different professional backgrounds to assess
the effect of policies on healthcare processes and to reach ethically sound privacy policies suitable for electronic healthcare.
|
378 |
Attitudes And Opinions Of People Who Use Medical Services About Privacy And Confidentiality Of Health Information In Electronic EnvironmentOzkan, Ozlem 01 February 2011 (has links) (PDF)
In health services, it is a necessity to keep the records of the patients. Although paper-based records are commonly used for this aim, they are not as convenient as computerized records. Therefore, many of the health facilities have recently started keeping patients&rsquo / health records in electronic databases. However, new questions about confidentiality and privacy of these records were raised with this new system.This study aims to investigate the opinions and attitudes of the people who use the health services of Turkey about the privacy and confidentiality of health information in electronic environment. In the survey, there are 596 participants from 64 different cities in six geographical regions of Turkey. The findings show that people feel comfortable about computer usage in health-care but they are concerned about the privacy and confidentiality of their information and also they are not sure if their medical information is safe and secure now. Moreover, they are mostly unaware about current regulations related to information privacy in Turkey. The study also shows that people trust in their doctors, health researchers in universities, pharmacist, nurses and other hospital staff but do not trust in insurance companies, government, private sector health researchers, information technology specialists and government health researchers for the privacy of their medical records.
|
379 |
Exploring everyday privacy behaviors and misclosuresCaine, Kelly Erinn 08 December 2009 (has links)
As access to information changes with increased use of technology, privacy becomes an increasingly prominent issue among technology users. Privacy concerns should be taken seriously because they influence system adoption, the way a system is used, and may even lead to system disuse. Threats to privacy are not only due to traditional security and privacy issues; human factors issues such as unintentional disclosure of information also influence the preservation of privacy in technology systems.
A dual pronged approach was used to examine privacy. First, a broad investigation of younger and older adults' privacy behaviors was conducted. The goal of this study was to gain a better understanding of privacy across technologies, to discover the similarities, and identify the differences in what privacy means across contexts as well as provide a means to evaluate current theories of privacy. This investigation resulted in a categorization of privacy behaviors associated with technology. There were three high level privacy behavior categories identified: avoidance, modification, and alleviatory behavior. This categorization furthers our understanding about the psychological underpinnings of privacy concerns and suggests that 1) common privacy feelings and behaviors exist across people and technologies and 2) alternative designs which consider these commonalities may increase privacy.
Second, I examined one specific human factors issue associated with privacy: disclosure error. This investigation focused on gaining an understanding of how to support privacy by preventing misclosure. A misclosure is an error in disclosure. When information is disclosed in error, or misclosed, privacy is violated in that information not intended for a specific person(s) is nevertheless revealed to that person. The goal of this study was to provide a psychological basis for design suggestions for improving privacy in technology which was grounded in empirical findings. The study furthers our understanding about privacy errors in the following ways: First, it demonstrates for the first time that both younger and older adults experience misclosures . Second, it suggests that misclosures occur even when technology is very familiar to the user. Third, it revealed that some misclosure experiences result in negative consequences, suggesting misclosure is a potential threat to privacy. Finally, by exploring the context surrounding each reported misclosure, I was able to propose potential design suggestions that may decrease the likelihood of misclosure.
|
380 |
WWW Privacy - P3P Platform of Privacy PreferencersFoerster, Marian 10 July 2000 (has links)
Gemeinsamer Workshop von Universitaetsrechenzentrum und
Professur Rechnernetze und verteilte Systeme (Fakultaet fuer
Informatik) der TU Chemnitz.
Workshop-Thema: Infrastruktur der ¨Digitalen Universitaet¨
WWW Privacy - P3P Platform of Privacy Preferencers
Der Vortrag soll einen Einblick in das z.Zt. noch in der Entwicklung stehenden Protokolls P3P des W3C geben.
Dabei wird das Grundprinzip von P3P, einige technische Realisierungsmoeglichkeiten sowie ein Demo-Einkaufssystem vorgestellt.
|
Page generated in 0.1047 seconds