Spelling suggestions: "subject:"privacy"" "subject:"eprivacy""
391 |
A review of the implementation of the personal data (privacy) ordinance in the Hong Kong Correctional Services Department /Kan, Chi-keung. January 1998 (has links)
Thesis (M.P.A.)--University of Hong Kong, 1998. / Includes bibliographical references (leaf 127-129).
|
392 |
A review of the implementation of the personal data (privacy) ordinance in the Hong Kong Correctional Services DepartmentKan, Chi-keung. January 1998 (has links)
Thesis (M.P.A.)--University of Hong Kong, 1998. / Includes bibliographical references (leaves 127-129). Also available in print.
|
393 |
Factors influencing information privacy in Abu Dhabi EmirateAldhaheri, Omar January 2016 (has links)
Individuals in the UAE and Abu Dhabi Emirate, in particular, have become increasingly concerned about their private information. This is mainly due to the use of technology, which makes accessing, transmitting and editing personal information faster and easier. Besides the use of technology, and the awareness and understanding of the privacy of expatriates, working in Abu Dhabi Emirate has had an impact on UAE citizens in terms of their rights to privacy. There is a need for organisations to comply with international bodies in protecting individuals rights to privacy and to increase the exploration of culturally sensitive information in the media. These issues have all led to the importance of and need to explore and identify Abu Dhabi Emirate employees perceptions, and the factors influencing their behaviour, towards privacy. The aim of this research was to investigate and analyse factors influencing employees information privacy behaviour and employees perceptions, awareness and behaviour on the handling of private information in the Abu Dhabi Emirate public sector, ADEC, as well as to provide practical recommendations to improve the privacy. The research methods used in this project are based on a mixed-method approach comprising both quantitative and qualitative strategies. Qualitative data collection in this research included face-to-face interviews and focus groups with Abu Dhabi Education Council. Quantitative surveys for all the Abu Dhabi Education Council were also utilised. The research identified the types of information that were considered private and defined privacy in the context of UAE culture. The main factors influencing privacy in Abu Dhabi Emirate employees were identified and analysed such as national culture, organisation culture and perceived benefits as examples. Following this, practical recommendations for changes to promote and enhance privacy in Abu Dhabi Emirate were offered. A model has been developed and designed based on the factors influencing individual information behaviour regarding private information handling, interrelated and influenced. This is essential to provide a practical model capable of acting as a guideline to help organisations, decision makers, and strategic planners in the Abu Dhabi Emirate public sector decide how best to approach privacy policy.
|
394 |
Social privacy : perceptions of veillance, relationships, and space with online social networking servicesDumbleton, Steven Philip Holt January 2016 (has links)
This research seeks to examine the experience of social privacy around online social networking services. In particular, it examines how individuals experience social privacy through the perception of veillance, relationships and space. It highlights that individuals need varying types of veillance and relationships in order to experience the social privacy they desire. It also highlights that individuals used the perception of space to indicate acceptable convention within that space; seeking spaces, both real and metaphorical, that they perceived to afford them the experience of social privacy. Through the application of phenomenological methods drawn from ethnography this study explores how the experience of social privacy is perceived. It does this through examining the perception of veillance, relationships and space in separation, though notes that the individual perceives all three simultaneously. It argues that the varying conditions of these perceptions afford the individuals the experience of social privacy. Social privacy is, therefore, perceived as a socially afforded emotional experience.
|
395 |
A systematic methodology for privacy impact assessments: a design science approachSpiekermann-Hoff, Sarah, Oetzel, Marie Caroline January 2014 (has links) (PDF)
For companies that develop and operate IT applications that process the personal data of customers and employees, a major problem is protecting these data and preventing privacy breaches. Failure to adequately address this problem can result in considerable damage to the company's reputation and finances, as well as negative effects for customers or employees (data subjects). To address this problem, we propose a methodology that systematically considers privacy issues by using a step-by-step privacy impact assessment (PIA). Existing PIA approaches cannot be applied easily because they are improperly structured or imprecise and lengthy. We argue that companies that employ our PIA can achieve "privacy-by-design", which is widely heralded by data protection authorities. In fact, the German Federal Office for Information Security (BSI) ratified the approach we present in this article for the technical field of RFID and published it as a guideline in November 2011. The contribution of the artefacts we created is twofold: First, we provide a formal problem representation structure for the analysis of privacy requirements. Second, we reduce the complexity of the privacy regulation landscape for practitioners who need to make privacy management decisions for their IT applications.
|
396 |
A Privacy-Policy Language and a Matching Engine for U-PrIMOggolder, Michael January 2013 (has links)
A privacy-policy matching engine may support users in determining if their privacy preferencesmatch with a service provider’s privacy policy. Furthermore, third parties, such asData Protection Agencies (DPAs), may support users in determining if a service provider’sprivacy policy is a reasonable privacy policy for a given service by issuing recommendationsfor reasonable data handling practises for different services. These recommendations needto be matched with service provider’s privacy policies, to determine if a privacy policy isreasonable or not, and with user’s privacy preferences, to determine if a set of preferencesare reasonable or not.In this thesis we propose a design of a new privacy-policy language, called the UPrIMPolicy Language (UPL). UPL is modelled on the PrimeLife Policy Language (PPL)and tries to improve some of PPL’s shortcomings. UPL also tries to include informationdeemed mandatory for service providers according to the European Data Protection Directive95/46/EC (DPD). In order to demonstrate the features of UPL, we developed aproof-of-concept matching engine and a set of example instances of UPL. The matchingengine is able to match preferences, policies and recommendations in any combination.The example instances are modelled on four stages of data disclosure found in literature.
|
397 |
Lessons from Québec: towards a national policy for information privacy in our information societyBoyer, Nicole-Anne 05 1900 (has links)
While on the broadest level this paper argues for a rethinking of governance in our
"information society," the central thesis of this paper argues for a national policy for data
protection in the private sector. It does so through three sets of lessons from the Quebec
data protection experience. These include lessons for I) the policy model, (2) the policy
process, (3) the policy area as it relates to the policy problem as well as general questions
about governance in an information polity.
The methodology for this paper is based on a four-part sequential analysis. The first part is a
theoretical and empirical exploration of the problem, which is broadly defined as the "tension
over personal information." The second part looks comparatively at how other jurisdictions
have responded to the problem. The third part assesses which model is the better policy
alternative for Canada and concludes that Quebec regulatory route is better than the national
status quo. The fourth part uses a comparative public policy framework, as well as interviews,
to understand the policy processes in Quebec and Ottawa so that we can highlight the
opportunities and constraints for a national data protection policy in the private sector. / Arts, Faculty of / Political Science, Department of / Graduate
|
398 |
Policy Merger System for P3P in a Cloud Aggregation PlatformOlurin, Olumuyiwa January 2013 (has links)
The need for aggregating privacy policies is present in a variety of application areas today. In traditional client/server models, websites host services along with their policies in different private domains. However, in a cloud-computing platform where aggregators can merge multiple services, users often face complex decisions in terms of choosing the right services from service providers. In this computing paradigm, the ability to aggregate policies as well as services will be useful and more effective for users that are privacy conscious regarding their sensitive or personal information.
This thesis studies the problems associated with the Platform for Privacy Preference (P3P) language, and the present issues with communicating and understanding the P3P language. Furthermore, it discusses some efficient strategies and algorithms for the matching and the merging processes, and then elaborates on some privacy policy conflicts that may occur after merging policies. Lastly, the thesis presents a tool for matching and merging P3P policies. If successful, the merge produces an aggregate policy that is consistent with the policies of all participating service providers.
|
399 |
Privacy Concerns and Personality Traits Influencing Online Behavior: A Structural ModelGrams, Brian C. 05 1900 (has links)
The concept of privacy has proven difficult to analyze because of its subjective nature and susceptibility to psychological and contextual influences. This study challenges the concept of privacy as a valid construct for addressing individuals' concerns regarding online disclosure of personal information, based on the premise that underlying behavioral traits offer a more reliable and temporally stable measure of privacy-oriented behavior than do snapshots of environmentally induced emotional states typically measured by opinion polls. This study investigated the relationship of personality characteristics associated with individuals' general privacy-related behavior to their online privacy behaviors and concerns. Two latent constructs, Functional Privacy Orientation and Online Privacy Orientation, were formulated. Functional Privacy Orientation is defined as a general measure of individuals' perception of control over their privacy. It was measured using the factors General Disclosiveness, Locus of Control, Generalized Trust, Risk Orientation, and Risk Propensity as indicator variables. Online Privacy Orientation is defined as a measure of individuals' perception of control over their privacy in an online environment. It was measured using the factors Willingness to Disclose Online, Level of Privacy Concern, Information Management Privacy Concerns, and Reported Online Disclosure as indicator variables. A survey questionnaire that included two new instruments to measure online disclosure and a willingness to disclose online was used to collect data from a sample of 274 adults. Indicator variables for each of the latent constructs, Functional Privacy Orientation and Online Privacy Orientation, were evaluated using corrected item-total correlations, factor analysis, and coefficient alpha. The measurement models and relationship between Functional Privacy Orientation and Online Privacy Orientation were assessed using exploratory factor analysis and structural equation modeling respectively. The structural model supported the hypothesis that Functional Privacy Orientation significantly influences Online Privacy Orientation. Theoretical, methodological, and practical implications and suggestions for analysis of privacy concerns and behavior are presented.
|
400 |
Quantifying Information Leakage via Adversarial Loss Functions: Theory and PracticeJanuary 2020 (has links)
abstract: Modern digital applications have significantly increased the leakage of private and sensitive personal data. While worst-case measures of leakage such as Differential Privacy (DP) provide the strongest guarantees, when utility matters, average-case information-theoretic measures can be more relevant. However, most such information-theoretic measures do not have clear operational meanings. This dissertation addresses this challenge.
This work introduces a tunable leakage measure called maximal $\alpha$-leakage which quantifies the maximal gain of an adversary in inferring any function of a data set. The inferential capability of the adversary is modeled by a class of loss functions, namely, $\alpha$-loss. The choice of $\alpha$ determines specific adversarial actions ranging from refining a belief for $\alpha =1$ to guessing the best posterior for $\alpha = \infty$, and for the two specific values maximal $\alpha$-leakage simplifies to mutual information and maximal leakage, respectively. Maximal $\alpha$-leakage is proved to have a composition property and be robust to side information.
There is a fundamental disjoint between theoretical measures of information leakages and their applications in practice. This issue is addressed in the second part of this dissertation by proposing a data-driven framework for learning Censored and Fair Universal Representations (CFUR) of data. This framework is formulated as a constrained minimax optimization of the expected $\alpha$-loss where the constraint ensures a measure of the usefulness of the representation. The performance of the CFUR framework with $\alpha=1$ is evaluated on publicly accessible data sets; it is shown that multiple sensitive features can be effectively censored to achieve group fairness via demographic parity while ensuring accuracy for several \textit{a priori} unknown downstream tasks.
Finally, focusing on worst-case measures, novel information-theoretic tools are used to refine the existing relationship between two such measures, $(\epsilon,\delta)$-DP and R\'enyi-DP. Applying these tools to the moments accountant framework, one can track the privacy guarantee achieved by adding Gaussian noise to Stochastic Gradient Descent (SGD) algorithms. Relative to state-of-the-art, for the same privacy budget, this method allows about 100 more SGD rounds for training deep learning models. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2020
|
Page generated in 0.0614 seconds