21 |
A Framework for Measuring Privacy Risks of YouTubeCalero, Vanessa January 2020 (has links)
While privacy risks associated with known social networks such as Facebook and Instagram
are well studied, there is a limited investigation of privacy risks of YouTube
videos, which are mainly uploaded by teenagers and young adults, called YouTubers.
This research aims on quantifying the privacy risks of videos when sensitive
information about the private life of a YouTuber is being shared publicly. We developed
a privacy metric for YouTube videos called Privacy Exposure Index (PEI) extending
the existing social networking privacy frameworks. To understand the factors
moderating privacy behaviour of YouTubers, we conducted an extensive survey
of about 100 YouTubers. We have also investigated how YouTube Subscribers and
Viewers may desire to influence the privacy exposure of YouTubers through interactive
commenting on Videos or using other parallels YouTubers’ social networking
channels. For this purpose, we conducted a second survey of about 2000 viewers.
The results of these surveys demonstrate that YouTubers are concerned about their
privacy. Nevertheless inconsistent to this concern they exhibit privacy exposing behaviour
on their videos. In addition, we found YouTubers are being encouraged by
their audience to continue disclosing more personal information on new contents.
Finally, we empirically evaluated the soundness, consistency and applicability of
PEI by analyzing 100 videos uploaded by 10 YouTubers over a period of two years. / Thesis / Master of Science (MSc) / This research aims on quantifying the privacy risks of videos when sensitive
information about the private life of a YouTuber is being shared publicly.
|
22 |
New Surveillance Technologies and the Invasion of Privacy RightsSimsek, Yilmaz 08 1900 (has links)
Definition of privacy has changed by the changes and improvements in information and surveillance technologies. These changes and improvement need new legal decisions for new kinds of privacy invasions. This study explores the scope of privacy right, particularly when a technological surveillance has occurred by law enforcement agencies. It focuses in particular on increasing law enforcements' surveillance technologies and devices that have the potential to impact citizens' information privacy. These increasing changes in surveillance technologies have important implications both for law enforcements and citizens. This study also discusses increasing law enforcement surveillance for the public's security, changes of the laws that allow law enforcements to use new surveillance powers as a war on terrorism, and the citizens concerns of information privacy. A particular attention is given to the recent public opinion surveys which show citizens' increasing privacy concerns. Finally, a set of recommendations to figure out security-privacy debate and reduce the privacy concerns of the citizens is offered.
|
23 |
Privacy and Australian lawGibb, Susan Jennifer. January 1987 (has links) (PDF)
Includes abstract. Includes bibliography.
|
24 |
The Impact of Salient Privacy Information on Decision-MakingTsai, Janice Y. 01 December 2009 (has links)
People value their privacy; however, they typically do not make the protection of their privacy a priority. Privacy is oftentimes not tangible, complicating the efforts of technology users to express and act according to their privacy needs. Additionally, people may not be fully aware of the risks they are subjecting themselves to once they use the Internet for financial transactions, or create profiles on online social networks. Companies post privacy policies inform people about their informational practices; but, this information is extremely difficult to use and typically not considered in users’ decision-making processes.
Privacy concerns have also had an impact on users’ adoption of new technologies that share personal information. A plethora of mobile location-finding technologies applications have become available over the last two decades, but the products and services offered by the technology developers may not comprehensively address the privacy implications and privacy concerns surrounding their use. The design considerations for these products may not provide the necessarily amount of control or risk mitigation for users to ensure that their location information is not misused.
In this thesis, I focus on the impact of salient privacy information on privacy concerns and behavior in two contexts: online shopping and the use of a mobile-location sharing technology. I examine several case studies focusing on the evolution of privacy attitudes after people use specific technologies. Based on the examination of the use of a location-sharing system, I highlight several design considerations for mobile-location application developers to ensure they address their users privacy concerns. I use the results of online surveys and user studies to provide concrete information on the impact of feedback on the comfort with using location-sharing technology. This research shows that users will pay a premium to purchase from websites that offer better privacy policies IF that privacy information is made visible and understandable. This research points to the importance of control in the management of privacy concerns. Whether it be mandated by legislation, or recommended in industry standards or design standards, offering users control in the form of understandable privacy policy information, or control over the disclosure of personal information by technology, is essential.
|
25 |
Analyzing Mobile App Privacy Using Computation and CrowdsourcingAmini, Shahriyar 01 May 2014 (has links)
Mobile apps can make use of the rich data and sensors available on smartphones to offer compelling services. However, the use of sensitive resources by apps is not always justified, which has led to new kinds of privacy risks and challenges. While it is possible for app market owners and third-parties to analyze the privacy-related behaviors of apps, present approaches are difficult and tedious.
I present two iterations of the design, implementation, and evaluation of a system, Gort, which enables more efficient app analysis, by reducing the burden of instrumenting apps, making it easier to find potential privacy problems, and presenting sensitive behavior in context. Gort interacts with apps while instrumenting them to detect sensitive information transmissions. It then presents this information along with the associated app context to a crowd of users to obtain their expectations and comfort regarding the privacy implications of using the app. Gort also runs a set of heuristics on the app to flag potential privacy problems. Finally, Gort synthesizes the information obtained through its analysis and presents it in an interactive GUI, built specifically for privacy analysts.
This work offers three distinct new advances over the state of the art. First, Gort uses a set of heuristics, elicited through interviews with 12 experts, to identify potential app privacy problems. Gort heuristics present high-level privacy problems instead of the overwhelming amount of information offered through existing tools. Second, Gort automatically interacts with apps by discovering and interacting with UI elements while instrumenting app behavior. This eliminates the need for analysts to manually interact with apps or to script interactions. Third, Gort uses crowdsourcing in a novel way to determine whether app privacy leaks are legitimate and desirable and raises red flags about potentially suspicious app behavior. While existing tools can detect privacy leaks, they cannot determine whether the privacy leaks are beneficial or desirable to the user. Gort was evaluated through two separate user studies. The experiences from building Gort and the insights from the user studies guide the creation of future systems, especially systems intended for the inspection and analysis of software.
|
26 |
A generic privacy ontology and its applications to different domainsHecker, Michael January 2009 (has links)
Privacy is becoming increasingly important due to the advent of e-commerce, but is equally important in other application domains. Domain applications frequently require customers to divulge many personal details about themselves that must be protected carefully in accordance with privacy principles and regulations. Here, we define a privacy ontology to support the provision of privacy and help derive the level of privacy associated with transactions and applications. The privacy ontology provides a framework for developers and service providers to guide and benchmark their applications and systems with regards to the concepts of privacy and the levels and dimensions experienced. Furthermore, it supports users or data subjects with the ability to describe their own privacy requirements and measure them when dealing with other parties that process personal information. The ontology developed captures the knowledge of the domain of privacy and its quality aspects, dimensions and assessment criteria. It is composed of a core ontology, which we call generic privacy ontology and application domain specific extensions, which commit to some of application domain concepts, properties and relationships as well as all of the generic privacy ontology ones. This allows for an evaluation of privacy dimensions in different application domains and we present case studies for two different application domains, namely a restricted B2C e-commerce scenario as well as a restricted hospital scenario from the medical domain.
|
27 |
Privacy, constitutions and the law of torts : a comparative and theoretical analysis of protecting personal information against dissemination in New Zealand, the UK and the USA : a thesis submitted in fulfilment of the requirements for the degree of Doctor of Philosophy in Law in the University of Canterbury /Heite, Martin. January 2008 (has links)
Thesis (Ph. D.)--University of Canterbury, 2008. / Typescript (photocopy). Includes bibliographical references (leave 338-361). Also available via the World Wide Web.
|
28 |
Privacy policy-based framework for privacy disambiguation in distributed systemsAlhalafi, Dhafer January 2015 (has links)
With an increase in the pervasiveness of distributed systems, now and into the future, there will be an increasing concern for the privacy of users in a world where almost everyone will be connected to the internet through numerous devices. Current ways of considering privacy in distributed system development are based on the idea of protecting personally-identifiable information such as name and national insurance number, however, with the abundance of distributed systems it is becoming easier to identify people through information that is not personally-identifiable, thus increasing privacy concerns. As a result ideas about privacy have changed and should be reconsidered towards the development of distributed systems. This requires a new way to conceptualise privacy. In spite of active effort on handling the privacy and security worries throughout the initial periods of plan of distributed systems, there has not been much work on creating a reliable and meaningful contribution towards stipulating and scheming a privacy policy framework. Beside developing and fully understanding how the earliest stage of this work is been carried out, the procedure for privacy policy development risks marginalising stakeholders, and therefore defeating the object of what such policies are designed to do. The study proposes a new Privacy Policy Framework (PPF) which is based on a combination of a new method for disambiguating the meaning of privacy from users, owners and developers of distributed systems with distributed system architecture and technical considerations. Towards development of the PPF semi-structured interviews and questionnaires were conducted to determine the current situation regards privacy policy and technical considerations, these methods were also employed to demonstrate the application and evaluation of the PPF itself. The study contributes a new understanding and approach to the consideration of privacy in distributed systems and a practical approach to achieving user privacy and privacy disambiguation through the development of a privacy button concept.
|
29 |
Privacy in Complex Sample Based SurveysShawn A Merrill (11806802) 20 December 2021 (has links)
In the last few decades, there has been a dramatic uptick in the issues related to protecting user privacy in released data, both in statistical databases and anonymized records. Privacy-preserving data publishing is a field established to handle these releases while avoiding the problems that plagued many earlier attempts. This issue is of particular importance for governmental data, where both the release and the privacy requirements are frequently governed by legislature (e.g., HIPAA, FERPA, Clery Act). This problem is doubly compounded by the complex survey methods employed to counter problems in data collection. The preeminent definition for privacy is that of differential privacy, which protects users by limiting the impact that any individual can have on the result of any query. <br><br>The thesis proposes models for differentially private versions of current survey methodologies and, discusses the evaluation of those models. We focus on the issues of missing data and weighting which are common techniques employed in complex surveys to counter problems with sampling and response rates. First we propose a model for answering queries on datasets with missing data while maintaining differential privacy. Our model uses k-Nearest Neighbor imputation to replicate donor values while protecting the privacy of the donor. Our model provides significantly better bias reduction in realistic experiments using existing data, as well as providing less noise than a naive solution. Our second model proposes a method of performing Iterative Proportional Fitting (IPF) in a differentially private manner, a common technique used to ensure that survey records are weighted consistently with known values. We also focus on the general philosophical need to incorporate privacy when creating new survey methodologies, rather than assuming that privacy can simply be added at a later step.
|
30 |
An Examination of the Privacy Impact Assessment as a Vehicle for Privacy Policy Implementation in U.S. Federal AgenciesPandy, Susan M. 13 February 2013 (has links)
The Privacy Act of 1974 was designed to protect personal privacy captured in the records held by government agencies. However, the scope of privacy protection has expanded in light of advances in technology, heightened security, ubiquitous threats, and the value of information. This environment has raised the expectations for public sector management of sensitive personal information and enhanced privacy protections. While the expanse of privacy policy implementation is broad, this study focuses specifically on how agencies implement privacy impact assessments (PIAs) as required under Section 208 of the E-Government Act of 2002. An enhanced understanding of the PIA implementation process serves as a portal into the strategic considerations and management challenges associated with broader privacy policy implementation efforts.
A case study of how the U.S. Postal Service and the U.S. Department of Veterans Affairs have implemented PIAs provides rich insights into privacy policy implementation and outcomes. Elite interviews enriched by process data and document analysis show how each organization undertook different approaches to PIA implementation over time. This study introduces the sociology of law literature using Lauren Edelman's conceptual framework to understand how organizations respond to and interpret law from within the organization, or endogenously. Building upon Edelman's model, certain characteristics of the PIA implementation are analyzed to provide rich description of the factors that influence the implementation process and lead to different policy outcomes.
The findings reflect valuable insights into the privacy policy implementation process and introduce the sociology of law literature to the field of public administration. This literature furthers our understanding of how organizations enact policy over time, how the implementation process unfolds and is impacted by critical factors, and for identifying emergent patterns in organizations. This study furthers our understanding how privacy policy, in particular, is implemented over time by examining the administrative capacities and levels of professionalism that are utilized to accomplish this effort. This research comes at a critical time in the context of the emerging legal and political environment for privacy that is characterized by new expectations by the public and the expanding role of government to manage and protect sensitive information. / Ph. D.
|
Page generated in 0.0422 seconds