• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1208
  • 167
  • 131
  • 109
  • 82
  • 70
  • 38
  • 38
  • 36
  • 18
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2335
  • 617
  • 544
  • 506
  • 493
  • 347
  • 323
  • 307
  • 291
  • 229
  • 226
  • 214
  • 205
  • 195
  • 178
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Optimizing Secure Function Evaluation on Mobile Devices

Mood, Benjamin, Mood, Benjamin January 2012 (has links)
Secure function evaluation (SFE) on mobile devices, such as smartphones, allows for the creation of new privacy-preserving applications. Generating the circuits on smartphones which allow for executing customized functions, however, is infeasible for most problems due to memory constraints. In this thesis, we develop a new methodology for generating circuits that is memory-efficient. Using the standard SFDL language for describing secure functions as input, we design a new pseudo- assembly language (PAL) and a template-driven compiler, generating circuits which can be evaluated with the canonical Fairplay evaluation framework. We deploy this compiler and demonstrate larger circuits can now be generated on smartphones. We show our compiler's ability to interface with other execution systems and perform optimizations on that execution system. We show how runtime generation of circuits can be used in practice. Our results demonstrate the feasibility of generating garbled circuits on mobile devices. This thesis includes previously published co-authored material.
82

Fairness and Privacy Violations in Black-Box Personalization Systems: Detection and Defenses

Datta, Amit 01 March 2018 (has links)
Black box personalization systems have become ubiquitous in our daily lives. They utilize collected data about us to make critical decisions such as those related to credit approval and insurance premiums. This leads to concerns about whether these systems respect expectations of fairness and privacy. Given the black box nature of these systems, it is challenging to test whether they satisfy certain fundamental fairness and privacy properties. For the same reason, while many black box privacy enhancing technologies offer consumers the ability to defend themselves from data collection, it is unclear how effective they are. In this doctoral thesis, we demonstrate that carefully designed methods and tools that soundly and scalably discover causal effects in black box software systems are useful in evaluating personalization systems and privacy enhancing technologies to understand how well they protect fairness and privacy. As an additional defense against discrimination, this thesis also explores legal liability for ad platforms in serving discriminatory ads. To formally study fairness and privacy properties in black box personalization systems, we translate these properties into information flow instances and develop methods to detect information flow. First, we establish a formal connection between information flow and causal effects. As a consequence, we can use randomized controlled experiments, traditionally used to detect causal effects, to detect information flow through black box systems. We develop AdFisher as a general framework to perform information flow experiments scalably on web systems and use it to evaluate discrimination, transparency, and choice on Google’s advertising ecosystem. We find evidence of gender-based discrimination in employment-related ads and a lack of transparency in Google’s transparency tool when serving ads for rehabilitation centers after visits to websites about substance abuse. Given the presence of discrimination and the use of sensitive attributes in personalization systems, we explore possible defenses for consumers. First, we evaluate the effectiveness of publicly available privacy enhancing technologies in protecting consumers from data collection by online trackers. Specifically, we use a combination of experimental and observational approaches to examine how well the technologies protect consumers against fingerprinting, an advanced form of tracking. Next, we explore legal liability for an advertising platform like Google for delivering employment and housing ads in a discriminatory manner under Title VII and the Fair Housing Act respectively. We find that an ad platform is unlikely to incur liability under Title VII due to its limited coverage. However, we argue that housing ads violating the Fair Housing Act could create liability if the ad platform targets ads toward or away from protected classes without explicit instructions from the advertiser.
83

The Infopriv model for information privacy

Dreyer, Lucas Cornelius Johannes 20 August 2012 (has links)
D.Phil. (Computer Science) / The privacy of personal information is crucial in today's information systems. Traditional security models are mainly concerned with the protection of information inside a computer system. These models assume that the users of a computer system are trustworthy and will not disclose information to unauthorised parties. However, this assumption does not always apply to information privacy since people are the major cause of privacy violations. Alternative models are, therefore, needed for the protection of personal information in an environment.
84

Practices for Protecting Privacy in Health Research: Perspectives of the Public, Privacy Guidance Documents and REBs

Lysyk, Mary C. January 2014 (has links)
Health research is the single vehicle for uncovering the varying causes of disease or illness, understanding the broader determinants of health, and discovering new or to validating traditional ways of treating the individuals who suffer from these conditions. Thus, health research activities are at the heart of medical, health and scientific developments. While health research activities exemplify some of the greatest hopes for improved health-care, they also highlight public concerns for the protection of personal health information (PHI). More specifically, advances in modern information technology and the increasing pace of international collaborative studies raise challenging issues regarding privacy protection in health research. The extensive quantities of data housed in general-use databases and electronic health records (EHRs) are two frequently cited examples of “electronic health information” that are now increasingly available to researchers globally . For example, individual discrete studies are expanding into long-term prospective disease or treatment databases without clear research questions and involving multiple research teams and jurisdictions. As well, EHRs are increasingly taking a prominent role in Western industrialized nations such as England, Australia, New Zealand, Germany, the Netherlands, the United States, as well as Canada. The expected large scale demand for the secondary uses of personal health information (PHI) from electronic health records represents another significant challenge to privacy. EHRs facilitate clinical and population-based health research not only in terms of secondary uses, such as retrospective observational studies, but also for prospective cohort studies. In Canada today, there are two documents that provide direction that is applicable at a national level to privacy protection practices in health research: The Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans (TCPS 2) and the CIHR Best Practices for Protecting Privacy in Health Research (CIHR BPPP). The TCPS 2, a policy document, is the most influential Canadian policy applicable to the ethics of research with human participants and widely followed by Canadian researchers and institutions. Conversely, use of the CIHR BPPP is purely optional. Canadian REBs are responsible for much of the governance of privacy, confidentiality and security in health research. However, the extent to which they apply and utilize the privacy provisions from the TCPS 2 and CIHR BPPP in their protocol requirements is not known. This thesis provides a descriptive comparative study of the international and Canadian contexts for privacy protection in health research and produces a greater understanding of two Canadian stakeholder groups: the Canadian public, whose participation and trust is imperative for valid research; and Canadian Faculty of Medicine (FoM) university biomedical REBs with whom much responsibility for ensuring appropriate protection of privacy and confidentiality in health research rests.
85

Anonymity and Linkability

January 2018 (has links)
acase@tulane.edu / This thesis considers systems for anonymous communication between users of a cybersystem. Specifically, we consider the scenario where communications generated by the same user repeatedly over time can or must be linked. Linked user behavior can leak information, which adversaries can use to de-anonymize users. Analyzing linked behavior can also generate information about the use of anonymity protocols that can be valuable for research, leading to more effective protocols. But techniques to collect such data must include assurances that the methods and outputs do not compromise user privacy. A main result of this thesis is an anonymity protocol called Private Set-Union Cardinality, designed to aggregate linked private user data safely. We prove that Private Set-Union Cardinality securely calculates the noisy cardinality of the union of a collection of distributed private data sets. This protocol is intended to take measurements in real-world anonymity systems like Tor and we prove it is secure even if a majority of the participants are dishonest as well as under general concurrent composition. The remaining results analyze path selection in anonymous routing systems. To obtain our results, we develop a mathematical framework to measure information leakage during repeated linkable path selection and propose new metrics: a radius that measures worst-case behavior, and a neighborhood graph that visualizes degradation of the system over time as a whole. We use these metrics to derive theoretical upper bounds on an adversary's accuracy in de-anonymization. Finally, we investigate an attack where users can be de-anonymized due to the information an adversary learns when failing to observe some event. We call these occurrences non-observations and we develop a theory of non-observations in anonymous routing systems, deriving theoretical bounds on the information leakage due to this behavior in the general case and for Tor. / 1 / Ellis Fenske
86

Responses to Privacy Turbulence: The Impact of Personality Traits on Recalibration and Privacy Boundaries on Facebook

Fechner, Valerie January 2016 (has links)
As individuals use social media to create and maintain relationships and connections, they must also decide how to manage the private information that they disclose to their connections. If private information is handled improperly online, it may evoke varying responses that affect previously held privacy boundaries. Using communication privacy management theory (Petronio, 2002) as a framework, this study seeks to understand how the severity of a privacy violation impacts the Facebook users respond to online privacy turbulence. It also investigates how personality characteristics influence these responses. Results reveal that more severe privacy violations are met with more discussion of the privacy violation and thicker privacy boundaries both between the owner and the violator and between the owner and their social media network. Findings also imply that some of the Big Five personality traits impact the relationship between severity and the outcome variables.
87

Privacy-Preserving Facial Recognition Using Biometric-Capsules

Phillips, Tyler S. 05 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / In recent years, developers have used the proliferation of biometric sensors in smart devices, along with recent advances in deep learning, to implement an array of biometrics-based recognition systems. Though these systems demonstrate remarkable performance and have seen wide acceptance, they present unique and pressing security and privacy concerns. One proposed method which addresses these concerns is the elegant, fusion-based Biometric-Capsule (BC) scheme. The BC scheme is provably secure, privacy-preserving, cancellable and interoperable in its secure feature fusion design. In this work, we demonstrate that the BC scheme is uniquely fit to secure state-of-the-art facial verification, authentication and identification systems. We compare the performance of unsecured, underlying biometrics systems to the performance of the BC-embedded systems in order to directly demonstrate the minimal effects of the privacy-preserving BC scheme on underlying system performance. Notably, we demonstrate that, when seamlessly embedded into a state-of-the-art FaceNet and ArcFace verification systems which achieve accuracies of 97.18% and 99.75% on the benchmark LFW dataset, the BC-embedded systems are able to achieve accuracies of 95.13% and 99.13% respectively. Furthermore, we also demonstrate that the BC scheme outperforms or performs as well as several other proposed secure biometric methods.
88

Concerned Enough to Act? Privacy Concerns & Perspectives Among Undergraduate Instagram Users

Zhang, Hongru 28 August 2023 (has links)
A gap, known as the privacy paradox, often exists between peoples' privacy concerns and the actions they take to protect their privacy. This thesis investigates how a small sample of undergraduate students perceive their online privacy, the measures they take to protect their online privacy, and the reasons for their action/inaction. In so doing, it seeks to ascertain whether the findings of Hinds et al.'s (2020) regarding Facebook users' perceptions of online privacy issues are replicable among a sample of undergraduate students who are regular users of Instagram. Interviews with 20 undergraduate students were conducted to gather information about their online privacy concerns, their attitudes toward social media platforms collecting their information, and the privacy protection strategies they employ. The findings suggest that in enacting protection strategies, the participants delineate between both social and institutional conceptions of privacy, and the notion of privacy as a right requiring protecting versus viewing privacy as a negotiable commodity. This, in turn, suggests a possible need to re-consider how privacy-related education is approached as well as privacy policy.
89

The cost of convenience the extent of the reasonable expectation of privacy in the internet age

Karpf, Justin 01 May 2013 (has links)
The thesis will conclude by identifying issues that courts and legislatures will have to address in the coming years to adequately deliver justice in a dynamic society that is prone to powerful technological change.; Though the Internet and social media are fairly recent developments, the legal principles and issues embodied in them are well-represented in the Constitution. Take, for example, the freedom of expression enumerated in the First Amendment. Though traditionally in print, pamphlets, and film, recent developments in technology such as Facebook and blogs have become the new standard forms of communication. Like the physical mediums that arose before them, issues arise of what limits, if any, should be placed on the speech. Given the guise of anonymity, people on the Internet have less accountability in the comments they make, which has led to things ranging from passionate political speech to what is known as cyber-bullying, which is online harassment that has led people to suicide. This thesis, however, will primarily focus on the Fourth Amendment's reasonable expectation of privacy. Because the information involved with the Internet and social media is digital, it is more difficult to identify when privacy has been breached. With a paper envelope, for example, one can tell if the seal was broken and the contents were potentially disclosed to an unwanted party. Electronically, however, no such seal exists to notify the sender or recipient of a communication. Furthermore, the Government has found itself under stricter scrutiny for searches with these new developments in technology; the lack of physical intrusion poses difficult questions for courts that must decide how far a reasonable expectation of privacy goes in the social media age. The thesis will also address how private companies obtain and use individuals' information through the services they provide and the issues that arise from them. Private companies have fewer restrictions than the Government, and both perspectives are important to keep in mind when trying to understand the policy implications rapid technological growth has brought about.
90

Negotiation of Transparency and Privacy in the Urban Context

Taylor, Bethany L. January 2010 (has links)
No description available.

Page generated in 0.0328 seconds