• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1210
  • 167
  • 133
  • 109
  • 83
  • 70
  • 38
  • 38
  • 36
  • 18
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2340
  • 619
  • 544
  • 508
  • 494
  • 348
  • 325
  • 307
  • 292
  • 231
  • 226
  • 214
  • 205
  • 196
  • 178
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

The right of publicity in the global market : is James Dean a living dead even in Korea? /

Nam, Hyung Doo. January 2005 (has links)
Thesis (Ph. D.)--University of Washington, 2005. / Vita. Includes bibliographical references (leaves 325-349).
112

Fundamental Limits in Data Privacy: From Privacy Measures to Economic Foundations

January 2016 (has links)
abstract: Data privacy is emerging as one of the most serious concerns of big data analytics, particularly with the growing use of personal data and the ever-improving capability of data analysis. This dissertation first investigates the relation between different privacy notions, and then puts the main focus on developing economic foundations for a market model of trading private data. The first part characterizes differential privacy, identifiability and mutual-information privacy by their privacy--distortion functions, which is the optimal achievable privacy level as a function of the maximum allowable distortion. The results show that these notions are fundamentally related and exhibit certain consistency: (1) The gap between the privacy--distortion functions of identifiability and differential privacy is upper bounded by a constant determined by the prior. (2) Identifiability and mutual-information privacy share the same optimal mechanism. (3) The mutual-information optimal mechanism satisfies differential privacy with a level at most a constant away from the optimal level. The second part studies a market model of trading private data, where a data collector purchases private data from strategic data subjects (individuals) through an incentive mechanism. The value of epsilon units of privacy is measured by the minimum payment such that an individual's equilibrium strategy is to report data in an epsilon-differentially private manner. For the setting with binary private data that represents individuals' knowledge about a common underlying state, asymptotically tight lower and upper bounds on the value of privacy are established as the number of individuals becomes large, and the payment--accuracy tradeoff for learning the state is obtained. The lower bound assures the impossibility of using lower payment to buy epsilon units of privacy, and the upper bound is given by a designed reward mechanism. When the individuals' valuations of privacy are unknown to the data collector, mechanisms with possible negative payments (aiming to penalize individuals with "unacceptably" high privacy valuations) are designed to fulfill the accuracy goal and drive the total payment to zero. For the setting with binary private data following a general joint probability distribution with some symmetry, asymptotically optimal mechanisms are designed in the high data quality regime. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2016
113

A vision for global privacy bridges: Technical and legal measures for international data markets

Spiekermann-Hoff, Sarah, Novotny, Alexander January 2015 (has links) (PDF)
From the early days of the information economy, personal data has been its most valuable asset. Despite data protection laws and an acknowledged right to privacy, trading personal information has become a business equated with "trading oil". Most of this business is done without the knowledge and active informed consent of the people. But as data breaches and abuses are made public through the media, consumers react. They become irritated about companies' data handling practices, lose trust, exercise political pressure and start to protect their privacy with the help of technical tools. As a result, companies' Internet business models that are based on personal data are unsettled. An open conflict is arising between business demands for data and a desire for privacy. As of 2015 no true answer is in sight of how to resolve this conflict. Technologists, economists and regulators are struggling to develop technical solutions and policies that meet businesses' demand for more data while still maintaining privacy. Yet, most of the proposed solutions fail to account for market complexity and provide no pathway to technological and legal implementation. They lack a bigger vision for data use and privacy. To break this vicious cycle, we propose and test such a vision of a personal information market with privacy. We accumulate technical and legal measures that have been proposed by technical and legal scholars over the past two decades. And out of this existing knowledge, we compose something new: a four-space market model for personal data.
114

Helping Smartphone Users Manage their Privacy through Nudges

Almuhimedi, Hazim 01 December 2017 (has links)
The two major smartphone platforms (Android and iOS) have more than two mil- lion mobile applications (apps) available from their respective app stores, and each store has seen more than 50 billion apps downloaded. Although apps provide desired functionality by accessing users’ personal information, they also access personal information for other purposes (e.g., advertising or profiling) that users may or may not desire. Users can exercise control over how apps access their personal information through permission managers. However, a permission manager alone might not be sufficient to help users manage their app privacy because: (1) privacy is typically a secondary task and thus users might not be motivated enough to take advantage of the permission manager’s functionality, and (2) even when using the permission manager, users often make suboptimal privacy decisions due to hurdles in decision making such as incomplete information, bounded rationality, and cognitive and behavioral biases. To address these two challenges, the theoretical framework of this dissertation is the concept of nudges: “soft paternalistic” behavioral interventions that do not restrict choice but account for decision making hurdles. Specifically, I designed app privacy nudges that primarily address the incomplete information hurdle. The nudges aim to help users make better privacy decisions by (1) increasing users’ awareness of privacy risks associated with apps, and (2) temporarily making privacy the primary task to motivate users to review and adjust their app settings. I evaluated app privacy nudges in three user studies. All three studies showed that app privacy nudges are indeed a promising approach to help users manage their privacy. App privacy nudges increased users’ awareness of privacy risks associated with apps on their phones, switched users’ attention to privacy management, and motivated users to review their app privacy settings. Additionally, the second study suggested that not all app privacy nudge contents equally help users manage their privacy. Rather, more effective nudge contents informed users of: (1) contexts in which their personal information has been accessed, (2) purposes for apps’ accessing their personal information, and (3) potential implications of secondary usage of users’ personal information. The third study showed that user engagement with nudges decreases as users receive repeated nudges. Nonetheless, the results of the third experiment also showed that users are more likely to engage with repeated nudges (1) if users have engaged with previous nudges, (2) if repeated nudges contain new in- formation (e.g., additional apps, not shown in earlier nudges, that accessed sensitive resources), or (3) if the nudge contents of repeated nudges resonate with users. The results of this dissertation suggest that mobile operating system providers should enrich their systems with app privacy nudges to assist users in managing their privacy. Additionally, the lessons learned in this dissertation may inform designing privacy nudges in emerging areas such as the Internet of Things.
115

The value required to negate consumer privacy concerns around location

Rosenberg, Dale Patrick 04 August 2012 (has links)
Privacy has been discussed throughout the ages as the world has developed and changed however privacy concerns have been reignited by the development of technology. One of these technologies, Location Based Services (LBS), together with how organisations are using these technologies is pushing the consumers’ privacy boundaries. In order for this technology to become widely adopted these privacy concerns need to be understood and addressed. It is therefore the purpose of this research to examine whether consumers’ privacy concern can be negated through consumers receiving a benefit which caused them to forego this concern.The research used scenarios to evaluate consumers’ comfort levels for four different intrusion levels and five different discounts offered. Due to the nature of the scenarios a repeated measures ANOVA design was used in order to allow for the analysis of each of the scenarios, intrusion levels and discount offered for each respondent.It was found that although privacy concerns can and were influenced by the offers made to the respondents, consumers have not yet gained a complete sense of comfort with the privacy boundaries that are being challenged. / Dissertation (MBA)--University of Pretoria, 2013. / Gordon Institute of Business Science (GIBS) / unrestricted
116

Privacy Enhancing Techniques for Digital Identity Management

Hasini T Urala Liyanage Dona Gunasinghe (8479665) 23 July 2021 (has links)
Proving and verifying remotely a user's identity information have become a critical and challenging problem in the online world, with the increased number of sensitive services offered online. The digital identity management ecosystem has been evolving over the years to address this problem. However, the limitations in existing identity management approaches in handling this problem in a privacy preserving and secure manner have caused disruptions to users' digital lives and damages to revenue and reputation of service providers.<br><br>In this dissertation, we analyze different areas of the identity management ecosystem in terms of privacy and security. In our analysis, we observe three critical aspects to take into account when identifying the privacy and security requirements to address in identity management scenarios, namely: i) protecting privacy and security of digital identity and online transactions of users; ii) providing other stakeholders with assurance about user identity information and accountability of transactions; iii) preserving utility (e.g. accuracy, efficiency and deployability).<br>We show that existing authentication models and identity management protocols fail to address critical privacy and security requirements related to all these three aspects, mainly because of inherent conflicts among these requirements. <br>For example, existing authentication protocols, which aim to protect service providers from imposters by involving strong authentication factors, such as biometrics, fail to protect privacy and security of users' biometrics. Protecting an identity management system against counterfeits of identity assets, while preserving unlinkability of the transactions carried out using the identity assets, is another example of conflicting yet critical privacy and security requirements.<br>We demonstrate that careful combinations of cryptographic techniques and other technologies make it feasible to design privacy preserving identity management protocols which address critical and conflicting requirements related to the aforementioned three aspects. Certain techniques, that we have developed for these protocols, are independent contributions with applications beyond the domain of digital identity management. We validate our contributions by providing prototype implementations, experimental evaluations and security proofs.
117

Privacy Preserving in Online Social Network Data Sharing and Publication

Gao, Tianchong 12 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Following the trend of online data sharing and publishing, researchers raise their concerns about the privacy problem. Online Social Networks (OSNs), for example, often contain sensitive information about individuals. Therefore, anonymizing network data before releasing it becomes an important issue. This dissertation studies the privacy preservation problem from the perspectives of both attackers and defenders. To defenders, preserving the private information while keeping the utility of the published OSN is essential in data anonymization. At one extreme, the final data equals the original one, which contains all the useful information but has no privacy protection. At the other extreme, the final data is random, which has the best privacy protection but is useless to the third parties. Hence, the defenders aim to explore multiple potential methods to strike a desirable tradeoff between privacy and utility in the published data. This dissertation draws on the very fundamental problem, the definition of utility and privacy. It draws on the design of the privacy criterion, the graph abstraction model, the utility method, and the anonymization method to further address the balance between utility and privacy. To attackers, extracting meaningful information from the collected data is essential in data de-anonymization. De-anonymization mechanisms utilize the similarities between attackers’ prior knowledge and published data to catch the targets. This dissertation focuses on the problems that the published data is periodic, anonymized, and does not cover the target persons. There are two thrusts in studying the de-anonymization attacks: the design of seed mapping method and the innovation of generating-based attack method. To conclude, this dissertation studies the online data privacy problem from both defenders’ and attackers’ point of view and introduces privacy and utility enhancement mechanisms in different novel angles.
118

A PRIVACY-AWARE WEARABLE FRAMEWORK

Mohzary, Muhammad A. 05 December 2018 (has links)
No description available.
119

Enabling Accurate Analysis of Private Network Data

Hay, Michael 01 September 2010 (has links)
This dissertation addresses the challenge of enabling accurate analysis of network data while ensuring the protection of network participants' privacy. This is an important problem: massive amounts of data are being collected (facebook activity, email correspondence, cell phone records), there is huge interest in analyzing the data, but the data is not being shared due to concerns about privacy. Despite much research in privacy-preserving data analysis, existing technologies fail to provide a solution because they were designed for tables, not networks, and cannot be easily adapted to handle the complexities of network data. We develop several technologies that advance us toward our goal. First, we develop a framework for assessing the risk of publishing a network that has been "anonymized." Using this framework, we show that only a small amount of background knowledge about local network structure is needed to re-identify an "anonymous" individual. This motivates our second contribution: an algorithm that transforms the structure of the network to provably lower re-identification risk. In comparison with other algorithms, we show that our approach more accurately preserves important features of the network topology. Finally, we consider an alternative paradigm, in which the analyst can analyze private data through a carefully controlled query interface. We show that the degree sequence of a network can be accurately estimated under strong guarantees of privacy.
120

DIFFERENTIAL PRIVACY IN DISTRIBUTED SETTINGS

Zitao Li (14135316) 18 November 2022 (has links)
<p>Data is considered the "new oil" in the information society and digital economy. While many commercial activities and government decisions are based on data, the public raises more concerns about privacy leakage when their private data are collected and used. In this dissertation, we investigate the privacy risks in settings where the data are distributed across multiple data holders, and there is only an untrusted central server. We provide solutions for several problems under this setting with a security notion called differential privacy (DP). Our solutions can guarantee that there is only limited and controllable privacy leakage from the data holder, while the utility of the final results, such as model prediction accuracy, can be still comparable to the ones of the non-private algorithms.</p> <p><br></p> <p>First, we investigate the problem of estimating the distribution over a numerical domain while satisfying local differential privacy (LDP). Our protocol prevents privacy leakage in the data collection phase, in which an untrusted data aggregator (or a server) wants to learn the distribution of private numerical data among all users. The protocol consists of 1) a new reporting mechanism called the square wave (SW) mechanism, which randomizes the user inputs before sharing them with the aggregator; 2) an Expectation Maximization with Smoothing (EMS) algorithm, which is applied to aggregated histograms from the SW mechanism to estimate the original distributions.</p> <p><br></p> <p>Second, we study the matrix factorization problem in three federated learning settings with an untrusted server, i.e., vertical, horizontal, and local federated learning settings. We propose a generic algorithmic framework for solving the problem in all three settings. We introduce how to adapt the algorithm into differentially private versions to prevent privacy leakage in the training and publishing stages.</p> <p><br></p> <p>Finally, we propose an algorithm for solving the k-means clustering problem in vertical federated learning (VFL). A big challenge in VFL is the lack of a global view of each data point. To overcome this challenge, we propose a lightweight and differentially private set intersection cardinality estimation algorithm based on the Flajolet-Martin (FM) sketch to convey the weight information of the synopsis points. We provide theoretical utility analysis for the cardinality estimation algorithm and further refine it for better empirical performance.</p>

Page generated in 0.0413 seconds