• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1207
  • 167
  • 131
  • 109
  • 82
  • 70
  • 38
  • 38
  • 36
  • 18
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2334
  • 616
  • 544
  • 506
  • 493
  • 347
  • 323
  • 307
  • 291
  • 229
  • 226
  • 214
  • 204
  • 195
  • 178
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

An architecture for identity management

Richardson, Brian Robert 06 July 2005
Personalization of on-line content by on-line businesses can improve a users experience and increase a businesss chance of making a sale, but with stricter privacy legislation and Internet users increasing concerns about privacy, businesses need to ensure they do not violate laws or frighten away potential customers. This thesis describes the design of the proposed Identity Management Architecture (IMA). The IMA system allows users to decide on a per business basis what personal information is provided, gives users greater access to their personal information held by on-line businesses, and does not rely on a trusted third-party for management of personal information. In order to demonstrate the design and functionality of the IMA system a prototype implementation has been built. This implementation consists of the IMA client application and an example participating business to demonstrate the features of the IMA client. To evaluate the design of the IMA system it was compared to three high profile identity management systems: Microsoft .NET Passport, Liberty Alliance Project, and Microsoft Infocards. Through this evaluation each tool was compared based on the access to personal information provided to users and on what areas of privacy legislation compliance are improved for a business that participates.
52

The Common Law Right to Privacy

Lilles, Jaan 15 February 2010 (has links)
This paper justifies and delineates a common law right to privacy. The first part of the paper reviews the current state of the law of privacy. The second part defines privacy by distinguishing privacy rights from those otherwise protected by the common law. The paper argues that the appropriate organizing principle behind the legal concept of privacy is the idea of control over one’s interactions with others. The third part argues that protection of privacy at common law is justified both pursuant to the demands of the Charter and with a theoretical understanding of private law based on a Kantian notion of Right. The final part argues that such an analysis determines the substantive nature of the protection that should be afforded at common law, namely that privacy should be protected from both intentional and negligent interference.
53

The Common Law Right to Privacy

Lilles, Jaan 15 February 2010 (has links)
This paper justifies and delineates a common law right to privacy. The first part of the paper reviews the current state of the law of privacy. The second part defines privacy by distinguishing privacy rights from those otherwise protected by the common law. The paper argues that the appropriate organizing principle behind the legal concept of privacy is the idea of control over one’s interactions with others. The third part argues that protection of privacy at common law is justified both pursuant to the demands of the Charter and with a theoretical understanding of private law based on a Kantian notion of Right. The final part argues that such an analysis determines the substantive nature of the protection that should be afforded at common law, namely that privacy should be protected from both intentional and negligent interference.
54

Improving Tor using a TCP-over-DTLS Tunnel

Reardon, Joel 09 September 1923 (has links)
The Tor network gives anonymity to Internet users by relaying their traffic through the world over a variety of routers. This incurs latency, and this thesis first explores where this latency occurs. Experiments discount the latency induced by routing traffic and computational latency to determine there is a substantial component that is caused by delay in the communication path. We determine that congestion control is causing the delay. Tor multiplexes multiple streams of data over a single TCP connection. This is not a wise use of TCP, and as such results in the unfair application of congestion control. We illustrate an example of this occurrence on a Tor node on the live network and also illustrate how packet dropping and reordering cause interference between the multiplexed streams. Our solution is to use a TCP-over-DTLS (Datagram Transport Layer Security) transport between routers, and give each stream of data its own TCP connection. We give our design for our proposal, and details about its implementation. Finally, we perform experiments on our implemented version to illustrate that our proposal has in fact resolved the multiplexing issues discovered in our system performance analysis. The future work gives a number of steps towards optimizing and improving our work, along with some tangential ideas that were discovered during research. Additionally, the open-source software projects latency_proxy and libspe, which were designed for our purposes but programmed for universal applicability, are discussed.
55

Improving Tor using a TCP-over-DTLS Tunnel

Reardon, Joel 09 September 1923 (has links)
The Tor network gives anonymity to Internet users by relaying their traffic through the world over a variety of routers. This incurs latency, and this thesis first explores where this latency occurs. Experiments discount the latency induced by routing traffic and computational latency to determine there is a substantial component that is caused by delay in the communication path. We determine that congestion control is causing the delay. Tor multiplexes multiple streams of data over a single TCP connection. This is not a wise use of TCP, and as such results in the unfair application of congestion control. We illustrate an example of this occurrence on a Tor node on the live network and also illustrate how packet dropping and reordering cause interference between the multiplexed streams. Our solution is to use a TCP-over-DTLS (Datagram Transport Layer Security) transport between routers, and give each stream of data its own TCP connection. We give our design for our proposal, and details about its implementation. Finally, we perform experiments on our implemented version to illustrate that our proposal has in fact resolved the multiplexing issues discovered in our system performance analysis. The future work gives a number of steps towards optimizing and improving our work, along with some tangential ideas that were discovered during research. Additionally, the open-source software projects latency_proxy and libspe, which were designed for our purposes but programmed for universal applicability, are discussed.
56

An architecture for identity management

Richardson, Brian Robert 06 July 2005 (has links)
Personalization of on-line content by on-line businesses can improve a users experience and increase a businesss chance of making a sale, but with stricter privacy legislation and Internet users increasing concerns about privacy, businesses need to ensure they do not violate laws or frighten away potential customers. This thesis describes the design of the proposed Identity Management Architecture (IMA). The IMA system allows users to decide on a per business basis what personal information is provided, gives users greater access to their personal information held by on-line businesses, and does not rely on a trusted third-party for management of personal information. In order to demonstrate the design and functionality of the IMA system a prototype implementation has been built. This implementation consists of the IMA client application and an example participating business to demonstrate the features of the IMA client. To evaluate the design of the IMA system it was compared to three high profile identity management systems: Microsoft .NET Passport, Liberty Alliance Project, and Microsoft Infocards. Through this evaluation each tool was compared based on the access to personal information provided to users and on what areas of privacy legislation compliance are improved for a business that participates.
57

Privacy-preserving data mining

Zhang, Nan 15 May 2009 (has links)
In the research of privacy-preserving data mining, we address issues related to extracting knowledge from large amounts of data without violating the privacy of the data owners. In this study, we first introduce an integrated baseline architecture, design principles, and implementation techniques for privacy-preserving data mining systems. We then discuss the key components of privacy-preserving data mining systems which include three protocols: data collection, inference control, and information sharing. We present and compare strategies for realizing these protocols. Theoretical analysis and experimental evaluation show that our protocols can generate accurate data mining models while protecting the privacy of the data being mined.
58

Fusion center privacy policies does one size fit all? /

Harper, Jennifer L. January 2009 (has links) (PDF)
Thesis (M.A. in Security Studies (Homeland Security and Defense))--Naval Postgraduate School, December 2009. / Thesis Advisor(s): Rollins, John. Second Reader: Petrie, Michael. "December 2009." Description based on title screen as viewed on January 26, 2010. Author(s) subject terms: Fusion center, privacy policy, civil liberties, information and analysis center, New Hampshire. Includes bibliographical references (p. 91-96). Also available in print.
59

Towards a privacy-preserving platform for apps

Lee, Sangmin 09 February 2015 (has links)
On mobile platforms such as iOS and Android, Web browsers such as Google Chrome, and even smart televisions such as Google TV or Roku, hundreds of thousands of software apps provide services to users. Their functionality often requires access to potentially sensitive user data (e.g., contact lists, passwords, photos), sensor inputs (e.g., camera, microphone, GPS), and/or information about user behavior. Most apps use this data responsibly, but there has also been evidence of privacy violations. As a result, individuals must carefully consider what apps to install and corporations often restrict what apps employees can install on their devices, to prevent an untrusted app—or a cloud provider that an app communicates with—from leaking personal data and proprietary information. There is an inherent trade-off between users’ privacy and apps’ functionality. An app with no access to user data cannot leak anything sensitive, but many apps cannot function without such data. A password management app needs access to passwords, an audio transcription app needs access to the recordings of users’ speech, and a navigation app needs users’ location. In this dissertation, we present two app platform designs, πBox and CleanRoom, that strike a useful balance between users’ privacy and apps’ functional needs, thus shifting much of the responsibility for protecting privacy from the app and its users to the platform itself. πBox is a new app platform that prevents apps from misusing information about their users. To achieve this, πBox deploys (1) a sandbox that spans the user’s device and the cloud, (2) specialized storage and communication channels that enable common app functionality, and (3) an adaptation of recent theoretical algorithms for differential privacy under continual observation. We describe a prototype implementation of πBox and show how it enables a wide range of useful apps with minimal performance overhead and without sacrificing user privacy. In particular, πBox develops the aforementioned three techniques under the assumption of limited sharing of personal data. CleanRoom extends πBox and is designed to protect confidentiality in a "Bring Your Own Apps" (BYOA) world in which employees use their own untrusted third-party apps to create, edit, and share corporate data. CleanRoom’s core guarantee is privacy-preserving collaboration: CleanRoom enables employees to work together on shared documents while ensuring that the documents’ owners—not the app accessing the document—control who can access and collaborate on the document. To achieve this guarantee, CleanRoom partitions an app into three parts, each of which implements a different function of the app (data navigation, data manipulation, and app settings), and controls communication between these parts. We show that CleanRoom accommodates a broad range of apps, preserves the confidentiality of the data that these apps access, and incurs insignificant overhead (e.g., 0.11 ms of overhead per client-server request). Both πBox and CleanRoom use differential privacy for apps to provide feedback to their publisher. This dissertation explores how to adapt differential privacy to be useful for app platforms. In particular, we investigate an adaptation of re- cent theoretical algorithms for differential privacy under continual observation and several techniques to leverage it for useful features in an app environment including advertising, app performance feedback, and error reporting. / text
60

Strategic behavior and database privacy

Krehbiel, Sara 21 September 2015 (has links)
This dissertation focuses on strategic behavior and database privacy. First, we look at strategic behavior as a tool for distributed computation. We blend the perspectives of game theory and mechanism design in proposals for distributed solutions to the classical set cover optimization problem. We endow agents with natural individual incentives, and we show that centrally broadcasting non-binding advice effectively guides the system to a near-optimal state while keeping the original incentive structure intact. We next turn to the database privacy setting, in which an analyst wishes to learn something from a database, but the individuals contributing the data want to protect their personal information. The notion of differential privacy allows us to do both by obscuring true answers to statistical queries with a small amount of noise. The ability to conduct a task differentially privately depends on whether the amount of noise required for privacy still permits statistical accuracy. We show that it is possible to give a satisfying tradeoff between privacy and accuracy for a computational problem called independent component analysis (ICA), which seeks to decompose an observed signal into its underlying independent source variables. We do this by releasing a perturbation of a compact representation of the observed data. This approach allows us to preserve individual privacy while releasing information that can be used to reconstruct the underlying relationship between the observed variables. In almost all of the differential privacy literature, the privacy requirement must be specified before looking at the data, and the noise added for privacy limits the statistical utility of the sanitized data. The third part of this dissertation ties together privacy and strategic behavior to answer the question of how to determine an appropriate level of privacy when data contributors prefer more privacy but an analyst prefers more accuracy. The proposed solution to this problem views privacy as a public good and uses market design techniques to collect these preferences and then privately select and enforce a socially efficient level of privacy.

Page generated in 0.0251 seconds