Spelling suggestions: "subject:"[een] PRIVACY"" "subject:"[enn] PRIVACY""
61 |
Towards a privacy-preserving platform for appsLee, Sangmin 09 February 2015 (has links)
On mobile platforms such as iOS and Android, Web browsers such as Google Chrome, and even smart televisions such as Google TV or Roku, hundreds of thousands of software apps provide services to users. Their functionality often requires access to potentially sensitive user data (e.g., contact lists, passwords, photos), sensor inputs (e.g., camera, microphone, GPS), and/or information about user behavior. Most apps use this data responsibly, but there has also been evidence of privacy violations. As a result, individuals must carefully consider what apps to install and corporations often restrict what apps employees can install on their devices, to prevent an untrusted app—or a cloud provider that an app communicates with—from leaking personal data and proprietary information. There is an inherent trade-off between users’ privacy and apps’ functionality. An app with no access to user data cannot leak anything sensitive, but many apps cannot function without such data. A password management app needs access to passwords, an audio transcription app needs access to the recordings of users’ speech, and a navigation app needs users’ location. In this dissertation, we present two app platform designs, πBox and CleanRoom, that strike a useful balance between users’ privacy and apps’ functional needs, thus shifting much of the responsibility for protecting privacy from the app and its users to the platform itself. πBox is a new app platform that prevents apps from misusing information about their users. To achieve this, πBox deploys (1) a sandbox that spans the user’s device and the cloud, (2) specialized storage and communication channels that enable common app functionality, and (3) an adaptation of recent theoretical algorithms for differential privacy under continual observation. We describe a prototype implementation of πBox and show how it enables a wide range of useful apps with minimal performance overhead and without sacrificing user privacy. In particular, πBox develops the aforementioned three techniques under the assumption of limited sharing of personal data. CleanRoom extends πBox and is designed to protect confidentiality in a "Bring Your Own Apps" (BYOA) world in which employees use their own untrusted third-party apps to create, edit, and share corporate data. CleanRoom’s core guarantee is privacy-preserving collaboration: CleanRoom enables employees to work together on shared documents while ensuring that the documents’ owners—not the app accessing the document—control who can access and collaborate on the document. To achieve this guarantee, CleanRoom partitions an app into three parts, each of which implements a different function of the app (data navigation, data manipulation, and app settings), and controls communication between these parts. We show that CleanRoom accommodates a broad range of apps, preserves the confidentiality of the data that these apps access, and incurs insignificant overhead (e.g., 0.11 ms of overhead per client-server request). Both πBox and CleanRoom use differential privacy for apps to provide feedback to their publisher. This dissertation explores how to adapt differential privacy to be useful for app platforms. In particular, we investigate an adaptation of re- cent theoretical algorithms for differential privacy under continual observation and several techniques to leverage it for useful features in an app environment including advertising, app performance feedback, and error reporting. / text
|
62 |
Strategic behavior and database privacyKrehbiel, Sara 21 September 2015 (has links)
This dissertation focuses on strategic behavior and database privacy. First, we look at strategic behavior as a tool for distributed computation. We blend the perspectives of game theory and mechanism design in proposals for distributed solutions to the classical set cover optimization problem. We endow agents with natural individual incentives, and we show that centrally broadcasting non-binding advice effectively guides the system to a near-optimal state while keeping the original incentive structure intact.
We next turn to the database privacy setting, in which an analyst wishes to learn something from a database, but the individuals contributing the data want to protect their personal information. The notion of differential privacy allows us to do both by obscuring true answers to statistical queries with a small amount of noise. The ability to conduct a task differentially privately depends on whether the amount of noise required for privacy still permits statistical accuracy.
We show that it is possible to give a satisfying tradeoff between privacy and accuracy for a computational problem called independent component analysis (ICA), which seeks to decompose an observed signal into its underlying independent source variables. We do this by releasing a perturbation of a compact representation of the observed data. This approach allows us to preserve individual privacy while releasing information that can be used to reconstruct the underlying relationship between the observed variables.
In almost all of the differential privacy literature, the privacy requirement must be specified before looking at the data, and the noise added for privacy limits the statistical utility of the sanitized data. The third part of this dissertation ties together privacy and strategic behavior to answer the question of how to determine an appropriate level of privacy when data contributors prefer more privacy but an analyst prefers more accuracy. The proposed solution to this problem views privacy as a public good and uses market design techniques to
collect these preferences and then privately select and enforce a socially efficient level of privacy.
|
63 |
Invasion of territorial and personal space as perceived by the surgical patientDonahue, Donna Mae January 1980 (has links)
No description available.
|
64 |
INSERTION OF PRIVACY SERVICES IN PRIVACY ARCHITECTURE FOR WEB SERVICES (PAWS)Bryn, Ajith Winston 20 March 2014 (has links)
Huge growth of the Internet is due to the large number of websites and web services through which information is easily accessible. E-commerce and e-services obtain much private data from users for various reasons such as advertising, marketing, etc. Collection, storage, and usage of private data are subject to various standards, privacy laws, and regulations. To adhere to these legal requirements, many privacy services, such as secure data transmission, authentication, notice, and consent, are required. Inclusion of these required privacy services early in the life cycle of the software development is preferred and advocated, but not fully adhered to. Inclusion of privacy services in legacy software and currently developed software is required. We describe software architecture and a system for automatic inclusion of privacy services, under the supervision of privacy expert, into web pages after the development phase of the Software Development Life Cycle. This will help organizations to adhere to standards, privacy laws, and regulations when collecting private data online from its clients. We also describe a prototype that we have developed as a proof-of-concept to demonstrate the feasibility of our approach.
|
65 |
Privacy Monitoring and Enforcement in a Web Service Architecture (WSA)Tong, Kai 03 May 2012 (has links)
The growth of online activities in our daily lives has led to substantially increased attention on how organizations and their computer systems handle Personal Information (PI). Independently, the wide adoption of Web Service Architecture (WSA), for the integration of software, creates an opportunity to facilitate support for privacy by monitoring the use of PI by web services and enforcing applicable privacy policies.
This thesis designs an agent for privacy monitoring and enforcement in a WSA environment and creates a prototype as a proof of concept. The agent is based on a specific multi-agent architecture for privacy compliance. The design of the agent has led to extension of the architecture to bring out its full potential in monitoring PI flows and enforcing privacy policies in a WSA environment. The evaluation of the prototype has led to suggestions on its implementation for an operational environment.
|
66 |
Workplace discipline and the right to privacy.Mookodi, Masego Magdaline. January 2004 (has links)
No abstract available. / Thesis (LL.M.)-University of KwaZulu-Natal, Durban, 2004.
|
67 |
Offshore financial centres in a globalised economy : the sociological dimensions of bank confidentiality in MonacoDonaghy, Matthew Paul January 1999 (has links)
No description available.
|
68 |
Privacy-Preserving Multi-Quality Charging in V2G networkHe, Miao 05 September 2014 (has links)
Vehicle-to-grid (V2G) network, which provides electricity charging service to the electric vehicles (EVs), is an essential part of the smart grid (SG). It can not only effectively reduce the greenhouse gas emission but also significantly enhance the efficiency of the power grid. Due to the limitation of the local electricity resource, the quality of charging service can be hardly guaranteed for every EV in V2G network. To this end, the multi-quality charging is introduced to provide quality-guaranteed service (QGS) to the qualified EVs and best effort service (BES) to the other EVs. To perform the multi-quality charging, the evaluation on the EV's attributes is necessary to determine which level of charging service can be offered to the EV. However, the EV owner's privacy such as real identity, lifestyle, location, and sensitive information in the attributes may be violated during the evaluation and authentication. In this thesis, a privacy-preserving multi-quality charging (PMQC) scheme for V2G network is proposed to evaluate the EV's attributes, authenticate its service eligibility and generate its bill without revealing the EV's private information. Specifically, by adopting ciphertext-policy attribute based encryption (CP-ABE), the EV can be evaluated to have proper charging service without disclosing its attribute privacy. By utilizing group signature, the EV's real identity is kept confidential during the authentication and the bill generation. By hiding the EV's real identity, the EV owner's lifestyle privacy and location privacy are also preserved. Security analysis demonstrates that PMQC can achieve the EV's privacy preservation, fine-grained access control on the EVs for QGS, traceability of the EV's real identity and secure revocation on the EV's service eligibility. Performance evaluation result shows that PMQC can achieve higher efficiency in authentication and verification compared with other schemes in terms of computation overhead. Based on PMQC, the EV's computation overhead and storage overhead can be further reduced in the extended privacy-preserving multi-quality charging (ePMQC) scheme.
|
69 |
Fully compliant? : a study of data protection policy in UK public organisationsWarren, Adam P. January 2003 (has links)
No description available.
|
70 |
A principled approach to criminalistion: when should making and/or distributing visual recordings be criminalised?Burton, Kelley Jean January 2008 (has links)
[Abstract]Determining the boundaries of the modern criminal law have become a difficult issue, particulary as 21st century criminal law struggles to deal with the widespread use of technology such as digital cameras, mobile phone cameras, video cameras, web cams, the Internet, email and blogosphere, privacy concerns and shifts in modern culture. This thesis discusses making and/or distributing visual recording, and issues which arise with the criminalisation of this conduct. Whilst various national and international jurisdictions have legislated in this regard, their responses have been inconsistent, and this thesis therefore takes a principled approach to examining the criminalisation of such conduct, examining constructs of privacy, harm, morality, culpability, punishment, social welfare and respect for individual autonomy. In framing criminal offences around this conduct, this thesis suggessts that the criminal law should respect the consent of the person visually recorded and consider the subjective culpability of the person making and/or distributing the visual recording.
|
Page generated in 0.048 seconds