• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1237
  • 167
  • 137
  • 109
  • 83
  • 70
  • 38
  • 38
  • 36
  • 21
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2380
  • 641
  • 556
  • 520
  • 508
  • 352
  • 332
  • 308
  • 299
  • 235
  • 234
  • 218
  • 210
  • 199
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

Privacy and data protection in a digital age - In the context of health data protection in Europe. / Privacy and data protection in a digital age - In the context of health data protection in Europe.

Johansson, Ellen January 2024 (has links)
No description available.
362

Security and privacy in perceptual computing

Jana, Suman 18 September 2014 (has links)
Perceptual, "context-aware" applications that observe their environment and interact with users via cameras and other sensors are becoming ubiquitous on personal computers, mobile phones, gaming platforms, household robots, and augmented-reality devices. This dissertation's main thesis is that perceptual applications present several new classes of security and privacy risks to both their users and the bystanders. Existing perceptual platforms are often completely inadequate for mitigating these risks. For example, we show that the augmented reality browsers, a class of popular perceptual platforms, contain numerous inherent security and privacy flaws. The key insight of this dissertation is that perceptual platforms can provide stronger security and privacy guarantees by controlling the interfaces they expose to the applications. We explore three different approaches that perceptual platforms can use to minimize the risks of perceptual computing: (i) redesigning the perceptual platform interfaces to provide a fine-grained permission system that allows least-privileged application development; (ii) leveraging existing perceptual interfaces to enforce access control on perceptual data, apply algorithmic privacy transforms to reduce the amount of sensitive content sent to the applications, and enable the users to audit/control the amount of perceptual data that reaches each application; and (iii) monitoring the applications' usage of perceptual interfaces to find anomalous high-risk cases. To demonstrate the efficacy of our approaches, first, we build a prototype perceptual platform that supports fine-grained privileges by redesigning the perceptual interfaces. We show that such a platform not only allows creation of least-privileged perceptual applications but also can improve performance by minimizing the overheads of executing multiple concurrent applications. Next, we build DARKLY, a security and privacy-aware perceptual platform that leverages existing perceptual interfaces to deploy several different security and privacy protection mechanisms: access control, algorithmic privacy transforms, and user audit. We find that DARKLY can run most existing perceptual applications with minimal changes while still providing strong security and privacy protection. Finally, We introduce peer group analysis, a new technique that detects anomalous high-risk perceptual interface usages by creating peer groups with software providing similar functionality and comparing each application's perceptual interface usages against those of its peers. We demonstrate that such peer groups can be created by leveraging information already available in software markets like textual descriptions and categories of applications, list of related applications, etc. Such automated detection of high-risk applications is essential for creating a safer perceptual ecosystem as it helps the users in identifying and installing safer applications with any desired functionality and encourages the application developers to follow the principle of least privilege. / text
363

PRIVACY IN THE PERSONAL LANDSCAPE.

Newell, Patricia Anne. January 1984 (has links)
No description available.
364

THE SOCIOLOGICAL IMPACT OF THE FAMILY EDUCATIONAL RIGHTS AND PRIVACY ACT ON AN INSTITUTION OF HIGHER EDUCATION.

Sparrow, Alice Pickett, 1939- January 1985 (has links)
No description available.
365

Multidimensional epidemiological transformations : addressing location-privacy in public health practice

Abdel Malik, Philip January 2011 (has links)
The ability to control one’s own personally identifiable information is a worthwhile human right that is becoming increasingly vulnerable. However just as significant, if not more so, is the right to health. With increasing globalisation and threats of natural disasters and acts of terrorism, this right is also becoming increasingly vulnerable. Public health practice – which is charged with the protection, promotion and mitigation of the health of society and its individuals – has been at odds with the right to privacy. This is particularly significant when location privacy is under consideration. Spatial information is an important aspect of public health, yet the increasing availability of spatial imagery and location-sensitive applications and technologies has brought location-privacy to the forefront, threatening to negatively impact the practice of public health by inhibiting or severely limiting data-sharing. This study begins by reviewing the current relevant legislation as it pertains to public health and investigates the public health community’s perceptions on location privacy barriers to the practice. Bureaucracy and legislation are identified by survey participants as the two greatest privacy-related barriers to public health. In response to this clash, a number of solutions and workarounds are proposed in the literature to compensate for location privacy. However, as their weaknesses are outlined, a novel approach - the multidimensional point transform - that works synergistically on multiple dimensions, including location, to anonymise data is developed and demonstrated. Finally, a framework for guiding decisions on data-sharing and identifying requirements is proposed and a sample implementation is demonstrated through a fictitious scenario. For each aspect of the study, a tool prototype and/or design for implementation is proposed and explained, and the need for further development of these is highlighted. In summary, this study provides a multi-disciplinary and multidimensional solution to the clash between privacy and data-sharing in public health practice.
366

Space in Space: Privacy Needs for Long-Duration Spaceflight

Aiken, Jo 05 1900 (has links)
Space exploration is a uniquely human activity. As humans continue to push the limits of exploring the unknown, they have sought knowledge supporting the sustenance of life in outer space. New technologies, advancements in medicine, and rethinking what it means to be a “community” will need to emerge to support life among the stars. Crews traveling beyond the Moon will rely on the development of new technologies to support the technological aspects of their missions as well as their quality of life while away from Earth. Likewise, through advancements in medicine, scientists will need to address remaining questions regarding the effects of long-duration spaceflight on the human body and crew performance. Space explorers must learn to utilize these new technologies and medical advancements while learning to adapt to their new environment in space and as a space community. It is important that researchers address these issues so that human survival beyond Earth is not only achievable but so that life among the stars is worth living and sustaining. This thesis addressed these issues in an attempt to extend the trajectory of space exploration to new horizons.
367

Privacy Preserving in Online Social Network Data Sharing and Publication

Tianchong Gao (7428566) 17 October 2019 (has links)
<p>Following the trend of online data sharing and publishing, researchers raise their concerns about the privacy problem. Online Social Networks (OSNs), for example, often contain sensitive information about individuals. Therefore, anonymizing network data before releasing it becomes an important issue. This dissertation studies the privacy preservation problem from the perspectives of both attackers and defenders. </p> <p><br></p> <p>To defenders, preserving the private information while keeping the utility of the published OSN is essential in data anonymization. At one extreme, the final data equals the original one, which contains all the useful information but has no privacy protection. At the other extreme, the final data is random, which has the best privacy protection but is useless to the third parties. Hence, the defenders aim to explore multiple potential methods to strike a desirable tradeoff between privacy and utility in the published data. This dissertation draws on the very fundamental problem, the definition of utility and privacy. It draws on the design of the privacy criterion, the graph abstraction model, the utility method, and the anonymization method to further address the balance between utility and privacy. </p> <p><br></p> <p>To attackers, extracting meaningful information from the collected data is essential in data de-anonymization. De-anonymization mechanisms utilize the similarities between attackers’ prior knowledge and published data to catch the targets. This dissertation focuses on the problems that the published data is periodic, anonymized, and does not cover the target persons. There are two thrusts in studying the de-anonymization attacks: the design of seed mapping method and the innovation of generating-based attack method. To conclude, this dissertation studies the online data privacy problem from both defenders’ and attackers’ point of view and introduces privacy and utility enhancement mechanisms in different novel angles.</p>
368

Privacy-aware Use of Accountability Evidence

Reuben, Jenni January 2017 (has links)
This thesis deals with the evidence that enable accountability, the privacy risks involved in using them and a privacy-aware solution to the problem of unauthorized evidence disclosure.  Legal means to protect privacy of an individual is anchored on the data protection perspective i.e., on the responsible collection and use of personal data. Accountability plays a crucial role in such legal privacy frameworks for assuring an individual’s privacy. In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. System traces that record the system activities are the essential inputs to those automated audits. Nevertheless, the traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about processing of the personal data. Therefore, ensuring the privacy of the accountability traces is equally important as ensuring the privacy of the personal data. However, by and large, research involving accountability traces is concerned with storage, interoperability and analytics challenges rather than on the privacy implications involved in processing them. This dissertation focuses on both the application of accountability evidence such as in the automated privacy audits and the privacy aware use of them. The overall aim of the thesis is to provide a conceptual understanding of the privacy compliance research domain and to contribute to the solutions that promote privacy-aware use of the traces that enable accountability. To address the first part of the objective, a systematic study of existing body of knowledge on automated privacy compliance is conducted. As a result, the state-of-the-art is conceptualized as taxonomies. The second part of the objective is accomplished through two results; first, a systematic understanding of the privacy challenges involved in processing of the system traces is obtained, second, a model for privacy aware access restrictions are proposed and formalized in order to prevent illegitimate access to the system traces. Access to accountability traces such as provenance are required for automatic fulfillment of accountability obligations, but they themselves contain personally identifiable information, hence in this thesis we provide a solution to prevent unauthorized access to the provenance traces. / This thesis deals with the evidence that enables accountability, the privacy risks involved in using it and proposes a privacy-aware solution for preventing unauthorized evidence disclosure. Accountability plays a crucial role in the legal privacy frameworks for assuring individuals’ privacy.  In the European context, accountability principle is pervasive in the measures that are mandated by the General Data Protection Regulation. In general, these measures are technically achieved through automated privacy audits. Traces that record the system activities are the essential inputs to those audits. Nevertheless, such traces that enable accountability are themselves subject to privacy risks, because in most cases, they inform about the processing of the personal data. Therefore, ensuring the privacy of the traces is equally important as ensuring the privacy of the personal data. The aim of the thesis is to provide a conceptual understanding of the automated privacy compliance research and to contribute to the solutions that promote privacy-aware use of the accountability traces. This is achieved in this dissertation through a systematic study of the existing body of knowledge in automated privacy compliance, a systematic analysis of the privacy challenges involved in processing the traces and a proposal of a privacy-aware access control model for preventing illegitimate access to the traces.
369

Policy Merger System for P3P in a Cloud Aggregation Platform

Olurin, Olumuyiwa 09 January 2013 (has links)
The need for aggregating privacy policies is present in a variety of application areas today. In traditional client/server models, websites host services along with their policies in different private domains. However, in a cloud-computing platform where aggregators can merge multiple services, users often face complex decisions in terms of choosing the right services from service providers. In this computing paradigm, the ability to aggregate policies as well as services will be useful and more effective for users that are privacy conscious regarding their sensitive or personal information. This thesis studies the problems associated with the Platform for Privacy Preference (P3P) language, and the present issues with communicating and understanding the P3P language. Furthermore, it discusses some efficient strategies and algorithms for the matching and the merging processes, and then elaborates on some privacy policy conflicts that may occur after merging policies. Lastly, the thesis presents a tool for matching and merging P3P policies. If successful, the merge produces an aggregate policy that is consistent with the policies of all participating service providers.
370

An Empirical Investigation of Internet Privacy: Customer Behaviour, Companies’ Privacy Policy Disclosures, and a Gap

No, Won Gyun 09 1900 (has links)
Privacy emerges as a critical issue in an e-commerce environment because of a fundamental tension among corporate, consumer, and government interests. By reviewing prior Internet-privacy research in the fields of information systems, business, and marketing published between 1995 and 2006, we consider the following research questions: 1) how an individual’s privacy behaviour is affected by privacy policy disclosures and by the level of the individual’s involvement regarding the sensitivity of personal information; 2) how companies’ privacy policies vary with respect to regulatory approaches and cultural values; and 3) whether there is a gap between the privacy practices valued by individuals and those emphasized by companies. A three-stage study is conducted to answer these questions. The first two stages, consisting of a Web-based survey and an online ordering experiment with 210 participants, found that individuals are more likely to read the privacy policy statements posted on Web sites and less likely to provide personal information, when they are under a high privacy involved situation as compared to being in a low privacy involved situation. However, the existence of a privacy seal did not affect individuals’ behaviour, regardless of involvement conditions. This study also found a gap between self-reported privacy behaviour and actual privacy behaviour. When individuals were requested to provide personal information, their privacy policy statement reading behaviour was close to their self-report behaviour. However, their personal information providing behaviour was different from their self-reported behaviour. The third stage, which entailed the study of 420 privacy policies spanning six countries and two industries, showed that privacy policies vary across countries, as well as with varying governmental involvement and cultural values in those countries. Finally, the analysis of all the three stages revealed a gap between individuals’ importance ratings of companies’ privacy practices and policies that companies emphasize in their privacy disclosures.

Page generated in 0.2162 seconds