• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1237
  • 167
  • 137
  • 109
  • 83
  • 70
  • 38
  • 38
  • 36
  • 20
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2377
  • 641
  • 556
  • 520
  • 508
  • 352
  • 332
  • 308
  • 299
  • 234
  • 234
  • 218
  • 209
  • 199
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Privacy Preserving Network Security Data Analytics

DeYoung, Mark E. 24 April 2018 (has links)
The problem of revealing accurate statistics about a population while maintaining privacy of individuals is extensively studied in several related disciplines. Statisticians, information security experts, and computational theory researchers, to name a few, have produced extensive bodies of work regarding privacy preservation. Still the need to improve our ability to control the dissemination of potentially private information is driven home by an incessant rhythm of data breaches, data leaks, and privacy exposure. History has shown that both public and private sector organizations are not immune to loss of control over data due to lax handling, incidental leakage, or adversarial breaches. Prudent organizations should consider the sensitive nature of network security data and network operations performance data recorded as logged events. These logged events often contain data elements that are directly correlated with sensitive information about people and their activities -- often at the same level of detail as sensor data. Privacy preserving data publication has the potential to support reproducibility and exploration of new analytic techniques for network security. Providing sanitized data sets de-couples privacy protection efforts from analytic research. De-coupling privacy protections from analytical capabilities enables specialists to tease out the information and knowledge hidden in high dimensional data, while, at the same time, providing some degree of assurance that people's private information is not exposed unnecessarily. In this research we propose methods that support a risk based approach to privacy preserving data publication for network security data. Our main research objective is the design and implementation of technical methods to support the appropriate release of network security data so it can be utilized to develop new analytic methods in an ethical manner. Our intent is to produce a database which holds network security data representative of a contextualized network and people's interaction with the network mid-points and end-points without the problems of identifiability. / Ph. D.
172

Understanding the Impact of Data Privacy Regulations on Software and Its Stakeholders

Franke, Lucas James 06 July 2023 (has links)
The General Data Protection Regulation (GDPR) is a comprehensive data privacy law that limits how businesses can collect personal information about their consumers living in the European Union. For our research, we aimed to evaluate the impact that the GDPR has on the open-source community, an online community that encourages open collaboration between software developers. We conducted a quantitative analysis of GitHub pull requests in which we compared pull requests explicitly related to the GDPR to other non-GDPR pull requests from the same projects. We also conducted a qualitative pilot study in which we interviewed software developers with experience implementing GDPR requirements in industry or in open-source. From our research, we found that GDPR-related pull requests had significantly more activity than other pull requests, but that open-source developers did not perceive a significant impact on their software development processes when implementing GDPR compliance. Industry developers, on the other hand, had a more negative outlook on the GDPR, and found implementation to be difficult. Our results indicate a need to involve software developers in the lawmaking process in order to create direct and realistic expectations for developers when implementing privacy policies. / Master of Science / The General Data Protection Regulation (GDPR) is a comprehensive data privacy law that limits how businesses can collect personal information about their consumers living in the European Union. For our research, we aimed to evaluate the impact that the GDPR has on the open-source community, an online community that encourages open collaboration between software developers. We conducted a quantitative analysis of GitHub, a major online open-source platform. We compared pull requests (major contributions to a project) explicitly related to the GDPR to other non-GDPR pull requests from the same projects. We also conducted a qualitative pilot study in which we interviewed software developers with experience implementing GDPR requirements in industry or in open-source. From our research, we found that GDPR-related pull requests had significantly more activity than other pull requests, but that open-source developers did not perceive a significant impact on their software development processes when implementing GDPR compliance. Industry developers, on the other hand, had a more negative outlook on the GDPR, and found implementation to be difficult. Our results indicate a need to involve software developers in the lawmaking process in order to create direct and realistic expectations for developers when implementing privacy policies.
173

Private in Public - Public in Private: A Library on H Street NE Washington, DC

Rutledge, Kathleen Anne 03 February 2011 (has links)
The thesis investigates private versus public space and the natural tendency for an individual to seek out its own place within a group. More specifically, the project studies whether private and public could not only occupy the same geographic space independently, but also activate one another. A library was chosen as the program for its opportunity to serve as a "third place" in the community. A "third place" is a neutral ground that is neither a home nor workplace. The benefit of such a place is to stimulate conversation and interaction, to provide a way to either hide or be seen, and to encourage social cohesion as people meet that may not have through normal daily life. The site is on the corner of 12th and H Streets NE in Washington, DC. Its location in a rebounding streetscape demands that the library give the surrounding context a large role in its design. Public space is a priority, and the building is porous to extend the exterior into the interior and vice versa. The library's ever-changing role in a city inspires flexibility in the design and a life beyond normal library hours. / Master of Architecture
174

Understanding Community Privacy through Focus Group Studies

Codio, Sherley 21 May 2012 (has links)
Just as an individual is rightly concerned about the privacy of their personally identifying information, so also is a group of people, a community, concerned about the privacy of sensitive information entrusted to their care. Our research seeks to develop a better understanding of the factors contributing to the sensitivity of community information, of the privacy threats that are recognized by the community, and of the means by which the community attempts to fulfill their privacy responsibilities. We are also interested in seeing how the elements of a community privacy model that we developed are related to the findings from the studies of communities. This thesis presents the results of a series of focus group sessions conducted in corporate settings. Three focus group interviews were conducted using participants from two information technology companies and one research group from the university. Three themes emerged from the analysis of these focus group interviews which are described as privacy awareness, situated disclosures, and confinement of sensitive information. These three themes capture the character and complexity of community oriented privacy and expose breakdowns in current approaches. / Master of Science
175

Privacy Notice and Choice in Practice

Leon-Najera, Pedro Giovanni 01 December 2014 (has links)
In the United States, notice and choice remain the most commonly used mechanisms to protect people’s privacy online. This approach relies on the assumption that users provided with notice will make informed choices that align with their privacy expectations. The goal of this research is to empirically inform industry and regulatory efforts that rely on notice and choice to protect people’s online privacy. To do so, we present a set of case studies covering different aspects of privacy notice and choice in four domains: online behavioral advertising (OBA), online social networks (OSN), financial privacy notices, and websites’ machine-readable privacy notices. We investigate users’ privacy preferences, information needs, and ability to exercise choices in the OBAdomain. Based on our results, we provide recommendations to improve the design of notice and choice methods currently in use in this domain. In the context of OSNs, we explore the effect of nudging notices designed to encourage more thoughtful disclosures among Facebook users and recommend changes to the Facebook user interface aimed to mitigate problematic disclosures. We demonstrate how standardized notices enable large-scale evaluations and comparisons of companies’ privacy practices and argue that standardized privacy notices have an enormous potential to improve transparency and benefit users, privacy-respectful companies, and oversight entities. We argue that, in today’s complex Internet ecosystem, an approach that relies on users to make privacy decisions should also empower them with user-friendly interfaces, relevant information, and the tools they need to make privacy decisions. Finally, we further argue that notice and choice are necessary, but not sufficient to protect online privacy, and that government regulation is necessary to establish necessary additional protections including access, redress, accountability, and enforcement.
176

Protecting Online Privacy

Winkler, Stephanie D. 01 January 2016 (has links)
Online privacy has become one of the greatest concerns in the United States today. There are currently multiple stakeholders with interests in online privacy including the public, industry, and the United States government. This study examines the issues surrounding the protection of online privacy. Privacy laws in the United States are currently outdated and do little to protect online privacy. These laws are unlikely to be changed as both the government and industry have interests in keeping these privacy laws lax. To bridge the gap between the desired level of online privacy and what is provided legally users may turn to technological solutions.
177

Cloud Privacy Audit Framework: A Value-Based Design

Coss, David 01 January 2013 (has links)
The rapid expansion of cloud technology provides enormous capacity, which allows for the collection, dissemination and re-identification of personal information. It is the cloud’s resource capabilities such as these that fuel the concern for privacy. The impetus of these concerns are not too far removed from those expressed by Mason in 1986, when he identified privacy as one of the biggest ethical issues facing the information age. There seems to be continuous ebb and flow relationship with respect to privacy concerns and the development of new information communication technologies such as cloud computing. Privacy issues are a concern to all types of stakeholders in the cloud. Individuals using the cloud are exposed to privacy threats when they are persuaded to provide personal information unwantedly. An Organization using a cloud service is at risk of non-compliance to internal privacy policies or legislative privacy regulations. The cloud service provider has a privacy risk of legal liability and credibility concerns if sensitive information is exposed. The data subject is at risk of having personal information exposed. In essence everyone who is involved in cloud computing has some level of privacy risk that needs to be evaluated before, during and after they or an organization they interact with adopts a cloud technology solution. This resonates a need for organizations to develop privacy practices that are socially responsible towards the protection of their stakeholders’ information privacy. This research is about understanding the relationship between individual values and their privacy objectives. There is a lack of clarity in organizations as to what individuals consider privacy to be. Therefore, it is essential to understand an individual’s privacy values. Individuals seem to have divergent perspectives on the nature and scope of how their personal information is to be kept private in different modes of technologies. This study is concerned with identifying individual privacy objectives for cloud computing. We argue that privacy is an elusive concept due to the evolving relationship between technology and privacy. Understanding and identifying individuals’ privacy objectives are an influential step in the process of protecting the privacy in cloud computing environments. The aim of this study is to identify individual privacy values and develop cloud privacy objectives, which can be used to design a privacy audit for cloud computing environments. We used Keeney’s (1992) value focused thinking approach to identify individual privacy values with respect to emerging cloud technologies, and to develop an understanding of how cloud privacy objectives are shaped by the individual’s privacy values. We discuss each objective and how they relate to privacy concerns in cloud computing. We also use the cloud privacy objectives in a design science study to design a cloud privacy audit framework. We then discuss the how this research helps privacy managers develop a cloud privacy strategy, evaluate cloud privacy practices and develop a cloud privacy audit to ensure privacy. Lastly, future research directions are proposed.
178

Internet Privacy : A look into the construct of Privacy Knowledge

Nordström, Michael, Sevcenko, Sergej January 2012 (has links)
Background:                With the increasing use of personalized marketing and the increasing ability to collect information on consumers, the consumers’ concern of privacy is increasing. Therefore it is important to understand what effects privacy concern, and how marketers can minimize this concern. Previous research suggest that factors such as computer knowledge, internet knowledge, and regulation awareness all affect privacy concern, however we believe that these are all related to each other in a construct we call Privacy Knowledge. Purpose:                        To investigate the construct of Privacy Knowledge and to what degree it influences a consumer’s attitude towards informational privacy. Method:                        In order to validate the Privacy Knowledge construct and measure its relationship to Privacy Concern we employed a deductive methodology which was comprised of questionnaires. The questionnaires were composed of summative Likert Scales, three of which had been previous validated by previous research. We utilized a quota sampling technique in order to gather enough data from each age group. The results were then analyzed by tools such as Factor Analysis, ANOVA tests, and Multiple Regression Analysis. Conclusion:                   Through the Factor Analysis we found that the factors Internet Knowledge, Computer Knowledge, and Regulation Awareness were better organized as Basic IT Knowledge, Advanced IT Knowledge and Regulation Awareness. Privacy Knowledge was found to be positively related to Privacy Concern. However we could only conclude of the three factors which make up Privacy Knowledge, Basic IT Knowledge had an effect on Privacy Concern. We believe this is due to the exclusion of other factors affecting Privacy Concern such as situational factors and suggest conducting further research on the matter including these variables.
179

Privacy paradox or bargained-for-exchange : capturing the relationships among privacy concerns, privacy management, self-disclosure, and social capital

Hsu, Shih-Hsien 16 January 2015 (has links)
The dissertation seeks to bridge the gap between privacy and social capital on SNS use by bringing the essential elements of social networking, privacy concerns, privacy management, self-disclosure, and social capital together to examine their complex relationships and the daily challenges every SNS user faces. The major purposes of this dissertation were to revisit the privacy paradox phenomenon, update the current relationships among privacy concerns, self-disclosure, and social capital on Facebook, integrate these relationships into a quantitative model, and explore the role of privacy management in these relationships. The goal was realized by using Amazon.com’s Mechanical Turk to test a theoretical model that used survey data from 522 respondents. The findings from the dissertation show the impact of the structural factor—Facebook social network intensity and diversity—and the impact of individuals’ self-disclosure on Facebook on their perceived bridging and bonding social capital. This dissertation employed various measurements of key variables to update the current status of the privacy paradox phenomenon—the disconnection between privacy concerns and self- disclosure on social media—and found the break of the traditional privacy paradox and the existence of the social privacy paradox. Findings also show that private information about personal information, thoughts, and ideas shared on Facebook become assets in using Facebook and accumulating social capital. Meanwhile, higher privacy concerns reduce the level of self-disclosure on Facebook. Therefore, privacy concerns become a barrier in Facebook use and in accumulating social capital within these networks. This dissertation further examined the mediating role of privacy management to solve the dilemma. Findings confirmed that privacy management is important in redirecting the relationships among privacy concerns, self-disclosure, and social capital. People who have higher privacy concerns tend to disclose fewer personal thoughts and ideas on Facebook and miss the opportunity to accumulate social capital. However, when they employ more privacy management strategies, they are more willing to self-disclose and thus accumulate more social capital on Facebook networks. Lastly, the proposed integrated model examined through SEM analysis confirms the delicate relationships among the social networking characteristics, privacy concerns, privacy management, self-disclosure, and social capital. / text
180

A Grounded Theory of the Psychology of Privacy Management

Christofides, Emily 30 July 2012 (has links)
This dissertation describes the findings from a qualitative research study aimed at increasing our understanding of the psychology of privacy management. Specifically, I sought to explore people’s beliefs, perceptions, and process for managing privacy in the contexts that they inhabit. I conducted 32 one-on-one interviews with participants ranging in age from 18 to 85 years old. Using grounded theory methodology, I developed a substantive theory of privacy that outlines the way people manage their privacy in our current environment. This grounded theory takes into account people’s individual approach to privacy, the elements they consider when deciding whether or not to reveal aspects of themselves, and the behaviors they engage in to maintain their privacy or protect the privacy of others. Approach to privacy consists of beliefs about privacy, personality characteristics such as openness and self-confidence, and values, which include doing unto others, honesty, and choice (or control). In many cases this approach has never been explicitly considered, but it interacts with who one is speaking with, the topic of discussion, the context, and the perceived risks and benefits, in affecting the privacy decision. Trust is a key factor in deciding whether or not to reveal part of oneself to someone, but certain roles and relationships seem to bypass the privacy decision-making process. A risk-benefit analysis does occur, but it is one of several components that impact privacy decisions and is hampered by the emotional nature of the information that is considered. Some contexts, such as technologically mediated situations, heighten awareness of privacy issues, while others involve information or situations that are seen to override privacy rights. Ultimately, these considerations interact and lead to particular behaviors for maintaining or regaining a desired level of privacy.

Page generated in 0.0318 seconds