• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1241
  • 167
  • 137
  • 109
  • 83
  • 70
  • 38
  • 38
  • 36
  • 21
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2389
  • 643
  • 558
  • 523
  • 509
  • 352
  • 333
  • 308
  • 299
  • 235
  • 235
  • 218
  • 210
  • 199
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

An examination of privacy in the socio-technological context of Big Data and the socio-cultural context of China

Fu, Tao 01 August 2015 (has links)
Privacy has been an academic concern, ethical issue and legislative conundrum. No other factors have shaped the understanding of privacy as much as the development of technologies – be it the invention of press machines, telephones or cameras. With the diffusion of mobile Internet, social media, the Internet of Things and the penetration of devices such as smartphones, the global positioning system, surveillance cameras, sensors and radio frequency identification tags, Big Data, designed to economically extract value from a huge amount and variety of data, has been accumulating exponentially since 2012. Data-driven businesses collect, combine, use, share and analyze consumers’ personal information for business revenues. Consumers’ shopping habits, viewing habits, browsing history and many other online behaviors have been commodified. Never before in history had privacy been threatened by the latest communication technologies as it is today. This dissertation aims to study some of the rising issues of technology and businesses that relate to privacy in China, a rising economic power of the East. China is a country with Confucian heritage and governed under decades of Communist leadership. Its philosophical traditions and social fabric have shaped the perception of privacy since more than 2,000 years ago. “Private” was not taken as negative but being committed to the public or the greater good was an expected virtue in ancient China. The country also has a long tradition of peer surveillance whether it was under the baojia system or the later-on Urban and Rural Residents’ Committees. But after China adopted the reform and open-up policy in 1978, consumerism has inspired the new Chinese middle class to pursue more private space as a lifestyle. Alibaba, Baidu and Tencent are globally top-ranking Chinese Internet companies with huge numbers of users, tractions and revenues, whose businesses depend heavily on consumers’ personal data. As a response to the increase of consumer data and the potential intrusion of privacy by Internet and information service providers (IISPs), the Ministry of Industry and Information Technology, a regulator of China’s Internet industry, enacted laws to regulate the collection and use of personal information by the IISPs. Drawing upon the literature and privacy theories of Westin, Altman and Nissenbaum and the cultural theory of Hofstede, this study investigated the compliance of Chinese businesses’ privacy policies with relevant Chinese laws and the information provided in the privacy policies regarding the collection, use and disclosure of Internet users’ personal information; Chinese consumers’ privacy attitudes and actions, including the awareness, concerns, control, trust and trade-offs related to privacy; the differences among Chinese Fundamentalists, Pragmatists and Unconcerned using Core Privacy Orientation Index; and the conceptualization of privacy in present China. A triangulation of quantitative and qualitative methods such as case study, content analysis, online survey and semantic network analysis were employed to answer research questions and test hypotheses. This study found Chinese IISPs represented by Alibaba, Baidu and Tencent comply well with Chinese laws. Tencent provides the most information about the collection, use and disclosure of consumers’ personal information. Chinese consumers know little about Big Data technologies in terms of collecting their personal information. They have the most concerns about other individuals and the least about the government when their personal information is accessed without their knowledge. When their personal information is collected by online businesses, Chinese consumers’ have more concerns about their online chats, their images and emails and the fewer concerns about searches performed, websites browsed, shopping and viewing habits. Less than one-third of Chinese surveyed take pro-active measures to manage online privacy settings. Chinese consumers make more efforts to avoid being tracked by people who might criticize, harass, or target them; advertisers and hackers or criminals. They rarely make themselves invisible from government, law enforcement persons or people they are familiar with such as people from their past, family members and romantic partners. Chinese consumers are more trusting of the laws and regulations issued by the government than they are of online businesses to protect personal data. Chinese only trade privacy for benefits occasionally but when they see more benefits from privacy trade-offs, they have fewer concerns. To Chinese consumers, privacy means personal information, including but not limited to, family, home address, phone number, Chinese ID number, password to bank accounts and other online accounts, the leaking and disclosure of which without the owners’ consent to people whom they do not want the information to be known will result in a sense of insecurity.
162

Evaluating End Users’ Online Privacy Preferences and Identifying PET Design Requirements: A Literature Review

Kolivodiakos, Paraskevas January 2018 (has links)
In this research end user privacy preferences regarding online resources web and mobile applications and websites are investigated and design requirements needed for the development of a privacy focused, privacy enhancing technology tool are identified, as derived from the literature, the crowd source based solution is the most appealing solution so it is fully analyzed according to our research main focus.
163

Perspectives on privacy : a sociological analysis

Day, Katherine Julie January 1985 (has links)
No description available.
164

Aspects of statistical disclosure control

Smith, Duncan Geoffrey January 2012 (has links)
This work concerns the evaluation of statistical disclosure control risk by adopting the position of the data intruder. The underlying assertion is that risk metrics should be based on the actual inferences that an intruder can make. Ideally metrics would also take into account how sensitive the inferences would be, but that is subjective. A parallel theme is that of the knowledgeable data intruder; an intruder who has the technical skills to maximally exploit the information contained in released data. This also raises the issue of computational costs and the benefits of using good algorithms. A metric for attribution risk in tabular data is presented. It addresses the issue that most measures for tabular data are based on the risk of identification. The metric can also take into account assumed levels of intruder knowledge regarding the population, and it can be applied to both exact and perturbed collections of tables. An improved implementation of the Key Variable Mapping System (Elliot, et al., 2010) is presented. The problem is more precisely defined in terms of categorical variables rather than responses to survey questions. This allows much more efficient algorithms to be developed, leading to significant performance increases. The advantages and disadvantages of alternative matching strategies are investigated. Some are shown to dominate others. The costs of searching for a match are also considered, providing insight into how a knowledgeable intruder might tailor a strategy to balance the probability of a correct match and the time and effort required to find a match. A novel approach to model determination in decomposable graphical models is described. It offers purely computational advantages over existing schemes, but allows data sets to be more thoroughly checked for disclosure risk. It is shown that a Bayesian strategy for matching between a sample and a population offers much higher probabilities of a correct match than traditional strategies would suggest. The Special Uniques Detection Algorithm (Elliot et al., 2002) (Manning et al., 2008), for identifying risky sample counts of 1, is compared against Bayesian (using Markov Chain Monte Carlo and simulated annealing) alternatives. It is shown that the alternatives are better at identifying risky sample uniques, and can do so with reduced computational costs.
165

Privacy, surveillance and HIV/AIDS in the workplace : a South African case study.

Muskat-Gorska, Zuzanna 19 March 2009 (has links)
The study focus on socio-legal dimension of medical data surveillance in the workplace on the example of the South African workplace response to HIV/AIDS. The strating point is the problem of growing data gathering and monitoring as an institutional feature of the information/surveillance society. Studying the problem in the context of workplace aims at indicating possibilities for social partners to respond to the new developments in the area of workplace surveillance and HIV/AIDS management in particular. The empirical data has been drawn from document analysis and interviews with trade union and business representatives from South Africa, involved in developing workplace response to HIV/AIDS. Particularly, the study is interested in identifying ways in which trade unions can make personal data treatment a trade union issue.
166

Factors influencing the use of privacy settings in location-based social networks

Oladimeji, Henry January 2017 (has links)
The growth of location-based social networks (LBSN) such as Facebook and Twitter has been rapid in recent years. In LBSNs, users provide location information on public profiles that potentially can be used in harmful ways. LBSNs have privacy settings that allow users to control the privacy level of their profiles, thus limiting access to location information by other users; but for various reasons users seldom make use of them. Using the protection motivation theory (PMT) as a theoretical lens, this dissertation examines whether users can be encouraged to use LBSN privacy settings through fear appeals. Fear appeals have been used in various studies to arouse fear in users, in order to motivate them to comply to an adaptive behaviour through the threat of impending danger. However, within the context of social networking, it is not yet clear how fear-inducing arguments will ultimately influence the use of privacy settings by users. The purpose of this study is to investigate the influence of fear appeals on user compliance, with recommendations to enact the use of privacy settings toward the alleviation of privacy threats. Using a survey methodology, 248 social-network users completed an instrument measuring the variables conceptualized by PMT. Partial Least Squares Structural Equation Modelling (PLS-SEM) was used to test the validity and reliability, and to analyze the data. Analysis of the responses show that PMT provides an explanation for the intention to use privacy settings by social-network users. Risk susceptibility, response efficacy, self-efficacy and response cost were found to have a positive impact on the intention to use privacy settings, while sharing benefits and maladaptive behaviours were found to have a negative impact on the intention to use privacy settings. However, risk severity and fear were not found to be significant predictors of the intention to use privacy settings. This study contributes to existing research on PMT in a sense that fear appeal should focus more on coping appraisal, rather than on threat appraisal which is consistent with the results of most studies on protection motivation.
167

Perceptions of privacy and career impression management : the case of facebook.

Pilcer, Danielle 21 June 2012 (has links)
Facebook (FB) is a ubiquitous category of web 2.0 technology that has embedded itself in the present day reality of people worldwide. It represents the constantly evolving online environment and brings to light the associated implications of synthesising people’s online private and work life. FB can act as a platform for employees to create and manage the impressions formed of them in their work context. On the backdrop of the social capital theory, this research explored the relationships between FB experience, perceptions of FB privacy and FB career impression management (FB CIM) and specifically whether perceptions of FB privacy moderated the impact of FB experience on FB CIM. Phase 1 was concerned with creating reliable scales through the implementation of a pilot study. Phase 2 initiated the main study with a convenience sample of 217 respondents, made up of FB users and non- users, recruited online on social networking sites and within a South African based IT organisation. They completed an online survey consisting of biographical information; FB experience, perceptions of FB privacy and FB CIM items (self-developed scales). From the analyses conducted it was found that the constructed scales were reliable, with co-efficient alpha’s yielding scores of above 0.6; and structurally valid as seen with the factor analyses. It was found that younger respondents experienced higher FB experience than older respondents (r=-0.39). FB experience was related to perceptions of FB privacy with an increase in FB experience being related to increased levels of trust (r=0.16) (part of the perceptions of FB privacy subscale). FB experience was associated with increased FB CIM activities (self- monitoring r=0.26; work relations r=0.23) with FB experience being the strongest predictor of FB CIM. As such FB experience and one’s perceived importance of FB privacy may have an influence on the degree to which one actively engages in FB CIM.
168

Privatizing the Volume and Timing of Blockchain Transactions

Miller, Trevor John 20 March 2023 (has links)
With current state-of-the-art privacy-preserving blockchain solutions, users can submit transactions to a blockchain while maintaining full anonymity and not leaking the contents of the transaction through cryptographic techniques like zero-knowledge proofs and homomorphic encryption. However, the architecture of a blockchain consists of a decentralized network where every network participant maintains their own local copy of the blockchain and updates it upon every added transaction. As a result, the volume of blockchain transactions and the timestamp of each blockchain transaction for an application is publicly available. This is problematic for applications with time-sensitive or volume-sensitive outcomes because users may want this information to be privatized, such as not leaking the lateness of student examinations. However, this is not possible with existing blockchain research. In this thesis, we propose a blockchain system for multi-party applications that does not leak any useful information from the volume and timing metadata of the application's transactions, including maintaining the privacy of a time-sensitive or volume-sensitive outcome. We achieve this by adding sufficient noise using indistinguishable decoy transactions such that an adversary cannot deduce which transactions actually impacted the outcome of the application. This is facilitated in a manner where anyone can publicly verify the application's execution to be correct, fair, and honest. We demonstrate and evaluate our approach by implementing a Dutch auction that supports decoy bid transactions on a private Ethereum blockchain network. / Master of Science / Blockchains are distributed, append-only, digital ledgers whose current state is continuously agreed upon through the consensus of network participants and not by any centralized party. These characteristics make them unique for many applications because they enable the application to be facilitated and executed in a public, verifiable, decentralized, and tamper-proof manner. For example, Bitcoin, the most popular cryptocurrency, uses blockchains to continuously maintain a permanent, verifiable ledger of payment transactions. However, one downside of this public architecture is that the volume of transactions and the timestamp of each transaction can always be publicly observed (e.g. the timestamp of every Bitcoin payment is public). This is problematic for applications with time-sensitive or volume-sensitive outcomes because users may want this volume and timing information to be privatized, such as not leaking the lateness of student examinations which could have severe consequences like violating student privacy laws. But currently with state-of-the-art blockchain research, privatizing this information is not possible. In this thesis, we demonstrate our approach that enables these time-sensitive and volume-sensitive applications to be implemented on blockchains in a manner that can maintain the privacy of these time-sensitive or volume-sensitive outcomes without sacrificing the application's integrity or verifiability. We then demonstrate and evaluate our approach through implementing a Dutch auction that supports decoy bid transactions on a private blockchain network.
169

We hit turbulence: governing screenshot collection and sharing of digital messages

Shore, Alexis 08 March 2024 (has links)
Individuals rely on digital messaging to form and maintain intimate relationships, trusting that the mediums through which they communicate are protective of their privacy. The screenshot feature revokes attempts to establish such trust. This tool allows individuals to capture and store pieces of a private conversation as a separate file on their device, rendering them usable and shareable with third parties. While the screenshot feature serves utilitarian purposes, this dissertation focuses on its ability to breach privacy expectations, termed within communication privacy management theory (CPM) as privacy turbulence. This dissertation extends the scope of CPM beyond its interpersonal bounds, recognizing the power of platforms to create rules that users follow when making decisions about others' information. Experimental results suggest that blurring received messages upon use of the screenshot feature (i.e., obscurity) and creating an explicit confidentiality expectation (i.e., explicit privacy rule) significantly reduce screenshot collection and sharing, respectively. Additionally, reflections from participants reveal that many individuals are willing to compromise others’ privacy on digital messaging platforms while simultaneously expecting protection of their own. Qualitative analysis of relevant case law and complaints/opinions from the Federal Trade Commission (FTC) reveal inconsistencies both within law and policy and as compared to empirical evidence. Judges have provided overly broad definitions of “authorization” and lofty thresholds to sustain individual harm, making statutory regulation of screenshot collection and sharing unlikely. However, guidance from the FTC demonstrates a more nuanced regulatory approach to privacy that recognizes the influence of platform design. Results from this study suggest that design-based strategies—both ex-ante and ex-post—would be a promising first step toward adjusting the norms around screenshot collection and sharing of digital messages. Together, the results of this dissertation will inform policymakers and platform designers of the privacy harm enabled by the screenshot feature, providing tangible recommendations to create messaging platforms that are truly private. / 2026-03-08T00:00:00Z
170

Privacy Preservation for Cloud-Based Data Sharing and Data Analytics

Zheng, Yao 21 December 2016 (has links)
Data privacy is a globally recognized human right for individuals to control the access to their personal information, and bar the negative consequences from the use of this information. As communication technologies progress, the means to protect data privacy must also evolve to address new challenges come into view. Our research goal in this dissertation is to develop privacy protection frameworks and techniques suitable for the emerging cloud-based data services, in particular privacy-preserving algorithms and protocols for the cloud-based data sharing and data analytics services. Cloud computing has enabled users to store, process, and communicate their personal information through third-party services. It has also raised privacy issues regarding losing control over data, mass harvesting of information, and un-consented disclosure of personal content. Above all, the main concern is the lack of understanding about data privacy in cloud environments. Currently, the cloud service providers either advocate the principle of third-party doctrine and deny users' rights to protect their data stored in the cloud; or rely the notice-and-choice framework and present users with ambiguous, incomprehensible privacy statements without any meaningful privacy guarantee. In this regard, our research has three main contributions. First, to capture users' privacy expectations in cloud environments, we conceptually divide personal data into two categories, i.e., visible data and invisible data. The visible data refer to information users intentionally create, upload to, and share through the cloud; the invisible data refer to users' information retained in the cloud that is aggregated, analyzed, and repurposed without their knowledge or understanding. Second, to address users' privacy concerns raised by cloud computing, we propose two privacy protection frameworks, namely individual control and use limitation. The individual control framework emphasizes users' capability to govern the access to the visible data stored in the cloud. The use limitation framework emphasizes users' expectation to remain anonymous when the invisible data are aggregated and analyzed by cloud-based data services. Finally, we investigate various techniques to accommodate the new privacy protection frameworks, in the context of four cloud-based data services: personal health record sharing, location-based proximity test, link recommendation for social networks, and face tagging in photo management applications. For the first case, we develop a key-based protection technique to enforce fine-grained access control to users' digital health records. For the second case, we develop a key-less protection technique to achieve location-specific user selection. For latter two cases, we develop distributed learning algorithms to prevent large scale data harvesting. We further combine these algorithms with query regulation techniques to achieve user anonymity. The picture that is emerging from the above works is a bleak one. Regarding to personal data, the reality is we can no longer control them all. As communication technologies evolve, the scope of personal data has expanded beyond local, discrete silos, and integrated into the Internet. The traditional understanding of privacy must be updated to reflect these changes. In addition, because privacy is a particularly nuanced problem that is governed by context, there is no one-size-fit-all solution. While some cases can be salvaged either by cryptography or by other means, in others a rethinking of the trade-offs between utility and privacy appears to be necessary. / Ph. D.

Page generated in 0.0541 seconds