• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1210
  • 167
  • 133
  • 109
  • 83
  • 70
  • 38
  • 38
  • 36
  • 18
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2340
  • 619
  • 544
  • 508
  • 494
  • 348
  • 325
  • 307
  • 292
  • 231
  • 226
  • 214
  • 205
  • 196
  • 178
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings

Irannejad, Arezoo January 2013 (has links)
Protection of personal information has become a critical issue in the digital world. Many companies and service provider websites have adopted privacy policies and practices to protect users’ personal information to some extent. In addition, various governments are adopting privacy protection legislation. System developers, service providers, and interface designers play an important role in determining how to make systems fulfill legal requirements and satisfy users. The human factor requirements for effective privacy interface design can be categorized into four groups: (1) comprehension, (2) consciousness, (3) control, and (4) consent (Patrick & Kenny, 2003). Moreover, the type of technology that people are engaged with has a crucial role in determining what type of practices should be adopted. As Weiser (1996) envisioned, we are now in an “ubiquitous computing” (Ubicomp) era in which technologies such as digital tabletops (what Weiser called LiveBoards) are emerging for use in public settings. The collaborative and open nature of this type of smart device introduces new privacy threats that have not yet been thoroughly investigated and as a result have not been addressed in companies’ and governmental privacy statements and legislation. In this thesis, I provide an analytical description of the privacy threats unique to tabletop display environments. I then present several design suggestions for a tabletop display interface that addresses and mitigates these threats, followed by a qualitative evaluation of these designs based on Patrick and Kenny’s (2003) model. Results show that most participants have often experienced being shoulder-surfed or had privacy issues when sharing information with someone in a collaborative environment. Therefore, they found most of the techniques designed in this thesis helpful in providing information privacy for them when they are engaged with online social activities on digital tabletops in public settings. Among all of the proposed tested designs, the first three have proven to be effective in providing the required privacy. However, designs 4 and 5 had some shortfalls that made them less helpful for participants. The main problem with these two designs was that participants had difficulty understanding what they had to do in order to complete the given tasks.
102

Understanding privacy leakage concerns in Facebook : a longitudinal case study

Jamal, Arshad January 2013 (has links)
This thesis focuses on examining users’ perceptions of privacy leakage in Facebook – the world’s largest and most popular social network site (SNS). The global popularity of this SNS offers a hugely tempting resource for organisations engaged in online business. The personal data willingly shared between online friends’ networks intuitively appears to be a natural extension of current advertising strategies such as word-of-mouth and viral marketing. Therefore organisations are increasingly adopting innovative ways to exploit the detail-rich personal data of SNS users for business marketing. However, commercial use of such personal information has provoked outrage amongst Facebook users and has radically highlighted the issue of privacy leakage. To date, little is known about how SNS users perceive such leakage of privacy. So a greater understanding of the form and nature of SNS users’ concerns about privacy leakage would contribute to the current literature as well as help to formulate best practice guidelines for organisations. Given the fluid, context-dependent and temporal nature of privacy, a longitudinal case study representing the launch of Facebook’s social Ads programme was conducted to investigate the phenomenon of privacy leakage within its real-life setting. A qualitative user blogs commentary was collected between November 2007 and December 2010 during the two-stage launch of the social Ads programme. Grounded theory data analysis procedures were used to analyse users’ blog postings. The resulting taxonomy shows that business integrity, user control, transparency, data protection breaches, automatic information broadcast and information leak are the core privacy leakage concerns of Facebook users. Privacy leakage concerns suggest three limits, or levels: organisational, user and legal, which provide the basis to understanding the nature and scope of the exploitation of SNS users’ data for commercial purposes. The case study reported herein is novel, as existing empirical research has not identified and analysed privacy leakage concerns of Facebook users.
103

Data-level privacy through data perturbation in distributed multi-application environments

de Souza, Tulio January 2016 (has links)
Wireless sensor networks used to have a main role as a monitoring tool for environmental purposes and animal tracking. This spectrum of applications, however, has dramatically grown in the past few years. Such evolution means that what used to be application-specific networks are now multi application environments, often with federation capabilities. This shift results in a challenging environment for data privacy, mainly caused by the broadening of the spectrum of data access points and involved entities. This thesis first evaluates existing privacy preserving data aggregation techniques to determine how suitable they are for providing data privacy in this more elaborate environment. Such evaluation led to the design of the set difference attack, which explores the fact that they all rely purely on data aggregation to achieve privacy, which is shown through simulation not to be suitable to the task. It also indicates that some form of uncertainty is required in order to mitigate the attack. Another relevant finding is that the attack can also be effective against standalone networks, by exploring the node availability factor. Uncertainty is achieved via the use of differential privacy, which offers a strong and formal privacy guarantee through data perturbation. In order to make it suitable to work in a wireless sensor network environment, which mainly deals with time-series data, two new approaches to address it have been proposed. These have a contrasting effect when it comes to utility and privacy levels, offering a flexible balance between privacy and data utility for sensed entities and data analysts/consumers. Lastly, this thesis proposes a framework to assist in the design of privacy preserving data aggregation protocols to suit application needs while at the same time complying with desired privacy requirements. The framework's evaluation compares and contrasts several scenarios to demonstrate the level of flexibility and effectiveness that the designed protocols can provide. Overall, this thesis demonstrates that data perturbation can be made significantly practical through the proposed framework. Although some problems remain, with further improvements to data correlation methods and better use of some intrinsic characteristics of such networks, the use of data perturbation may become a practical and efficient privacy preserving mechanism for wireless sensor networks.
104

The Role of Cognitive Disposition in Re-examining the Privacy Paradox: A Neuroscience Study

Mohammed, Zareef 01 January 2017 (has links)
The privacy paradox is a phenomenon whereby individuals continue to disclose their personal information, contrary to their claim of concerns for the privacy of their personal information. This study investigated the privacy paradox to better understand individuals' decisions to disclose or withhold their personal information. The study argued that individuals’ decisions are based on a cognitive disposition, which involves both rational and emotional mental processes. While the extended privacy calculus model was used as the theoretical basis for the study, the findings of cognitive neuroscience was applied to it to address its limitation in assuming individuals are purely rational decision-makers. Three within-subjects experiments were conducted whereby each subject participated in all three experiments as if it were one. Experiment 1 captured the neural correlates of mental processes involved in privacy-related decisions, while experiment 2 and 3 were factorial-design experiments used for testing the relationship of neural correlates in predicting privacy concerns and personal information disclosure. The findings of this study indicated that at least one neural correlate of every mental process involved in privacy-related decisions significantly influenced personal information disclosure, except for uncertainty. However, there were no significant relationships between mental processes and privacy concerns, except Brodmann’s Area 13, a neural correlate of distrust. This relationship, however, had a positive relationship with privacy concerns, opposite to what was hypothesized. Furthermore, interaction effects indicated that individuals put more emphasis on negative perceptions in privacy-related situations. This study contributed to the information privacy field by supporting the argument that individuals’ privacy-related decisions are both rational and emotional. Specifically, the privacy paradox cannot be explained through solely rational cost-benefit analysis or through an examination of individuals’ emotions alone.
105

Discovering Constructs and Dimensions for Information Privacy Metrics

Dayarathna, Rasika January 2013 (has links)
Privacy is a fundamental human right. During the last decades, in the information age, information privacy has become one of the most essential aspects of privacy. Information privacy is concerned with protecting personal information pertaining to individuals. Organizations, which frequently process the personal information, and individuals, who are the subjects of the information, have different needs, rights and obligations. Organizations need to utilize personal information as a basis to develop tailored services and products to their customers in order to gain advantage over their competitors. Individuals need assurance from the organizations that their personal information is not changed, disclosed, deleted or misused in any other way. Without this guarantee from the organizations, individuals will be more unwilling to share their personal information. Information privacy metrics is a set of parameters used for the quantitative assessment and benchmark of an organization’s measures to protect personal information. These metrics can be used by organizations to demonstrate, and by individuals to evaluate, the type and level of protection given to personal information. Currently, there are no systematically developed, established or widely used information privacy metrics. Hence, the purpose of this study is to establish a solid foundation for building information privacy metrics by discovering some of the most critical constructs and dimensions of these metrics.  The research was conducted within the general research strategy of design science and by applying research methods such as data collection and analysis informed by grounded theory as well as surveys using interviews and questionnaires in Sweden and in Sri Lanka. The result is a conceptual model for information privacy metrics including its basic foundation; the constructs and dimensions of the metrics. / <p>At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 6: Accepted.</p>
106

The Theory and Applications of Homomorphic Cryptography

Henry, Kevin January 2008 (has links)
Homomorphic cryptography provides a third party with the ability to perform simple computations on encrypted data without revealing any information about the data itself. Typically, a third party can calculate one of the encrypted sum or the encrypted product of two encrypted messages. This is possible due to the fact that the encryption function is a group homomorphism, and thus preserves group operations. This makes homomorphic cryptosystems useful in a wide variety of privacy preserving protocols. A comprehensive survey of known homomorphic cryptosystems is provided, including formal definitions, security assumptions, and outlines of security proofs for each cryptosystem presented. Threshold variants of several homomorphic cryptosystems are also considered, with the first construction of a threshold Boneh-Goh-Nissim cryptosystem given, along with a complete proof of security under the threshold semantic security game of Fouque, Poupard, and Stern. This approach is based on Shoup's approach to threshold RSA signatures, which has been previously applied to the Paillier and Damg\aa rd-Jurik cryptosystems. The question of whether or not this approach is suitable for other homomorphic cryptosystems is investigated, with results suggesting that a different approach is required when decryption requires a reduction modulo a secret value. The wide variety of protocols utilizing homomorphic cryptography makes it difficult to provide a comprehensive survey, and while an overview of applications is given, it is limited in scope and intended to provide an introduction to the various ways in which homomorphic cryptography is used beyond simple addition or multiplication of encrypted messages. In the case of strong conditional oblivious tranfser, a new protocol implementing the greater than predicate is presented, utilizing some special properties of the Boneh-Goh-Nissim cryptosystem to achieve security against a malicious receiver.
107

The Theory and Applications of Homomorphic Cryptography

Henry, Kevin January 2008 (has links)
Homomorphic cryptography provides a third party with the ability to perform simple computations on encrypted data without revealing any information about the data itself. Typically, a third party can calculate one of the encrypted sum or the encrypted product of two encrypted messages. This is possible due to the fact that the encryption function is a group homomorphism, and thus preserves group operations. This makes homomorphic cryptosystems useful in a wide variety of privacy preserving protocols. A comprehensive survey of known homomorphic cryptosystems is provided, including formal definitions, security assumptions, and outlines of security proofs for each cryptosystem presented. Threshold variants of several homomorphic cryptosystems are also considered, with the first construction of a threshold Boneh-Goh-Nissim cryptosystem given, along with a complete proof of security under the threshold semantic security game of Fouque, Poupard, and Stern. This approach is based on Shoup's approach to threshold RSA signatures, which has been previously applied to the Paillier and Damg\aa rd-Jurik cryptosystems. The question of whether or not this approach is suitable for other homomorphic cryptosystems is investigated, with results suggesting that a different approach is required when decryption requires a reduction modulo a secret value. The wide variety of protocols utilizing homomorphic cryptography makes it difficult to provide a comprehensive survey, and while an overview of applications is given, it is limited in scope and intended to provide an introduction to the various ways in which homomorphic cryptography is used beyond simple addition or multiplication of encrypted messages. In the case of strong conditional oblivious tranfser, a new protocol implementing the greater than predicate is presented, utilizing some special properties of the Boneh-Goh-Nissim cryptosystem to achieve security against a malicious receiver.
108

Personalized Marketing : An invasion of privacy or an approved phenomenon? An empirical study of how organizations can respond to consumers’ concern over the threats of online privacy.

Birgisdottir, Johanna, Amin, Hiral January 2012 (has links)
The authors of this study analysed the increasing use of personalized marketing and consumer concerns regarding the access to personal information. The purpose was to find out how companies could react to these concerns. Several theoretical concepts were explored, such as Personal Data, Personalized Marketing, Privacy Concerns, Privacy Policies, Consumer Trust and Consumer Behaviour. Facebook Inc. was analysed as an example to address the problem. An online survey was conducted on university students and two interviews were performed with representatives from the Data Inspection Board in Sweden. The main findings were that individuals seem to approve of personalized marketing but are concerned about their privacy. Companies should therefore inform their consumers on how personal data is used for personalized marketing and respect their rights and take governmental regulations into consideration.
109

Designing Privacy-Enhanced Interfaces on Digital Tabletops for Public Settings

Irannejad, Arezoo January 2013 (has links)
Protection of personal information has become a critical issue in the digital world. Many companies and service provider websites have adopted privacy policies and practices to protect users’ personal information to some extent. In addition, various governments are adopting privacy protection legislation. System developers, service providers, and interface designers play an important role in determining how to make systems fulfill legal requirements and satisfy users. The human factor requirements for effective privacy interface design can be categorized into four groups: (1) comprehension, (2) consciousness, (3) control, and (4) consent (Patrick & Kenny, 2003). Moreover, the type of technology that people are engaged with has a crucial role in determining what type of practices should be adopted. As Weiser (1996) envisioned, we are now in an “ubiquitous computing” (Ubicomp) era in which technologies such as digital tabletops (what Weiser called LiveBoards) are emerging for use in public settings. The collaborative and open nature of this type of smart device introduces new privacy threats that have not yet been thoroughly investigated and as a result have not been addressed in companies’ and governmental privacy statements and legislation. In this thesis, I provide an analytical description of the privacy threats unique to tabletop display environments. I then present several design suggestions for a tabletop display interface that addresses and mitigates these threats, followed by a qualitative evaluation of these designs based on Patrick and Kenny’s (2003) model. Results show that most participants have often experienced being shoulder-surfed or had privacy issues when sharing information with someone in a collaborative environment. Therefore, they found most of the techniques designed in this thesis helpful in providing information privacy for them when they are engaged with online social activities on digital tabletops in public settings. Among all of the proposed tested designs, the first three have proven to be effective in providing the required privacy. However, designs 4 and 5 had some shortfalls that made them less helpful for participants. The main problem with these two designs was that participants had difficulty understanding what they had to do in order to complete the given tasks.
110

Location privacy in automotive telematics

Iqbal, Muhammad Usman, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2009 (has links)
The convergence of transport, communication, computing and positioning technologies has enabled a smart car revolution. As a result, pricing of roads based on telematics technologies has gained significant attention. While there are promised benefits, systematic disclosure of precise location has the ability to impinge on privacy of a special kind, known as location privacy. The aim of this thesis is to provide technical designs that enhance the location privacy of motorists without compromising the benefits of accurate pricing. However, this research looks beyond a solely technology-based solution, For example, the ethical implications of the use of GPS data in pricing models have not been fully understood. Likewise. minimal research exists to evaluate the technical vulnerabilities that could be exploited to avoid criminal or financial penalties. To design a privacy-aware system, it is important to understand the needs of the stakeholders, most importantly the motorists. Knowledge about the anticipated privacy preferences of motorists is important in order to make reasonable predictions about their future willingness to adopt these systems. There is limited research so far Otl user perceptions regarding specific payment options in the uptake of privacy-aware systems. This thesis provides a critical privacy assessment of two mobility pricing systems, namely electronic tolls and mobility-priced insurance. As a result of this assessment. policy recommendations arc developed which could support a common approach in facilitating privacy-aware mobility-pricing strategies. This thesis also evaluates the existing and potential inferential threats and vulnerabilities to develop security and privacy recommendations for privacy-aware pricing designs for tolls and insurance. Utilising these policy recommendations and analysing user-perception with regards to the feasibility of sustaining privacy and willingness to pay for privacy, two privacy-aware mobility pricing designs have been presented which bridge the entire array of privacy interests and bring them together into a unified approach capable of sustaining legal protection as well as satisfying privacy requirements of motorists. It is maintained that it is only by social and technical analysis working in tandem that critical privacy issues in relation to location can be addressed.

Page generated in 0.0449 seconds