• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1237
  • 167
  • 137
  • 109
  • 83
  • 70
  • 38
  • 38
  • 36
  • 21
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2380
  • 641
  • 556
  • 520
  • 508
  • 352
  • 332
  • 308
  • 299
  • 235
  • 234
  • 218
  • 210
  • 199
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
411

Parental Depression-Related Disclosures with Children: An Analysis Using Communication Privacy Management Theory

Starcher, Shawn C. 30 April 2019 (has links)
No description available.
412

I'm Ready for My Close-Up: Cameras as a Privacy Issue in State and Federal Courts

O'Meara, Laura Ann January 2020 (has links)
No description available.
413

Telemetry for Debugging Software Issues in Privacy-sensitive Systems / Telemetri för att felsöka programvaruproblem i sekretesskänsliga system

Landgren, Kasper, Tavakoli, Payam January 2023 (has links)
Traditionally, when debugging a software system, developers rely on having access to the input data that caused the error. However, with data privacy concerns on the rise, it is becoming increasingly challenging to rely on input data as it might be sensitive or even classified. Telemetry is a method of collecting information about a system that can offer assistance. This thesis proposes a telemetry solution for debugging systems where input data is sensitive. The telemetry solution was implemented in a geographical 3D visualization system and evaluated based on two aspects: its effectiveness in assisting the developers working on the system to locate if and where a problem has occurred, and its ability to deduce the input data that generated the output. The results indicate that the telemetry solution is successful in helping developers identify system errors. Additionally, the system's privacy is maintained, as it is not feasible to directly ascertain the input data responsible from the output. However, no proof is presented and as such, no guarantee can be made. We conclude that telemetry can be a useful tool for developers, making the debugging process more effective while protecting sensitive input data. However, this might not be the case for different customers or systems. The thesis demonstrates the potential of using telemetry for debugging privacy-sensitive systems, with a proof-of-concept solution that can be improved upon in the future.
414

Examining the Impact of Privacy Awareness of user self-disclosure on social media

Attablayo, Prudence 02 June 2023 (has links)
No description available.
415

RISK INTERPRETATION OF DIFFERENTIAL PRIVACY

Jiajun Liang (13190613) 31 July 2023 (has links)
<p><br></p><p>How to set privacy parameters is a crucial problem for the consistent application of DP in practice. The current privacy parameters do not provide direct suggestions for this problem. On the other hand, different databases may have varying degrees of information leakage, allowing attackers to enhance their attacks with the available information. This dissertation provides an additional interpretation of the current DP notions by introducing a framework that directly considers the worst-case average failure probability of attackers under different levels of knowledge. </p><p><br></p><p>To achieve this, we introduce a novel measure of attacker knowledge and establish a dual relationship between (type I error, type II error) and (prior, average failure probability). By leveraging this framework, we propose an interpretable paradigm to consistently set privacy parameters on different databases with varying levels of leaked information. </p><p><br></p><p>Furthermore, we characterize the minimax limit of private parameter estimation, driven by $1/(n(1-2p))^2+1/n$, where $p$ represents the worst-case probability risk and $n$ is the number of data points. This characterization is more interpretable than the current lower bound $\min{1/(n\epsilon^2),1/(n\delta^2)}+1/n$ on $(\epsilon,\delta)$-DP. Additionally, we identify the phase transition of private parameter estimation based on this limit and provide suggestions for protocol designs to achieve optimal private estimations. </p><p><br></p><p>Last, we consider a federated learning setting where the data are stored in a distributed manner and privacy-preserving interactions are required. We extend the proposed interpretation to federated learning, considering two scenarios: protecting against privacy breaches against local nodes and protecting privacy breaches against the center. Specifically, we consider a non-convex sparse federated parameter estimation problem and apply it to the generalized linear models. We tackle two challenges in this setting. Firstly, we encounter the issue of initialization due to the privacy requirements that limit the number of queries to the database. Secondly, we overcome the heterogeneity in the distribution among local nodes to identify low-dimensional structures.</p>
416

DIFFERENTIALLY PRIVATE SUBLINEAR ALGORITHMS

Tamalika Mukherjee (16050815) 07 June 2023 (has links)
<p>Collecting user data is crucial for advancing machine learning, social science, and government policies, but the privacy of the users whose data is being collected is a growing concern. {\em Differential Privacy (DP)} has emerged as the most standard notion for privacy protection with robust mathematical guarantees. Analyzing such massive amounts of data in a privacy-preserving manner motivates the need to study differentially-private algorithms that are also super-efficient.  </p> <p><br></p> <p>This thesis initiates a systematic study of differentially-private sublinear-time and sublinear-space algorithms. The contributions of this thesis are two-fold. First, we design some of the first differentially private sublinear algorithms for many fundamental problems. Second, we develop general DP techniques for designing differentially-private sublinear algorithms. </p> <p><br></p> <p>We give the first DP sublinear algorithm for clustering by generalizing a subsampling framework from the non-DP sublinear-time literature. We give the first DP sublinear algorithm for estimating the maximum matching size. Our DP sublinear algorithm for estimating the average degree of the graph achieves a better approximation than previous works. We give the first DP algorithm for releasing $L_2$-heavy hitters in the sliding window model and a pure $L_1$-heavy hitter algorithm in the same model, which improves upon previous works.  </p> <p><br></p> <p>We develop general techniques that address the challenges of designing sublinear DP algorithms. First, we introduce the concept of Coupled Global Sensitivity (CGS). Intuitively, the CGS of a randomized algorithm generalizes the classical  notion of global sensitivity of a function, by considering a coupling of the random coins of the algorithm when run on neighboring inputs. We show that one can achieve pure DP by adding Laplace noise proportional to the CGS of an algorithm. Second, we give a black box DP transformation for a specific class of approximation algorithms. We show that such algorithms can be made differentially private without sacrificing accuracy, as long as the function has small global sensitivity. In particular, this transformation gives rise to sublinear DP algorithms for many problems, including triangle counting, the weight of the minimum spanning tree, and norm estimation.</p>
417

LEVERAGING MULTIMODAL SENSING FOR ENHANCING THE SECURITY AND PRIVACY OF MOBILE SYSTEMS

Habiba Farrukh (13969653) 26 July 2023 (has links)
<p>Mobile systems, such as smartphones, wearables (e.g., smartwatches, AR/VR headsets),<br> and IoT devices, have come a long way from being just a method of communication to<br> sophisticated sensing devices that monitor and control several aspects of our lives. These<br> devices have enabled several useful applications in a wide range of domains ranging from<br> healthcare and finance to energy and agriculture industries. While such advancement has<br> enabled applications in several aspects of human life, it has also made these devices an<br> interesting target for adversaries.<br> In this dissertation, I specifically focus on how the various sensors on mobile devices can<br> be exploited by adversaries to violate users’ privacy and present methods to use sensors<br> to improve the security of these devices. My thesis posits that multi-modal sensing can be<br> leveraged to enhance the security and privacy of mobile systems.<br> In this, first, I describe my work that demonstrates that human interaction with mobile de-<br> vices and their accessories (e.g., stylus pencils) generates identifiable patterns in permissionless<br> mobile sensors’ data, which reveal sensitive information about users. Specifically, I developed<br> S3 to show how embedded magnets in stylus pencils impact the mobile magnetometer sensor<br> and can be exploited to infer a users incredibly private handwriting. Then, I designed LocIn<br> to infer a users indoor semantic location from 3D spatial data collected by mixed reality<br> devices through LiDAR and depth sensors. These works highlight new privacy issues due to<br> advanced sensors on emerging commodity devices.<br> Second, I present my work that characterizes the threats against smartphone authentication<br> and IoT device pairing and proposes usable and secure methods to protect against these threats.<br> I developed two systems, FaceRevelio and IoTCupid, to enable reliable and secure user and<br> device authentication, respectively, to protect users’ private information (e.g., contacts,<br> messages, credit card details) on commodity mobile and allow secure communication between<br> IoT devices. These works enable usable authentication on diverse mobile and IoT devices<br> and eliminate the dependency on sophisticated hardware for user-friendly authentication.</p>
418

Data Security and Privacy under the Binary Cloak

Ji, Tianxi 26 August 2022 (has links)
No description available.
419

Private Law & Public Space : The Canadian Privacy Torts in an Era of Personal Remote-Surveillance Technology

Thomasen, Kristen 29 June 2022 (has links)
As increasingly sophisticated personal-use technologies like drones and home surveillance systems become more common, so too will interpersonal privacy conflicts. Given the nature of these new personal-use technologies, privacy conflicts will increasingly occur in public spaces. Tort law is one area of the Canadian legal system that can address interpersonal conflict and rights-infringements between people with no other legal relationship. However, building on a historical association between privacy and private property, the common and statutory law privacy torts in Canada fail to respond to such conflicts, I argue inappropriately. Privacy is an important dimension of public space, and the social interactions and relationships that arise in public spaces. Failing to recognize public space privacy in tort law leads to an overly narrow understanding of privacy, and can be considered contrary to binding precedent that says that the common law should evolve in line with (or at a minimum, not contrary to) Charter values. The Charter values of privacy, substantive equality, and expressive freedom support various reforms to the judicial understanding of the privacy torts in Canada. Privacy, also understood as "private affairs" or "private facts" in tort, should not be predicated on property, and can sometimes take on greater value in public spaces. Privacy interests should be assessed through a normative lens, with a view to the long-term implications of a precedent for both privacy and substantive equality. Courts should assess privacy through a subjective-objective lens that allows for consideration of the lived experiences and expertise of the parties, their relative power, and their relationships. Adopting these principles into the statutory and common law torts would permit tort law to serve as a legal mechanism for addressing interpersonal, technology-mediated privacy conflicts arising in public spaces. This will be a socially valuable development as such conflicts become increasingly common and potentially litigated.
420

Web browser privacy: Popular desktop web browsers ability to continuously spoof their fingerprint

Henningsson, Sebastian, Karlsson, Anton January 2022 (has links)
Background. Web tracking is a constant threat to our privacy when browsing the web. There exist multiple methods of tracking, but browser fingerprinting is more elusive and difficult to control. Browser fingerprinting works by a website collecting all kinds of browser and system information on visiting clients and then combining those into one set of information that can uniquely identify users. Objectives. In this thesis, we tested three of today's most used web browsers for the desktop platform to determine their ability to utilize one type of countermeasure, attribute spoofing. We aimed at determining how the browsers perform in two cases. The first case is when running with a default configuration. The second case is when the attribute spoofing is improved with the help of both altered settings and installed extensions. We also aimed at determining if the choice of browser matters in this aspect. Methods. The method for determining these goals was to conduct an experiment to collect 60 fingerprints from each browser and determine the effectiveness of the attribute spoofing via a weight-based system. We also used statistics to see the value range for spoofed attributes and to determine if any browser restart is required for certain spoofing to occur. Results. Our results show little to no attribute spoofing when browsers run in their default configuration. However, significant improvements were made through anti-fingerprint extensions. Conclusions. Our conclusion is, if the tested browsers' do not utilize any other type of countermeasure than attribute spoofing, using browsers at their default configuration can result in a user being alarmingly vulnerable to browser fingerprinting. Installing extensions aimed at improving our protection is therefore advised.

Page generated in 0.0374 seconds