• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1237
  • 167
  • 137
  • 109
  • 83
  • 70
  • 38
  • 38
  • 36
  • 21
  • 18
  • 12
  • 12
  • 12
  • 12
  • Tagged with
  • 2380
  • 641
  • 556
  • 520
  • 508
  • 352
  • 332
  • 308
  • 299
  • 235
  • 234
  • 218
  • 210
  • 199
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
421

PET-Exchange: A Privacy Enhanced Trading Framework : A Framework for Limit-Order Matching using Homomorphic Encryption in Trading / PET-Exchange: Ett Ramverk för Integritetsbevarande Limitordrar i Kontinuerliga Auktioner med Homomorfisk Kryptering

Wahlman, Jacob January 2022 (has links)
Over the recent decades, an increasing amount of new traders has entered the securities markets in order to trade securities such as stocks and bonds on electronic and physical exchanges. This increase in trader activity can largely be attributed to a simpler trading process including the growth of the electronic securities exchanges allowing for more dynamic and global trading platforms. Ever since their introduction, electronic exchanges have grown in terms of volume traded. The underlying trading mechanisms have mostly stayed the same over the years with some additions and improvements. However, over the recent decade, high-frequency traders (HFT) using algorithmic trading have shifted the playing field using practices that many consider unethical. Furthermore, insider trading continues to cause trust issues in certain trading platforms. Multiple solutions to these kinds of unethical trading behaviors have been proposed. Homomorphic encryption has been proposed as a potential preventative mechanism among the proposed solutions. This thesis analyses the properties and effects of a privacy-preserving framework for trading securities on an electronic stock exchange. The method used to evaluate the effects on trading was to implement a framework for handling trading and matching encrypted orders. The framework was then evaluated against its unencrypted counterpart to compare their performance properties in terms of volume handled, amount of orders matched, and timings of certain instructions. Finally, their security properties were analyzed to understand the proposed solution's potential impact on transparency, fairness, and opportunities for financial crime in an electronic securities exchange. The implementation was evaluated on its privacy-preserving properties by evaluating its ability to prevent information disclosure in trading processes. Furthermore, the performance of the implementation was evaluated using a generated trading session to simulate the market with sample trade data. Finally, from the proposed framework and the findings from this evaluation regarding privacy preservation and performance, a conclusion regarding its applicability as an alternative to off-exchange trading and preventative method against unfair practices and financial crime in trading is presented. The evaluation showed that the privacy-preserving and cryptographic properties of the suggested encrypted exchange were reasonably strong and were able to fulfill its goal of preventing unfair advantages in trading stemming from access to plaintext order information. However, the performance of the suggested implementation shows that more work needs to be performed for it to be viable in public electronic stock exchanges, although the solution could be suitable for small scale trading and privacy-preserving auctions.
422

Differential privacy and machine learning: Calculating sensitivity with generated data sets / Differential privacy och maskininlärning: Beräkning av sensitivitet med genererade dataset

Lundmark, Magnus, Dahlman, Carl-Johan January 2017 (has links)
Privacy has never been more important to maintain in today’s information society. Companies and organizations collect large amounts of data about their users. This information is considered to be valuable due to its statistical usage that provide insight into certain areas such as medicine, economics, or behavioural patterns among individuals. A technique called differential privacy has been developed to ensure that the privacy of individuals are maintained. This enables the ability to create useful statistics while the privacy of the individual is maintained. However the disadvantage of differential privacy is the magnitude of the randomized noise applied to the data in order to hide the individual. This research examined whether it is possible to improve the usability of the privatized result by using machine learning to generate a data set that the noise can be based on. The purpose of the generated data set is to provide a local representation of the underlying data set that is safe to use when calculating the magnitude of the randomized noise. The results of this research has determined that this approach is currently not a feasible solution, but demonstrates possible ways to base further research in order to improve the usability of differential privacy. The research indicates limiting the noise to a lower bound calculated from the underlying data set might be enough to reach all privacy requirements. Furthermore, the accuracy of the machining learning algorithm and its impact on the usability of the noise, was not fully investigated and could be of interest in future studies. / Aldrig tidigare har integritet varit viktigare att upprätthålla än i dagens informationssamhälle, där företag och organisationer samlar stora mängder data om sina användare. Merparten av denna information är sedd som värdefull och kan användas för att skapa statistik som i sin tur kan ge insikt inom områden som medicin, ekonomi eller beteendemönster bland individer. För att säkerställa att en enskild individs integritet upprätthålls har en teknik som heter differential privacy utvecklats. Denna möjliggör framtagandet av användbar statistik samtidigt som individens integritet upprätthålls. Differential privacy har dock en nackdel, och det är storleken på det randomiserade bruset som används för att dölja individen i en fråga om data. Denna undersökning undersökte huruvida detta brus kunde förbättras genom att använda maskininlärning för att generera ett data set som bruset kunde baseras på. Tanken var att den genererade datasetet skulle kunna ge en lokal representation av det underliggande datasetet som skulle vara säker att använda vid beräkning av det randomiserade brusets storlek. Forskningen visar att detta tillvägagångssätt för närvarande inte stöds av resultaten. Storleken på det beräknade bruset var inte tillräckligt stort och resulterade därmed i en oacceptabel mängd läckt information. Forskningen visar emellertid att genom att begränsa bruset till en lägsta nivå som är beräknad från det lokala datasetet möjligtvis kan räcka för att uppfylla alla sekretesskrav. Ytterligare forskning behövs för att säkerställa att detta ger den nödvändiga nivån av integritet. Vidare undersöktes inte noggrannheten hos maskininlärningsalgoritmen och dess inverkan på brusets användbarhet vilket kan vara en inriktning för vidare studier.
423

An exploratory paper of the privacy paradox in the age of big data and emerging technologies / En undersökning av the privacy paradox i en tid med big data och ny teknik

Serra, Michelle January 2018 (has links)
Technological innovations and advancements are helping people gain an increasingly comfortable life, as well as expand their social capital through online networks by offering individual's new opportunities to share personal information. By collecting vast amounts of data a whole new range of services can be offered, information can be collected and compared, and a new level of individualization can be reached. However, with these new technical capacities comes the omnipresence of various devices gathering data, potential threats to privacy, and individuals' increasing concern over data privacy. This paper aims to shed light on the 'privacy paradox' phenomenon, the dichotomy between privacy attitude, concern, and behavior, by examining previous literature as well as using an online survey (N=463). The findings indicate that there is a difference between attitude, concern, and actual behavior. While individuals' value their data privacy and are concerned about information collected on them, few take action to protect it and actions rarely align with expressed concerns. However, the 'privacy paradox' is a complex phenomenon and it requires further research, especially with the implications of a data driven society and when introducing emerging technologies such as Artificial Intelligence and Internet of Things. / Tekniska innovationer och framsteg har bidragit till att människor kan erbjudas en alltmer bekväm livsstil. Genom insamling av stora mängder data kan individer erbjudas ett helt nytt utbud av tjänster, information kan samlas in och jämföras, och en helt ny nivå av Individualisering kan uppnås. Dock innebär dessa innovationer enallt större närvaro av datainsamlandeenheter, potentiella hot mot privatliv, samt individers ökade oro kring dataintegritet. Denna uppsats undersöker "the privacy paradox", skillnaden mellan attityd och beteende kring datasäkerhet, och dess konsekvenser i ett datastyrt samhälle i och med att ny teknik introduceras. Undersökningen har skett genom en litteraturstudie samt en enkätundersökning (N=463) och resultaten visar på ett det finns en skillnad mellan attityd och beteende. Individer värderar datasäkerhet och är oroliga kring vilken mängd information som samlas in, dock är det få som agerar för att inte dela information och attityd går sällan i linje med faktiskt beteende. "The privacy paradox" är ett komplext fenomen och mer forskning krävs, speciellt i och med introduktion av ny teknik så som Artificiell intelligens och Internet of Things.
424

Upplevelser av ‘Privacy’ vid förlossning : en kvalitativ metasyntes ur kvinnans och barnmorskans perspektiv / Experiences of ‘Privacy’ in childbirth : a qualitative meta synthesis from the perspective of the woman and the midwife

Hed, Linda, Nyström, Emma January 2022 (has links)
Bakgrund: Bevarad 'privacy’ under förlossningen kan underlätta den fysiologiska födselns process och minska känslor av stress och otrygghet. Detta ger ökade förutsättningar att hantera smärta och minska lidande och leder även till minskat behov av interventioner. Rätten till ‘privacy’ i samband med födslar betonas i globala standarder för födslovården men upplevelser av ‘privacy’ vid födslar är lite studerat. Syfte: Att identifiera och syntetisera kvalitativ forskning om kvinnors och barnmorskors upplevelser av ‘privacy’ i samband med vaginala födslar. Metod: Kvalitativ metasyntes med metaetnografisk analysmetod. Resultatet baserades på 20 vetenskapliga artiklar framsökta i PubMed, Cinahl och Web of Science, granskade med JBIQARI. Resultat: Huvudtemat ‘Privacy’ som tillgången till resurser innefattar upplevelser av den fysiska miljön, mänskliga och personliga resurser samt av hur kultur och religion påverkar. Bevarandet av ‘privacy’ har en relationell komponent som påverkas av kvinnans och barnmorskans personliga resurser. Huvudtemat ‘Privacy’ som kontroll över tillgången till ens jag beskriver upplevelser av att ha eller förlora kontrollen samt av antalet personer i födslorummet. ‘Privacy’ innebär mer än den fysiska miljön, som känslan av att rummet är den födandes plats, vara huvudpersonen, känna trygghet och ha kontroll över vad som sker. Slutsats: Metasyntesen belyser att bevarad ‘privacy’ är betydelsefullt för välmående vid födslar. Globalt är bevarandet av ‘privacy’ otillräckligt. Miljön som kvinnorna önskar för bevarad ‘privacy’ har stora likheter den som krävs för att främja födselns fysiologi. Klinisk tillämpbarhet: Metasyntesen bidrar med konkreta åtgärder för att bevara kvinnans 'privacy' och kan vara till stöd i den enskilda barnmorskans arbete med att främja födselns fysiologi samt till barnmorskor och beslutsfattare vid utformning av arbetsrutiner och förlossningsenheter. / Background: Preserved privacy during childbirth may facilitate the process of physiological birth and reduce feelings of stress and insecurity. This gives better preconditions to handle pain and reduce suffering, it also leads to a reduced need for medical interventions. The right to privacy in birth is emphasized in global standards for maternity care but experiences of privacy in childbirth is sparingly researched. Aim: To identify and synthesize qualitative research about women's and midwives' experiences of privacy in vaginal childbirth. Method: A qualitative meta-synthesis with meta-ethnographic analysis. The result was based on 20 scientific articles searched in PubMed, Cinahl and Web of Science reviewed with JBIQARI. Results: The main theme Privacy as the access to resources includes experiences of the physical environment, of human and personal resources and of the impact of religion and culture. The preservation of 'privacy' has a relational component which is influenced by the personal resources of the woman and the midwife. The main theme privacy as control of access to the self describes experiences of having or losing control and about the amount of people in the birth room. Privacy includes more than the physical environment, such as the feeling that the room is the birthing womans place, being the protagonist, feeling safe and having control of what is happening. Conclusion: The meta-synthesis highlights that preserved 'privacy' is important for well-being during births. Globally, the preservation of privacy is insufficient. The environment that the women desire for preserved 'privacy' has great similarities to the one required to promote the physiology of birth. Clinical applicability: The meta-sythesis contributes with concrete measures to preserve the woman's 'privacy' and can therefore support the individual midwife's work in promoting the physiology of birth as well as to midwives and decision-makers when designing work routines and delivery units.
425

An information processing model and a set of risk identification methods for privacy impact assessment in an international context / 国際的な文脈におけるプライバシー影響評価のための情報取扱モデル及び一連のリスク特定手法

Kuroda, Yuki 25 September 2023 (has links)
京都大学 / 新制・課程博士 / 博士(情報学) / 甲第24935号 / 情博第846号 / 新制||情||142(附属図書館) / 京都大学大学院情報学研究科社会情報学専攻 / (主査)教授 黒田 知宏, 教授 矢守 克也, 教授 曽我部 真裕 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DGAM
426

A Jagged Little Pill: Ethics, Behavior, and the AI-Data Nexus

Kormylo, Cameron Fredric 21 December 2023 (has links)
The proliferation of big data and the algorithms that utilize it have revolutionized the way in which individuals make decisions, interact, and live. This dissertation presents a structured analysis of behavioral ramifications of artificial intelligence (AI) and big data in contemporary society. It offers three distinct but interrelated explorations. The first chapter investigates consumer reactions to digital privacy risks under the General Data Protection Regulation (GDPR), an encompassing regulatory act in the European Union aimed at enhancing consumer privacy controls. This work highlights how consumer behavior varies substantially between high- and low-risk privacy settings. These findings challenge existing notions surrounding privacy control efficacy and suggest a more complex consumer risk assessment process. The second study shifts to an investigation of historical obstacles to consumer adherence to expert advice, specifically betrayal aversion, in financial contexts. Betrayal aversion, a well-studied phenomenon in economics literature, is defined as the strong dislike for the violation of trust norms implicit in a relationship between two parties. Through a complex simulation, it contrasts human and algorithmic financial advisors, revealing a significant decrease in betrayal aversion when human experts are replaced by algorithms. This shift indicates a transformative change in the dynamics of AI-mediated environments. The third chapter addresses nomophobia – the fear of being without one's mobile device – in the workplace, quantifying its stress-related effects and impacts on productivity. This investigation not only provides empirical evidence of nomophobia's real-world implications but also underscores the growing interdependence between technology and mental health. Overall, the dissertation integrates interdisciplinary theoretical frameworks and robust empirical methods to delineate the profound and often nuanced implications of the AI-data nexus on human behavior, underscoring the need for a deeper understanding of our relationship with evolving technological landscapes. / Doctor of Philosophy / The massive amounts of data collected online and the smart technologies that use this data often affect the way we make decisions, interact with others, and go about our daily lives. This dissertation explores that relationship, investigating how artificial intelligence (AI) and big data are changing behavior in today's society. In my first study, I examine how individuals respond to high and low risks of sharing their personal information online, specifically under the General Data Protection Regulation (GDPR), a new regulation meant to protect online privacy in the European Union. Surprisingly, the results show that changes enacted by GDPR, such as default choices that automatically select the more privacy-preserving choice, are more effective in settings in which the risk to one's privacy is low. This implies the process in which people decide when and with whom to share information online is more complex than previously thought. In my second study, I shift focus to examine how people follow advice from experts, especially in financial decision contexts. I look specifically at betrayal aversion, a common trend studied in economics, that highlights individuals' unwillingness to trust someone when they fear they might be betrayed. I examine if betrayal aversion changes when human experts are replaced by algorithms. Interestingly, individuals displayed no betrayal aversion when given a financial investment algorithm, showing that non-human experts may have certain benefits for consumers over their human counterparts. Finally, I study a modern phenomenon called 'nomophobia' – the fear of being without your mobile phone – and how it affects people at work. I find that this fear can significantly increase stress, especially as phone-battery level levels decrease. This leads to a reduction in productivity, highlighting how deeply technology is intertwined with our mental health. Overall, this work utilizes a mix of theories and detailed analyses to show the complex and often subtle ways AI and big data are influencing our actions and thoughts. It emphasizes the importance of understanding our relationship with technology as it continues to evolve rapidly.
427

Platform Privacy Construction – A case study of privacy on public digital healthcare platform 1177

Lindholm, Nina January 2023 (has links)
Privacy is an essential concept in the field of healthcare. As healthcare is fast digitalizing and going through platformization, understanding how it is constructed has become important. Swedish digital healthcare platform 1177 is a unique case, that can be used to analyze how the different actors that are present in the platform environment like the platform owner, healthcare organizations, and patients take part in the construction of privacy in a platform environment and how for example national and international legislation affects on the societal norms of the platform. By using the context analysis method and operationalizing platform society – and contextual integrity theory to 1177, it was revealed that while there is a clear hierarchy between the actors, all of them participate in different ways into making sure that the data that flows within the platform is being processed securely and with privacy in mind. While the platforms are their own socio-technical environments, they are also affected by national and international legislative norms. Compliance with legislation like the GDPR and the Swedish national patient data law is important as non-compliance can cause issues for the platform. However, while the platform privacy construction would be strong, the development of data analysis methods and AI can pose a risk for the data transfer from the platform to other purposes, like for example in research.
428

An analysis of the Privacy Policy of Browser Extensions

Zachariah, Susan Sarah January 2024 (has links)
Technological advancement has transformed our lives by bringing unparalleled convenience and efficiency. Data, particularly consumer data, essential for influencing businesses and developing personalized experiences, is at the heart of this transition. Companies may improve consumer satisfaction and loyalty by using data analysis to customize their products and services. However, the collection and utilization of consumer data raise privacy concerns. Protecting customers’ personal information is essential to maintaining trust, respecting individual autonomy, and preventing unauthorized access or misuse. Along with the protection of data, transparency is also another essential factor. When companies or organizations deal with users’ data, they are liable to inform these users of anything and everything that happens with their data. Our study focuses on the online privacy policies of Google Chrome browser extensions. We have tried to find the extensions that comply with the data protection guidelines and if all Google Chrome browser extensions are transparent enough to mention the details as per guidelines. Utilizing the power of Natural Language Processing (NLP) techniques, we have employed advanced methodologies to extract insights from these policies.
429

Light and Privacy, A proposal towards a testing and education standard

Torgersrud, Cody January 2020 (has links)
The transformation of the architects’ vision to architectural form is a lengthy process. From initial sketch to day-to-day life, a design is transformed through the reality of occupation. No matter how much effort is put into a design its final effectiveness is determined by the end user. The access to ample daylight balanced with an adequate sense of visual privacy within ones one home is not often accounted for within the planning process. With current legislation making access to daylight a right within many developed countries, guaranteeing that access within the dense urban environment can mean putting resident’s privacy into question when planning to meet these daylight requirements. Failing to consider the privacy needs of all residents, especially immigrant groups, can lead to privacy driven modifications counterproductive to the overall goal of increasing access to daylight. Resident modifications can, in turn, lead to reductions of daylight levels within the home. There is a need for a system of analysis when it comes to the balance of access to daylight and adequate visual privacy, connecting the critical impacts of these factors on the human physiology and psychology. This proposal puts forward a system to analyze the relationship between the effective light transmission and the perceived visual privacy provided by a given visual privacy solution. The study is based off the analysis of current research regarding the effect of daylight on the human body, the importance of privacy within the home, the impact of cultural background on perception of privacy, and the impact of changing urban density on how people live. The research proposes a system of measurement taking into consideration both the quantitative effective daylight transmittance and a systematic qualitative analysis of perceived visual privacy through participant survey. The data collected would eventually be combined in a way that could be easily communicated to architects, designers, manufacturers and most importantly the end user. This system would be used to ensure that residents are able to effectively balance the level of privacy they require while mitigating the loss of daylight within their homes helping to insure the most benefits for the resident regardless of what home they find themselves in.
430

Protecting Online Privacy in the Digital Age: Carpenter v. United States and the Fourth Amendment's Third-Party Doctrine

Del Rosso, Cristina 01 January 2019 (has links)
The intent of this thesis is to examine the future of the third-party doctrine with the proliferation of technology and the online data we are surrounded with daily, specifically after the United States Supreme Court's decision in Carpenter v. United States. In order to better understand the Supreme Court's reasoning in that case, this thesis will review the history of the third-party doctrine and its roots in United States v. Miller and Smith v. Maryland. A review of Fourth Amendment history and jurisprudence is also crucial to this thesis, as it is imperative that individuals do not forfeit their Constitutional guarantees for the benefit of living in a technologically advanced society. This requires an understanding of the modern-day functional equivalents of "papers" and "effects." Furthermore, this thesis will ultimately answer the following question: Why is it legally significant that we protect at least some data that comes from technologies that our forefathers could have never imagined under the Fourth Amendment? Looking to the future, this thesis will contemplate solutions on how to move forward in this technology era. It will scrutinize the relevancy of the third-party doctrine due to the rise of technology and the enormous amount of information held about us by third parties. In the past, the Third-Party Doctrine may have been good law, but that time has passed. It is time for the Third-Party Doctrine to be abolished so the Fourth Amendment can join the 21st Century.

Page generated in 0.0354 seconds