Spelling suggestions: "subject:"privacy codecision caking"" "subject:"privacy codecision 4making""
1 |
A Study of Internet Privacy Invasion in PTTLi, Ming-Han 03 February 2010 (has links)
Although technology is well developed, personal data is well spread in internet ,too. We can see a lot of privacy invasion in PTT. People can find a lot of data about someone from his name or movie. People are afraid of being caught in internet. But why some people support this kind of privacy invasion but some don¡¦t. What is the main factor?
This study uses description and different kind of event to observe if description can affect people¡¦s decision toward privacy invasion. We separate description into violate and non-violate, and we use two stage survey to observe when people face different kind of description, is there any different between violate and non-violate description toward privacy decision making.
In the end, we find how sex, description, and event affect privacy decision making, and we bring up a suggestion of privacy situational desion and the main factor toward privacy decision making.
|
2 |
Helping Smartphone Users Manage their Privacy through NudgesAlmuhimedi, Hazim 01 December 2017 (has links)
The two major smartphone platforms (Android and iOS) have more than two mil- lion mobile applications (apps) available from their respective app stores, and each store has seen more than 50 billion apps downloaded. Although apps provide desired functionality by accessing users’ personal information, they also access personal information for other purposes (e.g., advertising or profiling) that users may or may not desire. Users can exercise control over how apps access their personal information through permission managers. However, a permission manager alone might not be sufficient to help users manage their app privacy because: (1) privacy is typically a secondary task and thus users might not be motivated enough to take advantage of the permission manager’s functionality, and (2) even when using the permission manager, users often make suboptimal privacy decisions due to hurdles in decision making such as incomplete information, bounded rationality, and cognitive and behavioral biases. To address these two challenges, the theoretical framework of this dissertation is the concept of nudges: “soft paternalistic” behavioral interventions that do not restrict choice but account for decision making hurdles. Specifically, I designed app privacy nudges that primarily address the incomplete information hurdle. The nudges aim to help users make better privacy decisions by (1) increasing users’ awareness of privacy risks associated with apps, and (2) temporarily making privacy the primary task to motivate users to review and adjust their app settings. I evaluated app privacy nudges in three user studies. All three studies showed that app privacy nudges are indeed a promising approach to help users manage their privacy. App privacy nudges increased users’ awareness of privacy risks associated with apps on their phones, switched users’ attention to privacy management, and motivated users to review their app privacy settings. Additionally, the second study suggested that not all app privacy nudge contents equally help users manage their privacy. Rather, more effective nudge contents informed users of: (1) contexts in which their personal information has been accessed, (2) purposes for apps’ accessing their personal information, and (3) potential implications of secondary usage of users’ personal information. The third study showed that user engagement with nudges decreases as users receive repeated nudges. Nonetheless, the results of the third experiment also showed that users are more likely to engage with repeated nudges (1) if users have engaged with previous nudges, (2) if repeated nudges contain new in- formation (e.g., additional apps, not shown in earlier nudges, that accessed sensitive resources), or (3) if the nudge contents of repeated nudges resonate with users. The results of this dissertation suggest that mobile operating system providers should enrich their systems with app privacy nudges to assist users in managing their privacy. Additionally, the lessons learned in this dissertation may inform designing privacy nudges in emerging areas such as the Internet of Things.
|
3 |
A Jagged Little Pill: Ethics, Behavior, and the AI-Data NexusKormylo, Cameron Fredric 21 December 2023 (has links)
The proliferation of big data and the algorithms that utilize it have revolutionized the way in which individuals make decisions, interact, and live. This dissertation presents a structured analysis of behavioral ramifications of artificial intelligence (AI) and big data in contemporary society. It offers three distinct but interrelated explorations. The first chapter investigates consumer reactions to digital privacy risks under the General Data Protection Regulation (GDPR), an encompassing regulatory act in the European Union aimed at enhancing consumer privacy controls. This work highlights how consumer behavior varies substantially between high- and low-risk privacy settings. These findings challenge existing notions surrounding privacy control efficacy and suggest a more complex consumer risk assessment process. The second study shifts to an investigation of historical obstacles to consumer adherence to expert advice, specifically betrayal aversion, in financial contexts. Betrayal aversion, a well-studied phenomenon in economics literature, is defined as the strong dislike for the violation of trust norms implicit in a relationship between two parties. Through a complex simulation, it contrasts human and algorithmic financial advisors, revealing a significant decrease in betrayal aversion when human experts are replaced by algorithms. This shift indicates a transformative change in the dynamics of AI-mediated environments. The third chapter addresses nomophobia – the fear of being without one's mobile device – in the workplace, quantifying its stress-related effects and impacts on productivity. This investigation not only provides empirical evidence of nomophobia's real-world implications but also underscores the growing interdependence between technology and mental health. Overall, the dissertation integrates interdisciplinary theoretical frameworks and robust empirical methods to delineate the profound and often nuanced implications of the AI-data nexus on human behavior, underscoring the need for a deeper understanding of our relationship with evolving technological landscapes. / Doctor of Philosophy / The massive amounts of data collected online and the smart technologies that use this data often affect the way we make decisions, interact with others, and go about our daily lives. This dissertation explores that relationship, investigating how artificial intelligence (AI) and big data are changing behavior in today's society. In my first study, I examine how individuals respond to high and low risks of sharing their personal information online, specifically under the General Data Protection Regulation (GDPR), a new regulation meant to protect online privacy in the European Union. Surprisingly, the results show that changes enacted by GDPR, such as default choices that automatically select the more privacy-preserving choice, are more effective in settings in which the risk to one's privacy is low. This implies the process in which people decide when and with whom to share information online is more complex than previously thought. In my second study, I shift focus to examine how people follow advice from experts, especially in financial decision contexts. I look specifically at betrayal aversion, a common trend studied in economics, that highlights individuals' unwillingness to trust someone when they fear they might be betrayed. I examine if betrayal aversion changes when human experts are replaced by algorithms. Interestingly, individuals displayed no betrayal aversion when given a financial investment algorithm, showing that non-human experts may have certain benefits for consumers over their human counterparts. Finally, I study a modern phenomenon called 'nomophobia' – the fear of being without your mobile phone – and how it affects people at work. I find that this fear can significantly increase stress, especially as phone-battery level levels decrease. This leads to a reduction in productivity, highlighting how deeply technology is intertwined with our mental health. Overall, this work utilizes a mix of theories and detailed analyses to show the complex and often subtle ways AI and big data are influencing our actions and thoughts. It emphasizes the importance of understanding our relationship with technology as it continues to evolve rapidly.
|
Page generated in 0.1006 seconds