Return to search

Combating Problematic Information Online with Dual Process Cognitive Affordances

Dual process theories of mind have been developed over the last decades to posit that humans use heuristics or mental shortcuts (automatic) and analytical (reflective) reasoning while consuming information. Can such theories be used to support users' information consumption in the presence of problematic content in online spaces? To answer, I merge these theories with the idea of affordances from HCI to into the concept of dual process cognitive affordances, consisting of automatic affordance and reflective affordance. Using this concept, I built and tested a set of systems to address two categories of online problematic content: misinformation and filter bubbles. In the first system, NudgeCred, I use cognitive heuristics from the MAIN model to design automatic affordances for better credibility assessment of news tweets from mainstream and misinformative sources. In TransparencyCue, I show the promise of value-centered automatic affordance design inside news articles differentiating content quality. To encourage information consumption outside their ideological filter bubble, in NewsComp, I use comparative annotation to design reflective affordances that enable active engagement with stories from opposing-leaning sources. In OtherTube, I use parasocial interaction, that is, experiencing information feed through the eyes of someone else, to design a reflective affordance that enables recognition of filter bubbles in their YouTube recommendation feeds. Each system shows various degrees of success and outlines considerations in cognitive affordances design. Overall, this thesis showcases the utility of design strategies centered on dual process information cognition model of human mind to combat problematic information space. / Doctor of Philosophy / Over the last several decades, billions of users have moved to the internet for everyday information gathering, allowing information flow around the globe at a massive scale. This flow is managed by algorithms personalized to each users' need, creating a complicated trio of producer-algorithm-consumer. This has resulted in some unforeseen challenges. Bad information producers takes the advantage of system to promote problematic content, such as, false information, termed as misinformation. Personalized algorithms have created filters of what people see oftentimes isolating them from diverse perspectives of information, creating a distorted perception of reality. Augmenting the online technology infrastructure to combat these challenges has become crucial and the overall goal of this thesis. Cognitive psychologists theorize that two cognitive processes are at play when people consume information, also known as dual process theories. Can we design new tools to combat these challenges by tapping into each of these processes? In this thesis, I answer this question through a series of studies. In each of these studies, I combine this theory from psychology with design guides from Human-Computer Interaction to design socio-technical design. I evaluated each of these systems through controlled experimentation. The result of these studies informs ways we can capitalize on users' information processing mechanism to combat various types of problematic information online.

Identiferoai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/115989
Date04 August 2023
CreatorsBhuiyan, MD Momen
ContributorsComputer Science and Applications, Lee, Sang Won, Mitra, Tanushree, Luther, Kurt, Horning, Michael A., Goyal, Nitesh
PublisherVirginia Tech
Source SetsVirginia Tech Theses and Dissertation
LanguageEnglish
Detected LanguageEnglish
TypeDissertation
FormatETD, application/pdf
RightsCreative Commons Attribution 4.0 International, http://creativecommons.org/licenses/by/4.0/

Page generated in 0.005 seconds