Spelling suggestions: "subject:"misinformation"" "subject:"disinformation""
41 |
Fake and Spam Messages: Detecting Misinformation During Natural Disasters on Social MediaRajdev, Meet 01 May 2015 (has links)
During natural disasters or crises, users on social media tend to easily believe contents of postings related to the events, and retweet the postings, hoping that the postings will be reached by many other users. Unfortunately, there are malicious users who understand the tendency and post misinformation such as spam and fake messages with expecting wider propagation. To resolve the problem, in this paper we conduct a case study of the 2013 Moore Tornado and Hurricane Sandy. Concretely, we (i) understand behaviors of these malicious users; (ii) analyze properties of spam, fake and legitimate messages; (iii) propose at and hierarchical classification approaches; and (iv) detect both fake and spam messages with even distinguishing between them. Our experimental results show that our proposed approaches identify spam and fake messages with 96.43% accuracy and 0.961 F-measure.
|
42 |
SHIFT : An alternate future for experiencing reality in digital imagery. / SHIFT : Mapping reality in digital imageryRevi Poovakkat, Manu January 2021 (has links)
A picture is worth 1000 words. Great visuals can enhance, dramatize and even bend the narrative. From historical photos to modern-day digital images made of millions of pixels, images have always been instrumental in shaping our visual understanding of the world around us. As much as they have been instrumental in shaping reality, easy access to image manipulation has also resulted in wide spread misinformation. When anything can befaked, honest representation of reality in images has become a hard problem to crack. After multiple design explorations, I realized a need for a fundamental change in our interaction with images. The thesis resulted in building an alternate landscape for digital imagery called SHIFT, where images are connected entities that become an access point to multiple perspectives and alternate realities. It is not an attempt to challenge image manipulation technology but to use images as a means to develop a more informed understanding of the reality they represent.
|
43 |
Modality Effects in False Memory Production Using the Misinformation ParadigmHendrich, Megan A. 18 June 2019 (has links)
No description available.
|
44 |
Vulnerability to the Misinformation Effect as a Function of Handedness ConsistencyMonroe, Stephanie R. 18 June 2019 (has links)
No description available.
|
45 |
Correcting eyewitness suggestibility: does explanatory role predict resistance to correction?Braun, Blair E. 20 November 2020 (has links)
No description available.
|
46 |
WILL SPACING RETRACTIONS MODULATE THE CONTINUED INFLUENCE EFFECT?Hailey Arreola (16426194) 26 June 2023 (has links)
<p>Globally, the misinformation crisis exposed the need for cognitive researchers to</p>
<p>investigate interventions that will mitigate the influence of misinformation within memory. One</p>
<p>proposed solution is a retraction, whereby misinformation is indicated to be inaccurate. Previous</p>
<p>studies have demonstrated that providing a retraction after misinformation may reduce references</p>
<p>to misinformation. The continued reliance on misinformation even after it has been corrected is</p>
<p>known as the continued influence effect (CIE). It is unclear whether repeated retractions and the</p>
<p>spacing of repeated retractions can reduce the CIE. In the present study, two experiments were</p>
<p>conducted to investigate whether spacing repeated retractions among news messages would be</p>
<p>more effective at reducing the CIE compared to massing retractions. Both experiments exposed</p>
<p>participants to a news story containing misinformation. Each experiment included four retraction</p>
<p>conditions: no retraction, a single retraction, or repeated retractions that were spaced or massed.</p>
<p>In Experiment 1, a single retraction reduced reliance on misinformation, but we did not observe</p>
<p>an additional benefit of repeated retractions when there were two retractions. In Experiment 2, we</p>
<p>provided participants with three repeated retractions. Using this stronger manipulation, repeated</p>
<p>retractions reduced references to misinformation compared to a single retraction, but there was no</p>
<p>benefit of spacing them out. Collectively, our results suggest that repeating corrective messages</p>
<p>can help reduce references to misinformation, with no supporting evidence that it matters how</p>
<p>the repetitions are organized.</p>
|
47 |
A Multi-Agent Model to Study the Effects of Crowdsourcing on the Spread of Misinformation in Social Networks.Bhattacharya, Ankur 06 June 2023 (has links)
No description available.
|
48 |
Misinformation in the Media and its Influence on RacismChampa, Jared 01 January 2021 (has links)
The purpose of the current study was to examine how the media's positive and negative portrayals related to racism impact the viewer's attitudes regarding African Americans. Previous research has shown how misinformation in the media can implicitly affect one's level of racism. Previous research has also shown that gender and one's sociodemographic status can affect the way individuals perceive misinformation. This study aimed to address the relationship between misinformation depicting racist views directed toward African Americans and consumer's attitudes toward African Americans. It was hypothesized that exposure to misinformation will have a significant impact on participants' level of racism. A Series of linear regression analyses were conducted to determine how race, sex, social class, right-wing authoritarianism, religious involvement, political preference, and exposure to real and fake news combined predict the pro-black and anti-black views of participants. Results indicated that exposure to fake news did have a significant negative impact on a pro-black viewpoint. However, the results of the study indicated that real or fake news did not significantly impact anti-black views.
|
49 |
The Indirect Threat of Misinformation to DemocracyMortenson, Chloe R. 04 October 2021 (has links)
No description available.
|
50 |
BRIDGING THE VALUE-ACTION GAPGiese, Michel January 2023 (has links)
Using an online charitable dictator game experiment (n=214), we explored how different, randomly assigned experimental treatments (social media posts) containing anti-climate-change sentiment (n=77, 36%), misinformation (n=74, 34.6%) and a control condition (n=63, 29.4%) impacted the real donation behaviour of pro-environmentalists to an environmental non-governmental organisation. Participants were recruited through social media (Facebook, Linked-In, and Reddit). We found that the treatments resulted in minimal differences to donation likelihood and amount. We used the same charitable dictator game experiment (n=56) to explore how these experimental treatments containing anti-climate-change sentiment (n=20, 35.7%), misinformation (n=26, 46.4%) and a control condition (n=10, 17.9%) impacted the social media response behaviour of pro-environments, as well as their real donation behaviour. We found that the treatments resulted in differences to reply frequency (p=0.02935) and minimal differences to reply tone (p=0.05698), while donation behaviour was unaffected. Donation behaviour did not stratify with demographic factors with the exception of geographic location (p=0.04825). These results suggest that the donation behaviour of pro-environmentalists is resistant to climate-change misinformation and anti-climate change opinions presented through social media, while these treatments may influence social media reply behaviour. Further research into the effect of this reply behaviour on other social media users and online spaces as well as whether these observations apply to the general population is necessary. These results also call into question the necessity of moderating misinformation and climate scepticism in online spaces, as there is some evidence that this content does not negatively affect prosocial behaviour, and instead may encourage cross-attitudinal discussion. / Thesis / Master of Arts (MA)
|
Page generated in 0.0688 seconds