Psychological Research on Misinformation: Current Issues and Future Directions
In: European psychologist, Band 28, Heft 3, S. 135-138
ISSN: 1878-531X
20 Ergebnisse
Sortierung:
In: European psychologist, Band 28, Heft 3, S. 135-138
ISSN: 1878-531X
In: Political psychology: journal of the International Society of Political Psychology, Band 40, Heft 2, S. 241-260
ISSN: 1467-9221
AbstractMisinformation often continues to influence people's memory and inferential reasoning after it has been retracted; this is known as the continued influence effect (CIE). Previous research investigating the role of attitude‐based motivated reasoning in this context has found conflicting results: Some studies have found that worldview can have a strong impact on the magnitude of the CIE, such that retractions are less effective if the misinformation is congruent with a person's relevant attitudes, in which case the retractions can even backfire. Other studies have failed to find evidence for an effect of attitudes on the processing of misinformation corrections. The present study used political misinformation—specifically fictional scenarios involving misconduct by politicians from left‐wing and right‐wing parties—and tested participants identifying with those political parties. Results showed that in this type of scenario, partisan attitudes have an impact on the processing of retractions, in particular (1) if the misinformation relates to a general assertion rather than just a specific singular event and (2) if the misinformation is congruent with a conservative partisanship.
In: Political psychology: journal of the International Society of Political Psychology
ISSN: 1467-9221
AbstractMaking misleading statements may benefit a politician, for example, during an election campaign. However, there are potentially also negative consequences; political misinformation can taint democratic debate, voters may be misled into forming false beliefs, and being fact‐checked may damage a politician's reputation. Previous research has found that correcting misleading statements made by established politicians reduces topical misperceptions, but hardly affects voter feelings and support. Here, we examined the impact of political misinformation and fact‐checking when politicians are unfamiliar. Participants (N = 406) were engaged in a simulated election campaign set in an unfamiliar country, featuring statements from fictional candidates. Participants indicated their feelings toward the candidates, cast a vote, and rated their belief in the fact‐checked statements. Misleading statements that were not corrected positively affected feelings toward and voting for (right‐leaning) politicians. Corrective fact‐checks had large effects, reducing belief in misinformation, and fact‐checked candidates were viewed much less favorably and attracted far fewer votes. This demonstrates that in the absence of strong pre‐existing attitudes, corrective fact‐checks can negatively impact misinformation‐spreading politicians who are not (yet) well known.
In: Humanities and Social Sciences Communications, Band 8, Heft 1
ISSN: 2662-9992
AbstractThe COVID-19 pandemic has caused immense distress but also created opportunity for radical change. Two main avenues for recovery from the pandemic have been discussed: A "back to normal" that foregrounds economic recovery, and a sustainable and progressive "build back better" approach that seeks to address global problems such as inequality and climate change. The article reports two experiments conducted on representative British and American samples (N = 600 andN = 800, respectively, for the two experiments) that show that people in both countries overall prefer a progressive future to a return to normal, although that preference is stronger on the political left and center-left with ambivalence prevailing on the right. However, irrespective of political leanings, people consider a return to normal more likely than a progressive future. People also mistakenly believe that others want the progressive scenarios less, and the return to normal more, than they actually do. The divergence between what people want and what they think others want represents an instance of pluralistic ignorance, which arises when public discourse is not reflecting people's actual opinions. Publicizing public opinion is thus crucial to facilitate a future with broad support. In additional open-ended items, participants cited working from home, reduced commuting, and a collective sense of civility as worth retaining post pandemic.
In: Lewandowsky , S , Facer , K & Ecker , U K H 2021 , ' Losses, hopes, and expectations for sustainable futures after COVID ' , Humanities & Social Sciences Communications , vol. 8 , no. 1 , 296 . https://doi.org/10.1057/s41599-021-00961-0
The COVID-19 pandemic has caused immense distress but also created opportunity for radical change. Two main avenues for recovery from the pandemic have been discussed: A "back to normal" that foregrounds economic recovery, and a sustainable and progressive "build back better" approach that seeks to address global problems such as inequality and climate change. The article reports two experiments conducted on representative British and American samples (N = 600 and N = 800, respectively, for the two experiments) that show that people in both countries overall prefer a progressive future to a return to normal, although that preference is stronger on the political left and center-left with ambivalence prevailing on the right. However, irrespective of political leanings, people consider a return to normal more likely than a progressive future. People also mistakenly believe that others want the progressive scenarios less, and the return to normal more, than they actually do. The divergence between what people want and what they think others want represents an instance of pluralistic ignorance, which arises when public discourse is not reflecting people's actual opinions. Publicizing public opinion is thus crucial to facilitate a future with broad support. In additional open-ended items, participants cited working from home, reduced commuting, and a collective sense of civility as worth retaining post pandemic.
BASE
Social media has arguably shifted political agenda-setting power away from mainstream media onto politicians. Current U.S. President Trump's reliance on Twitter is unprecedented, but the underlying implications for agenda setting are poorly understood. Using the president as a case study, we present evidence suggesting that President Trump's use of Twitter diverts crucial media (The New York Times and ABC News) from topics that are potentially harmful to him. We find that increased media coverage of the Mueller investigation is immediately followed by Trump tweeting increasingly about unrelated issues. This increased activity, in turn, is followed by a reduction in coverage of the Mueller investigation—a finding that is consistent with the hypothesis that President Trump's tweets may also successfully divert the media from topics that he considers threatening. The pattern is absent in placebo analyses involving Brexit coverage and several other topics that do not present a political risk to the president. Our results are robust to the inclusion of numerous control variables and examination of several alternative explanations, although the generality of the successful diversion must be established by further investigation.
BASE
In: Lewandowsky , S , Jetter , M & Ecker , U K H 2020 , ' Using the president's tweets to understand political diversion in the age of social media ' , Nature Communications , vol. 11 , 5764 (2020) . https://doi.org/10.1038/s41467-020-19644-6
Social media has arguably shifted political agenda-setting power away from mainstream media onto politicians. Current U.S. President Trump's reliance on Twitter is unprecedented, but the underlying implications for agenda setting are poorly understood. Using the president as a case study, we present evidence suggesting that President Trump's use of Twitter diverts crucial media (The New York Times and ABC News) from topics that are potentially harmful to him. We find that increased media coverage of the Mueller investigation is immediately followed by Trump tweeting increasingly about unrelated issues. This increased activity, in turn, is followed by a reduction in coverage of the Mueller investigation—a finding that is consistent with the hypothesis that President Trump's tweets may also successfully divert the media from topics that he considers threatening. The pattern is absent in placebo analyses involving Brexit coverage and several other topics that do not present a political risk to the president. Our results are robust to the inclusion of numerous control variables and examination of several alternative explanations, although the generality of the successful diversion must be established by further investigation.
BASE
Misinformation can undermine a well-functioning democracy. For example, public misconceptions about climate change can lead to lowered acceptance of the reality of climate change and lowered support for mitigation policies. This study experimentally explored the impact of misinformation about climate change and tested several pre-emptive interventions designed to reduce the influence of misinformation. We found that false-balance media coverage (giving contrarian views equal voice with climate scientists) lowered perceived consensus overall, although the effect was greater among free-market supporters. Likewise, misinformation that confuses people about the level of scientific agreement regarding anthropogenic global warming (AGW) had a polarizing effect, with free-market supporters reducing their acceptance of AGW and those with low free-market support increasing their acceptance of AGW. However, we found that inoculating messages that (1) explain the flawed argumentation technique used in the misinformation or that (2) highlight the scientific consensus on climate change were effective in neutralizing those adverse effects of misinformation. We recommend that climate communication messages should take into account ways in which scientific content can be distorted, and include preemptive inoculation messages.
BASE
Misinformation can undermine a well-functioning democracy. For example, public misconceptions about climate change can lead to lowered acceptance of the reality of climate change and lowered support for mitigation policies. This study experimentally explored the impact of misinformation about climate change and tested several pre-emptive interventions designed to reduce the influence of misinformation. We found that false-balance media coverage (giving contrarian views equal voice with climate scientists) lowered perceived consensus overall, although the effect was greater among free-market supporters. Likewise, misinformation that confuses people about the level of scientific agreement regarding anthropogenic global warming (AGW) had a polarizing effect, with free-market supporters reducing their acceptance of AGW and those with low free-market support increasing their acceptance of AGW. However, we found that inoculating messages that (1) explain the flawed argumentation technique used in the misinformation or that (2) highlight the scientific consensus on climate change were effective in neutralizing those adverse effects of misinformation. We recommend that climate communication messages should take into account ways in which scientific content can be distorted, and include pre-emptive inoculation messages.
BASE
In: Humanities and Social Sciences Communications, Band 8, Heft 1
ISSN: 2662-9992
AbstractSeveral countries have successfully reduced their COVID-19 infection rate early, while others have been overwhelmed. The reasons for the differences are complex, but response efficacy has in part depended on the speed and scale of governmental intervention and how communities have received, perceived, and acted on the information provided by governments and other agencies. While there is no 'one size fits all' communications strategy to deliver information during a prolonged crisis, in this article, we draw on key findings from scholarship in multiple social science disciplines to highlight some fundamental characteristics of effective governmental crisis communication. We then present ten recommendations for effective communication strategies to engender maximum support and participation. We argue that an effective communication strategy is a two-way process that involves clear messages, delivered via appropriate platforms, tailored for diverse audiences, and shared by trusted people. Ultimately, the long-term success depends on developing and maintaining public trust. We outline how government policymakers can engender widespread public support and participation through increased and ongoing community engagement. We argue that a diversity of community groups must be included in engagement activities. We also highlight the implications of emerging digital technologies in communication and engagement activities.
In: Swire , B , Berinsky , A J , Lewandowsky , S & Ecker , U K H 2017 , ' Processing political misinformation : comprehending the Trump phenomenon ' , Royal Society Open Science , vol. 4 , no. 3 , 160802 . https://doi.org/10.1098/rsos.160802
This study investigated the cognitive processing of true and false political information. Specifically, it examined the impact of source credibility on the assessment of veracity when information comes from a polarizing source (Experiment 1), and effectiveness of explanations when they come from one's own political party or an opposition party (Experiment 2). These experiments were conducted prior to the 2016 Presidential election. Participants rated their belief in factual and incorrect statements that President Trump made on the campaign trail; facts were subsequently affirmed and misinformation retracted. Participants then re-rated their belief immediately or after a delay. Experiment 1 found that (i) if information was attributed to Trump, Republican supporters of Trump believed it more than if it was presented without attribution, whereas the opposite was true for Democrats and (ii) although Trump supporters reduced their belief in misinformation items following a correction, they did not change their voting preferences. Experiment 2 revealed that the explanation's source had relatively little impact, and belief updating was more influenced by perceived credibility of the individual initially purporting the information. These findings suggest that people use political figures as a heuristic to guide evaluation of what is true or false, yet do not necessarily insist on veracity as a prerequisite for supporting political candidates.
BASE
his study investigated the cognitive processing of true and false political information. Specifically, it examined the impact of source credibility on the assessment of veracity when information comes from a polarizing source (Experiment 1), and effectiveness of explanations when they come from one's own political party or an opposition party (Experiment 2). These experiments were conducted prior to the 2016 Presidential election. Participants rated their belief in factual and incorrect statements that President Trump made on the campaign trail; facts were subsequently affirmed and misinformation retracted. Participants then re-rated their belief immediately or after a delay. Experiment 1 found that (i) if information was attributed to Trump, Republican supporters of Trump believed it more than if it was presented without attribution, whereas the opposite was true for Democrats and (ii) although Trump supporters reduced their belief in misinformation items following a correction, they did not change their voting preferences. Experiment 2 revealed that the explanation's source had relatively little impact, and belief updating was more influenced by perceived credibility of the individual initially purporting the information. These findings suggest that people use political figures as a heuristic to guide evaluation of what is true or false, yet do not necessarily insist on veracity as a prerequisite for supporting political candidates.
BASE
In: Swire-Thompson , B , Ecker , U K H , Lewandowsky , S & Berinsky , A J 2020 , ' They Might be a Liar but They're My Liar : Source Evaluation and the Prevalence of Misinformation ' , Political Psychology , vol. 41 , no. 1 , pp. 21-34 . https://doi.org/10.1111/pops.12586
Even if people acknowledge that misinformation is incorrect after a correction has been presented, their feelings towards the source of the misinformation can remain unchanged. The current study investigated whether participants reduce their support of Republican and Democratic politicians when the prevalence of misinformation disseminated by the politicians appears to be high in comparison to the prevalence of their factual statements. We presented U.S. participants either with (1) equal numbers of false and factual statements from political candidates or (2) disproportionately more false than factual statements. Participants received fact‐checks as to whether items were true or false, then rerated both their belief in the statements as well as their feelings towards the candidate. Results indicated that when corrected misinformation was presented alongside equal presentations of affirmed factual statements, participants reduced their belief in the misinformation but did not reduce their feelings towards the politician. However, if there was considerably more misinformation retracted than factual statements affirmed, feelings towards both Republican and Democratic figures were reduced—although the observed effect size was extremely small.
BASE
In: Political psychology: journal of the International Society of Political Psychology, Band 41, Heft 1, S. 21-34
ISSN: 1467-9221
Even if people acknowledge that misinformation is incorrect after a correction has been presented, their feelings towards the source of the misinformation can remain unchanged. The current study investigated whether participants reduce their support of Republican and Democratic politicians when the prevalence of misinformation disseminated by the politicians appears to be high in comparison to the prevalence of their factual statements. We presented U.S. participants either with (1) equal numbers of false and factual statements from political candidates or (2) disproportionately more false than factual statements. Participants received fact‐checks as to whether items were true or false, then rerated both their belief in the statements as well as their feelings towards the candidate. Results indicated that when corrected misinformation was presented alongside equal presentations of affirmed factual statements, participants reduced their belief in the misinformation but did not reduce their feelings towards the politician. However, if there was considerably more misinformation retracted than factual statements affirmed, feelings towards both Republican and Democratic figures were reduced—although the observed effect size was extremely small.
Even if people acknowledge that misinformation is incorrect after a correction has been presented, their feelings towards the source of the misinformation can remain unchanged. The current study investigated whether participants reduce their support of Republican and Democratic politicians when the prevalence of misinformation disseminated by the politicians appears to be high in comparison to the prevalence of their factual statements. We presented U.S. participants either with (1) equal numbers of false and factual statements from political candidates or (2) disproportionately more false than factual statements. Participants received fact-checks as to whether items were true or false, then rerated both their belief in the statements as well as their feelings towards the candidate. Results indicated that when corrected misinformation was presented alongside equal presentations of affirmed factual statements, participants reduced their belief in the misinformation but did not reduce their feelings towards the politician. However, if there was considerably more misinformation retracted than factual statements affirmed, feelings towards both Republican and Democratic figures were reduced—although the observed effect size was extremely small.
BASE