centered image

Coronavirus Shows You May Not Be As Good At Detecting Misinformation As You Think

Discussion in 'General Discussion' started by The Good Doctor, Dec 13, 2020.

  1. The Good Doctor

    The Good Doctor Golden Member

    Joined:
    Aug 12, 2020
    Messages:
    15,161
    Likes Received:
    7
    Trophy Points:
    12,195
    Gender:
    Female

    Most of us believe we're above average at detecting misinformation. This, of course, is a statistical impossibility – one that the coronavirus pandemic has well and truly put to the test.

    As case numbers surge in the US, bringing a devastating wave of over 2,000 deaths per day, physicians and other health experts plead with citizens to heed public health advice. The problem is, some people still think the pandemic is a hoax.

    "You go to different parts of the country, and even when the outbreak is clear and hospitals are on the verge of being overrun, there are a substantial proportion of the people who still think that this is not real, that it's fake news or that it's a hoax," immunologist and director of the US National Institute of Allergy and Infectious Diseases (NIAID) Anthony Fauci told CNN.

    A new study has found this tendency to believe we're more discerning than others is one of the key reasons why stopping misinformation about the pandemic online has proven so tough. We all think it's everyone else that's more vulnerable to misinformation.

    "This makes it harder to get people to participate in media literacy education or training efforts, because it suggests that most people think everyone else needs the training more than they do," said communications researcher Yang Cheng from North Carolina State University.

    So those around us end up mistakenly fuelling the infodemic, leading not only to further spreading of the virus, but directly to other immediate deaths as well.

    [​IMG]

    Surveying 1,793 adults in the US, Cheng and colleague Yunjuan Luo from South China University of Technology, also found misinformation was more likely to evoke negative emotions like fear and disgust.

    "Since fear, worry, or other negative emotions can facilitate information seeking, or encourage people to avoid specific behaviours during a crisis, communicators may want to consider using these emotional messages to convey accurate information about COVID-19 and public health," Cheng advises.

    Making use of disgust to communicate health advice has also been suggested by research into vaccine hesitancy. However, too much negativity can also be unproductive.

    "One of the most difficult things is to raise the right amount of fear in people," cautioned communications researcher Holley Wilkin from Georgia State University, who was not involved in the study. "You want them to take the pandemic seriously, but you don't want to go overboard so they think 'please, that will never happen."

    Wilkin explains that when people face constant fear, fear-based messages are no longer as powerful. Focusing on what can be gained then becomes more important. So, correctly targeting messages requires an understanding of your audience.

    "There's no monolithic audience out there that will respond to the same messages in the same ways. We need to use multiple sources through multiple channels because a segment of people may not trust this person, but they might trust that person," said Wilkin.

    But there are some techniques we can use to speak to the people we care about who are misinformed. One such strategy is motivational interviewing.

    "I ask my patients about their biggest barriers to changing their minds or habits; this way, I know which worries or misinformation to try to address," physician Yoo Jung Kim wrote for Undark in June, warning against resorting to ridicule and fearmongering when talking to people who are misinformed and stressing the importance of using empathy, patience, and respect.

    "Those who feel their beliefs are being threatened can become even more entrenched in their views."

    Just as with the climate crisis, we often mistakenly approach health behaviours as if they were 100 percent up to the individual. But this is often not the case, Wilkin points out. Between political polarisation and bots enlisted to spread fear and lies about the virus, it is important to remember a lot of this misinformation arises from wider problems, including lack of leadership.

    The media also have had a role in creating this problem.

    "If a media outlet gives equal voice and time to science experts and science skeptics, whether it's with regards to climate change or COVID-19, it conveys the message that 'this is a debatable thing,' even when it's not. People start to question whether scientists know what they're talking about," said Wilkin.

    The reality is that even if we all wanted to, not everyone will ever have the opportunity to learn the media (and science!) literacy required to process the masses of information we now deal with every day. So, misinformation must be combatted at the data science, regulation and political levels, too.

    The good news is that Cheng and Luo also found that the better someone thinks they are at detecting misinformation, the more likely they are to support such regulations

    Source
     

    Add Reply

Share This Page

<