Researchers want the public to test themselves: https://yourmist.streamlit.app/. Selecting true or false against 20 headlines gives the user a set of scores and a “resilience” ranking that compares them to the wider U.S. population. It takes less than two minutes to complete.

The paper

Edit: the article might be misrepresenting the study and its findings, so it’s worth checking the paper itself. (See @realChem 's comment in the thread).

  • Kwakigra@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I would cheat on this test because I cheat in real life. I’ve been humbled enough times not to put total faith in my initial impression and would rather have more evidence than whatever I happen to be aware of at the moment to determine whether a claim is true.

    • androogee (they/she)@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Absolutely. The problem isn’t that some people can psychically know whether a headline is true and some can’t.

      The problem is deciding that you know without checking. Which is exactly what this test seems to want you to do.

      I mean what does “real” even mean in this context? Just that it’s a published headline? Or that it’s a fact checked headline?

      What if it’s true, but it’s not a published headline?

      ¯\_(ツ)_/¯

  • Lvxferre@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    May I be honest? The study is awful. It has two big methodological flaws that stain completely its outcome.

    The first one is the absence of either an “I don’t know” answer, or a sliding scale for sureness of your answer. In large part, misinformation is a result of lack of scepticism - that is, failure at saying “I don’t know this”. And odds are that you’re more likely to spread discourses that you’re sure about, be them misinformation or actual information.

    The second flaw is over-reliance on geographically relevant information. Compare for example the following three questions:

    1. Morocco’s King Appoints Committee Chief to Fight Poverty and Inequality
    2. International Relations Experts and US Public Agree: America Is Less Respected Globally
    3. Attitudes Toward EU Are Largely Positive, Both Within Europe and Outside It

    The likelihood of someone living in Morocco, USA and the EU to be misinformed about #1, #2 and #3 respectively is far lower than the odds of someone living elsewhere. And more than that: due to the first methodological flaw, the study isn’t handling the difference between “misinformed” (someone who gets it wrong) and “uninformed” (someone who doesn’t believe anything in this regard).

    (For me, who don’t live in any of those three: the questions regarding EU are a bit easier to know about, but the other two? Might as well toss a coin.)

  • Panteleimon@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Hooo boy. This article is wildly misrepresenting both the study and it’s findings.

    1. The study did not set out to test ability to judge real/fake news across demographic differences. The study itself was primarily looking to determine the validity of their test.
    2. Because of this, their validation sample is wildly different from the sample observed in the online “game” version. As in, the original sample vetted participants, and also removed any who failed an “attention check”, neither of which were present in the second test.
    3. Demographics on the portion actually looking at age differences are… let’s say biased. There are far more young participants, with only ~10% over 50. The vast majority (almost 90%!) were college educated. And the sample trended liberal to a significant degree.
    4. All the above suggests that the demographic most typically considered “bad” at spotting fake news (conservative boomers who didn’t go to college) was massively underrepresented in the study. Which makes sense given that participation in that portion relies on largely unvetted volunteers to sign up to test their ability to spot fake news.

    Most critically, the study itself does not claim that differences between these demographics are representative. That portion is looking at differences in the sample pool before/after the test, to examine its potential for “training” people to spot fake news (this had mixed results, which they acknowledge). This article, ironically, is spreading misinformation about the study itself, and doing the researchers and its readers a great disservice.

    • GataZapata@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Regarding 3, that is the bane of many studies. College students are a demographic to which researchers tend to have easy access, they have time enough to participate and can be motivated by 20€ Amazon vouchers.